Mar 19 18:56:13 crc systemd[1]: Starting Kubernetes Kubelet... Mar 19 18:56:14 crc restorecon[4690]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:14 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 18:56:15 crc restorecon[4690]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 18:56:15 crc restorecon[4690]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 19 18:56:15 crc kubenswrapper[4826]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 18:56:15 crc kubenswrapper[4826]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 19 18:56:15 crc kubenswrapper[4826]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 18:56:15 crc kubenswrapper[4826]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 18:56:15 crc kubenswrapper[4826]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 19 18:56:15 crc kubenswrapper[4826]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.689355 4826 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695224 4826 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695270 4826 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695279 4826 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695289 4826 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695301 4826 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695311 4826 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695320 4826 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695329 4826 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695337 4826 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695345 4826 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695352 4826 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695360 4826 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695370 4826 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695381 4826 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695389 4826 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695397 4826 feature_gate.go:330] unrecognized feature gate: Example Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695406 4826 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695414 4826 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695424 4826 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695444 4826 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695453 4826 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695462 4826 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695470 4826 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695480 4826 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695520 4826 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695552 4826 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695589 4826 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695598 4826 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695607 4826 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695616 4826 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695624 4826 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695631 4826 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695639 4826 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695647 4826 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695682 4826 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695692 4826 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695702 4826 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695710 4826 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695718 4826 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695727 4826 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695735 4826 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695742 4826 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695751 4826 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695758 4826 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695766 4826 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695774 4826 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695782 4826 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695790 4826 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695797 4826 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695805 4826 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695813 4826 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695820 4826 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695828 4826 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695836 4826 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695844 4826 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695853 4826 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695861 4826 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695870 4826 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695882 4826 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695891 4826 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695901 4826 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695914 4826 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695924 4826 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695934 4826 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695944 4826 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695954 4826 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695964 4826 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695976 4826 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695985 4826 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.695993 4826 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.696001 4826 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.696981 4826 flags.go:64] FLAG: --address="0.0.0.0" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697007 4826 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697038 4826 flags.go:64] FLAG: --anonymous-auth="true" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697071 4826 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697086 4826 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697098 4826 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697116 4826 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697130 4826 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697143 4826 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697155 4826 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697168 4826 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697180 4826 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697190 4826 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697199 4826 flags.go:64] FLAG: --cgroup-root="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697208 4826 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697219 4826 flags.go:64] FLAG: --client-ca-file="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697229 4826 flags.go:64] FLAG: --cloud-config="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697239 4826 flags.go:64] FLAG: --cloud-provider="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697248 4826 flags.go:64] FLAG: --cluster-dns="[]" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697259 4826 flags.go:64] FLAG: --cluster-domain="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697268 4826 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697278 4826 flags.go:64] FLAG: --config-dir="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697287 4826 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697297 4826 flags.go:64] FLAG: --container-log-max-files="5" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697309 4826 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697318 4826 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697328 4826 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697337 4826 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697346 4826 flags.go:64] FLAG: --contention-profiling="false" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697355 4826 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697365 4826 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697375 4826 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697384 4826 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697396 4826 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697405 4826 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697414 4826 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697424 4826 flags.go:64] FLAG: --enable-load-reader="false" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697434 4826 flags.go:64] FLAG: --enable-server="true" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697444 4826 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697456 4826 flags.go:64] FLAG: --event-burst="100" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697466 4826 flags.go:64] FLAG: --event-qps="50" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697476 4826 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697485 4826 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697495 4826 flags.go:64] FLAG: --eviction-hard="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697506 4826 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697515 4826 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697524 4826 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697534 4826 flags.go:64] FLAG: --eviction-soft="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697543 4826 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697554 4826 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697563 4826 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697572 4826 flags.go:64] FLAG: --experimental-mounter-path="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697582 4826 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697592 4826 flags.go:64] FLAG: --fail-swap-on="true" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697602 4826 flags.go:64] FLAG: --feature-gates="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697612 4826 flags.go:64] FLAG: --file-check-frequency="20s" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697622 4826 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697632 4826 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697642 4826 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697684 4826 flags.go:64] FLAG: --healthz-port="10248" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697694 4826 flags.go:64] FLAG: --help="false" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697704 4826 flags.go:64] FLAG: --hostname-override="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697712 4826 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697722 4826 flags.go:64] FLAG: --http-check-frequency="20s" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697732 4826 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697741 4826 flags.go:64] FLAG: --image-credential-provider-config="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697750 4826 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697759 4826 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697768 4826 flags.go:64] FLAG: --image-service-endpoint="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697777 4826 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697788 4826 flags.go:64] FLAG: --kube-api-burst="100" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697797 4826 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697807 4826 flags.go:64] FLAG: --kube-api-qps="50" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697817 4826 flags.go:64] FLAG: --kube-reserved="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697827 4826 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697835 4826 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697845 4826 flags.go:64] FLAG: --kubelet-cgroups="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697854 4826 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697863 4826 flags.go:64] FLAG: --lock-file="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697872 4826 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697881 4826 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697891 4826 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697905 4826 flags.go:64] FLAG: --log-json-split-stream="false" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697915 4826 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697924 4826 flags.go:64] FLAG: --log-text-split-stream="false" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697933 4826 flags.go:64] FLAG: --logging-format="text" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697942 4826 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697953 4826 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697962 4826 flags.go:64] FLAG: --manifest-url="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697973 4826 flags.go:64] FLAG: --manifest-url-header="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697985 4826 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.697994 4826 flags.go:64] FLAG: --max-open-files="1000000" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.698005 4826 flags.go:64] FLAG: --max-pods="110" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.698014 4826 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.698024 4826 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.698033 4826 flags.go:64] FLAG: --memory-manager-policy="None" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.698042 4826 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.698052 4826 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.698061 4826 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.698070 4826 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.698094 4826 flags.go:64] FLAG: --node-status-max-images="50" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.698103 4826 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.698114 4826 flags.go:64] FLAG: --oom-score-adj="-999" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.698123 4826 flags.go:64] FLAG: --pod-cidr="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.698132 4826 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.698146 4826 flags.go:64] FLAG: --pod-manifest-path="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.698155 4826 flags.go:64] FLAG: --pod-max-pids="-1" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.698164 4826 flags.go:64] FLAG: --pods-per-core="0" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.698174 4826 flags.go:64] FLAG: --port="10250" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.698186 4826 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.698197 4826 flags.go:64] FLAG: --provider-id="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.698209 4826 flags.go:64] FLAG: --qos-reserved="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.698220 4826 flags.go:64] FLAG: --read-only-port="10255" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.698232 4826 flags.go:64] FLAG: --register-node="true" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.698244 4826 flags.go:64] FLAG: --register-schedulable="true" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.698256 4826 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.698276 4826 flags.go:64] FLAG: --registry-burst="10" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.698287 4826 flags.go:64] FLAG: --registry-qps="5" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.698298 4826 flags.go:64] FLAG: --reserved-cpus="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.698309 4826 flags.go:64] FLAG: --reserved-memory="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.698324 4826 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.698336 4826 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.698348 4826 flags.go:64] FLAG: --rotate-certificates="false" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.698360 4826 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.698370 4826 flags.go:64] FLAG: --runonce="false" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.698380 4826 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.698391 4826 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.698401 4826 flags.go:64] FLAG: --seccomp-default="false" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.698410 4826 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.698419 4826 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.698431 4826 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.698443 4826 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.698455 4826 flags.go:64] FLAG: --storage-driver-password="root" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.698466 4826 flags.go:64] FLAG: --storage-driver-secure="false" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.698478 4826 flags.go:64] FLAG: --storage-driver-table="stats" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.698490 4826 flags.go:64] FLAG: --storage-driver-user="root" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.698500 4826 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.698512 4826 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.698523 4826 flags.go:64] FLAG: --system-cgroups="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.698535 4826 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.698555 4826 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.698565 4826 flags.go:64] FLAG: --tls-cert-file="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.698574 4826 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.698585 4826 flags.go:64] FLAG: --tls-min-version="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.698595 4826 flags.go:64] FLAG: --tls-private-key-file="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.698605 4826 flags.go:64] FLAG: --topology-manager-policy="none" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.698615 4826 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.698625 4826 flags.go:64] FLAG: --topology-manager-scope="container" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.698634 4826 flags.go:64] FLAG: --v="2" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.698646 4826 flags.go:64] FLAG: --version="false" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.698688 4826 flags.go:64] FLAG: --vmodule="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.698700 4826 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.698709 4826 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.698940 4826 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.698951 4826 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.698960 4826 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.698969 4826 feature_gate.go:330] unrecognized feature gate: Example Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.698978 4826 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.698986 4826 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.698995 4826 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.699003 4826 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.699013 4826 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.699024 4826 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.699033 4826 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.699042 4826 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.699050 4826 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.699058 4826 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.699067 4826 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.699074 4826 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.699083 4826 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.699090 4826 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.699098 4826 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.699107 4826 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.699115 4826 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.699122 4826 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.699130 4826 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.699138 4826 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.699146 4826 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.699155 4826 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.699166 4826 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.699292 4826 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.702873 4826 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.702919 4826 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.702953 4826 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.703006 4826 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.703015 4826 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.703359 4826 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.703381 4826 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.703392 4826 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.703401 4826 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.703409 4826 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.703418 4826 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.703427 4826 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.703435 4826 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.703443 4826 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.703456 4826 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.703471 4826 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.703481 4826 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.703490 4826 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.703525 4826 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.703535 4826 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.703542 4826 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.703551 4826 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.703560 4826 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.703568 4826 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.703577 4826 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.703589 4826 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.703600 4826 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.703609 4826 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.703618 4826 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.703627 4826 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.703635 4826 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.703643 4826 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.703650 4826 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.703685 4826 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.703693 4826 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.703701 4826 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.703709 4826 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.703717 4826 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.703725 4826 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.703733 4826 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.703744 4826 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.703753 4826 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.703770 4826 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.703788 4826 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.715829 4826 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.715919 4826 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716085 4826 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716119 4826 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716132 4826 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716144 4826 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716155 4826 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716171 4826 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716188 4826 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716202 4826 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716213 4826 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716225 4826 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716237 4826 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716247 4826 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716258 4826 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716269 4826 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716279 4826 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716290 4826 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716301 4826 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716312 4826 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716324 4826 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716335 4826 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716346 4826 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716357 4826 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716367 4826 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716376 4826 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716387 4826 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716397 4826 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716407 4826 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716417 4826 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716427 4826 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716436 4826 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716446 4826 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716456 4826 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716469 4826 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716479 4826 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716490 4826 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716500 4826 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716512 4826 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716523 4826 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716533 4826 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716543 4826 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716554 4826 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716565 4826 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716575 4826 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716585 4826 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716595 4826 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716606 4826 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716617 4826 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716628 4826 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716638 4826 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716648 4826 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716731 4826 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716747 4826 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716759 4826 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716770 4826 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716780 4826 feature_gate.go:330] unrecognized feature gate: Example Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716790 4826 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716800 4826 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716811 4826 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716821 4826 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716831 4826 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716844 4826 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716857 4826 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716871 4826 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716884 4826 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716898 4826 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716908 4826 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716921 4826 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716932 4826 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716944 4826 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716955 4826 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.716967 4826 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.716985 4826 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.717468 4826 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.717511 4826 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.717526 4826 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.717540 4826 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.717553 4826 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.717565 4826 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.717576 4826 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.717587 4826 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.717597 4826 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.717608 4826 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.717617 4826 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.717627 4826 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.717640 4826 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.717651 4826 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.717697 4826 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.717713 4826 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.717727 4826 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.717740 4826 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.717750 4826 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.717761 4826 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.717773 4826 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.717783 4826 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.717793 4826 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.717804 4826 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.717815 4826 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.717827 4826 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.717839 4826 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.717849 4826 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.717860 4826 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.717870 4826 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.717905 4826 feature_gate.go:330] unrecognized feature gate: Example Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.717916 4826 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.717927 4826 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.717937 4826 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.717949 4826 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.717960 4826 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.717972 4826 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.717984 4826 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.717995 4826 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.718006 4826 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.718017 4826 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.718034 4826 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.718048 4826 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.718060 4826 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.718072 4826 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.718083 4826 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.718093 4826 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.718104 4826 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.718114 4826 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.718124 4826 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.718135 4826 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.718145 4826 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.718156 4826 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.718167 4826 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.718178 4826 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.718188 4826 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.718199 4826 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.718208 4826 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.718218 4826 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.718234 4826 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.718247 4826 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.718257 4826 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.718268 4826 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.718280 4826 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.718292 4826 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.718302 4826 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.718313 4826 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.718324 4826 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.718334 4826 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.718345 4826 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.718356 4826 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.718373 4826 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.718812 4826 server.go:940] "Client rotation is on, will bootstrap in background" Mar 19 18:56:15 crc kubenswrapper[4826]: E0319 18:56:15.723900 4826 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.728710 4826 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.729062 4826 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.730731 4826 server.go:997] "Starting client certificate rotation" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.730780 4826 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.731702 4826 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.759017 4826 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.763496 4826 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 19 18:56:15 crc kubenswrapper[4826]: E0319 18:56:15.764008 4826 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.69:6443: connect: connection refused" logger="UnhandledError" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.783706 4826 log.go:25] "Validated CRI v1 runtime API" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.824078 4826 log.go:25] "Validated CRI v1 image API" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.828019 4826 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.835703 4826 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-19-18-51-44-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.835768 4826 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:41 fsType:tmpfs blockSize:0}] Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.871781 4826 manager.go:217] Machine: {Timestamp:2026-03-19 18:56:15.867503294 +0000 UTC m=+0.621571697 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:e8d792fa-f63b-43a0-bb9a-92bfd45c6f50 BootID:188a80e8-02e6-4be0-9b87-2d80617c6be2 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:41 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:c5:43:8e Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:c5:43:8e Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:40:f0:86 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:56:63:f5 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:fd:ec:0e Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:21:d0:36 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:b6:c4:ba:f6:9f:1c Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:92:fa:47:8d:cf:b1 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.872274 4826 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.872549 4826 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.874221 4826 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.874613 4826 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.874717 4826 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.875099 4826 topology_manager.go:138] "Creating topology manager with none policy" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.875120 4826 container_manager_linux.go:303] "Creating device plugin manager" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.875804 4826 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.875865 4826 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.876796 4826 state_mem.go:36] "Initialized new in-memory state store" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.876954 4826 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.880556 4826 kubelet.go:418] "Attempting to sync node with API server" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.880600 4826 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.880627 4826 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.880677 4826 kubelet.go:324] "Adding apiserver pod source" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.880701 4826 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.885539 4826 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.886806 4826 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.888229 4826 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.69:6443: connect: connection refused Mar 19 18:56:15 crc kubenswrapper[4826]: E0319 18:56:15.888382 4826 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.69:6443: connect: connection refused" logger="UnhandledError" Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.888395 4826 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.69:6443: connect: connection refused Mar 19 18:56:15 crc kubenswrapper[4826]: E0319 18:56:15.888559 4826 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.69:6443: connect: connection refused" logger="UnhandledError" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.888318 4826 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.890234 4826 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.890276 4826 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.890291 4826 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.890308 4826 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.890331 4826 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.890344 4826 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.890358 4826 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.890381 4826 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.890398 4826 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.890412 4826 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.890452 4826 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.890467 4826 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.891764 4826 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.892736 4826 server.go:1280] "Started kubelet" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.893800 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.69:6443: connect: connection refused Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.894687 4826 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.895333 4826 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.895494 4826 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 19 18:56:15 crc systemd[1]: Started Kubernetes Kubelet. Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.897298 4826 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.898927 4826 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.901363 4826 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 19 18:56:15 crc kubenswrapper[4826]: E0319 18:56:15.903349 4826 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:56:15 crc kubenswrapper[4826]: E0319 18:56:15.901310 4826 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" interval="200ms" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.904205 4826 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.904255 4826 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.905600 4826 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.69:6443: connect: connection refused Mar 19 18:56:15 crc kubenswrapper[4826]: E0319 18:56:15.905784 4826 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.69:6443: connect: connection refused" logger="UnhandledError" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.906967 4826 factory.go:55] Registering systemd factory Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.907010 4826 factory.go:221] Registration of the systemd container factory successfully Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.907289 4826 server.go:460] "Adding debug handlers to kubelet server" Mar 19 18:56:15 crc kubenswrapper[4826]: E0319 18:56:15.907031 4826 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.69:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189e5303129ab1b4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:15.892689332 +0000 UTC m=+0.646757676,LastTimestamp:2026-03-19 18:56:15.892689332 +0000 UTC m=+0.646757676,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.908548 4826 factory.go:153] Registering CRI-O factory Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.908833 4826 factory.go:221] Registration of the crio container factory successfully Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.909063 4826 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.909236 4826 factory.go:103] Registering Raw factory Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.909360 4826 manager.go:1196] Started watching for new ooms in manager Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.912149 4826 manager.go:319] Starting recovery of all containers Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.925562 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.925729 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.925762 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.925787 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.925808 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.925832 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.925854 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.925921 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.925951 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.925973 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.926052 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.926083 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.926109 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.926138 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.926162 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.926188 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.926221 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.926247 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.926273 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.926296 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.926319 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.926358 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.926383 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.926406 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.926428 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.926453 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.926489 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.926516 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.926539 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.926562 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.926587 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.926612 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.926640 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.926696 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.926763 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.926792 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.926821 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.926848 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.926873 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.926897 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.926922 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.926948 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.926978 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.927004 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.927032 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.927057 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.927085 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.927111 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.927139 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.927164 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.927189 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.927215 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.927249 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.927276 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.927303 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.927329 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.927360 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.927386 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.927409 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.927436 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.927461 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.927484 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.927507 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.927534 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.927560 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.927587 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.927610 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.927634 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.927690 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.927718 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.927769 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.927794 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.927819 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.927841 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.927868 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.927894 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.927918 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.927986 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.928013 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.928041 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.928069 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.928095 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.928138 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.928172 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.931629 4826 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.931755 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.931786 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.931807 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.931837 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.931857 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.931878 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.931897 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.931965 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.932362 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.932392 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.932411 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.932434 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.932454 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.932474 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.932492 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.932511 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.932539 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.932561 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.932582 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.932601 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.932638 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.932681 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.932706 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.932726 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.932746 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.932768 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.932789 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.932811 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.932832 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.932853 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.932874 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.932894 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.932921 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.932944 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.932964 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.932986 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.933005 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.933022 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.933078 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.933106 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.933125 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.933144 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.933164 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.933183 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.933201 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.933220 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.933240 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.933257 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.933275 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.933293 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.933313 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.933333 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.933351 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.933370 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.933390 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.933408 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.933428 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.933449 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.933468 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.933488 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.933509 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.933528 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.933546 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.933566 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.933587 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.933608 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.933626 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.933644 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.933693 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.933713 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.933735 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.933757 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.933775 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.933797 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.933816 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.933870 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.933890 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.933908 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.933927 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.933965 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.933991 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.934012 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.934030 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.934052 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.934073 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.934093 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.934111 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.934138 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.934157 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.934175 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.934193 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.934213 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.934233 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.934251 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.934270 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.934290 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.934308 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.934327 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.934360 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.934380 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.934434 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.934454 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.934473 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.934494 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.934513 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.934532 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.934551 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.934572 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.934592 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.934612 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.934633 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.934693 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.934713 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.934731 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.934750 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.934769 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.934788 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.934815 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.934848 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.934866 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.934884 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.934903 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.934920 4826 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.934954 4826 manager.go:324] Recovery completed Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.934940 4826 reconstruct.go:97] "Volume reconstruction finished" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.937118 4826 reconciler.go:26] "Reconciler: start to sync state" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.951187 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.952880 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.952954 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.952974 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.957151 4826 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.957192 4826 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.957229 4826 state_mem.go:36] "Initialized new in-memory state store" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.971562 4826 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.974492 4826 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.974615 4826 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.974762 4826 kubelet.go:2335] "Starting kubelet main sync loop" Mar 19 18:56:15 crc kubenswrapper[4826]: E0319 18:56:15.974909 4826 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 19 18:56:15 crc kubenswrapper[4826]: W0319 18:56:15.980292 4826 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.69:6443: connect: connection refused Mar 19 18:56:15 crc kubenswrapper[4826]: E0319 18:56:15.980465 4826 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.69:6443: connect: connection refused" logger="UnhandledError" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.984618 4826 policy_none.go:49] "None policy: Start" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.985804 4826 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 19 18:56:15 crc kubenswrapper[4826]: I0319 18:56:15.985849 4826 state_mem.go:35] "Initializing new in-memory state store" Mar 19 18:56:16 crc kubenswrapper[4826]: E0319 18:56:16.004404 4826 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.042827 4826 manager.go:334] "Starting Device Plugin manager" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.042886 4826 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.042905 4826 server.go:79] "Starting device plugin registration server" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.044840 4826 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.044907 4826 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.045118 4826 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.045293 4826 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.045309 4826 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 19 18:56:16 crc kubenswrapper[4826]: E0319 18:56:16.058224 4826 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.075458 4826 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.075607 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.077573 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.077827 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.077961 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.078330 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.078618 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.078706 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.080084 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.080119 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.080131 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.080384 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.080527 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.080694 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.081023 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.081292 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.081438 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.082501 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.082540 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.082552 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.082705 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.082949 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.083019 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.084469 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.084499 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.084500 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.084549 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.084569 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.084511 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.084575 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.084793 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.084798 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.084806 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.084962 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.085015 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.086395 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.086453 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.086493 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.086454 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.086578 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.086605 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.086986 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.087053 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.088678 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.088732 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.088758 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:16 crc kubenswrapper[4826]: E0319 18:56:16.105179 4826 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" interval="400ms" Mar 19 18:56:16 crc kubenswrapper[4826]: W0319 18:56:16.118201 4826 helpers.go:245] readString: Failed to read "/sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/cpuset.cpus.effective": open /sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/cpuset.cpus.effective: no such device Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.139259 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.139318 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.139359 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.139397 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.139434 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.139467 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.139507 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.139542 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.139614 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.139723 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.139759 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.139789 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.139819 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.139851 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.139881 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.145641 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.146946 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.147017 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.147050 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.147093 4826 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 18:56:16 crc kubenswrapper[4826]: E0319 18:56:16.147765 4826 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.69:6443: connect: connection refused" node="crc" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.241089 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.241200 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.241244 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.241276 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.241322 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.241362 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.241569 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.241604 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.241640 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.241700 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.241736 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.241771 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.241896 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.241927 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.241963 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.242762 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.242806 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.242837 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.242919 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.242924 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.242950 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.242971 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.243032 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.243045 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.243065 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.242836 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.243093 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.243131 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.243140 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.243047 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.348365 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.349997 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.350065 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.350079 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.350113 4826 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 18:56:16 crc kubenswrapper[4826]: E0319 18:56:16.350583 4826 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.69:6443: connect: connection refused" node="crc" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.413852 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.429119 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.439462 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.453914 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.463755 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 18:56:16 crc kubenswrapper[4826]: W0319 18:56:16.467973 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-bd98899e360597afda6a95d04b193c09528a21502baf8bbaa16ee6d3ff7cc0cb WatchSource:0}: Error finding container bd98899e360597afda6a95d04b193c09528a21502baf8bbaa16ee6d3ff7cc0cb: Status 404 returned error can't find the container with id bd98899e360597afda6a95d04b193c09528a21502baf8bbaa16ee6d3ff7cc0cb Mar 19 18:56:16 crc kubenswrapper[4826]: W0319 18:56:16.479459 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-1bde394351c04172c965096cef25172cd52de06aed58d738f6a8c6e2f6b487da WatchSource:0}: Error finding container 1bde394351c04172c965096cef25172cd52de06aed58d738f6a8c6e2f6b487da: Status 404 returned error can't find the container with id 1bde394351c04172c965096cef25172cd52de06aed58d738f6a8c6e2f6b487da Mar 19 18:56:16 crc kubenswrapper[4826]: W0319 18:56:16.483782 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-93ecaa4cc7ae28ae61fc56abca58453af93edf08b5fce143343aee6c8ced4227 WatchSource:0}: Error finding container 93ecaa4cc7ae28ae61fc56abca58453af93edf08b5fce143343aee6c8ced4227: Status 404 returned error can't find the container with id 93ecaa4cc7ae28ae61fc56abca58453af93edf08b5fce143343aee6c8ced4227 Mar 19 18:56:16 crc kubenswrapper[4826]: W0319 18:56:16.487026 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-2548c092a5b6e983ef0bebbe4686da5dcb09a8d7867c8a6993f46ea4393ed566 WatchSource:0}: Error finding container 2548c092a5b6e983ef0bebbe4686da5dcb09a8d7867c8a6993f46ea4393ed566: Status 404 returned error can't find the container with id 2548c092a5b6e983ef0bebbe4686da5dcb09a8d7867c8a6993f46ea4393ed566 Mar 19 18:56:16 crc kubenswrapper[4826]: W0319 18:56:16.492715 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-0f3555c7b8bb1e8a5ad2f7debdb3805a388b4ca873d3e3fd29b07a975f34ee60 WatchSource:0}: Error finding container 0f3555c7b8bb1e8a5ad2f7debdb3805a388b4ca873d3e3fd29b07a975f34ee60: Status 404 returned error can't find the container with id 0f3555c7b8bb1e8a5ad2f7debdb3805a388b4ca873d3e3fd29b07a975f34ee60 Mar 19 18:56:16 crc kubenswrapper[4826]: E0319 18:56:16.507054 4826 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" interval="800ms" Mar 19 18:56:16 crc kubenswrapper[4826]: W0319 18:56:16.718923 4826 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.69:6443: connect: connection refused Mar 19 18:56:16 crc kubenswrapper[4826]: E0319 18:56:16.719075 4826 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.69:6443: connect: connection refused" logger="UnhandledError" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.751563 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.753123 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.753186 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.753205 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.753251 4826 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 18:56:16 crc kubenswrapper[4826]: E0319 18:56:16.753891 4826 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.69:6443: connect: connection refused" node="crc" Mar 19 18:56:16 crc kubenswrapper[4826]: W0319 18:56:16.783100 4826 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.69:6443: connect: connection refused Mar 19 18:56:16 crc kubenswrapper[4826]: E0319 18:56:16.783219 4826 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.69:6443: connect: connection refused" logger="UnhandledError" Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.895174 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.69:6443: connect: connection refused Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.981837 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2548c092a5b6e983ef0bebbe4686da5dcb09a8d7867c8a6993f46ea4393ed566"} Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.983043 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"93ecaa4cc7ae28ae61fc56abca58453af93edf08b5fce143343aee6c8ced4227"} Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.984679 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1bde394351c04172c965096cef25172cd52de06aed58d738f6a8c6e2f6b487da"} Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.985831 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"bd98899e360597afda6a95d04b193c09528a21502baf8bbaa16ee6d3ff7cc0cb"} Mar 19 18:56:16 crc kubenswrapper[4826]: I0319 18:56:16.986829 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"0f3555c7b8bb1e8a5ad2f7debdb3805a388b4ca873d3e3fd29b07a975f34ee60"} Mar 19 18:56:17 crc kubenswrapper[4826]: E0319 18:56:17.308324 4826 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" interval="1.6s" Mar 19 18:56:17 crc kubenswrapper[4826]: W0319 18:56:17.326455 4826 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.69:6443: connect: connection refused Mar 19 18:56:17 crc kubenswrapper[4826]: E0319 18:56:17.326606 4826 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.69:6443: connect: connection refused" logger="UnhandledError" Mar 19 18:56:17 crc kubenswrapper[4826]: W0319 18:56:17.348824 4826 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.69:6443: connect: connection refused Mar 19 18:56:17 crc kubenswrapper[4826]: E0319 18:56:17.348978 4826 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.69:6443: connect: connection refused" logger="UnhandledError" Mar 19 18:56:17 crc kubenswrapper[4826]: I0319 18:56:17.554602 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:17 crc kubenswrapper[4826]: I0319 18:56:17.562609 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:17 crc kubenswrapper[4826]: I0319 18:56:17.562744 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:17 crc kubenswrapper[4826]: I0319 18:56:17.562773 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:17 crc kubenswrapper[4826]: I0319 18:56:17.562832 4826 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 18:56:17 crc kubenswrapper[4826]: E0319 18:56:17.563791 4826 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.69:6443: connect: connection refused" node="crc" Mar 19 18:56:17 crc kubenswrapper[4826]: I0319 18:56:17.894969 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.69:6443: connect: connection refused Mar 19 18:56:17 crc kubenswrapper[4826]: I0319 18:56:17.899968 4826 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 18:56:17 crc kubenswrapper[4826]: E0319 18:56:17.901118 4826 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.69:6443: connect: connection refused" logger="UnhandledError" Mar 19 18:56:17 crc kubenswrapper[4826]: E0319 18:56:17.964779 4826 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.69:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189e5303129ab1b4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:15.892689332 +0000 UTC m=+0.646757676,LastTimestamp:2026-03-19 18:56:15.892689332 +0000 UTC m=+0.646757676,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:56:17 crc kubenswrapper[4826]: I0319 18:56:17.992399 4826 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="b096ddbf3e0696076f8989fead2305e102369f75e36c651eb4af16e1371d609f" exitCode=0 Mar 19 18:56:17 crc kubenswrapper[4826]: I0319 18:56:17.992486 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"b096ddbf3e0696076f8989fead2305e102369f75e36c651eb4af16e1371d609f"} Mar 19 18:56:17 crc kubenswrapper[4826]: I0319 18:56:17.992585 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:17 crc kubenswrapper[4826]: I0319 18:56:17.993988 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:17 crc kubenswrapper[4826]: I0319 18:56:17.994031 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:17 crc kubenswrapper[4826]: I0319 18:56:17.994046 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:17 crc kubenswrapper[4826]: I0319 18:56:17.994838 4826 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="31a6c22156e87013cbd0d08987f5deccd1d3e19dd0f7e1964de70f5dc43f0a11" exitCode=0 Mar 19 18:56:17 crc kubenswrapper[4826]: I0319 18:56:17.994921 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"31a6c22156e87013cbd0d08987f5deccd1d3e19dd0f7e1964de70f5dc43f0a11"} Mar 19 18:56:17 crc kubenswrapper[4826]: I0319 18:56:17.995046 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:17 crc kubenswrapper[4826]: I0319 18:56:17.996421 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:17 crc kubenswrapper[4826]: I0319 18:56:17.996455 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:17 crc kubenswrapper[4826]: I0319 18:56:17.996467 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:18 crc kubenswrapper[4826]: I0319 18:56:18.000218 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"cc688da785cd4c7e4c0b716a2fcc36cb3390e163c0cbba9d26364103d6d4cad6"} Mar 19 18:56:18 crc kubenswrapper[4826]: I0319 18:56:18.000266 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9ec9c06923049f881a32230044630b90d9f5227af3bb0fbb2b4d4d964dcaa861"} Mar 19 18:56:18 crc kubenswrapper[4826]: I0319 18:56:18.000275 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:18 crc kubenswrapper[4826]: I0319 18:56:18.000280 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"46835172f91653505b229a1af45967a87c797f2b3f8ea096da3a931a8b3b0de0"} Mar 19 18:56:18 crc kubenswrapper[4826]: I0319 18:56:18.000457 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a22f8de90a48c727556cc628544f3262bb1f7f32592a6672b8895a9e395d28af"} Mar 19 18:56:18 crc kubenswrapper[4826]: I0319 18:56:18.002069 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:18 crc kubenswrapper[4826]: I0319 18:56:18.002123 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:18 crc kubenswrapper[4826]: I0319 18:56:18.002141 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:18 crc kubenswrapper[4826]: I0319 18:56:18.003599 4826 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="114ac8bf22a2fbdae76be3b65c0c6a0b81a43812c8fb3559af532d5f14eb50d8" exitCode=0 Mar 19 18:56:18 crc kubenswrapper[4826]: I0319 18:56:18.003693 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"114ac8bf22a2fbdae76be3b65c0c6a0b81a43812c8fb3559af532d5f14eb50d8"} Mar 19 18:56:18 crc kubenswrapper[4826]: I0319 18:56:18.003720 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:18 crc kubenswrapper[4826]: I0319 18:56:18.004703 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:18 crc kubenswrapper[4826]: I0319 18:56:18.004730 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:18 crc kubenswrapper[4826]: I0319 18:56:18.004740 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:18 crc kubenswrapper[4826]: I0319 18:56:18.006204 4826 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="0aa69e3e6e43f7e685582a92fd5562b449c6b5e261ebc62fdbd5815620456f47" exitCode=0 Mar 19 18:56:18 crc kubenswrapper[4826]: I0319 18:56:18.006247 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"0aa69e3e6e43f7e685582a92fd5562b449c6b5e261ebc62fdbd5815620456f47"} Mar 19 18:56:18 crc kubenswrapper[4826]: I0319 18:56:18.006364 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:18 crc kubenswrapper[4826]: I0319 18:56:18.006636 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:18 crc kubenswrapper[4826]: I0319 18:56:18.007294 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:18 crc kubenswrapper[4826]: I0319 18:56:18.007410 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:18 crc kubenswrapper[4826]: I0319 18:56:18.007431 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:18 crc kubenswrapper[4826]: I0319 18:56:18.008110 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:18 crc kubenswrapper[4826]: I0319 18:56:18.008155 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:18 crc kubenswrapper[4826]: I0319 18:56:18.008171 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:18 crc kubenswrapper[4826]: W0319 18:56:18.364908 4826 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.69:6443: connect: connection refused Mar 19 18:56:18 crc kubenswrapper[4826]: E0319 18:56:18.365022 4826 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.69:6443: connect: connection refused" logger="UnhandledError" Mar 19 18:56:18 crc kubenswrapper[4826]: W0319 18:56:18.714068 4826 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.69:6443: connect: connection refused Mar 19 18:56:18 crc kubenswrapper[4826]: E0319 18:56:18.714195 4826 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.69:6443: connect: connection refused" logger="UnhandledError" Mar 19 18:56:18 crc kubenswrapper[4826]: I0319 18:56:18.895551 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.69:6443: connect: connection refused Mar 19 18:56:18 crc kubenswrapper[4826]: E0319 18:56:18.910042 4826 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" interval="3.2s" Mar 19 18:56:19 crc kubenswrapper[4826]: I0319 18:56:19.013286 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"662a22f3d3212cd3494b6ee986b4debacae977c6d6a5bb9e39ba2e3ddc5595b9"} Mar 19 18:56:19 crc kubenswrapper[4826]: I0319 18:56:19.013343 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"6743b15ba3bd06ce940f79845ad7bfe07ef92e54c1efedad7621ea693873e530"} Mar 19 18:56:19 crc kubenswrapper[4826]: I0319 18:56:19.013355 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"20939d58b80c88ee271214997c2a628fedf2297700bc34d570f5ef7a0aba7429"} Mar 19 18:56:19 crc kubenswrapper[4826]: I0319 18:56:19.013460 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:19 crc kubenswrapper[4826]: I0319 18:56:19.014543 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:19 crc kubenswrapper[4826]: I0319 18:56:19.014588 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:19 crc kubenswrapper[4826]: I0319 18:56:19.014603 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:19 crc kubenswrapper[4826]: I0319 18:56:19.019727 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4b5b9841cd846e58c72f4acc03b1509604b816bef5c45da0fc98f7483671822a"} Mar 19 18:56:19 crc kubenswrapper[4826]: I0319 18:56:19.019768 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0ea714363aa8ce7507efbab8cbb23b850bb2fa272d7cf20eb2c9eb8af0a3da21"} Mar 19 18:56:19 crc kubenswrapper[4826]: I0319 18:56:19.019785 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"bbf79230bb3f40d8a5de7b681913877be3e763cae02c99c6ebe12bff0e0319ab"} Mar 19 18:56:19 crc kubenswrapper[4826]: I0319 18:56:19.019797 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1c7e8bd5e1686bfadfc6f61c8436c0ca538cebbebb8fdafa685621c729b143ae"} Mar 19 18:56:19 crc kubenswrapper[4826]: I0319 18:56:19.022242 4826 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="de1694a1de0dc4668d456ab18a96f0b2d30693c341f1b7c0fa4acfe8e1953b96" exitCode=0 Mar 19 18:56:19 crc kubenswrapper[4826]: I0319 18:56:19.022356 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"de1694a1de0dc4668d456ab18a96f0b2d30693c341f1b7c0fa4acfe8e1953b96"} Mar 19 18:56:19 crc kubenswrapper[4826]: I0319 18:56:19.022397 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:19 crc kubenswrapper[4826]: I0319 18:56:19.023405 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:19 crc kubenswrapper[4826]: I0319 18:56:19.023441 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:19 crc kubenswrapper[4826]: I0319 18:56:19.023454 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:19 crc kubenswrapper[4826]: I0319 18:56:19.024356 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:19 crc kubenswrapper[4826]: I0319 18:56:19.024391 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:19 crc kubenswrapper[4826]: I0319 18:56:19.024315 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"23561f419aab37a447fc436e6792295ed518a1c09ed3fc54cb4feace6b8c4a8b"} Mar 19 18:56:19 crc kubenswrapper[4826]: I0319 18:56:19.025607 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:19 crc kubenswrapper[4826]: I0319 18:56:19.025633 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:19 crc kubenswrapper[4826]: I0319 18:56:19.025646 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:19 crc kubenswrapper[4826]: I0319 18:56:19.026200 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:19 crc kubenswrapper[4826]: I0319 18:56:19.026224 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:19 crc kubenswrapper[4826]: I0319 18:56:19.026239 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:19 crc kubenswrapper[4826]: I0319 18:56:19.164557 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:19 crc kubenswrapper[4826]: I0319 18:56:19.166093 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:19 crc kubenswrapper[4826]: I0319 18:56:19.166161 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:19 crc kubenswrapper[4826]: I0319 18:56:19.166177 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:19 crc kubenswrapper[4826]: I0319 18:56:19.166236 4826 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 18:56:19 crc kubenswrapper[4826]: E0319 18:56:19.166901 4826 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.69:6443: connect: connection refused" node="crc" Mar 19 18:56:19 crc kubenswrapper[4826]: I0319 18:56:19.257524 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 18:56:19 crc kubenswrapper[4826]: W0319 18:56:19.364625 4826 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.69:6443: connect: connection refused Mar 19 18:56:19 crc kubenswrapper[4826]: E0319 18:56:19.364753 4826 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.69:6443: connect: connection refused" logger="UnhandledError" Mar 19 18:56:20 crc kubenswrapper[4826]: I0319 18:56:20.033165 4826 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="f09b1d41a79c91ed9ba6b0d9a767a3a70c8e91f52b5a362fb623a79f710439ec" exitCode=0 Mar 19 18:56:20 crc kubenswrapper[4826]: I0319 18:56:20.033273 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"f09b1d41a79c91ed9ba6b0d9a767a3a70c8e91f52b5a362fb623a79f710439ec"} Mar 19 18:56:20 crc kubenswrapper[4826]: I0319 18:56:20.033511 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:20 crc kubenswrapper[4826]: I0319 18:56:20.039605 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"06d81776e131e89dc900504cef98ff09ceae04dfd5e41b2010247b13236f4751"} Mar 19 18:56:20 crc kubenswrapper[4826]: I0319 18:56:20.039735 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:20 crc kubenswrapper[4826]: I0319 18:56:20.039854 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:20 crc kubenswrapper[4826]: I0319 18:56:20.039745 4826 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 18:56:20 crc kubenswrapper[4826]: I0319 18:56:20.040003 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:20 crc kubenswrapper[4826]: I0319 18:56:20.040165 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:20 crc kubenswrapper[4826]: I0319 18:56:20.041496 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:20 crc kubenswrapper[4826]: I0319 18:56:20.041555 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:20 crc kubenswrapper[4826]: I0319 18:56:20.041575 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:20 crc kubenswrapper[4826]: I0319 18:56:20.041740 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:20 crc kubenswrapper[4826]: I0319 18:56:20.041776 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:20 crc kubenswrapper[4826]: I0319 18:56:20.041795 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:20 crc kubenswrapper[4826]: I0319 18:56:20.042259 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:20 crc kubenswrapper[4826]: I0319 18:56:20.042303 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:20 crc kubenswrapper[4826]: I0319 18:56:20.042322 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:20 crc kubenswrapper[4826]: I0319 18:56:20.042268 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:20 crc kubenswrapper[4826]: I0319 18:56:20.042402 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:20 crc kubenswrapper[4826]: I0319 18:56:20.042422 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:20 crc kubenswrapper[4826]: I0319 18:56:20.042760 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:20 crc kubenswrapper[4826]: I0319 18:56:20.042810 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:20 crc kubenswrapper[4826]: I0319 18:56:20.042828 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:20 crc kubenswrapper[4826]: I0319 18:56:20.313503 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:56:21 crc kubenswrapper[4826]: I0319 18:56:21.048180 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8a8b51761aab9e32c12d38f3dcaa505e9690beaa862b0ca7690bfdcaedb825c8"} Mar 19 18:56:21 crc kubenswrapper[4826]: I0319 18:56:21.048243 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ac1381fd4c384c75d8fb7ce94b681e99af93bfcf86bf9880be672ac5973e3831"} Mar 19 18:56:21 crc kubenswrapper[4826]: I0319 18:56:21.048261 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"cb7f5c656ddd72443e2e47a2aa5eca266afb5c2cb18df90e020eba809dfb7006"} Mar 19 18:56:21 crc kubenswrapper[4826]: I0319 18:56:21.048314 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:21 crc kubenswrapper[4826]: I0319 18:56:21.048369 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:56:21 crc kubenswrapper[4826]: I0319 18:56:21.050252 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:21 crc kubenswrapper[4826]: I0319 18:56:21.050303 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:21 crc kubenswrapper[4826]: I0319 18:56:21.050319 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:21 crc kubenswrapper[4826]: I0319 18:56:21.948265 4826 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 18:56:22 crc kubenswrapper[4826]: I0319 18:56:22.059102 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"80c7e6e3e6286bc2b0a9add5f62135c51476fa9fe40d87190d35c66ecaa73435"} Mar 19 18:56:22 crc kubenswrapper[4826]: I0319 18:56:22.059208 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3189002cbaf568c587a8cff32a0667af92dfb8508768d93822696ab0537fefc4"} Mar 19 18:56:22 crc kubenswrapper[4826]: I0319 18:56:22.059233 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:22 crc kubenswrapper[4826]: I0319 18:56:22.059252 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:22 crc kubenswrapper[4826]: I0319 18:56:22.062246 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:22 crc kubenswrapper[4826]: I0319 18:56:22.062362 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:22 crc kubenswrapper[4826]: I0319 18:56:22.062392 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:22 crc kubenswrapper[4826]: I0319 18:56:22.062953 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:22 crc kubenswrapper[4826]: I0319 18:56:22.063015 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:22 crc kubenswrapper[4826]: I0319 18:56:22.063039 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:22 crc kubenswrapper[4826]: I0319 18:56:22.367718 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:22 crc kubenswrapper[4826]: I0319 18:56:22.369597 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:22 crc kubenswrapper[4826]: I0319 18:56:22.369641 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:22 crc kubenswrapper[4826]: I0319 18:56:22.369702 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:22 crc kubenswrapper[4826]: I0319 18:56:22.369754 4826 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 18:56:22 crc kubenswrapper[4826]: I0319 18:56:22.421646 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 18:56:22 crc kubenswrapper[4826]: I0319 18:56:22.421944 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:22 crc kubenswrapper[4826]: I0319 18:56:22.423639 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:22 crc kubenswrapper[4826]: I0319 18:56:22.423777 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:22 crc kubenswrapper[4826]: I0319 18:56:22.423809 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:22 crc kubenswrapper[4826]: I0319 18:56:22.433532 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 19 18:56:22 crc kubenswrapper[4826]: I0319 18:56:22.489980 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 18:56:22 crc kubenswrapper[4826]: I0319 18:56:22.490264 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:22 crc kubenswrapper[4826]: I0319 18:56:22.491893 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:22 crc kubenswrapper[4826]: I0319 18:56:22.491951 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:22 crc kubenswrapper[4826]: I0319 18:56:22.491970 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:23 crc kubenswrapper[4826]: I0319 18:56:23.061965 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:23 crc kubenswrapper[4826]: I0319 18:56:23.063575 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:23 crc kubenswrapper[4826]: I0319 18:56:23.063625 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:23 crc kubenswrapper[4826]: I0319 18:56:23.063646 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:23 crc kubenswrapper[4826]: I0319 18:56:23.930979 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:56:23 crc kubenswrapper[4826]: I0319 18:56:23.931258 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:23 crc kubenswrapper[4826]: I0319 18:56:23.933058 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:23 crc kubenswrapper[4826]: I0319 18:56:23.933116 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:23 crc kubenswrapper[4826]: I0319 18:56:23.933135 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:24 crc kubenswrapper[4826]: I0319 18:56:24.064260 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:24 crc kubenswrapper[4826]: I0319 18:56:24.065729 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:24 crc kubenswrapper[4826]: I0319 18:56:24.065780 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:24 crc kubenswrapper[4826]: I0319 18:56:24.065797 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:24 crc kubenswrapper[4826]: I0319 18:56:24.929343 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 18:56:24 crc kubenswrapper[4826]: I0319 18:56:24.929626 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:24 crc kubenswrapper[4826]: I0319 18:56:24.931144 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:24 crc kubenswrapper[4826]: I0319 18:56:24.931199 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:24 crc kubenswrapper[4826]: I0319 18:56:24.931209 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:25 crc kubenswrapper[4826]: I0319 18:56:25.018795 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 19 18:56:25 crc kubenswrapper[4826]: I0319 18:56:25.067967 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:25 crc kubenswrapper[4826]: I0319 18:56:25.069337 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:25 crc kubenswrapper[4826]: I0319 18:56:25.069409 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:25 crc kubenswrapper[4826]: I0319 18:56:25.069427 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:25 crc kubenswrapper[4826]: I0319 18:56:25.536962 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 18:56:25 crc kubenswrapper[4826]: I0319 18:56:25.537177 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:25 crc kubenswrapper[4826]: I0319 18:56:25.539062 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:25 crc kubenswrapper[4826]: I0319 18:56:25.539130 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:25 crc kubenswrapper[4826]: I0319 18:56:25.539148 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:25 crc kubenswrapper[4826]: I0319 18:56:25.542332 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 18:56:26 crc kubenswrapper[4826]: E0319 18:56:26.058368 4826 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 18:56:26 crc kubenswrapper[4826]: I0319 18:56:26.075192 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:26 crc kubenswrapper[4826]: I0319 18:56:26.076393 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:26 crc kubenswrapper[4826]: I0319 18:56:26.076447 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:26 crc kubenswrapper[4826]: I0319 18:56:26.076460 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:27 crc kubenswrapper[4826]: I0319 18:56:27.929495 4826 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 18:56:27 crc kubenswrapper[4826]: I0319 18:56:27.929637 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 18:56:29 crc kubenswrapper[4826]: I0319 18:56:29.265169 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 18:56:29 crc kubenswrapper[4826]: I0319 18:56:29.265381 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:29 crc kubenswrapper[4826]: I0319 18:56:29.266719 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:29 crc kubenswrapper[4826]: I0319 18:56:29.266752 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:29 crc kubenswrapper[4826]: I0319 18:56:29.266764 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:29 crc kubenswrapper[4826]: I0319 18:56:29.896182 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 19 18:56:30 crc kubenswrapper[4826]: E0319 18:56:30.259805 4826 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:30Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 18:56:30 crc kubenswrapper[4826]: E0319 18:56:30.260220 4826 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:30Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 19 18:56:30 crc kubenswrapper[4826]: E0319 18:56:30.262365 4826 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:30Z is after 2026-02-23T05:33:13Z" node="crc" Mar 19 18:56:30 crc kubenswrapper[4826]: W0319 18:56:30.268281 4826 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:30Z is after 2026-02-23T05:33:13Z Mar 19 18:56:30 crc kubenswrapper[4826]: E0319 18:56:30.268389 4826 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:30Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 18:56:30 crc kubenswrapper[4826]: I0319 18:56:30.274171 4826 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 19 18:56:30 crc kubenswrapper[4826]: I0319 18:56:30.274230 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 19 18:56:30 crc kubenswrapper[4826]: W0319 18:56:30.278557 4826 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:30Z is after 2026-02-23T05:33:13Z Mar 19 18:56:30 crc kubenswrapper[4826]: E0319 18:56:30.278675 4826 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:30Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 18:56:30 crc kubenswrapper[4826]: I0319 18:56:30.281706 4826 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Mar 19 18:56:30 crc kubenswrapper[4826]: I0319 18:56:30.281757 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 19 18:56:30 crc kubenswrapper[4826]: W0319 18:56:30.282361 4826 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:30Z is after 2026-02-23T05:33:13Z Mar 19 18:56:30 crc kubenswrapper[4826]: E0319 18:56:30.282481 4826 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:30Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 18:56:30 crc kubenswrapper[4826]: W0319 18:56:30.282615 4826 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:30Z is after 2026-02-23T05:33:13Z Mar 19 18:56:30 crc kubenswrapper[4826]: E0319 18:56:30.282722 4826 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:30Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 18:56:30 crc kubenswrapper[4826]: E0319 18:56:30.283989 4826 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:30Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e5303129ab1b4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:15.892689332 +0000 UTC m=+0.646757676,LastTimestamp:2026-03-19 18:56:15.892689332 +0000 UTC m=+0.646757676,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:56:30 crc kubenswrapper[4826]: I0319 18:56:30.899711 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:30Z is after 2026-02-23T05:33:13Z Mar 19 18:56:31 crc kubenswrapper[4826]: I0319 18:56:31.095023 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 19 18:56:31 crc kubenswrapper[4826]: I0319 18:56:31.097771 4826 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="06d81776e131e89dc900504cef98ff09ceae04dfd5e41b2010247b13236f4751" exitCode=255 Mar 19 18:56:31 crc kubenswrapper[4826]: I0319 18:56:31.097836 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"06d81776e131e89dc900504cef98ff09ceae04dfd5e41b2010247b13236f4751"} Mar 19 18:56:31 crc kubenswrapper[4826]: I0319 18:56:31.098085 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:31 crc kubenswrapper[4826]: I0319 18:56:31.099309 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:31 crc kubenswrapper[4826]: I0319 18:56:31.099388 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:31 crc kubenswrapper[4826]: I0319 18:56:31.099407 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:31 crc kubenswrapper[4826]: I0319 18:56:31.100698 4826 scope.go:117] "RemoveContainer" containerID="06d81776e131e89dc900504cef98ff09ceae04dfd5e41b2010247b13236f4751" Mar 19 18:56:31 crc kubenswrapper[4826]: I0319 18:56:31.899547 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:31Z is after 2026-02-23T05:33:13Z Mar 19 18:56:32 crc kubenswrapper[4826]: I0319 18:56:32.104273 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 19 18:56:32 crc kubenswrapper[4826]: I0319 18:56:32.105291 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 19 18:56:32 crc kubenswrapper[4826]: I0319 18:56:32.108612 4826 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="27fc2cb49c00fd7b76d0c7bbd73b16347a4d94291d77168b18fed17aa0e673fd" exitCode=255 Mar 19 18:56:32 crc kubenswrapper[4826]: I0319 18:56:32.108698 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"27fc2cb49c00fd7b76d0c7bbd73b16347a4d94291d77168b18fed17aa0e673fd"} Mar 19 18:56:32 crc kubenswrapper[4826]: I0319 18:56:32.108818 4826 scope.go:117] "RemoveContainer" containerID="06d81776e131e89dc900504cef98ff09ceae04dfd5e41b2010247b13236f4751" Mar 19 18:56:32 crc kubenswrapper[4826]: I0319 18:56:32.109040 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:32 crc kubenswrapper[4826]: I0319 18:56:32.110630 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:32 crc kubenswrapper[4826]: I0319 18:56:32.110716 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:32 crc kubenswrapper[4826]: I0319 18:56:32.110744 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:32 crc kubenswrapper[4826]: I0319 18:56:32.111878 4826 scope.go:117] "RemoveContainer" containerID="27fc2cb49c00fd7b76d0c7bbd73b16347a4d94291d77168b18fed17aa0e673fd" Mar 19 18:56:32 crc kubenswrapper[4826]: E0319 18:56:32.115944 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 18:56:32 crc kubenswrapper[4826]: I0319 18:56:32.900229 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:32Z is after 2026-02-23T05:33:13Z Mar 19 18:56:33 crc kubenswrapper[4826]: I0319 18:56:33.114543 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 19 18:56:33 crc kubenswrapper[4826]: I0319 18:56:33.901193 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:33Z is after 2026-02-23T05:33:13Z Mar 19 18:56:33 crc kubenswrapper[4826]: I0319 18:56:33.941283 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:56:33 crc kubenswrapper[4826]: I0319 18:56:33.941731 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:33 crc kubenswrapper[4826]: I0319 18:56:33.943818 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:33 crc kubenswrapper[4826]: I0319 18:56:33.943897 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:33 crc kubenswrapper[4826]: I0319 18:56:33.943918 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:33 crc kubenswrapper[4826]: I0319 18:56:33.945197 4826 scope.go:117] "RemoveContainer" containerID="27fc2cb49c00fd7b76d0c7bbd73b16347a4d94291d77168b18fed17aa0e673fd" Mar 19 18:56:33 crc kubenswrapper[4826]: E0319 18:56:33.945590 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 18:56:33 crc kubenswrapper[4826]: I0319 18:56:33.950471 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:56:34 crc kubenswrapper[4826]: I0319 18:56:34.119296 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:34 crc kubenswrapper[4826]: I0319 18:56:34.120323 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:34 crc kubenswrapper[4826]: I0319 18:56:34.120361 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:34 crc kubenswrapper[4826]: I0319 18:56:34.120371 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:34 crc kubenswrapper[4826]: I0319 18:56:34.120901 4826 scope.go:117] "RemoveContainer" containerID="27fc2cb49c00fd7b76d0c7bbd73b16347a4d94291d77168b18fed17aa0e673fd" Mar 19 18:56:34 crc kubenswrapper[4826]: E0319 18:56:34.121063 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 18:56:34 crc kubenswrapper[4826]: I0319 18:56:34.777975 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:56:34 crc kubenswrapper[4826]: I0319 18:56:34.899424 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:34Z is after 2026-02-23T05:33:13Z Mar 19 18:56:35 crc kubenswrapper[4826]: I0319 18:56:35.075883 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 19 18:56:35 crc kubenswrapper[4826]: I0319 18:56:35.076230 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:35 crc kubenswrapper[4826]: I0319 18:56:35.078119 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:35 crc kubenswrapper[4826]: I0319 18:56:35.078204 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:35 crc kubenswrapper[4826]: I0319 18:56:35.078226 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:35 crc kubenswrapper[4826]: I0319 18:56:35.104902 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 19 18:56:35 crc kubenswrapper[4826]: I0319 18:56:35.122716 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:35 crc kubenswrapper[4826]: I0319 18:56:35.122848 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:35 crc kubenswrapper[4826]: I0319 18:56:35.123831 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:35 crc kubenswrapper[4826]: I0319 18:56:35.123875 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:35 crc kubenswrapper[4826]: I0319 18:56:35.123888 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:35 crc kubenswrapper[4826]: I0319 18:56:35.124467 4826 scope.go:117] "RemoveContainer" containerID="27fc2cb49c00fd7b76d0c7bbd73b16347a4d94291d77168b18fed17aa0e673fd" Mar 19 18:56:35 crc kubenswrapper[4826]: E0319 18:56:35.124702 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 18:56:35 crc kubenswrapper[4826]: I0319 18:56:35.124984 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:35 crc kubenswrapper[4826]: I0319 18:56:35.125056 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:35 crc kubenswrapper[4826]: I0319 18:56:35.125075 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:35 crc kubenswrapper[4826]: W0319 18:56:35.621493 4826 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:35Z is after 2026-02-23T05:33:13Z Mar 19 18:56:35 crc kubenswrapper[4826]: E0319 18:56:35.621694 4826 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:35Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 18:56:35 crc kubenswrapper[4826]: I0319 18:56:35.900254 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:35Z is after 2026-02-23T05:33:13Z Mar 19 18:56:36 crc kubenswrapper[4826]: E0319 18:56:36.058572 4826 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 18:56:36 crc kubenswrapper[4826]: I0319 18:56:36.663423 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:36 crc kubenswrapper[4826]: E0319 18:56:36.664557 4826 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:36Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 19 18:56:36 crc kubenswrapper[4826]: I0319 18:56:36.665405 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:36 crc kubenswrapper[4826]: I0319 18:56:36.665450 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:36 crc kubenswrapper[4826]: I0319 18:56:36.665469 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:36 crc kubenswrapper[4826]: I0319 18:56:36.665506 4826 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 18:56:36 crc kubenswrapper[4826]: E0319 18:56:36.672205 4826 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:36Z is after 2026-02-23T05:33:13Z" node="crc" Mar 19 18:56:36 crc kubenswrapper[4826]: I0319 18:56:36.899192 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:36Z is after 2026-02-23T05:33:13Z Mar 19 18:56:37 crc kubenswrapper[4826]: W0319 18:56:37.048482 4826 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:37Z is after 2026-02-23T05:33:13Z Mar 19 18:56:37 crc kubenswrapper[4826]: E0319 18:56:37.048571 4826 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:37Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 18:56:37 crc kubenswrapper[4826]: W0319 18:56:37.696277 4826 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:37Z is after 2026-02-23T05:33:13Z Mar 19 18:56:37 crc kubenswrapper[4826]: E0319 18:56:37.696422 4826 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:37Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 18:56:37 crc kubenswrapper[4826]: I0319 18:56:37.900497 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:37Z is after 2026-02-23T05:33:13Z Mar 19 18:56:37 crc kubenswrapper[4826]: I0319 18:56:37.930213 4826 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 18:56:37 crc kubenswrapper[4826]: I0319 18:56:37.930295 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 18:56:38 crc kubenswrapper[4826]: I0319 18:56:38.809081 4826 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 18:56:38 crc kubenswrapper[4826]: E0319 18:56:38.815290 4826 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:38Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 18:56:38 crc kubenswrapper[4826]: I0319 18:56:38.898644 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:38Z is after 2026-02-23T05:33:13Z Mar 19 18:56:38 crc kubenswrapper[4826]: I0319 18:56:38.970893 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:56:38 crc kubenswrapper[4826]: I0319 18:56:38.971183 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:38 crc kubenswrapper[4826]: I0319 18:56:38.972855 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:38 crc kubenswrapper[4826]: I0319 18:56:38.972923 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:38 crc kubenswrapper[4826]: I0319 18:56:38.972943 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:38 crc kubenswrapper[4826]: I0319 18:56:38.973945 4826 scope.go:117] "RemoveContainer" containerID="27fc2cb49c00fd7b76d0c7bbd73b16347a4d94291d77168b18fed17aa0e673fd" Mar 19 18:56:38 crc kubenswrapper[4826]: E0319 18:56:38.974223 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 18:56:39 crc kubenswrapper[4826]: W0319 18:56:39.881768 4826 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:39Z is after 2026-02-23T05:33:13Z Mar 19 18:56:39 crc kubenswrapper[4826]: E0319 18:56:39.881896 4826 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:39Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 18:56:39 crc kubenswrapper[4826]: I0319 18:56:39.899750 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:39Z is after 2026-02-23T05:33:13Z Mar 19 18:56:40 crc kubenswrapper[4826]: E0319 18:56:40.290597 4826 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:40Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e5303129ab1b4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:15.892689332 +0000 UTC m=+0.646757676,LastTimestamp:2026-03-19 18:56:15.892689332 +0000 UTC m=+0.646757676,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:56:40 crc kubenswrapper[4826]: I0319 18:56:40.898006 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:40Z is after 2026-02-23T05:33:13Z Mar 19 18:56:41 crc kubenswrapper[4826]: I0319 18:56:41.898399 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:41Z is after 2026-02-23T05:33:13Z Mar 19 18:56:42 crc kubenswrapper[4826]: I0319 18:56:42.899904 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:42Z is after 2026-02-23T05:33:13Z Mar 19 18:56:43 crc kubenswrapper[4826]: E0319 18:56:43.670426 4826 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:43Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 19 18:56:43 crc kubenswrapper[4826]: I0319 18:56:43.672750 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:43 crc kubenswrapper[4826]: I0319 18:56:43.674519 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:43 crc kubenswrapper[4826]: I0319 18:56:43.674576 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:43 crc kubenswrapper[4826]: I0319 18:56:43.674595 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:43 crc kubenswrapper[4826]: I0319 18:56:43.674631 4826 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 18:56:43 crc kubenswrapper[4826]: E0319 18:56:43.678081 4826 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:43Z is after 2026-02-23T05:33:13Z" node="crc" Mar 19 18:56:43 crc kubenswrapper[4826]: W0319 18:56:43.893119 4826 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:43Z is after 2026-02-23T05:33:13Z Mar 19 18:56:43 crc kubenswrapper[4826]: E0319 18:56:43.893256 4826 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:43Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 18:56:43 crc kubenswrapper[4826]: I0319 18:56:43.899244 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:43Z is after 2026-02-23T05:33:13Z Mar 19 18:56:44 crc kubenswrapper[4826]: I0319 18:56:44.899922 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:44Z is after 2026-02-23T05:33:13Z Mar 19 18:56:45 crc kubenswrapper[4826]: I0319 18:56:45.900728 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:45Z is after 2026-02-23T05:33:13Z Mar 19 18:56:46 crc kubenswrapper[4826]: E0319 18:56:46.058862 4826 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 18:56:46 crc kubenswrapper[4826]: I0319 18:56:46.900487 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:46Z is after 2026-02-23T05:33:13Z Mar 19 18:56:47 crc kubenswrapper[4826]: I0319 18:56:47.898447 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:47Z is after 2026-02-23T05:33:13Z Mar 19 18:56:47 crc kubenswrapper[4826]: I0319 18:56:47.930770 4826 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 18:56:47 crc kubenswrapper[4826]: I0319 18:56:47.930937 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 18:56:47 crc kubenswrapper[4826]: I0319 18:56:47.931046 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 18:56:47 crc kubenswrapper[4826]: I0319 18:56:47.931355 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:47 crc kubenswrapper[4826]: I0319 18:56:47.933088 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:47 crc kubenswrapper[4826]: I0319 18:56:47.933158 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:47 crc kubenswrapper[4826]: I0319 18:56:47.933179 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:47 crc kubenswrapper[4826]: I0319 18:56:47.933999 4826 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"46835172f91653505b229a1af45967a87c797f2b3f8ea096da3a931a8b3b0de0"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 19 18:56:47 crc kubenswrapper[4826]: I0319 18:56:47.934175 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://46835172f91653505b229a1af45967a87c797f2b3f8ea096da3a931a8b3b0de0" gracePeriod=30 Mar 19 18:56:48 crc kubenswrapper[4826]: I0319 18:56:48.164845 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 19 18:56:48 crc kubenswrapper[4826]: I0319 18:56:48.165307 4826 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="46835172f91653505b229a1af45967a87c797f2b3f8ea096da3a931a8b3b0de0" exitCode=255 Mar 19 18:56:48 crc kubenswrapper[4826]: I0319 18:56:48.165349 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"46835172f91653505b229a1af45967a87c797f2b3f8ea096da3a931a8b3b0de0"} Mar 19 18:56:48 crc kubenswrapper[4826]: I0319 18:56:48.900217 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:48Z is after 2026-02-23T05:33:13Z Mar 19 18:56:49 crc kubenswrapper[4826]: I0319 18:56:49.173358 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 19 18:56:49 crc kubenswrapper[4826]: I0319 18:56:49.174777 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8e283d8ec5464e99418607f84ce6ea083f6c94a592433799eddeb8c1d41760bb"} Mar 19 18:56:49 crc kubenswrapper[4826]: I0319 18:56:49.174961 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:49 crc kubenswrapper[4826]: I0319 18:56:49.176648 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:49 crc kubenswrapper[4826]: I0319 18:56:49.176736 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:49 crc kubenswrapper[4826]: I0319 18:56:49.176754 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:49 crc kubenswrapper[4826]: I0319 18:56:49.900370 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:49Z is after 2026-02-23T05:33:13Z Mar 19 18:56:50 crc kubenswrapper[4826]: I0319 18:56:50.178061 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:50 crc kubenswrapper[4826]: I0319 18:56:50.180006 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:50 crc kubenswrapper[4826]: I0319 18:56:50.180230 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:50 crc kubenswrapper[4826]: I0319 18:56:50.180372 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:50 crc kubenswrapper[4826]: E0319 18:56:50.298597 4826 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:50Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e5303129ab1b4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:15.892689332 +0000 UTC m=+0.646757676,LastTimestamp:2026-03-19 18:56:15.892689332 +0000 UTC m=+0.646757676,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:56:50 crc kubenswrapper[4826]: E0319 18:56:50.678808 4826 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:50Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 19 18:56:50 crc kubenswrapper[4826]: I0319 18:56:50.678837 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:50 crc kubenswrapper[4826]: I0319 18:56:50.680799 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:50 crc kubenswrapper[4826]: I0319 18:56:50.680943 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:50 crc kubenswrapper[4826]: I0319 18:56:50.681039 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:50 crc kubenswrapper[4826]: I0319 18:56:50.681148 4826 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 18:56:50 crc kubenswrapper[4826]: E0319 18:56:50.686888 4826 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:50Z is after 2026-02-23T05:33:13Z" node="crc" Mar 19 18:56:50 crc kubenswrapper[4826]: I0319 18:56:50.900517 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:50Z is after 2026-02-23T05:33:13Z Mar 19 18:56:50 crc kubenswrapper[4826]: I0319 18:56:50.976260 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:50 crc kubenswrapper[4826]: I0319 18:56:50.978173 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:50 crc kubenswrapper[4826]: I0319 18:56:50.978242 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:50 crc kubenswrapper[4826]: I0319 18:56:50.978267 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:50 crc kubenswrapper[4826]: I0319 18:56:50.979336 4826 scope.go:117] "RemoveContainer" containerID="27fc2cb49c00fd7b76d0c7bbd73b16347a4d94291d77168b18fed17aa0e673fd" Mar 19 18:56:51 crc kubenswrapper[4826]: I0319 18:56:51.900094 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:51Z is after 2026-02-23T05:33:13Z Mar 19 18:56:52 crc kubenswrapper[4826]: I0319 18:56:52.187511 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 19 18:56:52 crc kubenswrapper[4826]: I0319 18:56:52.188252 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 19 18:56:52 crc kubenswrapper[4826]: I0319 18:56:52.191252 4826 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="efc0f32dc86eaccdc86d2f3b54c7d43a36ccf056683fa726cfe190e85c81e018" exitCode=255 Mar 19 18:56:52 crc kubenswrapper[4826]: I0319 18:56:52.191325 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"efc0f32dc86eaccdc86d2f3b54c7d43a36ccf056683fa726cfe190e85c81e018"} Mar 19 18:56:52 crc kubenswrapper[4826]: I0319 18:56:52.191382 4826 scope.go:117] "RemoveContainer" containerID="27fc2cb49c00fd7b76d0c7bbd73b16347a4d94291d77168b18fed17aa0e673fd" Mar 19 18:56:52 crc kubenswrapper[4826]: I0319 18:56:52.191623 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:52 crc kubenswrapper[4826]: I0319 18:56:52.193470 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:52 crc kubenswrapper[4826]: I0319 18:56:52.193512 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:52 crc kubenswrapper[4826]: I0319 18:56:52.193534 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:52 crc kubenswrapper[4826]: I0319 18:56:52.194371 4826 scope.go:117] "RemoveContainer" containerID="efc0f32dc86eaccdc86d2f3b54c7d43a36ccf056683fa726cfe190e85c81e018" Mar 19 18:56:52 crc kubenswrapper[4826]: E0319 18:56:52.194681 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 18:56:52 crc kubenswrapper[4826]: I0319 18:56:52.490537 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 18:56:52 crc kubenswrapper[4826]: I0319 18:56:52.490802 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:52 crc kubenswrapper[4826]: I0319 18:56:52.492819 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:52 crc kubenswrapper[4826]: I0319 18:56:52.492873 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:52 crc kubenswrapper[4826]: I0319 18:56:52.492891 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:52 crc kubenswrapper[4826]: I0319 18:56:52.900135 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:52Z is after 2026-02-23T05:33:13Z Mar 19 18:56:53 crc kubenswrapper[4826]: I0319 18:56:53.198467 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 19 18:56:53 crc kubenswrapper[4826]: I0319 18:56:53.898908 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:53Z is after 2026-02-23T05:33:13Z Mar 19 18:56:54 crc kubenswrapper[4826]: W0319 18:56:54.476208 4826 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:54Z is after 2026-02-23T05:33:13Z Mar 19 18:56:54 crc kubenswrapper[4826]: E0319 18:56:54.476920 4826 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:54Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 18:56:54 crc kubenswrapper[4826]: W0319 18:56:54.515134 4826 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:54Z is after 2026-02-23T05:33:13Z Mar 19 18:56:54 crc kubenswrapper[4826]: E0319 18:56:54.515252 4826 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:54Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 18:56:54 crc kubenswrapper[4826]: I0319 18:56:54.778250 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:56:54 crc kubenswrapper[4826]: I0319 18:56:54.778516 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:54 crc kubenswrapper[4826]: I0319 18:56:54.780347 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:54 crc kubenswrapper[4826]: I0319 18:56:54.780404 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:54 crc kubenswrapper[4826]: I0319 18:56:54.780428 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:54 crc kubenswrapper[4826]: I0319 18:56:54.781410 4826 scope.go:117] "RemoveContainer" containerID="efc0f32dc86eaccdc86d2f3b54c7d43a36ccf056683fa726cfe190e85c81e018" Mar 19 18:56:54 crc kubenswrapper[4826]: E0319 18:56:54.781741 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 18:56:54 crc kubenswrapper[4826]: I0319 18:56:54.899879 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:54Z is after 2026-02-23T05:33:13Z Mar 19 18:56:54 crc kubenswrapper[4826]: I0319 18:56:54.930419 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 18:56:54 crc kubenswrapper[4826]: I0319 18:56:54.931033 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:54 crc kubenswrapper[4826]: I0319 18:56:54.933174 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:54 crc kubenswrapper[4826]: I0319 18:56:54.933250 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:54 crc kubenswrapper[4826]: I0319 18:56:54.933271 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:55 crc kubenswrapper[4826]: I0319 18:56:55.885719 4826 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 18:56:55 crc kubenswrapper[4826]: E0319 18:56:55.893019 4826 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:55Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 18:56:55 crc kubenswrapper[4826]: E0319 18:56:55.895061 4826 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Mar 19 18:56:55 crc kubenswrapper[4826]: I0319 18:56:55.901883 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:55Z is after 2026-02-23T05:33:13Z Mar 19 18:56:56 crc kubenswrapper[4826]: E0319 18:56:56.059025 4826 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 18:56:56 crc kubenswrapper[4826]: I0319 18:56:56.899900 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:56Z is after 2026-02-23T05:33:13Z Mar 19 18:56:57 crc kubenswrapper[4826]: E0319 18:56:57.682774 4826 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:57Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 19 18:56:57 crc kubenswrapper[4826]: I0319 18:56:57.687068 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:57 crc kubenswrapper[4826]: I0319 18:56:57.688343 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:57 crc kubenswrapper[4826]: I0319 18:56:57.688379 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:57 crc kubenswrapper[4826]: I0319 18:56:57.688388 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:57 crc kubenswrapper[4826]: I0319 18:56:57.688412 4826 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 18:56:57 crc kubenswrapper[4826]: E0319 18:56:57.691498 4826 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:57Z is after 2026-02-23T05:33:13Z" node="crc" Mar 19 18:56:57 crc kubenswrapper[4826]: I0319 18:56:57.899975 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:57Z is after 2026-02-23T05:33:13Z Mar 19 18:56:57 crc kubenswrapper[4826]: I0319 18:56:57.930324 4826 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded" start-of-body= Mar 19 18:56:57 crc kubenswrapper[4826]: I0319 18:56:57.930406 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded" Mar 19 18:56:58 crc kubenswrapper[4826]: I0319 18:56:58.900611 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:58Z is after 2026-02-23T05:33:13Z Mar 19 18:56:58 crc kubenswrapper[4826]: I0319 18:56:58.970320 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:56:58 crc kubenswrapper[4826]: I0319 18:56:58.970696 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:56:58 crc kubenswrapper[4826]: I0319 18:56:58.972465 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:56:58 crc kubenswrapper[4826]: I0319 18:56:58.972519 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:56:58 crc kubenswrapper[4826]: I0319 18:56:58.972540 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:56:58 crc kubenswrapper[4826]: I0319 18:56:58.973498 4826 scope.go:117] "RemoveContainer" containerID="efc0f32dc86eaccdc86d2f3b54c7d43a36ccf056683fa726cfe190e85c81e018" Mar 19 18:56:58 crc kubenswrapper[4826]: E0319 18:56:58.974188 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 18:56:59 crc kubenswrapper[4826]: I0319 18:56:59.899515 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:56:59Z is after 2026-02-23T05:33:13Z Mar 19 18:57:00 crc kubenswrapper[4826]: E0319 18:57:00.304619 4826 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:57:00Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e5303129ab1b4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:15.892689332 +0000 UTC m=+0.646757676,LastTimestamp:2026-03-19 18:56:15.892689332 +0000 UTC m=+0.646757676,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:00 crc kubenswrapper[4826]: I0319 18:57:00.898268 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:57:00Z is after 2026-02-23T05:33:13Z Mar 19 18:57:01 crc kubenswrapper[4826]: I0319 18:57:01.899363 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:57:01Z is after 2026-02-23T05:33:13Z Mar 19 18:57:02 crc kubenswrapper[4826]: I0319 18:57:02.899437 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:57:02Z is after 2026-02-23T05:33:13Z Mar 19 18:57:03 crc kubenswrapper[4826]: I0319 18:57:03.902928 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 18:57:04 crc kubenswrapper[4826]: E0319 18:57:04.691626 4826 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 19 18:57:04 crc kubenswrapper[4826]: I0319 18:57:04.691732 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:57:04 crc kubenswrapper[4826]: I0319 18:57:04.694022 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:04 crc kubenswrapper[4826]: I0319 18:57:04.694284 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:04 crc kubenswrapper[4826]: I0319 18:57:04.694470 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:04 crc kubenswrapper[4826]: I0319 18:57:04.694700 4826 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 18:57:04 crc kubenswrapper[4826]: E0319 18:57:04.703363 4826 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 19 18:57:04 crc kubenswrapper[4826]: I0319 18:57:04.907168 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 18:57:04 crc kubenswrapper[4826]: W0319 18:57:04.943827 4826 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 19 18:57:04 crc kubenswrapper[4826]: E0319 18:57:04.943951 4826 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 19 18:57:05 crc kubenswrapper[4826]: I0319 18:57:05.902707 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 18:57:06 crc kubenswrapper[4826]: E0319 18:57:06.059644 4826 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 18:57:06 crc kubenswrapper[4826]: I0319 18:57:06.900025 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 18:57:07 crc kubenswrapper[4826]: I0319 18:57:07.901524 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 18:57:07 crc kubenswrapper[4826]: I0319 18:57:07.930064 4826 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 18:57:07 crc kubenswrapper[4826]: I0319 18:57:07.930161 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 18:57:08 crc kubenswrapper[4826]: W0319 18:57:08.027205 4826 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 19 18:57:08 crc kubenswrapper[4826]: E0319 18:57:08.027321 4826 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 19 18:57:08 crc kubenswrapper[4826]: I0319 18:57:08.902910 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 18:57:09 crc kubenswrapper[4826]: I0319 18:57:09.903400 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.316118 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e5303129ab1b4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:15.892689332 +0000 UTC m=+0.646757676,LastTimestamp:2026-03-19 18:56:15.892689332 +0000 UTC m=+0.646757676,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.322497 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e53031631dc89 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:15.952927881 +0000 UTC m=+0.706996224,LastTimestamp:2026-03-19 18:56:15.952927881 +0000 UTC m=+0.706996224,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.330558 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e5303163277cf default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:15.952967631 +0000 UTC m=+0.707035984,LastTimestamp:2026-03-19 18:56:15.952967631 +0000 UTC m=+0.707035984,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.338284 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e53031632b7d6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:15.952984022 +0000 UTC m=+0.707052375,LastTimestamp:2026-03-19 18:56:15.952984022 +0000 UTC m=+0.707052375,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.350251 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e53031be9ff9c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:16.048881564 +0000 UTC m=+0.802949917,LastTimestamp:2026-03-19 18:56:16.048881564 +0000 UTC m=+0.802949917,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.356366 4826 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e53031631dc89\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e53031631dc89 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:15.952927881 +0000 UTC m=+0.706996224,LastTimestamp:2026-03-19 18:56:16.077794345 +0000 UTC m=+0.831862728,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.363594 4826 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e5303163277cf\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e5303163277cf default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:15.952967631 +0000 UTC m=+0.707035984,LastTimestamp:2026-03-19 18:56:16.077949137 +0000 UTC m=+0.832017490,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.371222 4826 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e53031632b7d6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e53031632b7d6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:15.952984022 +0000 UTC m=+0.707052375,LastTimestamp:2026-03-19 18:56:16.078066748 +0000 UTC m=+0.832135101,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.378990 4826 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e53031631dc89\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e53031631dc89 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:15.952927881 +0000 UTC m=+0.706996224,LastTimestamp:2026-03-19 18:56:16.0801045 +0000 UTC m=+0.834172823,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.386326 4826 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e5303163277cf\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e5303163277cf default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:15.952967631 +0000 UTC m=+0.707035984,LastTimestamp:2026-03-19 18:56:16.08012727 +0000 UTC m=+0.834195593,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.393303 4826 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e53031632b7d6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e53031632b7d6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:15.952984022 +0000 UTC m=+0.707052375,LastTimestamp:2026-03-19 18:56:16.0801382 +0000 UTC m=+0.834206523,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.400214 4826 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e53031631dc89\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e53031631dc89 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:15.952927881 +0000 UTC m=+0.706996224,LastTimestamp:2026-03-19 18:56:16.080498545 +0000 UTC m=+0.834566898,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.407709 4826 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e5303163277cf\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e5303163277cf default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:15.952967631 +0000 UTC m=+0.707035984,LastTimestamp:2026-03-19 18:56:16.080645237 +0000 UTC m=+0.834713590,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.415290 4826 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e53031632b7d6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e53031632b7d6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:15.952984022 +0000 UTC m=+0.707052375,LastTimestamp:2026-03-19 18:56:16.080826628 +0000 UTC m=+0.834894981,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.422193 4826 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e53031631dc89\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e53031631dc89 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:15.952927881 +0000 UTC m=+0.706996224,LastTimestamp:2026-03-19 18:56:16.082521297 +0000 UTC m=+0.836589620,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.427409 4826 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e5303163277cf\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e5303163277cf default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:15.952967631 +0000 UTC m=+0.707035984,LastTimestamp:2026-03-19 18:56:16.082548297 +0000 UTC m=+0.836616620,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.433092 4826 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e53031632b7d6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e53031632b7d6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:15.952984022 +0000 UTC m=+0.707052375,LastTimestamp:2026-03-19 18:56:16.082558657 +0000 UTC m=+0.836626980,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.438997 4826 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e53031631dc89\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e53031631dc89 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:15.952927881 +0000 UTC m=+0.706996224,LastTimestamp:2026-03-19 18:56:16.084484728 +0000 UTC m=+0.838553051,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.446340 4826 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e5303163277cf\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e5303163277cf default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:15.952967631 +0000 UTC m=+0.707035984,LastTimestamp:2026-03-19 18:56:16.08450704 +0000 UTC m=+0.838575363,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.453356 4826 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e53031631dc89\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e53031631dc89 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:15.952927881 +0000 UTC m=+0.706996224,LastTimestamp:2026-03-19 18:56:16.08453604 +0000 UTC m=+0.838604403,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.459443 4826 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e5303163277cf\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e5303163277cf default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:15.952967631 +0000 UTC m=+0.707035984,LastTimestamp:2026-03-19 18:56:16.08456213 +0000 UTC m=+0.838630483,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.464982 4826 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e53031632b7d6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e53031632b7d6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:15.952984022 +0000 UTC m=+0.707052375,LastTimestamp:2026-03-19 18:56:16.08458024 +0000 UTC m=+0.838648593,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.472164 4826 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e53031632b7d6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e53031632b7d6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:15.952984022 +0000 UTC m=+0.707052375,LastTimestamp:2026-03-19 18:56:16.084599871 +0000 UTC m=+0.838668194,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.479559 4826 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e53031631dc89\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e53031631dc89 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:15.952927881 +0000 UTC m=+0.706996224,LastTimestamp:2026-03-19 18:56:16.084785372 +0000 UTC m=+0.838853695,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.486420 4826 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e5303163277cf\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e5303163277cf default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:15.952967631 +0000 UTC m=+0.707035984,LastTimestamp:2026-03-19 18:56:16.084801303 +0000 UTC m=+0.838869626,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.494191 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e5303358486a4 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:16.478439076 +0000 UTC m=+1.232507429,LastTimestamp:2026-03-19 18:56:16.478439076 +0000 UTC m=+1.232507429,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.502368 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e530335d790ed openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:16.483881197 +0000 UTC m=+1.237949520,LastTimestamp:2026-03-19 18:56:16.483881197 +0000 UTC m=+1.237949520,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.509896 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e5303363e9f22 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:16.490635042 +0000 UTC m=+1.244703395,LastTimestamp:2026-03-19 18:56:16.490635042 +0000 UTC m=+1.244703395,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.517819 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e530336925dff openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:16.496123391 +0000 UTC m=+1.250191714,LastTimestamp:2026-03-19 18:56:16.496123391 +0000 UTC m=+1.250191714,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.525039 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e530336aa4a9a openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:16.49769129 +0000 UTC m=+1.251759613,LastTimestamp:2026-03-19 18:56:16.49769129 +0000 UTC m=+1.251759613,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.532776 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e53035a6fbe12 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:17.097834002 +0000 UTC m=+1.851902335,LastTimestamp:2026-03-19 18:56:17.097834002 +0000 UTC m=+1.851902335,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.540760 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e53035a883ea3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:17.099439779 +0000 UTC m=+1.853508112,LastTimestamp:2026-03-19 18:56:17.099439779 +0000 UTC m=+1.853508112,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.549537 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e53035a89774c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:17.09951982 +0000 UTC m=+1.853588143,LastTimestamp:2026-03-19 18:56:17.09951982 +0000 UTC m=+1.853588143,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.560464 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e53035a93038c openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:17.100145548 +0000 UTC m=+1.854213871,LastTimestamp:2026-03-19 18:56:17.100145548 +0000 UTC m=+1.854213871,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.567408 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e53035a982c1b openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:17.100483611 +0000 UTC m=+1.854551924,LastTimestamp:2026-03-19 18:56:17.100483611 +0000 UTC m=+1.854551924,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.574180 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e53035b3c0ce1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:17.111223521 +0000 UTC m=+1.865291854,LastTimestamp:2026-03-19 18:56:17.111223521 +0000 UTC m=+1.865291854,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.579721 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e53035b5628d1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:17.112934609 +0000 UTC m=+1.867002932,LastTimestamp:2026-03-19 18:56:17.112934609 +0000 UTC m=+1.867002932,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.586409 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e53035b625ccd openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:17.113734349 +0000 UTC m=+1.867802682,LastTimestamp:2026-03-19 18:56:17.113734349 +0000 UTC m=+1.867802682,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.591064 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e53035b73796e openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:17.11485579 +0000 UTC m=+1.868924103,LastTimestamp:2026-03-19 18:56:17.11485579 +0000 UTC m=+1.868924103,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.598371 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e53035bd91764 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:17.121515364 +0000 UTC m=+1.875583687,LastTimestamp:2026-03-19 18:56:17.121515364 +0000 UTC m=+1.875583687,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.603746 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e53035bdaca34 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:17.121626676 +0000 UTC m=+1.875695009,LastTimestamp:2026-03-19 18:56:17.121626676 +0000 UTC m=+1.875695009,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.611030 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e53036ea30adf openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:17.436740319 +0000 UTC m=+2.190808672,LastTimestamp:2026-03-19 18:56:17.436740319 +0000 UTC m=+2.190808672,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.617980 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e53036f7626ca openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:17.450575562 +0000 UTC m=+2.204643875,LastTimestamp:2026-03-19 18:56:17.450575562 +0000 UTC m=+2.204643875,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.624815 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e53036f8c5144 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:17.452028228 +0000 UTC m=+2.206096571,LastTimestamp:2026-03-19 18:56:17.452028228 +0000 UTC m=+2.206096571,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.632513 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e53037f78a323 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:17.719173923 +0000 UTC m=+2.473242246,LastTimestamp:2026-03-19 18:56:17.719173923 +0000 UTC m=+2.473242246,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.638985 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e530380a017bb openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:17.738536891 +0000 UTC m=+2.492605214,LastTimestamp:2026-03-19 18:56:17.738536891 +0000 UTC m=+2.492605214,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.647434 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e530380b193f4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:17.739682804 +0000 UTC m=+2.493751117,LastTimestamp:2026-03-19 18:56:17.739682804 +0000 UTC m=+2.493751117,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.655437 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e53038eae078e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:17.974331278 +0000 UTC m=+2.728399601,LastTimestamp:2026-03-19 18:56:17.974331278 +0000 UTC m=+2.728399601,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.662403 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e53038f70f181 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:17.987105153 +0000 UTC m=+2.741173476,LastTimestamp:2026-03-19 18:56:17.987105153 +0000 UTC m=+2.741173476,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.670280 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e5303900322dd openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:17.996686045 +0000 UTC m=+2.750754398,LastTimestamp:2026-03-19 18:56:17.996686045 +0000 UTC m=+2.750754398,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.678759 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e5303901f5f87 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:17.998536583 +0000 UTC m=+2.752604936,LastTimestamp:2026-03-19 18:56:17.998536583 +0000 UTC m=+2.752604936,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.690379 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e5303909669ce openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:18.006337998 +0000 UTC m=+2.760406331,LastTimestamp:2026-03-19 18:56:18.006337998 +0000 UTC m=+2.760406331,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.698574 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e530391a41bd0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:18.024012752 +0000 UTC m=+2.778081095,LastTimestamp:2026-03-19 18:56:18.024012752 +0000 UTC m=+2.778081095,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.705741 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e53039dc0b3c0 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:18.227213248 +0000 UTC m=+2.981281571,LastTimestamp:2026-03-19 18:56:18.227213248 +0000 UTC m=+2.981281571,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.712702 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e53039df9b76f openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:18.230949743 +0000 UTC m=+2.985018056,LastTimestamp:2026-03-19 18:56:18.230949743 +0000 UTC m=+2.985018056,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.718788 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e53039e359002 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:18.23487181 +0000 UTC m=+2.988940133,LastTimestamp:2026-03-19 18:56:18.23487181 +0000 UTC m=+2.988940133,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.724094 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e53039e5e55ae openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:18.237543854 +0000 UTC m=+2.991612167,LastTimestamp:2026-03-19 18:56:18.237543854 +0000 UTC m=+2.991612167,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.731168 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e53039e72c22f openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:18.238882351 +0000 UTC m=+2.992950674,LastTimestamp:2026-03-19 18:56:18.238882351 +0000 UTC m=+2.992950674,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.736462 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e53039ede9c4e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:18.245950542 +0000 UTC m=+3.000018855,LastTimestamp:2026-03-19 18:56:18.245950542 +0000 UTC m=+3.000018855,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.743699 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e53039f1ac15f openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:18.249892191 +0000 UTC m=+3.003960504,LastTimestamp:2026-03-19 18:56:18.249892191 +0000 UTC m=+3.003960504,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.750530 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e5303a01b6408 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:18.266711048 +0000 UTC m=+3.020779361,LastTimestamp:2026-03-19 18:56:18.266711048 +0000 UTC m=+3.020779361,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.758284 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e5303a0319943 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:18.268166467 +0000 UTC m=+3.022234790,LastTimestamp:2026-03-19 18:56:18.268166467 +0000 UTC m=+3.022234790,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.766116 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e5303ab536aa6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:18.454932134 +0000 UTC m=+3.209000447,LastTimestamp:2026-03-19 18:56:18.454932134 +0000 UTC m=+3.209000447,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.773342 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e5303ab65408f openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:18.456101007 +0000 UTC m=+3.210169330,LastTimestamp:2026-03-19 18:56:18.456101007 +0000 UTC m=+3.210169330,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.781304 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e5303acde4eab openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:18.480811691 +0000 UTC m=+3.234880024,LastTimestamp:2026-03-19 18:56:18.480811691 +0000 UTC m=+3.234880024,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.787797 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e5303acef622a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:18.481930794 +0000 UTC m=+3.235999107,LastTimestamp:2026-03-19 18:56:18.481930794 +0000 UTC m=+3.235999107,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.792744 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e5303ad1e5103 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:18.485006595 +0000 UTC m=+3.239074918,LastTimestamp:2026-03-19 18:56:18.485006595 +0000 UTC m=+3.239074918,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.797513 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e5303ad5e5d0b openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:18.489203979 +0000 UTC m=+3.243272292,LastTimestamp:2026-03-19 18:56:18.489203979 +0000 UTC m=+3.243272292,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.803259 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e5303bb3033d6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:18.721059798 +0000 UTC m=+3.475128151,LastTimestamp:2026-03-19 18:56:18.721059798 +0000 UTC m=+3.475128151,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.808141 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e5303bb9799d1 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:18.727836113 +0000 UTC m=+3.481904466,LastTimestamp:2026-03-19 18:56:18.727836113 +0000 UTC m=+3.481904466,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.814840 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e5303bc734336 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:18.742231862 +0000 UTC m=+3.496300185,LastTimestamp:2026-03-19 18:56:18.742231862 +0000 UTC m=+3.496300185,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.819731 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e5303bc87a4b3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:18.743567539 +0000 UTC m=+3.497635892,LastTimestamp:2026-03-19 18:56:18.743567539 +0000 UTC m=+3.497635892,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.824931 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e5303bcd10bb3 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:18.748378035 +0000 UTC m=+3.502446358,LastTimestamp:2026-03-19 18:56:18.748378035 +0000 UTC m=+3.502446358,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.831601 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e5303be39563d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:18.771990077 +0000 UTC m=+3.526058390,LastTimestamp:2026-03-19 18:56:18.771990077 +0000 UTC m=+3.526058390,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.837545 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e5303c8655698 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:18.942645912 +0000 UTC m=+3.696714225,LastTimestamp:2026-03-19 18:56:18.942645912 +0000 UTC m=+3.696714225,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.844049 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e5303c92cb8f1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:18.955712753 +0000 UTC m=+3.709781066,LastTimestamp:2026-03-19 18:56:18.955712753 +0000 UTC m=+3.709781066,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.851077 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e5303c93ef8e9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:18.956908777 +0000 UTC m=+3.710977090,LastTimestamp:2026-03-19 18:56:18.956908777 +0000 UTC m=+3.710977090,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.859150 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e5303cd4cbc74 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:19.024919668 +0000 UTC m=+3.778987991,LastTimestamp:2026-03-19 18:56:19.024919668 +0000 UTC m=+3.778987991,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.867203 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e5303d7ca3e24 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:19.200917028 +0000 UTC m=+3.954985351,LastTimestamp:2026-03-19 18:56:19.200917028 +0000 UTC m=+3.954985351,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.873909 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e5303d8a69626 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:19.215357478 +0000 UTC m=+3.969425811,LastTimestamp:2026-03-19 18:56:19.215357478 +0000 UTC m=+3.969425811,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.881444 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e5303d8bc5857 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:19.216783447 +0000 UTC m=+3.970851770,LastTimestamp:2026-03-19 18:56:19.216783447 +0000 UTC m=+3.970851770,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.889172 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e5303da0fb334 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:19.239023412 +0000 UTC m=+3.993091765,LastTimestamp:2026-03-19 18:56:19.239023412 +0000 UTC m=+3.993091765,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: I0319 18:57:10.901511 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.901613 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e53040a0e9938 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:20.044257592 +0000 UTC m=+4.798325935,LastTimestamp:2026-03-19 18:56:20.044257592 +0000 UTC m=+4.798325935,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.908396 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e530419a3bebe openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:20.305690302 +0000 UTC m=+5.059758655,LastTimestamp:2026-03-19 18:56:20.305690302 +0000 UTC m=+5.059758655,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.914941 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e53041a850e7e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:20.320456318 +0000 UTC m=+5.074524671,LastTimestamp:2026-03-19 18:56:20.320456318 +0000 UTC m=+5.074524671,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.921935 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e53041aa0181d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:20.322228253 +0000 UTC m=+5.076296596,LastTimestamp:2026-03-19 18:56:20.322228253 +0000 UTC m=+5.076296596,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.928745 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e53042adf3d43 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:20.594801987 +0000 UTC m=+5.348870340,LastTimestamp:2026-03-19 18:56:20.594801987 +0000 UTC m=+5.348870340,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.935748 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e53042bcb9ec7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:20.610293447 +0000 UTC m=+5.364361760,LastTimestamp:2026-03-19 18:56:20.610293447 +0000 UTC m=+5.364361760,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.943160 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e53042be13361 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:20.611707745 +0000 UTC m=+5.365776058,LastTimestamp:2026-03-19 18:56:20.611707745 +0000 UTC m=+5.365776058,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.948813 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e5304395c5caa openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:20.83788305 +0000 UTC m=+5.591951403,LastTimestamp:2026-03-19 18:56:20.83788305 +0000 UTC m=+5.591951403,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.955920 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e53043a4e32c5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:20.853732037 +0000 UTC m=+5.607800390,LastTimestamp:2026-03-19 18:56:20.853732037 +0000 UTC m=+5.607800390,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.963364 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e53043a72cb15 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:20.856130325 +0000 UTC m=+5.610198678,LastTimestamp:2026-03-19 18:56:20.856130325 +0000 UTC m=+5.610198678,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.971050 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e530448a47d82 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:21.09426829 +0000 UTC m=+5.848336613,LastTimestamp:2026-03-19 18:56:21.09426829 +0000 UTC m=+5.848336613,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: I0319 18:57:10.976051 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.978961 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e530449f73d36 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:21.116468534 +0000 UTC m=+5.870536857,LastTimestamp:2026-03-19 18:56:21.116468534 +0000 UTC m=+5.870536857,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: I0319 18:57:10.980424 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:10 crc kubenswrapper[4826]: I0319 18:57:10.980522 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:10 crc kubenswrapper[4826]: I0319 18:57:10.980552 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:10 crc kubenswrapper[4826]: I0319 18:57:10.984605 4826 scope.go:117] "RemoveContainer" containerID="efc0f32dc86eaccdc86d2f3b54c7d43a36ccf056683fa726cfe190e85c81e018" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.985008 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.987473 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e53044a0bad6c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:21.11780798 +0000 UTC m=+5.871876303,LastTimestamp:2026-03-19 18:56:21.11780798 +0000 UTC m=+5.871876303,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:10 crc kubenswrapper[4826]: E0319 18:57:10.995303 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e530459452093 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:21.373231251 +0000 UTC m=+6.127299594,LastTimestamp:2026-03-19 18:56:21.373231251 +0000 UTC m=+6.127299594,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:11 crc kubenswrapper[4826]: E0319 18:57:11.002387 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e53045a459e98 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:21.390040728 +0000 UTC m=+6.144109071,LastTimestamp:2026-03-19 18:56:21.390040728 +0000 UTC m=+6.144109071,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:11 crc kubenswrapper[4826]: E0319 18:57:11.012479 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 19 18:57:11 crc kubenswrapper[4826]: &Event{ObjectMeta:{kube-controller-manager-crc.189e5305e00f0e64 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 19 18:57:11 crc kubenswrapper[4826]: body: Mar 19 18:57:11 crc kubenswrapper[4826]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:27.929579108 +0000 UTC m=+12.683647451,LastTimestamp:2026-03-19 18:56:27.929579108 +0000 UTC m=+12.683647451,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 19 18:57:11 crc kubenswrapper[4826]: > Mar 19 18:57:11 crc kubenswrapper[4826]: E0319 18:57:11.020466 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e5305e011f1ca openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:27.929768394 +0000 UTC m=+12.683836747,LastTimestamp:2026-03-19 18:56:27.929768394 +0000 UTC m=+12.683836747,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:11 crc kubenswrapper[4826]: E0319 18:57:11.026985 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 19 18:57:11 crc kubenswrapper[4826]: &Event{ObjectMeta:{kube-apiserver-crc.189e53066bcf5803 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 19 18:57:11 crc kubenswrapper[4826]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 19 18:57:11 crc kubenswrapper[4826]: Mar 19 18:57:11 crc kubenswrapper[4826]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:30.274213891 +0000 UTC m=+15.028282214,LastTimestamp:2026-03-19 18:56:30.274213891 +0000 UTC m=+15.028282214,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 19 18:57:11 crc kubenswrapper[4826]: > Mar 19 18:57:11 crc kubenswrapper[4826]: E0319 18:57:11.033601 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e53066bcff9da openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:30.274255322 +0000 UTC m=+15.028323645,LastTimestamp:2026-03-19 18:56:30.274255322 +0000 UTC m=+15.028323645,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:11 crc kubenswrapper[4826]: E0319 18:57:11.041568 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 19 18:57:11 crc kubenswrapper[4826]: &Event{ObjectMeta:{kube-apiserver-crc.189e53066c4247c8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 19 18:57:11 crc kubenswrapper[4826]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Mar 19 18:57:11 crc kubenswrapper[4826]: Mar 19 18:57:11 crc kubenswrapper[4826]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:30.281746376 +0000 UTC m=+15.035814699,LastTimestamp:2026-03-19 18:56:30.281746376 +0000 UTC m=+15.035814699,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 19 18:57:11 crc kubenswrapper[4826]: > Mar 19 18:57:11 crc kubenswrapper[4826]: E0319 18:57:11.048076 4826 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e53066bcff9da\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e53066bcff9da openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:30.274255322 +0000 UTC m=+15.028323645,LastTimestamp:2026-03-19 18:56:30.281779007 +0000 UTC m=+15.035847330,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:11 crc kubenswrapper[4826]: E0319 18:57:11.055935 4826 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e5303c93ef8e9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e5303c93ef8e9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:18.956908777 +0000 UTC m=+3.710977090,LastTimestamp:2026-03-19 18:56:31.102265825 +0000 UTC m=+15.856334168,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:11 crc kubenswrapper[4826]: E0319 18:57:11.061396 4826 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e5303d7ca3e24\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e5303d7ca3e24 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:19.200917028 +0000 UTC m=+3.954985351,LastTimestamp:2026-03-19 18:56:31.318434814 +0000 UTC m=+16.072503137,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:11 crc kubenswrapper[4826]: E0319 18:57:11.069345 4826 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e5303d8bc5857\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e5303d8bc5857 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:19.216783447 +0000 UTC m=+3.970851770,LastTimestamp:2026-03-19 18:56:31.328388595 +0000 UTC m=+16.082456908,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:11 crc kubenswrapper[4826]: E0319 18:57:11.077645 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 19 18:57:11 crc kubenswrapper[4826]: &Event{ObjectMeta:{kube-controller-manager-crc.189e5308342585e1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 19 18:57:11 crc kubenswrapper[4826]: body: Mar 19 18:57:11 crc kubenswrapper[4826]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:37.930272225 +0000 UTC m=+22.684340578,LastTimestamp:2026-03-19 18:56:37.930272225 +0000 UTC m=+22.684340578,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 19 18:57:11 crc kubenswrapper[4826]: > Mar 19 18:57:11 crc kubenswrapper[4826]: E0319 18:57:11.081944 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e53083426b61a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:37.930350106 +0000 UTC m=+22.684418459,LastTimestamp:2026-03-19 18:56:37.930350106 +0000 UTC m=+22.684418459,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:11 crc kubenswrapper[4826]: E0319 18:57:11.090503 4826 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e5308342585e1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 19 18:57:11 crc kubenswrapper[4826]: &Event{ObjectMeta:{kube-controller-manager-crc.189e5308342585e1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 19 18:57:11 crc kubenswrapper[4826]: body: Mar 19 18:57:11 crc kubenswrapper[4826]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:37.930272225 +0000 UTC m=+22.684340578,LastTimestamp:2026-03-19 18:56:47.930895316 +0000 UTC m=+32.684963669,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 19 18:57:11 crc kubenswrapper[4826]: > Mar 19 18:57:11 crc kubenswrapper[4826]: E0319 18:57:11.098720 4826 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e53083426b61a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e53083426b61a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:37.930350106 +0000 UTC m=+22.684418459,LastTimestamp:2026-03-19 18:56:47.930980808 +0000 UTC m=+32.685049151,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:11 crc kubenswrapper[4826]: E0319 18:57:11.105904 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e530a886cabdd openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:47.934155741 +0000 UTC m=+32.688224054,LastTimestamp:2026-03-19 18:56:47.934155741 +0000 UTC m=+32.688224054,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:11 crc kubenswrapper[4826]: E0319 18:57:11.111240 4826 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e53035b5628d1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e53035b5628d1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:17.112934609 +0000 UTC m=+1.867002932,LastTimestamp:2026-03-19 18:56:48.060390832 +0000 UTC m=+32.814459185,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:11 crc kubenswrapper[4826]: E0319 18:57:11.117093 4826 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e53036ea30adf\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e53036ea30adf openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:17.436740319 +0000 UTC m=+2.190808672,LastTimestamp:2026-03-19 18:56:48.273165991 +0000 UTC m=+33.027234314,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:11 crc kubenswrapper[4826]: E0319 18:57:11.122294 4826 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e53036f7626ca\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e53036f7626ca openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:17.450575562 +0000 UTC m=+2.204643875,LastTimestamp:2026-03-19 18:56:48.285207931 +0000 UTC m=+33.039276254,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:11 crc kubenswrapper[4826]: E0319 18:57:11.131428 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 19 18:57:11 crc kubenswrapper[4826]: &Event{ObjectMeta:{kube-controller-manager-crc.189e530cdc3ef1ef openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded Mar 19 18:57:11 crc kubenswrapper[4826]: body: Mar 19 18:57:11 crc kubenswrapper[4826]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:57.930379759 +0000 UTC m=+42.684448102,LastTimestamp:2026-03-19 18:56:57.930379759 +0000 UTC m=+42.684448102,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 19 18:57:11 crc kubenswrapper[4826]: > Mar 19 18:57:11 crc kubenswrapper[4826]: E0319 18:57:11.139827 4826 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e530cdc4000ec openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:57.930449132 +0000 UTC m=+42.684517485,LastTimestamp:2026-03-19 18:56:57.930449132 +0000 UTC m=+42.684517485,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:57:11 crc kubenswrapper[4826]: E0319 18:57:11.148549 4826 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e5308342585e1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 19 18:57:11 crc kubenswrapper[4826]: &Event{ObjectMeta:{kube-controller-manager-crc.189e5308342585e1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 19 18:57:11 crc kubenswrapper[4826]: body: Mar 19 18:57:11 crc kubenswrapper[4826]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:56:37.930272225 +0000 UTC m=+22.684340578,LastTimestamp:2026-03-19 18:57:07.930125962 +0000 UTC m=+52.684194305,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 19 18:57:11 crc kubenswrapper[4826]: > Mar 19 18:57:11 crc kubenswrapper[4826]: E0319 18:57:11.699604 4826 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 19 18:57:11 crc kubenswrapper[4826]: I0319 18:57:11.703928 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:57:11 crc kubenswrapper[4826]: I0319 18:57:11.705317 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:11 crc kubenswrapper[4826]: I0319 18:57:11.705354 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:11 crc kubenswrapper[4826]: I0319 18:57:11.705366 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:11 crc kubenswrapper[4826]: I0319 18:57:11.705394 4826 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 18:57:11 crc kubenswrapper[4826]: E0319 18:57:11.711136 4826 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 19 18:57:11 crc kubenswrapper[4826]: I0319 18:57:11.899470 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 18:57:12 crc kubenswrapper[4826]: I0319 18:57:12.428403 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 18:57:12 crc kubenswrapper[4826]: I0319 18:57:12.428645 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:57:12 crc kubenswrapper[4826]: I0319 18:57:12.430197 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:12 crc kubenswrapper[4826]: I0319 18:57:12.430258 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:12 crc kubenswrapper[4826]: I0319 18:57:12.430276 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:12 crc kubenswrapper[4826]: I0319 18:57:12.902356 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 18:57:13 crc kubenswrapper[4826]: I0319 18:57:13.904288 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 18:57:14 crc kubenswrapper[4826]: I0319 18:57:14.900302 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 18:57:14 crc kubenswrapper[4826]: I0319 18:57:14.937227 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 18:57:14 crc kubenswrapper[4826]: I0319 18:57:14.937486 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:57:14 crc kubenswrapper[4826]: I0319 18:57:14.939183 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:14 crc kubenswrapper[4826]: I0319 18:57:14.939235 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:14 crc kubenswrapper[4826]: I0319 18:57:14.939255 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:14 crc kubenswrapper[4826]: I0319 18:57:14.944237 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 18:57:15 crc kubenswrapper[4826]: I0319 18:57:15.277240 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:57:15 crc kubenswrapper[4826]: I0319 18:57:15.278608 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:15 crc kubenswrapper[4826]: I0319 18:57:15.278870 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:15 crc kubenswrapper[4826]: I0319 18:57:15.279029 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:15 crc kubenswrapper[4826]: I0319 18:57:15.901683 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 18:57:16 crc kubenswrapper[4826]: E0319 18:57:16.059808 4826 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 18:57:16 crc kubenswrapper[4826]: I0319 18:57:16.899364 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 18:57:17 crc kubenswrapper[4826]: I0319 18:57:17.905415 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 18:57:18 crc kubenswrapper[4826]: E0319 18:57:18.709033 4826 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 19 18:57:18 crc kubenswrapper[4826]: I0319 18:57:18.712111 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:57:18 crc kubenswrapper[4826]: I0319 18:57:18.714118 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:18 crc kubenswrapper[4826]: I0319 18:57:18.714209 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:18 crc kubenswrapper[4826]: I0319 18:57:18.714232 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:18 crc kubenswrapper[4826]: I0319 18:57:18.714282 4826 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 18:57:18 crc kubenswrapper[4826]: E0319 18:57:18.719404 4826 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 19 18:57:18 crc kubenswrapper[4826]: I0319 18:57:18.899843 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 18:57:19 crc kubenswrapper[4826]: I0319 18:57:19.901458 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 18:57:20 crc kubenswrapper[4826]: I0319 18:57:20.900645 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 18:57:21 crc kubenswrapper[4826]: I0319 18:57:21.901624 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 18:57:21 crc kubenswrapper[4826]: I0319 18:57:21.975971 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:57:21 crc kubenswrapper[4826]: I0319 18:57:21.977762 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:21 crc kubenswrapper[4826]: I0319 18:57:21.977832 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:21 crc kubenswrapper[4826]: I0319 18:57:21.977859 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:21 crc kubenswrapper[4826]: I0319 18:57:21.978868 4826 scope.go:117] "RemoveContainer" containerID="efc0f32dc86eaccdc86d2f3b54c7d43a36ccf056683fa726cfe190e85c81e018" Mar 19 18:57:22 crc kubenswrapper[4826]: I0319 18:57:22.303570 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 19 18:57:22 crc kubenswrapper[4826]: I0319 18:57:22.307115 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d6543dc21146ffce18eefd1d6f58480662c580fc8dbb20550656709811dd6cc7"} Mar 19 18:57:22 crc kubenswrapper[4826]: I0319 18:57:22.307367 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:57:22 crc kubenswrapper[4826]: I0319 18:57:22.309693 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:22 crc kubenswrapper[4826]: I0319 18:57:22.309758 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:22 crc kubenswrapper[4826]: I0319 18:57:22.309776 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:22 crc kubenswrapper[4826]: I0319 18:57:22.902547 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 18:57:23 crc kubenswrapper[4826]: I0319 18:57:23.322030 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 19 18:57:23 crc kubenswrapper[4826]: I0319 18:57:23.323018 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 19 18:57:23 crc kubenswrapper[4826]: I0319 18:57:23.325883 4826 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d6543dc21146ffce18eefd1d6f58480662c580fc8dbb20550656709811dd6cc7" exitCode=255 Mar 19 18:57:23 crc kubenswrapper[4826]: I0319 18:57:23.325955 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"d6543dc21146ffce18eefd1d6f58480662c580fc8dbb20550656709811dd6cc7"} Mar 19 18:57:23 crc kubenswrapper[4826]: I0319 18:57:23.326041 4826 scope.go:117] "RemoveContainer" containerID="efc0f32dc86eaccdc86d2f3b54c7d43a36ccf056683fa726cfe190e85c81e018" Mar 19 18:57:23 crc kubenswrapper[4826]: I0319 18:57:23.326205 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:57:23 crc kubenswrapper[4826]: I0319 18:57:23.327401 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:23 crc kubenswrapper[4826]: I0319 18:57:23.327465 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:23 crc kubenswrapper[4826]: I0319 18:57:23.327528 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:23 crc kubenswrapper[4826]: I0319 18:57:23.331063 4826 scope.go:117] "RemoveContainer" containerID="d6543dc21146ffce18eefd1d6f58480662c580fc8dbb20550656709811dd6cc7" Mar 19 18:57:23 crc kubenswrapper[4826]: E0319 18:57:23.331373 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 18:57:23 crc kubenswrapper[4826]: I0319 18:57:23.903212 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 18:57:24 crc kubenswrapper[4826]: I0319 18:57:24.332826 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 19 18:57:24 crc kubenswrapper[4826]: I0319 18:57:24.778375 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:57:24 crc kubenswrapper[4826]: I0319 18:57:24.778714 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:57:24 crc kubenswrapper[4826]: I0319 18:57:24.780345 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:24 crc kubenswrapper[4826]: I0319 18:57:24.780441 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:24 crc kubenswrapper[4826]: I0319 18:57:24.780462 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:24 crc kubenswrapper[4826]: I0319 18:57:24.781744 4826 scope.go:117] "RemoveContainer" containerID="d6543dc21146ffce18eefd1d6f58480662c580fc8dbb20550656709811dd6cc7" Mar 19 18:57:24 crc kubenswrapper[4826]: E0319 18:57:24.782163 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 18:57:24 crc kubenswrapper[4826]: I0319 18:57:24.902453 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 18:57:25 crc kubenswrapper[4826]: E0319 18:57:25.717483 4826 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 19 18:57:25 crc kubenswrapper[4826]: I0319 18:57:25.720558 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:57:25 crc kubenswrapper[4826]: I0319 18:57:25.722616 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:25 crc kubenswrapper[4826]: I0319 18:57:25.722701 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:25 crc kubenswrapper[4826]: I0319 18:57:25.722726 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:25 crc kubenswrapper[4826]: I0319 18:57:25.722774 4826 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 18:57:25 crc kubenswrapper[4826]: E0319 18:57:25.729841 4826 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 19 18:57:25 crc kubenswrapper[4826]: I0319 18:57:25.902834 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 18:57:26 crc kubenswrapper[4826]: E0319 18:57:26.060335 4826 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 18:57:26 crc kubenswrapper[4826]: I0319 18:57:26.901837 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 18:57:27 crc kubenswrapper[4826]: I0319 18:57:27.897806 4826 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 18:57:27 crc kubenswrapper[4826]: I0319 18:57:27.899529 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 18:57:27 crc kubenswrapper[4826]: I0319 18:57:27.919182 4826 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 19 18:57:28 crc kubenswrapper[4826]: I0319 18:57:28.902508 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 18:57:28 crc kubenswrapper[4826]: I0319 18:57:28.971268 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:57:28 crc kubenswrapper[4826]: I0319 18:57:28.971587 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:57:28 crc kubenswrapper[4826]: I0319 18:57:28.973152 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:28 crc kubenswrapper[4826]: I0319 18:57:28.973209 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:28 crc kubenswrapper[4826]: I0319 18:57:28.973228 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:28 crc kubenswrapper[4826]: I0319 18:57:28.974139 4826 scope.go:117] "RemoveContainer" containerID="d6543dc21146ffce18eefd1d6f58480662c580fc8dbb20550656709811dd6cc7" Mar 19 18:57:28 crc kubenswrapper[4826]: E0319 18:57:28.974415 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 18:57:29 crc kubenswrapper[4826]: I0319 18:57:29.905775 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 18:57:29 crc kubenswrapper[4826]: I0319 18:57:29.975386 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:57:29 crc kubenswrapper[4826]: I0319 18:57:29.976906 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:29 crc kubenswrapper[4826]: I0319 18:57:29.976977 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:29 crc kubenswrapper[4826]: I0319 18:57:29.976995 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:30 crc kubenswrapper[4826]: I0319 18:57:30.901296 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 18:57:31 crc kubenswrapper[4826]: I0319 18:57:31.902567 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 18:57:32 crc kubenswrapper[4826]: E0319 18:57:32.723982 4826 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 19 18:57:32 crc kubenswrapper[4826]: I0319 18:57:32.730159 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:57:32 crc kubenswrapper[4826]: I0319 18:57:32.731263 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:32 crc kubenswrapper[4826]: I0319 18:57:32.731303 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:32 crc kubenswrapper[4826]: I0319 18:57:32.731315 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:32 crc kubenswrapper[4826]: I0319 18:57:32.731340 4826 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 18:57:32 crc kubenswrapper[4826]: E0319 18:57:32.735502 4826 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 19 18:57:32 crc kubenswrapper[4826]: I0319 18:57:32.906079 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 18:57:33 crc kubenswrapper[4826]: I0319 18:57:33.901207 4826 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 18:57:34 crc kubenswrapper[4826]: I0319 18:57:34.679918 4826 csr.go:261] certificate signing request csr-26rhg is approved, waiting to be issued Mar 19 18:57:34 crc kubenswrapper[4826]: I0319 18:57:34.693512 4826 csr.go:257] certificate signing request csr-26rhg is issued Mar 19 18:57:34 crc kubenswrapper[4826]: I0319 18:57:34.731453 4826 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 19 18:57:34 crc kubenswrapper[4826]: I0319 18:57:34.762190 4826 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 19 18:57:35 crc kubenswrapper[4826]: I0319 18:57:35.696042 4826 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-24 08:30:37.100180743 +0000 UTC Mar 19 18:57:35 crc kubenswrapper[4826]: I0319 18:57:35.696113 4826 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6709h33m1.404073111s for next certificate rotation Mar 19 18:57:36 crc kubenswrapper[4826]: E0319 18:57:36.061538 4826 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 18:57:37 crc kubenswrapper[4826]: I0319 18:57:37.352555 4826 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 19 18:57:39 crc kubenswrapper[4826]: I0319 18:57:39.736710 4826 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 18:57:39 crc kubenswrapper[4826]: I0319 18:57:39.738289 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:39 crc kubenswrapper[4826]: I0319 18:57:39.738352 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:39 crc kubenswrapper[4826]: I0319 18:57:39.738362 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:39 crc kubenswrapper[4826]: I0319 18:57:39.738519 4826 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 18:57:39 crc kubenswrapper[4826]: I0319 18:57:39.748990 4826 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 19 18:57:39 crc kubenswrapper[4826]: I0319 18:57:39.749394 4826 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 19 18:57:39 crc kubenswrapper[4826]: E0319 18:57:39.749433 4826 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 19 18:57:39 crc kubenswrapper[4826]: I0319 18:57:39.755287 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:39 crc kubenswrapper[4826]: I0319 18:57:39.755368 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:39 crc kubenswrapper[4826]: I0319 18:57:39.755388 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:39 crc kubenswrapper[4826]: I0319 18:57:39.755959 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:39 crc kubenswrapper[4826]: I0319 18:57:39.756311 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:39Z","lastTransitionTime":"2026-03-19T18:57:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:39 crc kubenswrapper[4826]: E0319 18:57:39.771162 4826 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:57:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:57:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:57:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:57:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"188a80e8-02e6-4be0-9b87-2d80617c6be2\\\",\\\"systemUUID\\\":\\\"e8d792fa-f63b-43a0-bb9a-92bfd45c6f50\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:57:39 crc kubenswrapper[4826]: I0319 18:57:39.775945 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:39 crc kubenswrapper[4826]: I0319 18:57:39.775985 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:39 crc kubenswrapper[4826]: I0319 18:57:39.775994 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:39 crc kubenswrapper[4826]: I0319 18:57:39.776013 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:39 crc kubenswrapper[4826]: I0319 18:57:39.776028 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:39Z","lastTransitionTime":"2026-03-19T18:57:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:39 crc kubenswrapper[4826]: E0319 18:57:39.792254 4826 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:57:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:57:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:57:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:57:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"188a80e8-02e6-4be0-9b87-2d80617c6be2\\\",\\\"systemUUID\\\":\\\"e8d792fa-f63b-43a0-bb9a-92bfd45c6f50\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:57:39 crc kubenswrapper[4826]: I0319 18:57:39.798094 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:39 crc kubenswrapper[4826]: I0319 18:57:39.798129 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:39 crc kubenswrapper[4826]: I0319 18:57:39.798142 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:39 crc kubenswrapper[4826]: I0319 18:57:39.798163 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:39 crc kubenswrapper[4826]: I0319 18:57:39.798178 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:39Z","lastTransitionTime":"2026-03-19T18:57:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:39 crc kubenswrapper[4826]: E0319 18:57:39.811944 4826 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:57:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:57:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:57:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:57:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"188a80e8-02e6-4be0-9b87-2d80617c6be2\\\",\\\"systemUUID\\\":\\\"e8d792fa-f63b-43a0-bb9a-92bfd45c6f50\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:57:39 crc kubenswrapper[4826]: I0319 18:57:39.818490 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:39 crc kubenswrapper[4826]: I0319 18:57:39.818553 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:39 crc kubenswrapper[4826]: I0319 18:57:39.818572 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:39 crc kubenswrapper[4826]: I0319 18:57:39.818598 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:39 crc kubenswrapper[4826]: I0319 18:57:39.818620 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:39Z","lastTransitionTime":"2026-03-19T18:57:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:39 crc kubenswrapper[4826]: E0319 18:57:39.836970 4826 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:57:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:57:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:57:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:57:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"188a80e8-02e6-4be0-9b87-2d80617c6be2\\\",\\\"systemUUID\\\":\\\"e8d792fa-f63b-43a0-bb9a-92bfd45c6f50\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:57:39 crc kubenswrapper[4826]: E0319 18:57:39.837182 4826 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 18:57:39 crc kubenswrapper[4826]: E0319 18:57:39.837224 4826 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:39 crc kubenswrapper[4826]: E0319 18:57:39.937553 4826 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:40 crc kubenswrapper[4826]: I0319 18:57:40.009266 4826 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 19 18:57:40 crc kubenswrapper[4826]: E0319 18:57:40.037958 4826 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:40 crc kubenswrapper[4826]: E0319 18:57:40.138642 4826 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:40 crc kubenswrapper[4826]: E0319 18:57:40.239330 4826 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:40 crc kubenswrapper[4826]: E0319 18:57:40.339891 4826 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:40 crc kubenswrapper[4826]: E0319 18:57:40.440019 4826 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:40 crc kubenswrapper[4826]: E0319 18:57:40.541066 4826 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:40 crc kubenswrapper[4826]: E0319 18:57:40.641843 4826 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:40 crc kubenswrapper[4826]: E0319 18:57:40.742763 4826 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:40 crc kubenswrapper[4826]: E0319 18:57:40.843149 4826 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:40 crc kubenswrapper[4826]: E0319 18:57:40.943611 4826 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:41 crc kubenswrapper[4826]: E0319 18:57:41.044168 4826 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:41 crc kubenswrapper[4826]: E0319 18:57:41.144978 4826 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:41 crc kubenswrapper[4826]: E0319 18:57:41.245399 4826 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:41 crc kubenswrapper[4826]: E0319 18:57:41.346035 4826 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:41 crc kubenswrapper[4826]: E0319 18:57:41.447127 4826 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:41 crc kubenswrapper[4826]: E0319 18:57:41.547235 4826 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:41 crc kubenswrapper[4826]: E0319 18:57:41.648146 4826 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:41 crc kubenswrapper[4826]: E0319 18:57:41.748731 4826 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:41 crc kubenswrapper[4826]: E0319 18:57:41.848858 4826 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:41 crc kubenswrapper[4826]: E0319 18:57:41.949760 4826 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:42 crc kubenswrapper[4826]: E0319 18:57:42.050263 4826 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:42 crc kubenswrapper[4826]: E0319 18:57:42.150680 4826 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:42 crc kubenswrapper[4826]: E0319 18:57:42.251770 4826 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:42 crc kubenswrapper[4826]: E0319 18:57:42.351989 4826 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 18:57:42 crc kubenswrapper[4826]: I0319 18:57:42.447684 4826 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 19 18:57:42 crc kubenswrapper[4826]: I0319 18:57:42.456335 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:42 crc kubenswrapper[4826]: I0319 18:57:42.456431 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:42 crc kubenswrapper[4826]: I0319 18:57:42.456450 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:42 crc kubenswrapper[4826]: I0319 18:57:42.456477 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:42 crc kubenswrapper[4826]: I0319 18:57:42.456497 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:42Z","lastTransitionTime":"2026-03-19T18:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:42 crc kubenswrapper[4826]: I0319 18:57:42.560298 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:42 crc kubenswrapper[4826]: I0319 18:57:42.560378 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:42 crc kubenswrapper[4826]: I0319 18:57:42.560403 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:42 crc kubenswrapper[4826]: I0319 18:57:42.560435 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:42 crc kubenswrapper[4826]: I0319 18:57:42.560457 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:42Z","lastTransitionTime":"2026-03-19T18:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:42 crc kubenswrapper[4826]: I0319 18:57:42.664483 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:42 crc kubenswrapper[4826]: I0319 18:57:42.664554 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:42 crc kubenswrapper[4826]: I0319 18:57:42.664572 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:42 crc kubenswrapper[4826]: I0319 18:57:42.664598 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:42 crc kubenswrapper[4826]: I0319 18:57:42.664617 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:42Z","lastTransitionTime":"2026-03-19T18:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:42 crc kubenswrapper[4826]: I0319 18:57:42.768562 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:42 crc kubenswrapper[4826]: I0319 18:57:42.768614 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:42 crc kubenswrapper[4826]: I0319 18:57:42.768628 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:42 crc kubenswrapper[4826]: I0319 18:57:42.768649 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:42 crc kubenswrapper[4826]: I0319 18:57:42.768684 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:42Z","lastTransitionTime":"2026-03-19T18:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:42 crc kubenswrapper[4826]: I0319 18:57:42.872487 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:42 crc kubenswrapper[4826]: I0319 18:57:42.872564 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:42 crc kubenswrapper[4826]: I0319 18:57:42.872583 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:42 crc kubenswrapper[4826]: I0319 18:57:42.872611 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:42 crc kubenswrapper[4826]: I0319 18:57:42.872630 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:42Z","lastTransitionTime":"2026-03-19T18:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:42 crc kubenswrapper[4826]: I0319 18:57:42.934056 4826 apiserver.go:52] "Watching apiserver" Mar 19 18:57:42 crc kubenswrapper[4826]: I0319 18:57:42.941886 4826 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 19 18:57:42 crc kubenswrapper[4826]: I0319 18:57:42.942359 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Mar 19 18:57:42 crc kubenswrapper[4826]: I0319 18:57:42.943066 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 18:57:42 crc kubenswrapper[4826]: I0319 18:57:42.943070 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 18:57:42 crc kubenswrapper[4826]: E0319 18:57:42.943186 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 18:57:42 crc kubenswrapper[4826]: I0319 18:57:42.943334 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 18:57:42 crc kubenswrapper[4826]: I0319 18:57:42.943392 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 18:57:42 crc kubenswrapper[4826]: E0319 18:57:42.943708 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 18:57:42 crc kubenswrapper[4826]: I0319 18:57:42.943787 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 18:57:42 crc kubenswrapper[4826]: I0319 18:57:42.943967 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:57:42 crc kubenswrapper[4826]: E0319 18:57:42.944647 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 18:57:42 crc kubenswrapper[4826]: I0319 18:57:42.946113 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 19 18:57:42 crc kubenswrapper[4826]: I0319 18:57:42.947102 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 19 18:57:42 crc kubenswrapper[4826]: I0319 18:57:42.947194 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 19 18:57:42 crc kubenswrapper[4826]: I0319 18:57:42.947735 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 19 18:57:42 crc kubenswrapper[4826]: I0319 18:57:42.947780 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 19 18:57:42 crc kubenswrapper[4826]: I0319 18:57:42.948080 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 19 18:57:42 crc kubenswrapper[4826]: I0319 18:57:42.948846 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 19 18:57:42 crc kubenswrapper[4826]: I0319 18:57:42.950042 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 19 18:57:42 crc kubenswrapper[4826]: I0319 18:57:42.950319 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 19 18:57:42 crc kubenswrapper[4826]: I0319 18:57:42.976013 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:42 crc kubenswrapper[4826]: I0319 18:57:42.976075 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:42 crc kubenswrapper[4826]: I0319 18:57:42.976093 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:42 crc kubenswrapper[4826]: I0319 18:57:42.976121 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:42 crc kubenswrapper[4826]: I0319 18:57:42.976140 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:42Z","lastTransitionTime":"2026-03-19T18:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:42 crc kubenswrapper[4826]: I0319 18:57:42.987785 4826 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.002456 4826 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.004138 4826 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.016447 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.016489 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.016513 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.016531 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.016552 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.016573 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.016597 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.016618 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.016643 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.016682 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.016705 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.016760 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.016786 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.016814 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.017319 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.017415 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.017454 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.017484 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.017509 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.017534 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.017558 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.017594 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.017625 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.017693 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.017720 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.017728 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.017748 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.017777 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.017803 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.017828 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.017854 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.017910 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.017930 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.017962 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.017986 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.018012 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.018040 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.018062 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.018083 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.018105 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.018130 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.018129 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.018151 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.018211 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.018256 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.018289 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.018312 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.018315 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.018334 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.018382 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.018481 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.018492 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.018510 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.018543 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.018569 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.018595 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.018627 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.018674 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.018703 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.018734 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.018738 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.018770 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.018763 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.019048 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.019180 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.019212 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.019237 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.019260 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.019281 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.019302 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.019326 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.019347 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.019365 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.019382 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.019405 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.019428 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.019451 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.019477 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.019528 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.019554 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.019582 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.019604 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.019629 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.019682 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.019771 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.019800 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.019823 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.019848 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.019875 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.019902 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.019934 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.019966 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.019993 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.020019 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.020043 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.020069 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.020095 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.020121 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.020144 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.020173 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.020198 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.020225 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.020251 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.020278 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.020303 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.020326 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.020350 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.020374 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.020394 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.020412 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.020434 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.020458 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.020485 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.020512 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.020579 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.020604 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.020630 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.020670 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.020696 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.020725 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.020751 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.020776 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.020801 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.020828 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.020852 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.020876 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.020903 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.020928 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.020952 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.020979 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.021005 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.021031 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.021060 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.021085 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.021109 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.021134 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.021163 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.021186 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.021210 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.021236 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.021262 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.021287 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.021373 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.021400 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.021423 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.021462 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.021487 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.021514 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.021539 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.021567 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.021591 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.021619 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.021673 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.021702 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.021728 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.021785 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.021811 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.021832 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.021856 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.021919 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.021944 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.021971 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.021993 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.022013 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.022033 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.022053 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.022074 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.022094 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.022112 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.022130 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.022149 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.022169 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.022189 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.022207 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.022225 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.022817 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.022840 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.022860 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.022877 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.022900 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.022925 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.022942 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.022960 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.022979 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.023762 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.023832 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.023864 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.023902 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.023939 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.023974 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.024012 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.024049 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.024078 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.024109 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.024141 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.024177 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.024204 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.024239 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.024273 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.024304 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.024338 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.019266 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.028299 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.019922 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.019987 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.020042 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.020078 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.021224 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.021309 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.021359 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.021534 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.021851 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.022697 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.022768 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.024220 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.024223 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.024421 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.023035 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.024778 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.028552 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.024895 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.024880 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.025194 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.028619 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.025365 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.025808 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.025978 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.026074 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.026448 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.026492 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.026535 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.027024 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.027250 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.027274 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.027316 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.027635 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.027717 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: E0319 18:57:43.027954 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:57:43.527911737 +0000 UTC m=+88.281980060 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.027925 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.028963 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.029074 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.029114 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.029363 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.029394 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.029452 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.029535 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.029532 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.029770 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.029904 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.028287 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.030136 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.030364 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.030487 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.030624 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.030735 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.030765 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.030788 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.030831 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.031299 4826 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.031384 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.031469 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.031702 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.031814 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.032362 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.032611 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.032691 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.032712 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.033079 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.033112 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.033971 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.034279 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.034328 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.034817 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.034521 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.035125 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.035923 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.036427 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.036756 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.037930 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.037081 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.036748 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.037335 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.037619 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.037729 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.037098 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.037127 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.038369 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.038472 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.038558 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.038632 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.039142 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.039251 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.039365 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.039579 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.039700 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.039776 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.039853 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.039922 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.039946 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.040008 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.040046 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.040106 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.040203 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.040266 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.040443 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.040464 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.040477 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.040600 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.039311 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.040480 4826 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.041301 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.041348 4826 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.041387 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.041438 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.041548 4826 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.039961 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: E0319 18:57:43.041146 4826 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.041972 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.042645 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.042653 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.042724 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.042800 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: E0319 18:57:43.042826 4826 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.042867 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.043121 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.043431 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: E0319 18:57:43.043510 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 18:57:43.543315216 +0000 UTC m=+88.297383539 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.043754 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: E0319 18:57:43.043906 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 18:57:43.543888961 +0000 UTC m=+88.297957284 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.043989 4826 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.044010 4826 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.044034 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.044050 4826 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.044068 4826 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.044088 4826 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.044112 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.044131 4826 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.044150 4826 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.044163 4826 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.044180 4826 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.044194 4826 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.044209 4826 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.044225 4826 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.044238 4826 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.044252 4826 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.044265 4826 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.044283 4826 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.044296 4826 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.044312 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.044331 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.044355 4826 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.044374 4826 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.044391 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.044407 4826 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.044436 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.044452 4826 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.044470 4826 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.044492 4826 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.044511 4826 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.044531 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.044555 4826 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.044577 4826 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.044594 4826 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.044611 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.044624 4826 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.044642 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.044693 4826 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.044713 4826 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.044735 4826 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.044753 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.044770 4826 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.044786 4826 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.044810 4826 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.044828 4826 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.044846 4826 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.044862 4826 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.044886 4826 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.044907 4826 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.044927 4826 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.044950 4826 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.044969 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.044987 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.045008 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.045032 4826 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.045052 4826 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.045071 4826 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.045089 4826 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.045114 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.045134 4826 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.045172 4826 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.045192 4826 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.045223 4826 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.045246 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.045264 4826 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.045289 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.045308 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.045326 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.045344 4826 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.045370 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.045389 4826 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.045407 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.045426 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.045448 4826 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.045465 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.045484 4826 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.045512 4826 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.045535 4826 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.045554 4826 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.045571 4826 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.045596 4826 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.045614 4826 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.045633 4826 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.045678 4826 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.045701 4826 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.045718 4826 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.051700 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.052294 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.059459 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.059429 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.059761 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.060434 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.060856 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.060695 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.060812 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.062041 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.062594 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.062831 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.063328 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.063481 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.063676 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 18:57:43 crc kubenswrapper[4826]: E0319 18:57:43.063523 4826 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 18:57:43 crc kubenswrapper[4826]: E0319 18:57:43.064873 4826 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 18:57:43 crc kubenswrapper[4826]: E0319 18:57:43.064902 4826 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 18:57:43 crc kubenswrapper[4826]: E0319 18:57:43.065017 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 18:57:43.564987545 +0000 UTC m=+88.319055898 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.064249 4826 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:42Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.063629 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.063730 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.066693 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.068922 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.068945 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: E0319 18:57:43.072367 4826 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 18:57:43 crc kubenswrapper[4826]: E0319 18:57:43.072415 4826 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 18:57:43 crc kubenswrapper[4826]: E0319 18:57:43.072438 4826 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 18:57:43 crc kubenswrapper[4826]: E0319 18:57:43.072518 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 18:57:43.57249224 +0000 UTC m=+88.326560563 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.077964 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.078285 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.078538 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.078952 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.079287 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.079443 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.079564 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.079588 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.080070 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.080443 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.080718 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.080966 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.080992 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.081004 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.081023 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.081035 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:43Z","lastTransitionTime":"2026-03-19T18:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.081069 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.082005 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.082315 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.082503 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.082712 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.082851 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.083107 4826 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:42Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.084700 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.085484 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.085687 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.085733 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.085989 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.086858 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.087636 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.088023 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.089115 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.089491 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.090103 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.090157 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.090165 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.090237 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.090492 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.090777 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.091034 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.091281 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.091522 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.091527 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.091592 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.091876 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.092043 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.092227 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.092237 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.092717 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.092603 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.092557 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.092939 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.093037 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.093168 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.093313 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.093349 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.093496 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.093501 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.092408 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.093873 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.094898 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.095045 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.095000 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.097281 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.097379 4826 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.097644 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.098462 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.098858 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.099635 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.099741 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.100019 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.100485 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.100541 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.100591 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.100690 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.100970 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.101153 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.102159 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.109091 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.109302 4826 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.112752 4826 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.117815 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.126142 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.133242 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.133625 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.146412 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.146533 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.146679 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.146906 4826 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.147006 4826 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.147107 4826 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.147191 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.147287 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.147375 4826 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.147464 4826 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.147545 4826 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.147628 4826 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.147753 4826 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.147848 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.147929 4826 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.148002 4826 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.148088 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.148169 4826 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.148276 4826 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.148368 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.148450 4826 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.148523 4826 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.146808 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.148603 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.148767 4826 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.148797 4826 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.148818 4826 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.148833 4826 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.148846 4826 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.148859 4826 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.148871 4826 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.148884 4826 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.148898 4826 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.148911 4826 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.148923 4826 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.148936 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.148949 4826 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.148962 4826 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.148976 4826 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.148990 4826 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149003 4826 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149020 4826 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149037 4826 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149049 4826 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149062 4826 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149079 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149096 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149113 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149130 4826 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149147 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149159 4826 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149173 4826 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149188 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149204 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149221 4826 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149238 4826 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149257 4826 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149272 4826 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149285 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149302 4826 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149314 4826 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149326 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149338 4826 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149350 4826 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149362 4826 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149374 4826 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149386 4826 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149397 4826 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149409 4826 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149422 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149433 4826 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149445 4826 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149457 4826 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149469 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149485 4826 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149499 4826 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149511 4826 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149522 4826 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149535 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149548 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149562 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149574 4826 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149585 4826 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149597 4826 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149609 4826 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149622 4826 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149637 4826 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149694 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149710 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149722 4826 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149735 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149747 4826 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149759 4826 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149772 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149784 4826 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149795 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149808 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149819 4826 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149831 4826 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149844 4826 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149857 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149869 4826 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149881 4826 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149894 4826 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.149906 4826 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.185309 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.185800 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.185987 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.186153 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.186287 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:43Z","lastTransitionTime":"2026-03-19T18:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.265380 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.278036 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.291574 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.292039 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.292059 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.292085 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.292104 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.292230 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:43Z","lastTransitionTime":"2026-03-19T18:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:43 crc kubenswrapper[4826]: W0319 18:57:43.298879 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-d27062a2b6c9aaec9df7f0f49c34b4984d7e2522afe87856bea8add0b21f3766 WatchSource:0}: Error finding container d27062a2b6c9aaec9df7f0f49c34b4984d7e2522afe87856bea8add0b21f3766: Status 404 returned error can't find the container with id d27062a2b6c9aaec9df7f0f49c34b4984d7e2522afe87856bea8add0b21f3766 Mar 19 18:57:43 crc kubenswrapper[4826]: W0319 18:57:43.323589 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-dcf57e00c75fcf6b54822a1ab4bf12bd5862e8080f1da35e9ceaa3c9500b9d21 WatchSource:0}: Error finding container dcf57e00c75fcf6b54822a1ab4bf12bd5862e8080f1da35e9ceaa3c9500b9d21: Status 404 returned error can't find the container with id dcf57e00c75fcf6b54822a1ab4bf12bd5862e8080f1da35e9ceaa3c9500b9d21 Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.396877 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.396913 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.396926 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.396945 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.396958 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:43Z","lastTransitionTime":"2026-03-19T18:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.398672 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"dcf57e00c75fcf6b54822a1ab4bf12bd5862e8080f1da35e9ceaa3c9500b9d21"} Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.400086 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"d27062a2b6c9aaec9df7f0f49c34b4984d7e2522afe87856bea8add0b21f3766"} Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.401556 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"eb0674c0f0355c9b1e771aa9bd8484f1dc2445475181ba77ca446733b311c5d1"} Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.500216 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.500275 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.500293 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.500320 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.500338 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:43Z","lastTransitionTime":"2026-03-19T18:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.553736 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.553851 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.553916 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:57:43 crc kubenswrapper[4826]: E0319 18:57:43.554028 4826 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 18:57:43 crc kubenswrapper[4826]: E0319 18:57:43.554046 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:57:44.553995552 +0000 UTC m=+89.308063895 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:57:43 crc kubenswrapper[4826]: E0319 18:57:43.554112 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 18:57:44.554095745 +0000 UTC m=+89.308164318 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 18:57:43 crc kubenswrapper[4826]: E0319 18:57:43.554129 4826 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 18:57:43 crc kubenswrapper[4826]: E0319 18:57:43.554260 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 18:57:44.554230069 +0000 UTC m=+89.308298412 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.602786 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.602866 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.602890 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.602930 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.602956 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:43Z","lastTransitionTime":"2026-03-19T18:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.655273 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.655354 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 18:57:43 crc kubenswrapper[4826]: E0319 18:57:43.655517 4826 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 18:57:43 crc kubenswrapper[4826]: E0319 18:57:43.655544 4826 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 18:57:43 crc kubenswrapper[4826]: E0319 18:57:43.655558 4826 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 18:57:43 crc kubenswrapper[4826]: E0319 18:57:43.655620 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 18:57:44.655601475 +0000 UTC m=+89.409669788 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 18:57:43 crc kubenswrapper[4826]: E0319 18:57:43.655766 4826 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 18:57:43 crc kubenswrapper[4826]: E0319 18:57:43.655803 4826 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 18:57:43 crc kubenswrapper[4826]: E0319 18:57:43.655818 4826 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 18:57:43 crc kubenswrapper[4826]: E0319 18:57:43.655914 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 18:57:44.655887353 +0000 UTC m=+89.409955676 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.706286 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.706355 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.706383 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.706416 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.706439 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:43Z","lastTransitionTime":"2026-03-19T18:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.809513 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.809572 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.809589 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.809608 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.809620 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:43Z","lastTransitionTime":"2026-03-19T18:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.912873 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.912946 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.912964 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.912993 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.913011 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:43Z","lastTransitionTime":"2026-03-19T18:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.983020 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.984180 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.985814 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.986719 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.988149 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.988960 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.990050 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.991497 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.992485 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.994071 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.994868 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.996392 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.996930 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.997421 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.998401 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 19 18:57:43 crc kubenswrapper[4826]: I0319 18:57:43.998931 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:43.999976 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.000377 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.001090 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.002065 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.002517 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.003491 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.004060 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.005089 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.005539 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.006155 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.007196 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.007722 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.009107 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.010375 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.011397 4826 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.012425 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.015533 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.016254 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.016285 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.016294 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.016311 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.016322 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:44Z","lastTransitionTime":"2026-03-19T18:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.016605 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.017455 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.020524 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.021297 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.022637 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.023412 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.024538 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.025163 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.025795 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.027037 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.028110 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.028705 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.029584 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.030314 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.032931 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.033296 4826 scope.go:117] "RemoveContainer" containerID="d6543dc21146ffce18eefd1d6f58480662c580fc8dbb20550656709811dd6cc7" Mar 19 18:57:44 crc kubenswrapper[4826]: E0319 18:57:44.033571 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.034536 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.035482 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.035956 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.036724 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.038361 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.039473 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.041450 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.119459 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.119497 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.119507 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.119521 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.119532 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:44Z","lastTransitionTime":"2026-03-19T18:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.223082 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.223160 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.223180 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.223210 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.223233 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:44Z","lastTransitionTime":"2026-03-19T18:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.325856 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.325925 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.325942 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.325965 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.325979 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:44Z","lastTransitionTime":"2026-03-19T18:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.408682 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"7ae146e79708b3c4f9336667a00aec954144547dc1d5c45ad80eaed9469db0b3"} Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.408803 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"dc95fdf8e065c97198bd621aca51c2184ee4da44d93565f9282b4003aa861263"} Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.411075 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"63ba500b673a9a0f5fcc94e8d9abb40ab97cd76e6ea34b72828f9e432e8038f6"} Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.411521 4826 scope.go:117] "RemoveContainer" containerID="d6543dc21146ffce18eefd1d6f58480662c580fc8dbb20550656709811dd6cc7" Mar 19 18:57:44 crc kubenswrapper[4826]: E0319 18:57:44.411724 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.428878 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.428959 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.428991 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.429028 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.429060 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:44Z","lastTransitionTime":"2026-03-19T18:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.439136 4826 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e43f424d-e1b9-437a-9f69-704a183575d4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T18:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T18:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T18:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T18:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T18:56:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c7e8bd5e1686bfadfc6f61c8436c0ca538cebbebb8fdafa685621c729b143ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T18:56:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea714363aa8ce7507efbab8cbb23b850bb2fa272d7cf20eb2c9eb8af0a3da21\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T18:56:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbf79230bb3f40d8a5de7b681913877be3e763cae02c99c6ebe12bff0e0319ab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T18:56:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6543dc21146ffce18eefd1d6f58480662c580fc8dbb20550656709811dd6cc7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6543dc21146ffce18eefd1d6f58480662c580fc8dbb20550656709811dd6cc7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T18:57:23Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 18:57:22.674570 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 18:57:22.674875 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 18:57:22.676118 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1188538338/tls.crt::/tmp/serving-cert-1188538338/tls.key\\\\\\\"\\\\nI0319 18:57:23.159174 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 18:57:23.162482 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 18:57:23.162505 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 18:57:23.162534 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 18:57:23.162545 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 18:57:23.171891 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 18:57:23.171895 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 18:57:23.171946 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 18:57:23.171957 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 18:57:23.171966 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 18:57:23.171972 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 18:57:23.171978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 18:57:23.171984 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 18:57:23.174477 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T18:57:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b5b9841cd846e58c72f4acc03b1509604b816bef5c45da0fc98f7483671822a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T18:56:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://114ac8bf22a2fbdae76be3b65c0c6a0b81a43812c8fb3559af532d5f14eb50d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://114ac8bf22a2fbdae76be3b65c0c6a0b81a43812c8fb3559af532d5f14eb50d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T18:56:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T18:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T18:56:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:57:44Z is after 2025-08-24T17:21:41Z" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.453413 4826 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:57:44Z is after 2025-08-24T17:21:41Z" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.472372 4826 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ae146e79708b3c4f9336667a00aec954144547dc1d5c45ad80eaed9469db0b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T18:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc95fdf8e065c97198bd621aca51c2184ee4da44d93565f9282b4003aa861263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T18:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:57:44Z is after 2025-08-24T17:21:41Z" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.486967 4826 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:42Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:57:44Z is after 2025-08-24T17:21:41Z" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.502104 4826 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:57:44Z is after 2025-08-24T17:21:41Z" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.532170 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.532238 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.532253 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.532274 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.532287 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:44Z","lastTransitionTime":"2026-03-19T18:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.534893 4826 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:57:44Z is after 2025-08-24T17:21:41Z" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.560135 4826 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:57:44Z is after 2025-08-24T17:21:41Z" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.563434 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.563518 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:57:44 crc kubenswrapper[4826]: E0319 18:57:44.563638 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:57:46.563602026 +0000 UTC m=+91.317670339 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:57:44 crc kubenswrapper[4826]: E0319 18:57:44.563760 4826 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 18:57:44 crc kubenswrapper[4826]: E0319 18:57:44.563888 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 18:57:46.563863142 +0000 UTC m=+91.317931455 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.563790 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:57:44 crc kubenswrapper[4826]: E0319 18:57:44.563963 4826 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 18:57:44 crc kubenswrapper[4826]: E0319 18:57:44.564032 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 18:57:46.564019837 +0000 UTC m=+91.318088160 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.582027 4826 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e43f424d-e1b9-437a-9f69-704a183575d4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T18:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T18:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T18:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T18:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T18:56:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c7e8bd5e1686bfadfc6f61c8436c0ca538cebbebb8fdafa685621c729b143ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T18:56:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ea714363aa8ce7507efbab8cbb23b850bb2fa272d7cf20eb2c9eb8af0a3da21\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T18:56:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbf79230bb3f40d8a5de7b681913877be3e763cae02c99c6ebe12bff0e0319ab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T18:56:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6543dc21146ffce18eefd1d6f58480662c580fc8dbb20550656709811dd6cc7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6543dc21146ffce18eefd1d6f58480662c580fc8dbb20550656709811dd6cc7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T18:57:23Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 18:57:22.674570 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 18:57:22.674875 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 18:57:22.676118 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1188538338/tls.crt::/tmp/serving-cert-1188538338/tls.key\\\\\\\"\\\\nI0319 18:57:23.159174 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 18:57:23.162482 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 18:57:23.162505 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 18:57:23.162534 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 18:57:23.162545 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 18:57:23.171891 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 18:57:23.171895 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 18:57:23.171946 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 18:57:23.171957 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 18:57:23.171966 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 18:57:23.171972 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 18:57:23.171978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 18:57:23.171984 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 18:57:23.174477 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T18:57:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b5b9841cd846e58c72f4acc03b1509604b816bef5c45da0fc98f7483671822a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T18:56:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://114ac8bf22a2fbdae76be3b65c0c6a0b81a43812c8fb3559af532d5f14eb50d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://114ac8bf22a2fbdae76be3b65c0c6a0b81a43812c8fb3559af532d5f14eb50d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T18:56:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T18:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T18:56:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:57:44Z is after 2025-08-24T17:21:41Z" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.603623 4826 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:57:44Z is after 2025-08-24T17:21:41Z" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.619223 4826 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ae146e79708b3c4f9336667a00aec954144547dc1d5c45ad80eaed9469db0b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T18:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc95fdf8e065c97198bd621aca51c2184ee4da44d93565f9282b4003aa861263\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T18:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:57:44Z is after 2025-08-24T17:21:41Z" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.632450 4826 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:42Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:57:44Z is after 2025-08-24T17:21:41Z" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.634557 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.634685 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.634771 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.634864 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.634934 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:44Z","lastTransitionTime":"2026-03-19T18:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.647080 4826 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:57:44Z is after 2025-08-24T17:21:41Z" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.663133 4826 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63ba500b673a9a0f5fcc94e8d9abb40ab97cd76e6ea34b72828f9e432e8038f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T18:57:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:57:44Z is after 2025-08-24T17:21:41Z" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.664312 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.664428 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 18:57:44 crc kubenswrapper[4826]: E0319 18:57:44.664594 4826 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 18:57:44 crc kubenswrapper[4826]: E0319 18:57:44.664678 4826 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 18:57:44 crc kubenswrapper[4826]: E0319 18:57:44.664702 4826 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 18:57:44 crc kubenswrapper[4826]: E0319 18:57:44.664613 4826 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 18:57:44 crc kubenswrapper[4826]: E0319 18:57:44.664846 4826 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 18:57:44 crc kubenswrapper[4826]: E0319 18:57:44.664909 4826 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 18:57:44 crc kubenswrapper[4826]: E0319 18:57:44.664805 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 18:57:46.664753376 +0000 UTC m=+91.418821729 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 18:57:44 crc kubenswrapper[4826]: E0319 18:57:44.665035 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 18:57:46.665002742 +0000 UTC m=+91.419071085 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.678163 4826 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T18:57:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T18:57:44Z is after 2025-08-24T17:21:41Z" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.737163 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.737200 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.737209 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.737227 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.737241 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:44Z","lastTransitionTime":"2026-03-19T18:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.839796 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.839848 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.839859 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.839879 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.839891 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:44Z","lastTransitionTime":"2026-03-19T18:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.942967 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.943013 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.943028 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.943050 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.943066 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:44Z","lastTransitionTime":"2026-03-19T18:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.975980 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:57:44 crc kubenswrapper[4826]: E0319 18:57:44.976127 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.976497 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 18:57:44 crc kubenswrapper[4826]: I0319 18:57:44.976699 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 18:57:44 crc kubenswrapper[4826]: E0319 18:57:44.976944 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 18:57:44 crc kubenswrapper[4826]: E0319 18:57:44.976731 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 18:57:45 crc kubenswrapper[4826]: I0319 18:57:45.045917 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:45 crc kubenswrapper[4826]: I0319 18:57:45.045943 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:45 crc kubenswrapper[4826]: I0319 18:57:45.045952 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:45 crc kubenswrapper[4826]: I0319 18:57:45.045966 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:45 crc kubenswrapper[4826]: I0319 18:57:45.045976 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:45Z","lastTransitionTime":"2026-03-19T18:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:45 crc kubenswrapper[4826]: I0319 18:57:45.148320 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:45 crc kubenswrapper[4826]: I0319 18:57:45.148361 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:45 crc kubenswrapper[4826]: I0319 18:57:45.148370 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:45 crc kubenswrapper[4826]: I0319 18:57:45.148385 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:45 crc kubenswrapper[4826]: I0319 18:57:45.148395 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:45Z","lastTransitionTime":"2026-03-19T18:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:45 crc kubenswrapper[4826]: I0319 18:57:45.250603 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:45 crc kubenswrapper[4826]: I0319 18:57:45.250673 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:45 crc kubenswrapper[4826]: I0319 18:57:45.250688 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:45 crc kubenswrapper[4826]: I0319 18:57:45.250713 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:45 crc kubenswrapper[4826]: I0319 18:57:45.250726 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:45Z","lastTransitionTime":"2026-03-19T18:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:45 crc kubenswrapper[4826]: I0319 18:57:45.353513 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:45 crc kubenswrapper[4826]: I0319 18:57:45.353951 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:45 crc kubenswrapper[4826]: I0319 18:57:45.353963 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:45 crc kubenswrapper[4826]: I0319 18:57:45.353981 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:45 crc kubenswrapper[4826]: I0319 18:57:45.354004 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:45Z","lastTransitionTime":"2026-03-19T18:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:45 crc kubenswrapper[4826]: I0319 18:57:45.456528 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:45 crc kubenswrapper[4826]: I0319 18:57:45.456593 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:45 crc kubenswrapper[4826]: I0319 18:57:45.456615 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:45 crc kubenswrapper[4826]: I0319 18:57:45.456647 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:45 crc kubenswrapper[4826]: I0319 18:57:45.456713 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:45Z","lastTransitionTime":"2026-03-19T18:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:45 crc kubenswrapper[4826]: I0319 18:57:45.559024 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:45 crc kubenswrapper[4826]: I0319 18:57:45.559078 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:45 crc kubenswrapper[4826]: I0319 18:57:45.559087 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:45 crc kubenswrapper[4826]: I0319 18:57:45.559105 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:45 crc kubenswrapper[4826]: I0319 18:57:45.559120 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:45Z","lastTransitionTime":"2026-03-19T18:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:45 crc kubenswrapper[4826]: I0319 18:57:45.661881 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:45 crc kubenswrapper[4826]: I0319 18:57:45.661939 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:45 crc kubenswrapper[4826]: I0319 18:57:45.661949 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:45 crc kubenswrapper[4826]: I0319 18:57:45.661969 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:45 crc kubenswrapper[4826]: I0319 18:57:45.661979 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:45Z","lastTransitionTime":"2026-03-19T18:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:45 crc kubenswrapper[4826]: I0319 18:57:45.764924 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:45 crc kubenswrapper[4826]: I0319 18:57:45.764974 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:45 crc kubenswrapper[4826]: I0319 18:57:45.764985 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:45 crc kubenswrapper[4826]: I0319 18:57:45.765009 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:45 crc kubenswrapper[4826]: I0319 18:57:45.765021 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:45Z","lastTransitionTime":"2026-03-19T18:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:45 crc kubenswrapper[4826]: I0319 18:57:45.868736 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:45 crc kubenswrapper[4826]: I0319 18:57:45.868784 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:45 crc kubenswrapper[4826]: I0319 18:57:45.868804 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:45 crc kubenswrapper[4826]: I0319 18:57:45.868830 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:45 crc kubenswrapper[4826]: I0319 18:57:45.868848 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:45Z","lastTransitionTime":"2026-03-19T18:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:45 crc kubenswrapper[4826]: I0319 18:57:45.973238 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:45 crc kubenswrapper[4826]: I0319 18:57:45.973290 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:45 crc kubenswrapper[4826]: I0319 18:57:45.973310 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:45 crc kubenswrapper[4826]: I0319 18:57:45.973328 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:45 crc kubenswrapper[4826]: I0319 18:57:45.973339 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:45Z","lastTransitionTime":"2026-03-19T18:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:46 crc kubenswrapper[4826]: I0319 18:57:46.075989 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:46 crc kubenswrapper[4826]: I0319 18:57:46.076059 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:46 crc kubenswrapper[4826]: I0319 18:57:46.076085 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:46 crc kubenswrapper[4826]: I0319 18:57:46.076118 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:46 crc kubenswrapper[4826]: I0319 18:57:46.076141 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:46Z","lastTransitionTime":"2026-03-19T18:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:46 crc kubenswrapper[4826]: I0319 18:57:46.180471 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:46 crc kubenswrapper[4826]: I0319 18:57:46.180539 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:46 crc kubenswrapper[4826]: I0319 18:57:46.180557 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:46 crc kubenswrapper[4826]: I0319 18:57:46.180581 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:46 crc kubenswrapper[4826]: I0319 18:57:46.180599 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:46Z","lastTransitionTime":"2026-03-19T18:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:46 crc kubenswrapper[4826]: I0319 18:57:46.283761 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:46 crc kubenswrapper[4826]: I0319 18:57:46.283824 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:46 crc kubenswrapper[4826]: I0319 18:57:46.283836 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:46 crc kubenswrapper[4826]: I0319 18:57:46.283858 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:46 crc kubenswrapper[4826]: I0319 18:57:46.283871 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:46Z","lastTransitionTime":"2026-03-19T18:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:46 crc kubenswrapper[4826]: I0319 18:57:46.387786 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:46 crc kubenswrapper[4826]: I0319 18:57:46.387876 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:46 crc kubenswrapper[4826]: I0319 18:57:46.387900 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:46 crc kubenswrapper[4826]: I0319 18:57:46.387936 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:46 crc kubenswrapper[4826]: I0319 18:57:46.387960 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:46Z","lastTransitionTime":"2026-03-19T18:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:46 crc kubenswrapper[4826]: I0319 18:57:46.490809 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:46 crc kubenswrapper[4826]: I0319 18:57:46.490860 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:46 crc kubenswrapper[4826]: I0319 18:57:46.490873 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:46 crc kubenswrapper[4826]: I0319 18:57:46.490896 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:46 crc kubenswrapper[4826]: I0319 18:57:46.490912 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:46Z","lastTransitionTime":"2026-03-19T18:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:46 crc kubenswrapper[4826]: I0319 18:57:46.587724 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:57:46 crc kubenswrapper[4826]: I0319 18:57:46.587924 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:57:46 crc kubenswrapper[4826]: E0319 18:57:46.588024 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:57:50.587966531 +0000 UTC m=+95.342034884 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:57:46 crc kubenswrapper[4826]: E0319 18:57:46.588074 4826 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 18:57:46 crc kubenswrapper[4826]: E0319 18:57:46.588179 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 18:57:50.588135765 +0000 UTC m=+95.342204118 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 18:57:46 crc kubenswrapper[4826]: I0319 18:57:46.588323 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:57:46 crc kubenswrapper[4826]: E0319 18:57:46.588590 4826 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 18:57:46 crc kubenswrapper[4826]: E0319 18:57:46.588796 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 18:57:50.588763823 +0000 UTC m=+95.342832166 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 18:57:46 crc kubenswrapper[4826]: I0319 18:57:46.594594 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:46 crc kubenswrapper[4826]: I0319 18:57:46.594709 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:46 crc kubenswrapper[4826]: I0319 18:57:46.594736 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:46 crc kubenswrapper[4826]: I0319 18:57:46.594768 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:46 crc kubenswrapper[4826]: I0319 18:57:46.594789 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:46Z","lastTransitionTime":"2026-03-19T18:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:46 crc kubenswrapper[4826]: I0319 18:57:46.689236 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 18:57:46 crc kubenswrapper[4826]: I0319 18:57:46.689304 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 18:57:46 crc kubenswrapper[4826]: E0319 18:57:46.689441 4826 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 18:57:46 crc kubenswrapper[4826]: E0319 18:57:46.689459 4826 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 18:57:46 crc kubenswrapper[4826]: E0319 18:57:46.689473 4826 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 18:57:46 crc kubenswrapper[4826]: E0319 18:57:46.689529 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 18:57:50.689511342 +0000 UTC m=+95.443579675 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 18:57:46 crc kubenswrapper[4826]: E0319 18:57:46.689593 4826 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 18:57:46 crc kubenswrapper[4826]: E0319 18:57:46.689606 4826 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 18:57:46 crc kubenswrapper[4826]: E0319 18:57:46.689617 4826 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 18:57:46 crc kubenswrapper[4826]: E0319 18:57:46.689646 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 18:57:50.689637526 +0000 UTC m=+95.443705849 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 18:57:46 crc kubenswrapper[4826]: I0319 18:57:46.698206 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:46 crc kubenswrapper[4826]: I0319 18:57:46.698398 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:46 crc kubenswrapper[4826]: I0319 18:57:46.698500 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:46 crc kubenswrapper[4826]: I0319 18:57:46.698604 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:46 crc kubenswrapper[4826]: I0319 18:57:46.698764 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:46Z","lastTransitionTime":"2026-03-19T18:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:46 crc kubenswrapper[4826]: I0319 18:57:46.812093 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:46 crc kubenswrapper[4826]: I0319 18:57:46.812149 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:46 crc kubenswrapper[4826]: I0319 18:57:46.812169 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:46 crc kubenswrapper[4826]: I0319 18:57:46.812194 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:46 crc kubenswrapper[4826]: I0319 18:57:46.812212 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:46Z","lastTransitionTime":"2026-03-19T18:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:46 crc kubenswrapper[4826]: I0319 18:57:46.916313 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:46 crc kubenswrapper[4826]: I0319 18:57:46.916377 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:46 crc kubenswrapper[4826]: I0319 18:57:46.916396 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:46 crc kubenswrapper[4826]: I0319 18:57:46.916424 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:46 crc kubenswrapper[4826]: I0319 18:57:46.916448 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:46Z","lastTransitionTime":"2026-03-19T18:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:46 crc kubenswrapper[4826]: I0319 18:57:46.976081 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 18:57:46 crc kubenswrapper[4826]: I0319 18:57:46.976205 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:57:46 crc kubenswrapper[4826]: E0319 18:57:46.976260 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 18:57:46 crc kubenswrapper[4826]: I0319 18:57:46.976295 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 18:57:46 crc kubenswrapper[4826]: E0319 18:57:46.976425 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 18:57:46 crc kubenswrapper[4826]: E0319 18:57:46.976586 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 18:57:47 crc kubenswrapper[4826]: I0319 18:57:47.020319 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:47 crc kubenswrapper[4826]: I0319 18:57:47.020386 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:47 crc kubenswrapper[4826]: I0319 18:57:47.020403 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:47 crc kubenswrapper[4826]: I0319 18:57:47.020429 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:47 crc kubenswrapper[4826]: I0319 18:57:47.020448 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:47Z","lastTransitionTime":"2026-03-19T18:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:47 crc kubenswrapper[4826]: I0319 18:57:47.122536 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:47 crc kubenswrapper[4826]: I0319 18:57:47.122614 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:47 crc kubenswrapper[4826]: I0319 18:57:47.122640 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:47 crc kubenswrapper[4826]: I0319 18:57:47.122710 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:47 crc kubenswrapper[4826]: I0319 18:57:47.122738 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:47Z","lastTransitionTime":"2026-03-19T18:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:47 crc kubenswrapper[4826]: I0319 18:57:47.225523 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:47 crc kubenswrapper[4826]: I0319 18:57:47.225611 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:47 crc kubenswrapper[4826]: I0319 18:57:47.225641 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:47 crc kubenswrapper[4826]: I0319 18:57:47.225708 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:47 crc kubenswrapper[4826]: I0319 18:57:47.225737 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:47Z","lastTransitionTime":"2026-03-19T18:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:47 crc kubenswrapper[4826]: I0319 18:57:47.328573 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:47 crc kubenswrapper[4826]: I0319 18:57:47.328631 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:47 crc kubenswrapper[4826]: I0319 18:57:47.328645 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:47 crc kubenswrapper[4826]: I0319 18:57:47.328696 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:47 crc kubenswrapper[4826]: I0319 18:57:47.328713 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:47Z","lastTransitionTime":"2026-03-19T18:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:47 crc kubenswrapper[4826]: I0319 18:57:47.425186 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"d4b01c2c24651b431ad713bd0626b8d25efed04190a8ecbdd8781745c2fef128"} Mar 19 18:57:47 crc kubenswrapper[4826]: I0319 18:57:47.432711 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:47 crc kubenswrapper[4826]: I0319 18:57:47.432882 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:47 crc kubenswrapper[4826]: I0319 18:57:47.433893 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:47 crc kubenswrapper[4826]: I0319 18:57:47.434035 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:47 crc kubenswrapper[4826]: I0319 18:57:47.434146 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:47Z","lastTransitionTime":"2026-03-19T18:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:47 crc kubenswrapper[4826]: I0319 18:57:47.540684 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:47 crc kubenswrapper[4826]: I0319 18:57:47.540778 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:47 crc kubenswrapper[4826]: I0319 18:57:47.540795 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:47 crc kubenswrapper[4826]: I0319 18:57:47.540818 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:47 crc kubenswrapper[4826]: I0319 18:57:47.540830 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:47Z","lastTransitionTime":"2026-03-19T18:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:47 crc kubenswrapper[4826]: I0319 18:57:47.644797 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:47 crc kubenswrapper[4826]: I0319 18:57:47.645165 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:47 crc kubenswrapper[4826]: I0319 18:57:47.645323 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:47 crc kubenswrapper[4826]: I0319 18:57:47.645460 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:47 crc kubenswrapper[4826]: I0319 18:57:47.645825 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:47Z","lastTransitionTime":"2026-03-19T18:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:47 crc kubenswrapper[4826]: I0319 18:57:47.750640 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:47 crc kubenswrapper[4826]: I0319 18:57:47.750772 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:47 crc kubenswrapper[4826]: I0319 18:57:47.750788 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:47 crc kubenswrapper[4826]: I0319 18:57:47.750811 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:47 crc kubenswrapper[4826]: I0319 18:57:47.750827 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:47Z","lastTransitionTime":"2026-03-19T18:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:47 crc kubenswrapper[4826]: I0319 18:57:47.853953 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:47 crc kubenswrapper[4826]: I0319 18:57:47.854368 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:47 crc kubenswrapper[4826]: I0319 18:57:47.854448 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:47 crc kubenswrapper[4826]: I0319 18:57:47.854556 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:47 crc kubenswrapper[4826]: I0319 18:57:47.854720 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:47Z","lastTransitionTime":"2026-03-19T18:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:47 crc kubenswrapper[4826]: I0319 18:57:47.958328 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:47 crc kubenswrapper[4826]: I0319 18:57:47.959481 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:47 crc kubenswrapper[4826]: I0319 18:57:47.959784 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:47 crc kubenswrapper[4826]: I0319 18:57:47.959988 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:47 crc kubenswrapper[4826]: I0319 18:57:47.960142 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:47Z","lastTransitionTime":"2026-03-19T18:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:48 crc kubenswrapper[4826]: I0319 18:57:48.063786 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:48 crc kubenswrapper[4826]: I0319 18:57:48.065057 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:48 crc kubenswrapper[4826]: I0319 18:57:48.065154 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:48 crc kubenswrapper[4826]: I0319 18:57:48.065242 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:48 crc kubenswrapper[4826]: I0319 18:57:48.065324 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:48Z","lastTransitionTime":"2026-03-19T18:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:48 crc kubenswrapper[4826]: I0319 18:57:48.168210 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:48 crc kubenswrapper[4826]: I0319 18:57:48.168255 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:48 crc kubenswrapper[4826]: I0319 18:57:48.168276 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:48 crc kubenswrapper[4826]: I0319 18:57:48.168307 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:48 crc kubenswrapper[4826]: I0319 18:57:48.168330 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:48Z","lastTransitionTime":"2026-03-19T18:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:48 crc kubenswrapper[4826]: I0319 18:57:48.272433 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:48 crc kubenswrapper[4826]: I0319 18:57:48.272955 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:48 crc kubenswrapper[4826]: I0319 18:57:48.273205 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:48 crc kubenswrapper[4826]: I0319 18:57:48.273387 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:48 crc kubenswrapper[4826]: I0319 18:57:48.273564 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:48Z","lastTransitionTime":"2026-03-19T18:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:48 crc kubenswrapper[4826]: I0319 18:57:48.377188 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:48 crc kubenswrapper[4826]: I0319 18:57:48.377594 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:48 crc kubenswrapper[4826]: I0319 18:57:48.377736 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:48 crc kubenswrapper[4826]: I0319 18:57:48.377832 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:48 crc kubenswrapper[4826]: I0319 18:57:48.377910 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:48Z","lastTransitionTime":"2026-03-19T18:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:48 crc kubenswrapper[4826]: I0319 18:57:48.481784 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:48 crc kubenswrapper[4826]: I0319 18:57:48.481842 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:48 crc kubenswrapper[4826]: I0319 18:57:48.481861 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:48 crc kubenswrapper[4826]: I0319 18:57:48.481889 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:48 crc kubenswrapper[4826]: I0319 18:57:48.481909 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:48Z","lastTransitionTime":"2026-03-19T18:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:48 crc kubenswrapper[4826]: I0319 18:57:48.585188 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:48 crc kubenswrapper[4826]: I0319 18:57:48.585254 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:48 crc kubenswrapper[4826]: I0319 18:57:48.585272 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:48 crc kubenswrapper[4826]: I0319 18:57:48.585301 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:48 crc kubenswrapper[4826]: I0319 18:57:48.585320 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:48Z","lastTransitionTime":"2026-03-19T18:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:48 crc kubenswrapper[4826]: I0319 18:57:48.687841 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:48 crc kubenswrapper[4826]: I0319 18:57:48.688232 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:48 crc kubenswrapper[4826]: I0319 18:57:48.688309 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:48 crc kubenswrapper[4826]: I0319 18:57:48.688387 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:48 crc kubenswrapper[4826]: I0319 18:57:48.688457 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:48Z","lastTransitionTime":"2026-03-19T18:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:48 crc kubenswrapper[4826]: I0319 18:57:48.791684 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:48 crc kubenswrapper[4826]: I0319 18:57:48.791728 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:48 crc kubenswrapper[4826]: I0319 18:57:48.791738 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:48 crc kubenswrapper[4826]: I0319 18:57:48.791752 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:48 crc kubenswrapper[4826]: I0319 18:57:48.791762 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:48Z","lastTransitionTime":"2026-03-19T18:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:48 crc kubenswrapper[4826]: I0319 18:57:48.894508 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:48 crc kubenswrapper[4826]: I0319 18:57:48.894609 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:48 crc kubenswrapper[4826]: I0319 18:57:48.894629 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:48 crc kubenswrapper[4826]: I0319 18:57:48.894684 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:48 crc kubenswrapper[4826]: I0319 18:57:48.894704 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:48Z","lastTransitionTime":"2026-03-19T18:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:48 crc kubenswrapper[4826]: I0319 18:57:48.975486 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:57:48 crc kubenswrapper[4826]: I0319 18:57:48.975537 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 18:57:48 crc kubenswrapper[4826]: I0319 18:57:48.975537 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 18:57:48 crc kubenswrapper[4826]: E0319 18:57:48.975741 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 18:57:48 crc kubenswrapper[4826]: E0319 18:57:48.975882 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 18:57:48 crc kubenswrapper[4826]: E0319 18:57:48.975977 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 18:57:49 crc kubenswrapper[4826]: I0319 18:57:49.002779 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:49 crc kubenswrapper[4826]: I0319 18:57:49.002829 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:49 crc kubenswrapper[4826]: I0319 18:57:49.002845 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:49 crc kubenswrapper[4826]: I0319 18:57:49.002871 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:49 crc kubenswrapper[4826]: I0319 18:57:49.002889 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:49Z","lastTransitionTime":"2026-03-19T18:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:49 crc kubenswrapper[4826]: I0319 18:57:49.106265 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:49 crc kubenswrapper[4826]: I0319 18:57:49.106337 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:49 crc kubenswrapper[4826]: I0319 18:57:49.106357 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:49 crc kubenswrapper[4826]: I0319 18:57:49.106387 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:49 crc kubenswrapper[4826]: I0319 18:57:49.106407 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:49Z","lastTransitionTime":"2026-03-19T18:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:49 crc kubenswrapper[4826]: I0319 18:57:49.209507 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:49 crc kubenswrapper[4826]: I0319 18:57:49.209588 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:49 crc kubenswrapper[4826]: I0319 18:57:49.209605 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:49 crc kubenswrapper[4826]: I0319 18:57:49.209629 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:49 crc kubenswrapper[4826]: I0319 18:57:49.209641 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:49Z","lastTransitionTime":"2026-03-19T18:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:49 crc kubenswrapper[4826]: I0319 18:57:49.312811 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:49 crc kubenswrapper[4826]: I0319 18:57:49.312887 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:49 crc kubenswrapper[4826]: I0319 18:57:49.312911 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:49 crc kubenswrapper[4826]: I0319 18:57:49.312945 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:49 crc kubenswrapper[4826]: I0319 18:57:49.312974 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:49Z","lastTransitionTime":"2026-03-19T18:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:49 crc kubenswrapper[4826]: I0319 18:57:49.415807 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:49 crc kubenswrapper[4826]: I0319 18:57:49.415875 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:49 crc kubenswrapper[4826]: I0319 18:57:49.415895 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:49 crc kubenswrapper[4826]: I0319 18:57:49.415925 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:49 crc kubenswrapper[4826]: I0319 18:57:49.415946 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:49Z","lastTransitionTime":"2026-03-19T18:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:49 crc kubenswrapper[4826]: I0319 18:57:49.519550 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:49 crc kubenswrapper[4826]: I0319 18:57:49.519618 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:49 crc kubenswrapper[4826]: I0319 18:57:49.519638 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:49 crc kubenswrapper[4826]: I0319 18:57:49.519710 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:49 crc kubenswrapper[4826]: I0319 18:57:49.519733 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:49Z","lastTransitionTime":"2026-03-19T18:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:49 crc kubenswrapper[4826]: I0319 18:57:49.623401 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:49 crc kubenswrapper[4826]: I0319 18:57:49.623486 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:49 crc kubenswrapper[4826]: I0319 18:57:49.623504 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:49 crc kubenswrapper[4826]: I0319 18:57:49.623534 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:49 crc kubenswrapper[4826]: I0319 18:57:49.623555 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:49Z","lastTransitionTime":"2026-03-19T18:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:49 crc kubenswrapper[4826]: I0319 18:57:49.726346 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:49 crc kubenswrapper[4826]: I0319 18:57:49.726404 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:49 crc kubenswrapper[4826]: I0319 18:57:49.726423 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:49 crc kubenswrapper[4826]: I0319 18:57:49.726449 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:49 crc kubenswrapper[4826]: I0319 18:57:49.726467 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:49Z","lastTransitionTime":"2026-03-19T18:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:49 crc kubenswrapper[4826]: I0319 18:57:49.829769 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:49 crc kubenswrapper[4826]: I0319 18:57:49.829844 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:49 crc kubenswrapper[4826]: I0319 18:57:49.829873 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:49 crc kubenswrapper[4826]: I0319 18:57:49.830739 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:49 crc kubenswrapper[4826]: I0319 18:57:49.830835 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:49Z","lastTransitionTime":"2026-03-19T18:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:49 crc kubenswrapper[4826]: I0319 18:57:49.933919 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:49 crc kubenswrapper[4826]: I0319 18:57:49.934330 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:49 crc kubenswrapper[4826]: I0319 18:57:49.934422 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:49 crc kubenswrapper[4826]: I0319 18:57:49.934525 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:49 crc kubenswrapper[4826]: I0319 18:57:49.934621 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:49Z","lastTransitionTime":"2026-03-19T18:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:50 crc kubenswrapper[4826]: I0319 18:57:50.037142 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:50 crc kubenswrapper[4826]: I0319 18:57:50.037998 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:50 crc kubenswrapper[4826]: I0319 18:57:50.038105 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:50 crc kubenswrapper[4826]: I0319 18:57:50.038197 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:50 crc kubenswrapper[4826]: I0319 18:57:50.038295 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:50Z","lastTransitionTime":"2026-03-19T18:57:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:50 crc kubenswrapper[4826]: I0319 18:57:50.088113 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 18:57:50 crc kubenswrapper[4826]: I0319 18:57:50.088182 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 18:57:50 crc kubenswrapper[4826]: I0319 18:57:50.088200 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 18:57:50 crc kubenswrapper[4826]: I0319 18:57:50.088229 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 18:57:50 crc kubenswrapper[4826]: I0319 18:57:50.088250 4826 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T18:57:50Z","lastTransitionTime":"2026-03-19T18:57:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 18:57:50 crc kubenswrapper[4826]: I0319 18:57:50.625388 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:57:50 crc kubenswrapper[4826]: I0319 18:57:50.625509 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:57:50 crc kubenswrapper[4826]: E0319 18:57:50.625556 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:57:58.625529309 +0000 UTC m=+103.379597622 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:57:50 crc kubenswrapper[4826]: I0319 18:57:50.625582 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:57:50 crc kubenswrapper[4826]: E0319 18:57:50.625642 4826 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 18:57:50 crc kubenswrapper[4826]: E0319 18:57:50.625677 4826 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 18:57:50 crc kubenswrapper[4826]: E0319 18:57:50.625721 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 18:57:58.625709054 +0000 UTC m=+103.379777367 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 18:57:50 crc kubenswrapper[4826]: E0319 18:57:50.625744 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 18:57:58.625732784 +0000 UTC m=+103.379801097 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 18:57:50 crc kubenswrapper[4826]: I0319 18:57:50.726890 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 18:57:50 crc kubenswrapper[4826]: I0319 18:57:50.727145 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 18:57:50 crc kubenswrapper[4826]: E0319 18:57:50.727086 4826 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 18:57:50 crc kubenswrapper[4826]: E0319 18:57:50.727266 4826 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 18:57:50 crc kubenswrapper[4826]: E0319 18:57:50.727288 4826 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 18:57:50 crc kubenswrapper[4826]: E0319 18:57:50.727380 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 18:57:58.727354228 +0000 UTC m=+103.481422581 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 18:57:50 crc kubenswrapper[4826]: E0319 18:57:50.727481 4826 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 18:57:50 crc kubenswrapper[4826]: E0319 18:57:50.727554 4826 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 18:57:50 crc kubenswrapper[4826]: E0319 18:57:50.727605 4826 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 18:57:50 crc kubenswrapper[4826]: E0319 18:57:50.727817 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 18:57:58.7278077 +0000 UTC m=+103.481876013 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 18:57:50 crc kubenswrapper[4826]: I0319 18:57:50.960258 4826 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 19 18:57:50 crc kubenswrapper[4826]: I0319 18:57:50.975400 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:57:50 crc kubenswrapper[4826]: I0319 18:57:50.975443 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 18:57:50 crc kubenswrapper[4826]: I0319 18:57:50.975489 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 18:57:50 crc kubenswrapper[4826]: E0319 18:57:50.975566 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 18:57:50 crc kubenswrapper[4826]: E0319 18:57:50.975823 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 18:57:50 crc kubenswrapper[4826]: E0319 18:57:50.975955 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 18:57:50 crc kubenswrapper[4826]: I0319 18:57:50.977221 4826 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 19 18:57:52 crc kubenswrapper[4826]: I0319 18:57:52.975965 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 18:57:52 crc kubenswrapper[4826]: I0319 18:57:52.976026 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:57:52 crc kubenswrapper[4826]: I0319 18:57:52.976025 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 18:57:52 crc kubenswrapper[4826]: E0319 18:57:52.977303 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 18:57:52 crc kubenswrapper[4826]: E0319 18:57:52.977047 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 18:57:52 crc kubenswrapper[4826]: E0319 18:57:52.977404 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 18:57:54 crc kubenswrapper[4826]: I0319 18:57:54.975251 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 18:57:54 crc kubenswrapper[4826]: I0319 18:57:54.975387 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:57:54 crc kubenswrapper[4826]: E0319 18:57:54.975436 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 18:57:54 crc kubenswrapper[4826]: I0319 18:57:54.975458 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 18:57:54 crc kubenswrapper[4826]: E0319 18:57:54.975886 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 18:57:54 crc kubenswrapper[4826]: E0319 18:57:54.975943 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 18:57:54 crc kubenswrapper[4826]: I0319 18:57:54.998821 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.014966 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-gznc8"] Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.015757 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-gznc8" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.027303 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.027988 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.028614 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.043731 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-zz87p"] Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.044476 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-jbz9t"] Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.045708 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-fwtqp"] Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.045953 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-jbz9t" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.046286 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-fwtqp" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.046647 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.052335 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.052363 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.052496 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.052537 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.052644 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.052921 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.052942 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.053075 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.053151 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.054941 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.054994 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.055256 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.060407 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-tmdll"] Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.062837 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.064829 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.065140 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.065298 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.065843 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.066234 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.070116 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.070616 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.109639 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=1.109607159 podStartE2EDuration="1.109607159s" podCreationTimestamp="2026-03-19 18:57:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:57:55.085426341 +0000 UTC m=+99.839494674" watchObservedRunningTime="2026-03-19 18:57:55.109607159 +0000 UTC m=+99.863675502" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.141093 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-vpfc2"] Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.142126 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vpfc2" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.143988 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.146116 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.146448 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.146671 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.168940 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-host-kubelet\") pod \"ovnkube-node-tmdll\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.169076 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-systemd-units\") pod \"ovnkube-node-tmdll\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.169138 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-run-systemd\") pod \"ovnkube-node-tmdll\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.169164 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tmdll\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.169248 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5861ee46-4813-4909-ab68-55e3ba84a2b9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jbz9t\" (UID: \"5861ee46-4813-4909-ab68-55e3ba84a2b9\") " pod="openshift-multus/multus-additional-cni-plugins-jbz9t" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.169273 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/66ede589-eceb-497a-b51a-f702f9181969-multus-conf-dir\") pod \"multus-fwtqp\" (UID: \"66ede589-eceb-497a-b51a-f702f9181969\") " pod="openshift-multus/multus-fwtqp" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.169295 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b456fa3f-c7a7-45ca-b560-e7a9b21be05a-proxy-tls\") pod \"machine-config-daemon-zz87p\" (UID: \"b456fa3f-c7a7-45ca-b560-e7a9b21be05a\") " pod="openshift-machine-config-operator/machine-config-daemon-zz87p" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.169318 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/66ede589-eceb-497a-b51a-f702f9181969-cnibin\") pod \"multus-fwtqp\" (UID: \"66ede589-eceb-497a-b51a-f702f9181969\") " pod="openshift-multus/multus-fwtqp" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.169341 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/66ede589-eceb-497a-b51a-f702f9181969-os-release\") pod \"multus-fwtqp\" (UID: \"66ede589-eceb-497a-b51a-f702f9181969\") " pod="openshift-multus/multus-fwtqp" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.169364 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-etc-openvswitch\") pod \"ovnkube-node-tmdll\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.169392 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/66ede589-eceb-497a-b51a-f702f9181969-cni-binary-copy\") pod \"multus-fwtqp\" (UID: \"66ede589-eceb-497a-b51a-f702f9181969\") " pod="openshift-multus/multus-fwtqp" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.169429 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/66ede589-eceb-497a-b51a-f702f9181969-host-var-lib-cni-bin\") pod \"multus-fwtqp\" (UID: \"66ede589-eceb-497a-b51a-f702f9181969\") " pod="openshift-multus/multus-fwtqp" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.169463 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-var-lib-openvswitch\") pod \"ovnkube-node-tmdll\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.169495 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-host-run-ovn-kubernetes\") pod \"ovnkube-node-tmdll\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.169517 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-host-cni-netd\") pod \"ovnkube-node-tmdll\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.169541 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/66ede589-eceb-497a-b51a-f702f9181969-multus-socket-dir-parent\") pod \"multus-fwtqp\" (UID: \"66ede589-eceb-497a-b51a-f702f9181969\") " pod="openshift-multus/multus-fwtqp" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.169679 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/66ede589-eceb-497a-b51a-f702f9181969-host-var-lib-cni-multus\") pod \"multus-fwtqp\" (UID: \"66ede589-eceb-497a-b51a-f702f9181969\") " pod="openshift-multus/multus-fwtqp" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.169793 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhqg7\" (UniqueName: \"kubernetes.io/projected/66ede589-eceb-497a-b51a-f702f9181969-kube-api-access-zhqg7\") pod \"multus-fwtqp\" (UID: \"66ede589-eceb-497a-b51a-f702f9181969\") " pod="openshift-multus/multus-fwtqp" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.169839 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d5a08d12-9af8-4524-8312-dac430ab73ac-ovnkube-config\") pod \"ovnkube-node-tmdll\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.169885 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/6676cf07-e314-4266-b5b4-a8e74e26992c-hosts-file\") pod \"node-resolver-gznc8\" (UID: \"6676cf07-e314-4266-b5b4-a8e74e26992c\") " pod="openshift-dns/node-resolver-gznc8" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.169920 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5861ee46-4813-4909-ab68-55e3ba84a2b9-os-release\") pod \"multus-additional-cni-plugins-jbz9t\" (UID: \"5861ee46-4813-4909-ab68-55e3ba84a2b9\") " pod="openshift-multus/multus-additional-cni-plugins-jbz9t" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.169955 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-host-cni-bin\") pod \"ovnkube-node-tmdll\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.169989 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-node-log\") pod \"ovnkube-node-tmdll\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.170021 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d5a08d12-9af8-4524-8312-dac430ab73ac-ovnkube-script-lib\") pod \"ovnkube-node-tmdll\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.170055 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvjb7\" (UniqueName: \"kubernetes.io/projected/6676cf07-e314-4266-b5b4-a8e74e26992c-kube-api-access-xvjb7\") pod \"node-resolver-gznc8\" (UID: \"6676cf07-e314-4266-b5b4-a8e74e26992c\") " pod="openshift-dns/node-resolver-gznc8" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.170083 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/66ede589-eceb-497a-b51a-f702f9181969-host-run-netns\") pod \"multus-fwtqp\" (UID: \"66ede589-eceb-497a-b51a-f702f9181969\") " pod="openshift-multus/multus-fwtqp" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.170112 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-host-run-netns\") pod \"ovnkube-node-tmdll\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.170144 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b456fa3f-c7a7-45ca-b560-e7a9b21be05a-mcd-auth-proxy-config\") pod \"machine-config-daemon-zz87p\" (UID: \"b456fa3f-c7a7-45ca-b560-e7a9b21be05a\") " pod="openshift-machine-config-operator/machine-config-daemon-zz87p" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.170185 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5861ee46-4813-4909-ab68-55e3ba84a2b9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jbz9t\" (UID: \"5861ee46-4813-4909-ab68-55e3ba84a2b9\") " pod="openshift-multus/multus-additional-cni-plugins-jbz9t" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.170216 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2t87\" (UniqueName: \"kubernetes.io/projected/5861ee46-4813-4909-ab68-55e3ba84a2b9-kube-api-access-j2t87\") pod \"multus-additional-cni-plugins-jbz9t\" (UID: \"5861ee46-4813-4909-ab68-55e3ba84a2b9\") " pod="openshift-multus/multus-additional-cni-plugins-jbz9t" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.170259 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5861ee46-4813-4909-ab68-55e3ba84a2b9-system-cni-dir\") pod \"multus-additional-cni-plugins-jbz9t\" (UID: \"5861ee46-4813-4909-ab68-55e3ba84a2b9\") " pod="openshift-multus/multus-additional-cni-plugins-jbz9t" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.170290 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/66ede589-eceb-497a-b51a-f702f9181969-host-run-multus-certs\") pod \"multus-fwtqp\" (UID: \"66ede589-eceb-497a-b51a-f702f9181969\") " pod="openshift-multus/multus-fwtqp" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.170324 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-host-slash\") pod \"ovnkube-node-tmdll\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.170358 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5861ee46-4813-4909-ab68-55e3ba84a2b9-cni-binary-copy\") pod \"multus-additional-cni-plugins-jbz9t\" (UID: \"5861ee46-4813-4909-ab68-55e3ba84a2b9\") " pod="openshift-multus/multus-additional-cni-plugins-jbz9t" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.170391 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/66ede589-eceb-497a-b51a-f702f9181969-etc-kubernetes\") pod \"multus-fwtqp\" (UID: \"66ede589-eceb-497a-b51a-f702f9181969\") " pod="openshift-multus/multus-fwtqp" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.170423 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5861ee46-4813-4909-ab68-55e3ba84a2b9-cnibin\") pod \"multus-additional-cni-plugins-jbz9t\" (UID: \"5861ee46-4813-4909-ab68-55e3ba84a2b9\") " pod="openshift-multus/multus-additional-cni-plugins-jbz9t" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.170456 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwrv8\" (UniqueName: \"kubernetes.io/projected/b456fa3f-c7a7-45ca-b560-e7a9b21be05a-kube-api-access-qwrv8\") pod \"machine-config-daemon-zz87p\" (UID: \"b456fa3f-c7a7-45ca-b560-e7a9b21be05a\") " pod="openshift-machine-config-operator/machine-config-daemon-zz87p" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.170487 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/66ede589-eceb-497a-b51a-f702f9181969-system-cni-dir\") pod \"multus-fwtqp\" (UID: \"66ede589-eceb-497a-b51a-f702f9181969\") " pod="openshift-multus/multus-fwtqp" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.170514 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/66ede589-eceb-497a-b51a-f702f9181969-multus-cni-dir\") pod \"multus-fwtqp\" (UID: \"66ede589-eceb-497a-b51a-f702f9181969\") " pod="openshift-multus/multus-fwtqp" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.170580 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/66ede589-eceb-497a-b51a-f702f9181969-hostroot\") pod \"multus-fwtqp\" (UID: \"66ede589-eceb-497a-b51a-f702f9181969\") " pod="openshift-multus/multus-fwtqp" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.170611 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/66ede589-eceb-497a-b51a-f702f9181969-multus-daemon-config\") pod \"multus-fwtqp\" (UID: \"66ede589-eceb-497a-b51a-f702f9181969\") " pod="openshift-multus/multus-fwtqp" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.170643 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d5a08d12-9af8-4524-8312-dac430ab73ac-ovn-node-metrics-cert\") pod \"ovnkube-node-tmdll\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.170703 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d5a08d12-9af8-4524-8312-dac430ab73ac-env-overrides\") pod \"ovnkube-node-tmdll\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.170741 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b4qs\" (UniqueName: \"kubernetes.io/projected/d5a08d12-9af8-4524-8312-dac430ab73ac-kube-api-access-4b4qs\") pod \"ovnkube-node-tmdll\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.170808 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-run-openvswitch\") pod \"ovnkube-node-tmdll\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.170844 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-run-ovn\") pod \"ovnkube-node-tmdll\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.170877 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-log-socket\") pod \"ovnkube-node-tmdll\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.170910 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/b456fa3f-c7a7-45ca-b560-e7a9b21be05a-rootfs\") pod \"machine-config-daemon-zz87p\" (UID: \"b456fa3f-c7a7-45ca-b560-e7a9b21be05a\") " pod="openshift-machine-config-operator/machine-config-daemon-zz87p" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.170952 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/66ede589-eceb-497a-b51a-f702f9181969-host-run-k8s-cni-cncf-io\") pod \"multus-fwtqp\" (UID: \"66ede589-eceb-497a-b51a-f702f9181969\") " pod="openshift-multus/multus-fwtqp" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.170978 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/66ede589-eceb-497a-b51a-f702f9181969-host-var-lib-kubelet\") pod \"multus-fwtqp\" (UID: \"66ede589-eceb-497a-b51a-f702f9181969\") " pod="openshift-multus/multus-fwtqp" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.256449 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-gfrv5"] Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.256994 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-gfrv5" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.259139 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.259170 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.259917 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.260040 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.275805 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5861ee46-4813-4909-ab68-55e3ba84a2b9-system-cni-dir\") pod \"multus-additional-cni-plugins-jbz9t\" (UID: \"5861ee46-4813-4909-ab68-55e3ba84a2b9\") " pod="openshift-multus/multus-additional-cni-plugins-jbz9t" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.275916 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/66ede589-eceb-497a-b51a-f702f9181969-host-run-multus-certs\") pod \"multus-fwtqp\" (UID: \"66ede589-eceb-497a-b51a-f702f9181969\") " pod="openshift-multus/multus-fwtqp" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.276005 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-host-slash\") pod \"ovnkube-node-tmdll\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.276043 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5861ee46-4813-4909-ab68-55e3ba84a2b9-system-cni-dir\") pod \"multus-additional-cni-plugins-jbz9t\" (UID: \"5861ee46-4813-4909-ab68-55e3ba84a2b9\") " pod="openshift-multus/multus-additional-cni-plugins-jbz9t" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.276065 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5861ee46-4813-4909-ab68-55e3ba84a2b9-cni-binary-copy\") pod \"multus-additional-cni-plugins-jbz9t\" (UID: \"5861ee46-4813-4909-ab68-55e3ba84a2b9\") " pod="openshift-multus/multus-additional-cni-plugins-jbz9t" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.276182 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/66ede589-eceb-497a-b51a-f702f9181969-etc-kubernetes\") pod \"multus-fwtqp\" (UID: \"66ede589-eceb-497a-b51a-f702f9181969\") " pod="openshift-multus/multus-fwtqp" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.276249 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5861ee46-4813-4909-ab68-55e3ba84a2b9-cnibin\") pod \"multus-additional-cni-plugins-jbz9t\" (UID: \"5861ee46-4813-4909-ab68-55e3ba84a2b9\") " pod="openshift-multus/multus-additional-cni-plugins-jbz9t" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.276311 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwrv8\" (UniqueName: \"kubernetes.io/projected/b456fa3f-c7a7-45ca-b560-e7a9b21be05a-kube-api-access-qwrv8\") pod \"machine-config-daemon-zz87p\" (UID: \"b456fa3f-c7a7-45ca-b560-e7a9b21be05a\") " pod="openshift-machine-config-operator/machine-config-daemon-zz87p" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.276373 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/66ede589-eceb-497a-b51a-f702f9181969-system-cni-dir\") pod \"multus-fwtqp\" (UID: \"66ede589-eceb-497a-b51a-f702f9181969\") " pod="openshift-multus/multus-fwtqp" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.276421 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/66ede589-eceb-497a-b51a-f702f9181969-multus-cni-dir\") pod \"multus-fwtqp\" (UID: \"66ede589-eceb-497a-b51a-f702f9181969\") " pod="openshift-multus/multus-fwtqp" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.276533 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/66ede589-eceb-497a-b51a-f702f9181969-hostroot\") pod \"multus-fwtqp\" (UID: \"66ede589-eceb-497a-b51a-f702f9181969\") " pod="openshift-multus/multus-fwtqp" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.276594 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/66ede589-eceb-497a-b51a-f702f9181969-multus-daemon-config\") pod \"multus-fwtqp\" (UID: \"66ede589-eceb-497a-b51a-f702f9181969\") " pod="openshift-multus/multus-fwtqp" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.276652 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d5a08d12-9af8-4524-8312-dac430ab73ac-ovn-node-metrics-cert\") pod \"ovnkube-node-tmdll\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.276739 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d5a08d12-9af8-4524-8312-dac430ab73ac-env-overrides\") pod \"ovnkube-node-tmdll\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.276786 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b4qs\" (UniqueName: \"kubernetes.io/projected/d5a08d12-9af8-4524-8312-dac430ab73ac-kube-api-access-4b4qs\") pod \"ovnkube-node-tmdll\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.276868 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-run-openvswitch\") pod \"ovnkube-node-tmdll\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.276927 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-run-ovn\") pod \"ovnkube-node-tmdll\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.276977 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-log-socket\") pod \"ovnkube-node-tmdll\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.277026 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/b456fa3f-c7a7-45ca-b560-e7a9b21be05a-rootfs\") pod \"machine-config-daemon-zz87p\" (UID: \"b456fa3f-c7a7-45ca-b560-e7a9b21be05a\") " pod="openshift-machine-config-operator/machine-config-daemon-zz87p" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.277084 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/66ede589-eceb-497a-b51a-f702f9181969-host-run-k8s-cni-cncf-io\") pod \"multus-fwtqp\" (UID: \"66ede589-eceb-497a-b51a-f702f9181969\") " pod="openshift-multus/multus-fwtqp" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.277142 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/66ede589-eceb-497a-b51a-f702f9181969-host-var-lib-kubelet\") pod \"multus-fwtqp\" (UID: \"66ede589-eceb-497a-b51a-f702f9181969\") " pod="openshift-multus/multus-fwtqp" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.277164 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5861ee46-4813-4909-ab68-55e3ba84a2b9-cni-binary-copy\") pod \"multus-additional-cni-plugins-jbz9t\" (UID: \"5861ee46-4813-4909-ab68-55e3ba84a2b9\") " pod="openshift-multus/multus-additional-cni-plugins-jbz9t" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.277193 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-host-kubelet\") pod \"ovnkube-node-tmdll\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.277242 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-systemd-units\") pod \"ovnkube-node-tmdll\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.277263 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/66ede589-eceb-497a-b51a-f702f9181969-host-run-multus-certs\") pod \"multus-fwtqp\" (UID: \"66ede589-eceb-497a-b51a-f702f9181969\") " pod="openshift-multus/multus-fwtqp" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.277290 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-run-systemd\") pod \"ovnkube-node-tmdll\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.277336 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-host-slash\") pod \"ovnkube-node-tmdll\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.277343 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tmdll\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.277398 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5861ee46-4813-4909-ab68-55e3ba84a2b9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jbz9t\" (UID: \"5861ee46-4813-4909-ab68-55e3ba84a2b9\") " pod="openshift-multus/multus-additional-cni-plugins-jbz9t" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.277439 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/66ede589-eceb-497a-b51a-f702f9181969-multus-conf-dir\") pod \"multus-fwtqp\" (UID: \"66ede589-eceb-497a-b51a-f702f9181969\") " pod="openshift-multus/multus-fwtqp" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.277482 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b456fa3f-c7a7-45ca-b560-e7a9b21be05a-proxy-tls\") pod \"machine-config-daemon-zz87p\" (UID: \"b456fa3f-c7a7-45ca-b560-e7a9b21be05a\") " pod="openshift-machine-config-operator/machine-config-daemon-zz87p" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.277538 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/66ede589-eceb-497a-b51a-f702f9181969-cnibin\") pod \"multus-fwtqp\" (UID: \"66ede589-eceb-497a-b51a-f702f9181969\") " pod="openshift-multus/multus-fwtqp" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.277580 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/66ede589-eceb-497a-b51a-f702f9181969-os-release\") pod \"multus-fwtqp\" (UID: \"66ede589-eceb-497a-b51a-f702f9181969\") " pod="openshift-multus/multus-fwtqp" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.277629 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-etc-openvswitch\") pod \"ovnkube-node-tmdll\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.277709 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/66ede589-eceb-497a-b51a-f702f9181969-cni-binary-copy\") pod \"multus-fwtqp\" (UID: \"66ede589-eceb-497a-b51a-f702f9181969\") " pod="openshift-multus/multus-fwtqp" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.277764 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/66ede589-eceb-497a-b51a-f702f9181969-host-var-lib-cni-bin\") pod \"multus-fwtqp\" (UID: \"66ede589-eceb-497a-b51a-f702f9181969\") " pod="openshift-multus/multus-fwtqp" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.277811 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-var-lib-openvswitch\") pod \"ovnkube-node-tmdll\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.277822 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/66ede589-eceb-497a-b51a-f702f9181969-multus-cni-dir\") pod \"multus-fwtqp\" (UID: \"66ede589-eceb-497a-b51a-f702f9181969\") " pod="openshift-multus/multus-fwtqp" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.277878 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/724ceb00-ea74-4a49-b344-bf4d80a988fd-service-ca\") pod \"cluster-version-operator-5c965bbfc6-vpfc2\" (UID: \"724ceb00-ea74-4a49-b344-bf4d80a988fd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vpfc2" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.277919 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/66ede589-eceb-497a-b51a-f702f9181969-hostroot\") pod \"multus-fwtqp\" (UID: \"66ede589-eceb-497a-b51a-f702f9181969\") " pod="openshift-multus/multus-fwtqp" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.277918 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5861ee46-4813-4909-ab68-55e3ba84a2b9-cnibin\") pod \"multus-additional-cni-plugins-jbz9t\" (UID: \"5861ee46-4813-4909-ab68-55e3ba84a2b9\") " pod="openshift-multus/multus-additional-cni-plugins-jbz9t" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.277998 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-host-run-ovn-kubernetes\") pod \"ovnkube-node-tmdll\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.278033 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/66ede589-eceb-497a-b51a-f702f9181969-system-cni-dir\") pod \"multus-fwtqp\" (UID: \"66ede589-eceb-497a-b51a-f702f9181969\") " pod="openshift-multus/multus-fwtqp" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.277935 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-host-run-ovn-kubernetes\") pod \"ovnkube-node-tmdll\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.278065 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-run-openvswitch\") pod \"ovnkube-node-tmdll\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.278092 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/66ede589-eceb-497a-b51a-f702f9181969-etc-kubernetes\") pod \"multus-fwtqp\" (UID: \"66ede589-eceb-497a-b51a-f702f9181969\") " pod="openshift-multus/multus-fwtqp" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.278127 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-host-cni-netd\") pod \"ovnkube-node-tmdll\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.278147 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-run-ovn\") pod \"ovnkube-node-tmdll\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.278171 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/66ede589-eceb-497a-b51a-f702f9181969-multus-socket-dir-parent\") pod \"multus-fwtqp\" (UID: \"66ede589-eceb-497a-b51a-f702f9181969\") " pod="openshift-multus/multus-fwtqp" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.278206 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/66ede589-eceb-497a-b51a-f702f9181969-host-var-lib-cni-multus\") pod \"multus-fwtqp\" (UID: \"66ede589-eceb-497a-b51a-f702f9181969\") " pod="openshift-multus/multus-fwtqp" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.278219 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/66ede589-eceb-497a-b51a-f702f9181969-multus-conf-dir\") pod \"multus-fwtqp\" (UID: \"66ede589-eceb-497a-b51a-f702f9181969\") " pod="openshift-multus/multus-fwtqp" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.278247 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/724ceb00-ea74-4a49-b344-bf4d80a988fd-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-vpfc2\" (UID: \"724ceb00-ea74-4a49-b344-bf4d80a988fd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vpfc2" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.278283 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/724ceb00-ea74-4a49-b344-bf4d80a988fd-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-vpfc2\" (UID: \"724ceb00-ea74-4a49-b344-bf4d80a988fd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vpfc2" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.278343 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhqg7\" (UniqueName: \"kubernetes.io/projected/66ede589-eceb-497a-b51a-f702f9181969-kube-api-access-zhqg7\") pod \"multus-fwtqp\" (UID: \"66ede589-eceb-497a-b51a-f702f9181969\") " pod="openshift-multus/multus-fwtqp" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.278375 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d5a08d12-9af8-4524-8312-dac430ab73ac-ovnkube-config\") pod \"ovnkube-node-tmdll\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.278409 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/724ceb00-ea74-4a49-b344-bf4d80a988fd-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-vpfc2\" (UID: \"724ceb00-ea74-4a49-b344-bf4d80a988fd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vpfc2" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.278433 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/6676cf07-e314-4266-b5b4-a8e74e26992c-hosts-file\") pod \"node-resolver-gznc8\" (UID: \"6676cf07-e314-4266-b5b4-a8e74e26992c\") " pod="openshift-dns/node-resolver-gznc8" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.278464 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5861ee46-4813-4909-ab68-55e3ba84a2b9-os-release\") pod \"multus-additional-cni-plugins-jbz9t\" (UID: \"5861ee46-4813-4909-ab68-55e3ba84a2b9\") " pod="openshift-multus/multus-additional-cni-plugins-jbz9t" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.278499 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-host-cni-bin\") pod \"ovnkube-node-tmdll\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.278522 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-node-log\") pod \"ovnkube-node-tmdll\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.278545 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d5a08d12-9af8-4524-8312-dac430ab73ac-ovnkube-script-lib\") pod \"ovnkube-node-tmdll\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.278580 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/724ceb00-ea74-4a49-b344-bf4d80a988fd-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-vpfc2\" (UID: \"724ceb00-ea74-4a49-b344-bf4d80a988fd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vpfc2" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.278612 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvjb7\" (UniqueName: \"kubernetes.io/projected/6676cf07-e314-4266-b5b4-a8e74e26992c-kube-api-access-xvjb7\") pod \"node-resolver-gznc8\" (UID: \"6676cf07-e314-4266-b5b4-a8e74e26992c\") " pod="openshift-dns/node-resolver-gznc8" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.278641 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/66ede589-eceb-497a-b51a-f702f9181969-host-run-netns\") pod \"multus-fwtqp\" (UID: \"66ede589-eceb-497a-b51a-f702f9181969\") " pod="openshift-multus/multus-fwtqp" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.278790 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5861ee46-4813-4909-ab68-55e3ba84a2b9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jbz9t\" (UID: \"5861ee46-4813-4909-ab68-55e3ba84a2b9\") " pod="openshift-multus/multus-additional-cni-plugins-jbz9t" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.278870 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-log-socket\") pod \"ovnkube-node-tmdll\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.278898 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/66ede589-eceb-497a-b51a-f702f9181969-host-var-lib-cni-multus\") pod \"multus-fwtqp\" (UID: \"66ede589-eceb-497a-b51a-f702f9181969\") " pod="openshift-multus/multus-fwtqp" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.278927 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/66ede589-eceb-497a-b51a-f702f9181969-host-run-netns\") pod \"multus-fwtqp\" (UID: \"66ede589-eceb-497a-b51a-f702f9181969\") " pod="openshift-multus/multus-fwtqp" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.278968 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-host-kubelet\") pod \"ovnkube-node-tmdll\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.279103 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/b456fa3f-c7a7-45ca-b560-e7a9b21be05a-rootfs\") pod \"machine-config-daemon-zz87p\" (UID: \"b456fa3f-c7a7-45ca-b560-e7a9b21be05a\") " pod="openshift-machine-config-operator/machine-config-daemon-zz87p" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.279160 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/66ede589-eceb-497a-b51a-f702f9181969-host-run-k8s-cni-cncf-io\") pod \"multus-fwtqp\" (UID: \"66ede589-eceb-497a-b51a-f702f9181969\") " pod="openshift-multus/multus-fwtqp" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.279199 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/66ede589-eceb-497a-b51a-f702f9181969-multus-daemon-config\") pod \"multus-fwtqp\" (UID: \"66ede589-eceb-497a-b51a-f702f9181969\") " pod="openshift-multus/multus-fwtqp" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.279090 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-host-run-netns\") pod \"ovnkube-node-tmdll\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.279211 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/66ede589-eceb-497a-b51a-f702f9181969-host-var-lib-kubelet\") pod \"multus-fwtqp\" (UID: \"66ede589-eceb-497a-b51a-f702f9181969\") " pod="openshift-multus/multus-fwtqp" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.279001 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-node-log\") pod \"ovnkube-node-tmdll\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.279569 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-host-cni-netd\") pod \"ovnkube-node-tmdll\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.279598 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b456fa3f-c7a7-45ca-b560-e7a9b21be05a-mcd-auth-proxy-config\") pod \"machine-config-daemon-zz87p\" (UID: \"b456fa3f-c7a7-45ca-b560-e7a9b21be05a\") " pod="openshift-machine-config-operator/machine-config-daemon-zz87p" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.279599 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-host-cni-bin\") pod \"ovnkube-node-tmdll\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.279613 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d5a08d12-9af8-4524-8312-dac430ab73ac-ovnkube-config\") pod \"ovnkube-node-tmdll\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.279647 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5861ee46-4813-4909-ab68-55e3ba84a2b9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jbz9t\" (UID: \"5861ee46-4813-4909-ab68-55e3ba84a2b9\") " pod="openshift-multus/multus-additional-cni-plugins-jbz9t" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.279704 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2t87\" (UniqueName: \"kubernetes.io/projected/5861ee46-4813-4909-ab68-55e3ba84a2b9-kube-api-access-j2t87\") pod \"multus-additional-cni-plugins-jbz9t\" (UID: \"5861ee46-4813-4909-ab68-55e3ba84a2b9\") " pod="openshift-multus/multus-additional-cni-plugins-jbz9t" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.279709 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/66ede589-eceb-497a-b51a-f702f9181969-multus-socket-dir-parent\") pod \"multus-fwtqp\" (UID: \"66ede589-eceb-497a-b51a-f702f9181969\") " pod="openshift-multus/multus-fwtqp" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.279727 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-run-systemd\") pod \"ovnkube-node-tmdll\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.279777 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tmdll\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.279603 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-systemd-units\") pod \"ovnkube-node-tmdll\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.279854 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/66ede589-eceb-497a-b51a-f702f9181969-cnibin\") pod \"multus-fwtqp\" (UID: \"66ede589-eceb-497a-b51a-f702f9181969\") " pod="openshift-multus/multus-fwtqp" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.279985 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-host-run-netns\") pod \"ovnkube-node-tmdll\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.280238 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/66ede589-eceb-497a-b51a-f702f9181969-os-release\") pod \"multus-fwtqp\" (UID: \"66ede589-eceb-497a-b51a-f702f9181969\") " pod="openshift-multus/multus-fwtqp" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.280492 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b456fa3f-c7a7-45ca-b560-e7a9b21be05a-mcd-auth-proxy-config\") pod \"machine-config-daemon-zz87p\" (UID: \"b456fa3f-c7a7-45ca-b560-e7a9b21be05a\") " pod="openshift-machine-config-operator/machine-config-daemon-zz87p" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.280559 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/66ede589-eceb-497a-b51a-f702f9181969-cni-binary-copy\") pod \"multus-fwtqp\" (UID: \"66ede589-eceb-497a-b51a-f702f9181969\") " pod="openshift-multus/multus-fwtqp" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.280606 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5861ee46-4813-4909-ab68-55e3ba84a2b9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jbz9t\" (UID: \"5861ee46-4813-4909-ab68-55e3ba84a2b9\") " pod="openshift-multus/multus-additional-cni-plugins-jbz9t" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.280699 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d5a08d12-9af8-4524-8312-dac430ab73ac-env-overrides\") pod \"ovnkube-node-tmdll\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.283701 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-etc-openvswitch\") pod \"ovnkube-node-tmdll\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.283762 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/6676cf07-e314-4266-b5b4-a8e74e26992c-hosts-file\") pod \"node-resolver-gznc8\" (UID: \"6676cf07-e314-4266-b5b4-a8e74e26992c\") " pod="openshift-dns/node-resolver-gznc8" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.283815 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5861ee46-4813-4909-ab68-55e3ba84a2b9-os-release\") pod \"multus-additional-cni-plugins-jbz9t\" (UID: \"5861ee46-4813-4909-ab68-55e3ba84a2b9\") " pod="openshift-multus/multus-additional-cni-plugins-jbz9t" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.283842 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-var-lib-openvswitch\") pod \"ovnkube-node-tmdll\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.283867 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/66ede589-eceb-497a-b51a-f702f9181969-host-var-lib-cni-bin\") pod \"multus-fwtqp\" (UID: \"66ede589-eceb-497a-b51a-f702f9181969\") " pod="openshift-multus/multus-fwtqp" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.285229 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d5a08d12-9af8-4524-8312-dac430ab73ac-ovn-node-metrics-cert\") pod \"ovnkube-node-tmdll\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.285818 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d5a08d12-9af8-4524-8312-dac430ab73ac-ovnkube-script-lib\") pod \"ovnkube-node-tmdll\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.300041 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwrv8\" (UniqueName: \"kubernetes.io/projected/b456fa3f-c7a7-45ca-b560-e7a9b21be05a-kube-api-access-qwrv8\") pod \"machine-config-daemon-zz87p\" (UID: \"b456fa3f-c7a7-45ca-b560-e7a9b21be05a\") " pod="openshift-machine-config-operator/machine-config-daemon-zz87p" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.302054 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b456fa3f-c7a7-45ca-b560-e7a9b21be05a-proxy-tls\") pod \"machine-config-daemon-zz87p\" (UID: \"b456fa3f-c7a7-45ca-b560-e7a9b21be05a\") " pod="openshift-machine-config-operator/machine-config-daemon-zz87p" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.304900 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhqg7\" (UniqueName: \"kubernetes.io/projected/66ede589-eceb-497a-b51a-f702f9181969-kube-api-access-zhqg7\") pod \"multus-fwtqp\" (UID: \"66ede589-eceb-497a-b51a-f702f9181969\") " pod="openshift-multus/multus-fwtqp" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.305849 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b4qs\" (UniqueName: \"kubernetes.io/projected/d5a08d12-9af8-4524-8312-dac430ab73ac-kube-api-access-4b4qs\") pod \"ovnkube-node-tmdll\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.306414 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2t87\" (UniqueName: \"kubernetes.io/projected/5861ee46-4813-4909-ab68-55e3ba84a2b9-kube-api-access-j2t87\") pod \"multus-additional-cni-plugins-jbz9t\" (UID: \"5861ee46-4813-4909-ab68-55e3ba84a2b9\") " pod="openshift-multus/multus-additional-cni-plugins-jbz9t" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.306984 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvjb7\" (UniqueName: \"kubernetes.io/projected/6676cf07-e314-4266-b5b4-a8e74e26992c-kube-api-access-xvjb7\") pod \"node-resolver-gznc8\" (UID: \"6676cf07-e314-4266-b5b4-a8e74e26992c\") " pod="openshift-dns/node-resolver-gznc8" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.349208 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-gznc8" Mar 19 18:57:55 crc kubenswrapper[4826]: W0319 18:57:55.364866 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6676cf07_e314_4266_b5b4_a8e74e26992c.slice/crio-106270e9eec73f3ba931f21d8d8fccf784570b6066d41043ec0d58a1ac8ceb5e WatchSource:0}: Error finding container 106270e9eec73f3ba931f21d8d8fccf784570b6066d41043ec0d58a1ac8ceb5e: Status 404 returned error can't find the container with id 106270e9eec73f3ba931f21d8d8fccf784570b6066d41043ec0d58a1ac8ceb5e Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.375793 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-jbz9t" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.380231 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1ea4070c-03a3-41df-8e7f-b8bd2ade1c46-serviceca\") pod \"node-ca-gfrv5\" (UID: \"1ea4070c-03a3-41df-8e7f-b8bd2ade1c46\") " pod="openshift-image-registry/node-ca-gfrv5" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.380345 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/724ceb00-ea74-4a49-b344-bf4d80a988fd-service-ca\") pod \"cluster-version-operator-5c965bbfc6-vpfc2\" (UID: \"724ceb00-ea74-4a49-b344-bf4d80a988fd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vpfc2" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.380439 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/724ceb00-ea74-4a49-b344-bf4d80a988fd-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-vpfc2\" (UID: \"724ceb00-ea74-4a49-b344-bf4d80a988fd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vpfc2" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.380512 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/724ceb00-ea74-4a49-b344-bf4d80a988fd-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-vpfc2\" (UID: \"724ceb00-ea74-4a49-b344-bf4d80a988fd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vpfc2" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.380594 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/724ceb00-ea74-4a49-b344-bf4d80a988fd-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-vpfc2\" (UID: \"724ceb00-ea74-4a49-b344-bf4d80a988fd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vpfc2" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.380603 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/724ceb00-ea74-4a49-b344-bf4d80a988fd-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-vpfc2\" (UID: \"724ceb00-ea74-4a49-b344-bf4d80a988fd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vpfc2" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.380557 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/724ceb00-ea74-4a49-b344-bf4d80a988fd-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-vpfc2\" (UID: \"724ceb00-ea74-4a49-b344-bf4d80a988fd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vpfc2" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.380695 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/724ceb00-ea74-4a49-b344-bf4d80a988fd-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-vpfc2\" (UID: \"724ceb00-ea74-4a49-b344-bf4d80a988fd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vpfc2" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.380730 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v66c9\" (UniqueName: \"kubernetes.io/projected/1ea4070c-03a3-41df-8e7f-b8bd2ade1c46-kube-api-access-v66c9\") pod \"node-ca-gfrv5\" (UID: \"1ea4070c-03a3-41df-8e7f-b8bd2ade1c46\") " pod="openshift-image-registry/node-ca-gfrv5" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.380755 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1ea4070c-03a3-41df-8e7f-b8bd2ade1c46-host\") pod \"node-ca-gfrv5\" (UID: \"1ea4070c-03a3-41df-8e7f-b8bd2ade1c46\") " pod="openshift-image-registry/node-ca-gfrv5" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.381634 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/724ceb00-ea74-4a49-b344-bf4d80a988fd-service-ca\") pod \"cluster-version-operator-5c965bbfc6-vpfc2\" (UID: \"724ceb00-ea74-4a49-b344-bf4d80a988fd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vpfc2" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.385338 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/724ceb00-ea74-4a49-b344-bf4d80a988fd-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-vpfc2\" (UID: \"724ceb00-ea74-4a49-b344-bf4d80a988fd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vpfc2" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.386788 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-fwtqp" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.399347 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.401023 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/724ceb00-ea74-4a49-b344-bf4d80a988fd-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-vpfc2\" (UID: \"724ceb00-ea74-4a49-b344-bf4d80a988fd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vpfc2" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.407433 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:57:55 crc kubenswrapper[4826]: W0319 18:57:55.412506 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66ede589_eceb_497a_b51a_f702f9181969.slice/crio-f861192e648d212f780328a497384fe9f04220dec878414860d202eed5291d0f WatchSource:0}: Error finding container f861192e648d212f780328a497384fe9f04220dec878414860d202eed5291d0f: Status 404 returned error can't find the container with id f861192e648d212f780328a497384fe9f04220dec878414860d202eed5291d0f Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.441857 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cc65v"] Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.442614 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cc65v" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.445294 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.445340 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 19 18:57:55 crc kubenswrapper[4826]: W0319 18:57:55.446865 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5a08d12_9af8_4524_8312_dac430ab73ac.slice/crio-3f12baf1fc0b617ff99a728b69b11b3b665630761f0efd3183686248dfe6d4c8 WatchSource:0}: Error finding container 3f12baf1fc0b617ff99a728b69b11b3b665630761f0efd3183686248dfe6d4c8: Status 404 returned error can't find the container with id 3f12baf1fc0b617ff99a728b69b11b3b665630761f0efd3183686248dfe6d4c8 Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.453484 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fwtqp" event={"ID":"66ede589-eceb-497a-b51a-f702f9181969","Type":"ContainerStarted","Data":"f861192e648d212f780328a497384fe9f04220dec878414860d202eed5291d0f"} Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.457372 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jbz9t" event={"ID":"5861ee46-4813-4909-ab68-55e3ba84a2b9","Type":"ContainerStarted","Data":"5a5c8ec1917d46c5d51fd0aa9b94b666c81fc55e89c63e0de1305c1a8213c1cc"} Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.461087 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-gznc8" event={"ID":"6676cf07-e314-4266-b5b4-a8e74e26992c","Type":"ContainerStarted","Data":"106270e9eec73f3ba931f21d8d8fccf784570b6066d41043ec0d58a1ac8ceb5e"} Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.464030 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vpfc2" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.464973 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" event={"ID":"b456fa3f-c7a7-45ca-b560-e7a9b21be05a","Type":"ContainerStarted","Data":"837c1ce789441da4fa95ce29421eb26a7d7256ad48489c0909abe0adf7e71cf1"} Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.470028 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-7fdpm"] Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.470547 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fdpm" Mar 19 18:57:55 crc kubenswrapper[4826]: E0319 18:57:55.470615 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7fdpm" podUID="9f822d71-562c-4d2c-917f-82281bef6c8a" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.481846 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1ea4070c-03a3-41df-8e7f-b8bd2ade1c46-serviceca\") pod \"node-ca-gfrv5\" (UID: \"1ea4070c-03a3-41df-8e7f-b8bd2ade1c46\") " pod="openshift-image-registry/node-ca-gfrv5" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.481925 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v66c9\" (UniqueName: \"kubernetes.io/projected/1ea4070c-03a3-41df-8e7f-b8bd2ade1c46-kube-api-access-v66c9\") pod \"node-ca-gfrv5\" (UID: \"1ea4070c-03a3-41df-8e7f-b8bd2ade1c46\") " pod="openshift-image-registry/node-ca-gfrv5" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.481950 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1ea4070c-03a3-41df-8e7f-b8bd2ade1c46-host\") pod \"node-ca-gfrv5\" (UID: \"1ea4070c-03a3-41df-8e7f-b8bd2ade1c46\") " pod="openshift-image-registry/node-ca-gfrv5" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.482036 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1ea4070c-03a3-41df-8e7f-b8bd2ade1c46-host\") pod \"node-ca-gfrv5\" (UID: \"1ea4070c-03a3-41df-8e7f-b8bd2ade1c46\") " pod="openshift-image-registry/node-ca-gfrv5" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.483113 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1ea4070c-03a3-41df-8e7f-b8bd2ade1c46-serviceca\") pod \"node-ca-gfrv5\" (UID: \"1ea4070c-03a3-41df-8e7f-b8bd2ade1c46\") " pod="openshift-image-registry/node-ca-gfrv5" Mar 19 18:57:55 crc kubenswrapper[4826]: W0319 18:57:55.499100 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod724ceb00_ea74_4a49_b344_bf4d80a988fd.slice/crio-d09da393d972055f5bc9cc8ef643911deb169aed633b1a888ba3176703aacb0a WatchSource:0}: Error finding container d09da393d972055f5bc9cc8ef643911deb169aed633b1a888ba3176703aacb0a: Status 404 returned error can't find the container with id d09da393d972055f5bc9cc8ef643911deb169aed633b1a888ba3176703aacb0a Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.503244 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v66c9\" (UniqueName: \"kubernetes.io/projected/1ea4070c-03a3-41df-8e7f-b8bd2ade1c46-kube-api-access-v66c9\") pod \"node-ca-gfrv5\" (UID: \"1ea4070c-03a3-41df-8e7f-b8bd2ade1c46\") " pod="openshift-image-registry/node-ca-gfrv5" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.571234 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-gfrv5" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.582553 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87745\" (UniqueName: \"kubernetes.io/projected/4aae65e3-6afb-4324-b432-aab2e3c294a3-kube-api-access-87745\") pod \"ovnkube-control-plane-749d76644c-cc65v\" (UID: \"4aae65e3-6afb-4324-b432-aab2e3c294a3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cc65v" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.582586 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4aae65e3-6afb-4324-b432-aab2e3c294a3-env-overrides\") pod \"ovnkube-control-plane-749d76644c-cc65v\" (UID: \"4aae65e3-6afb-4324-b432-aab2e3c294a3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cc65v" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.582625 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4aae65e3-6afb-4324-b432-aab2e3c294a3-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-cc65v\" (UID: \"4aae65e3-6afb-4324-b432-aab2e3c294a3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cc65v" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.582647 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmlgb\" (UniqueName: \"kubernetes.io/projected/9f822d71-562c-4d2c-917f-82281bef6c8a-kube-api-access-vmlgb\") pod \"network-metrics-daemon-7fdpm\" (UID: \"9f822d71-562c-4d2c-917f-82281bef6c8a\") " pod="openshift-multus/network-metrics-daemon-7fdpm" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.582686 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4aae65e3-6afb-4324-b432-aab2e3c294a3-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-cc65v\" (UID: \"4aae65e3-6afb-4324-b432-aab2e3c294a3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cc65v" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.582793 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f822d71-562c-4d2c-917f-82281bef6c8a-metrics-certs\") pod \"network-metrics-daemon-7fdpm\" (UID: \"9f822d71-562c-4d2c-917f-82281bef6c8a\") " pod="openshift-multus/network-metrics-daemon-7fdpm" Mar 19 18:57:55 crc kubenswrapper[4826]: W0319 18:57:55.584836 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ea4070c_03a3_41df_8e7f_b8bd2ade1c46.slice/crio-23724a6ebf05575eac705c45a71887261fc61ab397ac9efb452eb3fad96ab0db WatchSource:0}: Error finding container 23724a6ebf05575eac705c45a71887261fc61ab397ac9efb452eb3fad96ab0db: Status 404 returned error can't find the container with id 23724a6ebf05575eac705c45a71887261fc61ab397ac9efb452eb3fad96ab0db Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.683953 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmlgb\" (UniqueName: \"kubernetes.io/projected/9f822d71-562c-4d2c-917f-82281bef6c8a-kube-api-access-vmlgb\") pod \"network-metrics-daemon-7fdpm\" (UID: \"9f822d71-562c-4d2c-917f-82281bef6c8a\") " pod="openshift-multus/network-metrics-daemon-7fdpm" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.684014 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4aae65e3-6afb-4324-b432-aab2e3c294a3-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-cc65v\" (UID: \"4aae65e3-6afb-4324-b432-aab2e3c294a3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cc65v" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.684054 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f822d71-562c-4d2c-917f-82281bef6c8a-metrics-certs\") pod \"network-metrics-daemon-7fdpm\" (UID: \"9f822d71-562c-4d2c-917f-82281bef6c8a\") " pod="openshift-multus/network-metrics-daemon-7fdpm" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.684123 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87745\" (UniqueName: \"kubernetes.io/projected/4aae65e3-6afb-4324-b432-aab2e3c294a3-kube-api-access-87745\") pod \"ovnkube-control-plane-749d76644c-cc65v\" (UID: \"4aae65e3-6afb-4324-b432-aab2e3c294a3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cc65v" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.684178 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4aae65e3-6afb-4324-b432-aab2e3c294a3-env-overrides\") pod \"ovnkube-control-plane-749d76644c-cc65v\" (UID: \"4aae65e3-6afb-4324-b432-aab2e3c294a3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cc65v" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.684250 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4aae65e3-6afb-4324-b432-aab2e3c294a3-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-cc65v\" (UID: \"4aae65e3-6afb-4324-b432-aab2e3c294a3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cc65v" Mar 19 18:57:55 crc kubenswrapper[4826]: E0319 18:57:55.685166 4826 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 18:57:55 crc kubenswrapper[4826]: E0319 18:57:55.685258 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f822d71-562c-4d2c-917f-82281bef6c8a-metrics-certs podName:9f822d71-562c-4d2c-917f-82281bef6c8a nodeName:}" failed. No retries permitted until 2026-03-19 18:57:56.185230401 +0000 UTC m=+100.939298754 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f822d71-562c-4d2c-917f-82281bef6c8a-metrics-certs") pod "network-metrics-daemon-7fdpm" (UID: "9f822d71-562c-4d2c-917f-82281bef6c8a") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.685922 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4aae65e3-6afb-4324-b432-aab2e3c294a3-env-overrides\") pod \"ovnkube-control-plane-749d76644c-cc65v\" (UID: \"4aae65e3-6afb-4324-b432-aab2e3c294a3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cc65v" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.688790 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4aae65e3-6afb-4324-b432-aab2e3c294a3-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-cc65v\" (UID: \"4aae65e3-6afb-4324-b432-aab2e3c294a3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cc65v" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.688811 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4aae65e3-6afb-4324-b432-aab2e3c294a3-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-cc65v\" (UID: \"4aae65e3-6afb-4324-b432-aab2e3c294a3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cc65v" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.702457 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmlgb\" (UniqueName: \"kubernetes.io/projected/9f822d71-562c-4d2c-917f-82281bef6c8a-kube-api-access-vmlgb\") pod \"network-metrics-daemon-7fdpm\" (UID: \"9f822d71-562c-4d2c-917f-82281bef6c8a\") " pod="openshift-multus/network-metrics-daemon-7fdpm" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.703361 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87745\" (UniqueName: \"kubernetes.io/projected/4aae65e3-6afb-4324-b432-aab2e3c294a3-kube-api-access-87745\") pod \"ovnkube-control-plane-749d76644c-cc65v\" (UID: \"4aae65e3-6afb-4324-b432-aab2e3c294a3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cc65v" Mar 19 18:57:55 crc kubenswrapper[4826]: I0319 18:57:55.763678 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cc65v" Mar 19 18:57:56 crc kubenswrapper[4826]: I0319 18:57:56.189645 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f822d71-562c-4d2c-917f-82281bef6c8a-metrics-certs\") pod \"network-metrics-daemon-7fdpm\" (UID: \"9f822d71-562c-4d2c-917f-82281bef6c8a\") " pod="openshift-multus/network-metrics-daemon-7fdpm" Mar 19 18:57:56 crc kubenswrapper[4826]: E0319 18:57:56.189939 4826 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 18:57:56 crc kubenswrapper[4826]: E0319 18:57:56.190078 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f822d71-562c-4d2c-917f-82281bef6c8a-metrics-certs podName:9f822d71-562c-4d2c-917f-82281bef6c8a nodeName:}" failed. No retries permitted until 2026-03-19 18:57:57.190046677 +0000 UTC m=+101.944115190 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f822d71-562c-4d2c-917f-82281bef6c8a-metrics-certs") pod "network-metrics-daemon-7fdpm" (UID: "9f822d71-562c-4d2c-917f-82281bef6c8a") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 18:57:56 crc kubenswrapper[4826]: I0319 18:57:56.469349 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" event={"ID":"b456fa3f-c7a7-45ca-b560-e7a9b21be05a","Type":"ContainerStarted","Data":"6579506a95d12b3df1615304ae7df172177b051306069b6fbeae1144e96bfcff"} Mar 19 18:57:56 crc kubenswrapper[4826]: I0319 18:57:56.469817 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" event={"ID":"b456fa3f-c7a7-45ca-b560-e7a9b21be05a","Type":"ContainerStarted","Data":"9d92f655f3b11b40bcc07704e2387d92e12e6f78e1df6ba8885d1c76be823e80"} Mar 19 18:57:56 crc kubenswrapper[4826]: I0319 18:57:56.470359 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-gfrv5" event={"ID":"1ea4070c-03a3-41df-8e7f-b8bd2ade1c46","Type":"ContainerStarted","Data":"2786a964418c126ba51719f1e017c556928c47be8aeb0bc98d12f2f933e5c0cb"} Mar 19 18:57:56 crc kubenswrapper[4826]: I0319 18:57:56.470398 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-gfrv5" event={"ID":"1ea4070c-03a3-41df-8e7f-b8bd2ade1c46","Type":"ContainerStarted","Data":"23724a6ebf05575eac705c45a71887261fc61ab397ac9efb452eb3fad96ab0db"} Mar 19 18:57:56 crc kubenswrapper[4826]: I0319 18:57:56.472515 4826 generic.go:334] "Generic (PLEG): container finished" podID="5861ee46-4813-4909-ab68-55e3ba84a2b9" containerID="d4e5c9f753374e190219d5fdebc7e27ef98514d080852899c1f80a86bc2969a2" exitCode=0 Mar 19 18:57:56 crc kubenswrapper[4826]: I0319 18:57:56.472615 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jbz9t" event={"ID":"5861ee46-4813-4909-ab68-55e3ba84a2b9","Type":"ContainerDied","Data":"d4e5c9f753374e190219d5fdebc7e27ef98514d080852899c1f80a86bc2969a2"} Mar 19 18:57:56 crc kubenswrapper[4826]: I0319 18:57:56.473808 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-gznc8" event={"ID":"6676cf07-e314-4266-b5b4-a8e74e26992c","Type":"ContainerStarted","Data":"659ef570b18339bc7e3fb7f7cabbb543fc07810187a9a362412b6e6ab6e78abb"} Mar 19 18:57:56 crc kubenswrapper[4826]: I0319 18:57:56.475441 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vpfc2" event={"ID":"724ceb00-ea74-4a49-b344-bf4d80a988fd","Type":"ContainerStarted","Data":"ff0777955116a1f46fdabdad2bd351faa39771cd82711089fba4fee5bbbea281"} Mar 19 18:57:56 crc kubenswrapper[4826]: I0319 18:57:56.475487 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vpfc2" event={"ID":"724ceb00-ea74-4a49-b344-bf4d80a988fd","Type":"ContainerStarted","Data":"d09da393d972055f5bc9cc8ef643911deb169aed633b1a888ba3176703aacb0a"} Mar 19 18:57:56 crc kubenswrapper[4826]: I0319 18:57:56.483133 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cc65v" event={"ID":"4aae65e3-6afb-4324-b432-aab2e3c294a3","Type":"ContainerStarted","Data":"722954210cb9dbf70dc42994612dcf13cae21e4d3980e1d7899e0169f927a408"} Mar 19 18:57:56 crc kubenswrapper[4826]: I0319 18:57:56.483203 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cc65v" event={"ID":"4aae65e3-6afb-4324-b432-aab2e3c294a3","Type":"ContainerStarted","Data":"d5685fabdbfad81f1d09ebef8e0a055d79eaad947947bd9a71bf0c40860d174c"} Mar 19 18:57:56 crc kubenswrapper[4826]: I0319 18:57:56.483216 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cc65v" event={"ID":"4aae65e3-6afb-4324-b432-aab2e3c294a3","Type":"ContainerStarted","Data":"b209a6c43b7710331ba1311cd26fde34c0ec7930373dba3796eaba2a23d1149c"} Mar 19 18:57:56 crc kubenswrapper[4826]: I0319 18:57:56.489394 4826 generic.go:334] "Generic (PLEG): container finished" podID="d5a08d12-9af8-4524-8312-dac430ab73ac" containerID="1d9d11401e0db6838e5ede12e71db43517ccb90500295373ea3bc3d284d5346c" exitCode=0 Mar 19 18:57:56 crc kubenswrapper[4826]: I0319 18:57:56.489467 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" event={"ID":"d5a08d12-9af8-4524-8312-dac430ab73ac","Type":"ContainerDied","Data":"1d9d11401e0db6838e5ede12e71db43517ccb90500295373ea3bc3d284d5346c"} Mar 19 18:57:56 crc kubenswrapper[4826]: I0319 18:57:56.489522 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" event={"ID":"d5a08d12-9af8-4524-8312-dac430ab73ac","Type":"ContainerStarted","Data":"3f12baf1fc0b617ff99a728b69b11b3b665630761f0efd3183686248dfe6d4c8"} Mar 19 18:57:56 crc kubenswrapper[4826]: I0319 18:57:56.492464 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podStartSLOduration=40.492376648 podStartE2EDuration="40.492376648s" podCreationTimestamp="2026-03-19 18:57:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:57:56.491647649 +0000 UTC m=+101.245715972" watchObservedRunningTime="2026-03-19 18:57:56.492376648 +0000 UTC m=+101.246444961" Mar 19 18:57:56 crc kubenswrapper[4826]: I0319 18:57:56.503947 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fwtqp" event={"ID":"66ede589-eceb-497a-b51a-f702f9181969","Type":"ContainerStarted","Data":"23953b69631e80f97d6e28d3ca594416c07f1a83bc445d00460c0edbb31552ab"} Mar 19 18:57:56 crc kubenswrapper[4826]: I0319 18:57:56.538183 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vpfc2" podStartSLOduration=40.538163453 podStartE2EDuration="40.538163453s" podCreationTimestamp="2026-03-19 18:57:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:57:56.512067623 +0000 UTC m=+101.266135966" watchObservedRunningTime="2026-03-19 18:57:56.538163453 +0000 UTC m=+101.292231766" Mar 19 18:57:56 crc kubenswrapper[4826]: I0319 18:57:56.560236 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-gfrv5" podStartSLOduration=40.560217343 podStartE2EDuration="40.560217343s" podCreationTimestamp="2026-03-19 18:57:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:57:56.540071575 +0000 UTC m=+101.294139908" watchObservedRunningTime="2026-03-19 18:57:56.560217343 +0000 UTC m=+101.314285656" Mar 19 18:57:56 crc kubenswrapper[4826]: I0319 18:57:56.640926 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-gznc8" podStartSLOduration=40.640900306 podStartE2EDuration="40.640900306s" podCreationTimestamp="2026-03-19 18:57:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:57:56.640189468 +0000 UTC m=+101.394257781" watchObservedRunningTime="2026-03-19 18:57:56.640900306 +0000 UTC m=+101.394968619" Mar 19 18:57:56 crc kubenswrapper[4826]: I0319 18:57:56.662840 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cc65v" podStartSLOduration=39.662816483 podStartE2EDuration="39.662816483s" podCreationTimestamp="2026-03-19 18:57:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:57:56.6623188 +0000 UTC m=+101.416387113" watchObservedRunningTime="2026-03-19 18:57:56.662816483 +0000 UTC m=+101.416884796" Mar 19 18:57:56 crc kubenswrapper[4826]: I0319 18:57:56.689800 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-fwtqp" podStartSLOduration=40.689773225 podStartE2EDuration="40.689773225s" podCreationTimestamp="2026-03-19 18:57:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:57:56.689457997 +0000 UTC m=+101.443526310" watchObservedRunningTime="2026-03-19 18:57:56.689773225 +0000 UTC m=+101.443841548" Mar 19 18:57:56 crc kubenswrapper[4826]: I0319 18:57:56.976033 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fdpm" Mar 19 18:57:56 crc kubenswrapper[4826]: E0319 18:57:56.976533 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7fdpm" podUID="9f822d71-562c-4d2c-917f-82281bef6c8a" Mar 19 18:57:56 crc kubenswrapper[4826]: I0319 18:57:56.977129 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 18:57:56 crc kubenswrapper[4826]: E0319 18:57:56.977205 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 18:57:56 crc kubenswrapper[4826]: I0319 18:57:56.977258 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 18:57:56 crc kubenswrapper[4826]: E0319 18:57:56.977302 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 18:57:56 crc kubenswrapper[4826]: I0319 18:57:56.977349 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:57:56 crc kubenswrapper[4826]: E0319 18:57:56.977397 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 18:57:57 crc kubenswrapper[4826]: I0319 18:57:57.203158 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f822d71-562c-4d2c-917f-82281bef6c8a-metrics-certs\") pod \"network-metrics-daemon-7fdpm\" (UID: \"9f822d71-562c-4d2c-917f-82281bef6c8a\") " pod="openshift-multus/network-metrics-daemon-7fdpm" Mar 19 18:57:57 crc kubenswrapper[4826]: E0319 18:57:57.203373 4826 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 18:57:57 crc kubenswrapper[4826]: E0319 18:57:57.203473 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f822d71-562c-4d2c-917f-82281bef6c8a-metrics-certs podName:9f822d71-562c-4d2c-917f-82281bef6c8a nodeName:}" failed. No retries permitted until 2026-03-19 18:57:59.203442303 +0000 UTC m=+103.957510606 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f822d71-562c-4d2c-917f-82281bef6c8a-metrics-certs") pod "network-metrics-daemon-7fdpm" (UID: "9f822d71-562c-4d2c-917f-82281bef6c8a") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 18:57:57 crc kubenswrapper[4826]: I0319 18:57:57.509292 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jbz9t" event={"ID":"5861ee46-4813-4909-ab68-55e3ba84a2b9","Type":"ContainerStarted","Data":"a0cb94d55df9a176b1c96a7cce62c1288af19f658f4504d3b8ae0515161c96bf"} Mar 19 18:57:57 crc kubenswrapper[4826]: I0319 18:57:57.512295 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" event={"ID":"d5a08d12-9af8-4524-8312-dac430ab73ac","Type":"ContainerStarted","Data":"9c2feb7a785c80cfbb1039d4b73bd90734e4f14d6c680a1c4e3e6ad53e52e987"} Mar 19 18:57:57 crc kubenswrapper[4826]: I0319 18:57:57.512353 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" event={"ID":"d5a08d12-9af8-4524-8312-dac430ab73ac","Type":"ContainerStarted","Data":"8e46f0d8e96721b4855b6ad1e1c626aa6015f077e082aa8699a0e1d9763bb26f"} Mar 19 18:57:57 crc kubenswrapper[4826]: I0319 18:57:57.512369 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" event={"ID":"d5a08d12-9af8-4524-8312-dac430ab73ac","Type":"ContainerStarted","Data":"61849390975c705320f8f58d0e1c2ad8c1b75b8272e9648e52c5c9285350a9f0"} Mar 19 18:57:57 crc kubenswrapper[4826]: I0319 18:57:57.512384 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" event={"ID":"d5a08d12-9af8-4524-8312-dac430ab73ac","Type":"ContainerStarted","Data":"f037fea284de0cac9aaaafaec1b782570ad70ea0f6342072f861e01a13da09cb"} Mar 19 18:57:57 crc kubenswrapper[4826]: I0319 18:57:57.976702 4826 scope.go:117] "RemoveContainer" containerID="d6543dc21146ffce18eefd1d6f58480662c580fc8dbb20550656709811dd6cc7" Mar 19 18:57:57 crc kubenswrapper[4826]: E0319 18:57:57.977325 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 18:57:58 crc kubenswrapper[4826]: I0319 18:57:58.527148 4826 generic.go:334] "Generic (PLEG): container finished" podID="5861ee46-4813-4909-ab68-55e3ba84a2b9" containerID="a0cb94d55df9a176b1c96a7cce62c1288af19f658f4504d3b8ae0515161c96bf" exitCode=0 Mar 19 18:57:58 crc kubenswrapper[4826]: I0319 18:57:58.527251 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jbz9t" event={"ID":"5861ee46-4813-4909-ab68-55e3ba84a2b9","Type":"ContainerDied","Data":"a0cb94d55df9a176b1c96a7cce62c1288af19f658f4504d3b8ae0515161c96bf"} Mar 19 18:57:58 crc kubenswrapper[4826]: I0319 18:57:58.540076 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" event={"ID":"d5a08d12-9af8-4524-8312-dac430ab73ac","Type":"ContainerStarted","Data":"7bdae07ceccbc9af72b58d07baa0246b76c101ec439c0d0cccda4933ee732d36"} Mar 19 18:57:58 crc kubenswrapper[4826]: I0319 18:57:58.540130 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" event={"ID":"d5a08d12-9af8-4524-8312-dac430ab73ac","Type":"ContainerStarted","Data":"906e2b3d6954f05f75aee72a899184cb8df00bc6791da4a80cd788cf0aeb6dd6"} Mar 19 18:57:58 crc kubenswrapper[4826]: I0319 18:57:58.726787 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:57:58 crc kubenswrapper[4826]: E0319 18:57:58.727046 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:14.726992701 +0000 UTC m=+119.481061044 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:57:58 crc kubenswrapper[4826]: I0319 18:57:58.727157 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:57:58 crc kubenswrapper[4826]: I0319 18:57:58.727312 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:57:58 crc kubenswrapper[4826]: I0319 18:57:58.727370 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 18:57:58 crc kubenswrapper[4826]: E0319 18:57:58.727378 4826 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 18:57:58 crc kubenswrapper[4826]: E0319 18:57:58.727470 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 18:58:14.727444684 +0000 UTC m=+119.481513057 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 18:57:58 crc kubenswrapper[4826]: E0319 18:57:58.727761 4826 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 18:57:58 crc kubenswrapper[4826]: E0319 18:57:58.727811 4826 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 18:57:58 crc kubenswrapper[4826]: E0319 18:57:58.727851 4826 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 18:57:58 crc kubenswrapper[4826]: E0319 18:57:58.727871 4826 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 18:57:58 crc kubenswrapper[4826]: E0319 18:57:58.727955 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 18:58:14.727908916 +0000 UTC m=+119.481977269 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 18:57:58 crc kubenswrapper[4826]: E0319 18:57:58.728004 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 18:58:14.727982208 +0000 UTC m=+119.482050671 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 18:57:58 crc kubenswrapper[4826]: I0319 18:57:58.829159 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 18:57:58 crc kubenswrapper[4826]: E0319 18:57:58.829506 4826 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 18:57:58 crc kubenswrapper[4826]: E0319 18:57:58.829588 4826 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 18:57:58 crc kubenswrapper[4826]: E0319 18:57:58.829619 4826 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 18:57:58 crc kubenswrapper[4826]: E0319 18:57:58.829810 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 18:58:14.829764556 +0000 UTC m=+119.583833029 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 18:57:58 crc kubenswrapper[4826]: I0319 18:57:58.975146 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:57:58 crc kubenswrapper[4826]: I0319 18:57:58.975242 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 18:57:58 crc kubenswrapper[4826]: E0319 18:57:58.975302 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 18:57:58 crc kubenswrapper[4826]: E0319 18:57:58.975423 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 18:57:58 crc kubenswrapper[4826]: I0319 18:57:58.975513 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fdpm" Mar 19 18:57:58 crc kubenswrapper[4826]: E0319 18:57:58.975562 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7fdpm" podUID="9f822d71-562c-4d2c-917f-82281bef6c8a" Mar 19 18:57:58 crc kubenswrapper[4826]: I0319 18:57:58.975613 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 18:57:58 crc kubenswrapper[4826]: E0319 18:57:58.975685 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 18:57:59 crc kubenswrapper[4826]: I0319 18:57:59.234709 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f822d71-562c-4d2c-917f-82281bef6c8a-metrics-certs\") pod \"network-metrics-daemon-7fdpm\" (UID: \"9f822d71-562c-4d2c-917f-82281bef6c8a\") " pod="openshift-multus/network-metrics-daemon-7fdpm" Mar 19 18:57:59 crc kubenswrapper[4826]: E0319 18:57:59.235010 4826 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 18:57:59 crc kubenswrapper[4826]: E0319 18:57:59.235208 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f822d71-562c-4d2c-917f-82281bef6c8a-metrics-certs podName:9f822d71-562c-4d2c-917f-82281bef6c8a nodeName:}" failed. No retries permitted until 2026-03-19 18:58:03.235163249 +0000 UTC m=+107.989231772 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f822d71-562c-4d2c-917f-82281bef6c8a-metrics-certs") pod "network-metrics-daemon-7fdpm" (UID: "9f822d71-562c-4d2c-917f-82281bef6c8a") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 18:57:59 crc kubenswrapper[4826]: I0319 18:57:59.546141 4826 generic.go:334] "Generic (PLEG): container finished" podID="5861ee46-4813-4909-ab68-55e3ba84a2b9" containerID="f0038bcb040a6fa3d0aae878d064ca3b076659248febc191390b80ecedaed8f9" exitCode=0 Mar 19 18:57:59 crc kubenswrapper[4826]: I0319 18:57:59.546208 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jbz9t" event={"ID":"5861ee46-4813-4909-ab68-55e3ba84a2b9","Type":"ContainerDied","Data":"f0038bcb040a6fa3d0aae878d064ca3b076659248febc191390b80ecedaed8f9"} Mar 19 18:58:00 crc kubenswrapper[4826]: I0319 18:58:00.553903 4826 generic.go:334] "Generic (PLEG): container finished" podID="5861ee46-4813-4909-ab68-55e3ba84a2b9" containerID="e15adfcef5a848846fe4c9334535836ca01989089756ed53dbffa56c52e055a4" exitCode=0 Mar 19 18:58:00 crc kubenswrapper[4826]: I0319 18:58:00.553974 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jbz9t" event={"ID":"5861ee46-4813-4909-ab68-55e3ba84a2b9","Type":"ContainerDied","Data":"e15adfcef5a848846fe4c9334535836ca01989089756ed53dbffa56c52e055a4"} Mar 19 18:58:00 crc kubenswrapper[4826]: I0319 18:58:00.567236 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" event={"ID":"d5a08d12-9af8-4524-8312-dac430ab73ac","Type":"ContainerStarted","Data":"f368b3acfe3fc97f8ff0bc0ca84d609b243a60e28ee7004b7cc19d30f1081683"} Mar 19 18:58:00 crc kubenswrapper[4826]: I0319 18:58:00.975740 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 18:58:00 crc kubenswrapper[4826]: E0319 18:58:00.975957 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 18:58:00 crc kubenswrapper[4826]: I0319 18:58:00.976358 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:58:00 crc kubenswrapper[4826]: I0319 18:58:00.976791 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fdpm" Mar 19 18:58:00 crc kubenswrapper[4826]: E0319 18:58:00.976730 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 18:58:00 crc kubenswrapper[4826]: E0319 18:58:00.976910 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7fdpm" podUID="9f822d71-562c-4d2c-917f-82281bef6c8a" Mar 19 18:58:00 crc kubenswrapper[4826]: I0319 18:58:00.978215 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 18:58:00 crc kubenswrapper[4826]: E0319 18:58:00.978522 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 18:58:01 crc kubenswrapper[4826]: I0319 18:58:01.575802 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jbz9t" event={"ID":"5861ee46-4813-4909-ab68-55e3ba84a2b9","Type":"ContainerStarted","Data":"b049f499ca3915bc7db298163e260e1926e39a8bea883499c45285a2a90f59ee"} Mar 19 18:58:02 crc kubenswrapper[4826]: I0319 18:58:02.592917 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" event={"ID":"d5a08d12-9af8-4524-8312-dac430ab73ac","Type":"ContainerStarted","Data":"04f8c9d39de66273aab67bc4a6c2d30f4e42c3520ff8fa7d26f0b711cc67792b"} Mar 19 18:58:02 crc kubenswrapper[4826]: I0319 18:58:02.593949 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:58:02 crc kubenswrapper[4826]: I0319 18:58:02.594034 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:58:02 crc kubenswrapper[4826]: I0319 18:58:02.598109 4826 generic.go:334] "Generic (PLEG): container finished" podID="5861ee46-4813-4909-ab68-55e3ba84a2b9" containerID="b049f499ca3915bc7db298163e260e1926e39a8bea883499c45285a2a90f59ee" exitCode=0 Mar 19 18:58:02 crc kubenswrapper[4826]: I0319 18:58:02.598185 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jbz9t" event={"ID":"5861ee46-4813-4909-ab68-55e3ba84a2b9","Type":"ContainerDied","Data":"b049f499ca3915bc7db298163e260e1926e39a8bea883499c45285a2a90f59ee"} Mar 19 18:58:02 crc kubenswrapper[4826]: I0319 18:58:02.625748 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" podStartSLOduration=45.625716725 podStartE2EDuration="45.625716725s" podCreationTimestamp="2026-03-19 18:57:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:02.6255185 +0000 UTC m=+107.379586903" watchObservedRunningTime="2026-03-19 18:58:02.625716725 +0000 UTC m=+107.379785048" Mar 19 18:58:02 crc kubenswrapper[4826]: I0319 18:58:02.633159 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:58:02 crc kubenswrapper[4826]: I0319 18:58:02.639211 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:58:02 crc kubenswrapper[4826]: I0319 18:58:02.975694 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:58:02 crc kubenswrapper[4826]: E0319 18:58:02.976218 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 18:58:02 crc kubenswrapper[4826]: I0319 18:58:02.975854 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 18:58:02 crc kubenswrapper[4826]: I0319 18:58:02.975728 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 18:58:02 crc kubenswrapper[4826]: E0319 18:58:02.976320 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 18:58:02 crc kubenswrapper[4826]: I0319 18:58:02.975867 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fdpm" Mar 19 18:58:02 crc kubenswrapper[4826]: E0319 18:58:02.976515 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 18:58:02 crc kubenswrapper[4826]: E0319 18:58:02.976632 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7fdpm" podUID="9f822d71-562c-4d2c-917f-82281bef6c8a" Mar 19 18:58:03 crc kubenswrapper[4826]: I0319 18:58:03.294387 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f822d71-562c-4d2c-917f-82281bef6c8a-metrics-certs\") pod \"network-metrics-daemon-7fdpm\" (UID: \"9f822d71-562c-4d2c-917f-82281bef6c8a\") " pod="openshift-multus/network-metrics-daemon-7fdpm" Mar 19 18:58:03 crc kubenswrapper[4826]: E0319 18:58:03.294557 4826 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 18:58:03 crc kubenswrapper[4826]: E0319 18:58:03.294683 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f822d71-562c-4d2c-917f-82281bef6c8a-metrics-certs podName:9f822d71-562c-4d2c-917f-82281bef6c8a nodeName:}" failed. No retries permitted until 2026-03-19 18:58:11.294627544 +0000 UTC m=+116.048695897 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f822d71-562c-4d2c-917f-82281bef6c8a-metrics-certs") pod "network-metrics-daemon-7fdpm" (UID: "9f822d71-562c-4d2c-917f-82281bef6c8a") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 18:58:03 crc kubenswrapper[4826]: I0319 18:58:03.608502 4826 generic.go:334] "Generic (PLEG): container finished" podID="5861ee46-4813-4909-ab68-55e3ba84a2b9" containerID="8bb6f422cf86fa13e647a4fa53c52403b95fe6a6f2cf7f6967bd4b5020285e50" exitCode=0 Mar 19 18:58:03 crc kubenswrapper[4826]: I0319 18:58:03.608762 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jbz9t" event={"ID":"5861ee46-4813-4909-ab68-55e3ba84a2b9","Type":"ContainerDied","Data":"8bb6f422cf86fa13e647a4fa53c52403b95fe6a6f2cf7f6967bd4b5020285e50"} Mar 19 18:58:03 crc kubenswrapper[4826]: I0319 18:58:03.609569 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:58:04 crc kubenswrapper[4826]: I0319 18:58:04.618085 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jbz9t" event={"ID":"5861ee46-4813-4909-ab68-55e3ba84a2b9","Type":"ContainerStarted","Data":"9e446bfbb47019e49d3b382c29f736ccc226bdbc870aa294ed8a6ed320816f85"} Mar 19 18:58:04 crc kubenswrapper[4826]: I0319 18:58:04.645072 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-jbz9t" podStartSLOduration=48.645030634 podStartE2EDuration="48.645030634s" podCreationTimestamp="2026-03-19 18:57:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:04.644628763 +0000 UTC m=+109.398697156" watchObservedRunningTime="2026-03-19 18:58:04.645030634 +0000 UTC m=+109.399098977" Mar 19 18:58:04 crc kubenswrapper[4826]: I0319 18:58:04.675315 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7fdpm"] Mar 19 18:58:04 crc kubenswrapper[4826]: I0319 18:58:04.675445 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fdpm" Mar 19 18:58:04 crc kubenswrapper[4826]: E0319 18:58:04.675561 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7fdpm" podUID="9f822d71-562c-4d2c-917f-82281bef6c8a" Mar 19 18:58:04 crc kubenswrapper[4826]: I0319 18:58:04.975912 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 18:58:04 crc kubenswrapper[4826]: I0319 18:58:04.976027 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 18:58:04 crc kubenswrapper[4826]: E0319 18:58:04.976074 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 18:58:04 crc kubenswrapper[4826]: E0319 18:58:04.976235 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 18:58:04 crc kubenswrapper[4826]: I0319 18:58:04.976314 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:58:04 crc kubenswrapper[4826]: E0319 18:58:04.976411 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 18:58:06 crc kubenswrapper[4826]: I0319 18:58:06.975609 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:58:06 crc kubenswrapper[4826]: I0319 18:58:06.975958 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 18:58:06 crc kubenswrapper[4826]: I0319 18:58:06.975970 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fdpm" Mar 19 18:58:06 crc kubenswrapper[4826]: E0319 18:58:06.976920 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 18:58:06 crc kubenswrapper[4826]: E0319 18:58:06.977043 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7fdpm" podUID="9f822d71-562c-4d2c-917f-82281bef6c8a" Mar 19 18:58:06 crc kubenswrapper[4826]: I0319 18:58:06.976060 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 18:58:06 crc kubenswrapper[4826]: E0319 18:58:06.977213 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 18:58:06 crc kubenswrapper[4826]: E0319 18:58:06.976734 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.841799 4826 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.842049 4826 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.898186 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-h7mf4"] Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.899305 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-hcg9z"] Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.899983 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hcg9z" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.900698 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-h7mf4" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.899992 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-tw9k9"] Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.902130 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-6tq97"] Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.902945 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-brsbv"] Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.903512 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-zl2jh"] Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.903799 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tw9k9" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.903861 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-6tq97" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.904025 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-brsbv" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.905610 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.909739 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.916459 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-4t5nj"] Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.917155 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-6sfrz"] Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.918238 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6sfrz" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.919214 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-zl2jh" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.919960 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-4t5nj" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.921296 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-zc8ht"] Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.922096 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-zc8ht" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.926992 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.927297 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.928231 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h6hdz"] Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.929172 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h6hdz" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.935981 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.936175 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.936766 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.936780 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.937556 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.937632 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.938041 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.938330 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.939563 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.939837 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.940047 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.940402 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.940732 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.941208 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.941907 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-pfrcn"] Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.942513 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-cbmtf"] Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.942878 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-zl2jh"] Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.943010 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-cbmtf" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.943349 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-pfrcn" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.949951 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.950213 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.950354 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.950458 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.950554 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.950709 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.967437 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.967699 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.967848 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.971707 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/86f5311b-39ed-455f-a9bc-b83044d63db8-node-pullsecrets\") pod \"apiserver-76f77b778f-h7mf4\" (UID: \"86f5311b-39ed-455f-a9bc-b83044d63db8\") " pod="openshift-apiserver/apiserver-76f77b778f-h7mf4" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.971789 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3d9978e8-9235-4a27-b28e-6e8ed8cf70c4-client-ca\") pod \"route-controller-manager-6576b87f9c-hcg9z\" (UID: \"3d9978e8-9235-4a27-b28e-6e8ed8cf70c4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hcg9z" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.972078 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c5a99915-a5d4-4e6f-87e0-ead11079eeec-auth-proxy-config\") pod \"machine-approver-56656f9798-6sfrz\" (UID: \"c5a99915-a5d4-4e6f-87e0-ead11079eeec\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6sfrz" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.972921 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8db9fbca-87a7-4706-9aef-e78fa4fefe16-client-ca\") pod \"controller-manager-879f6c89f-brsbv\" (UID: \"8db9fbca-87a7-4706-9aef-e78fa4fefe16\") " pod="openshift-controller-manager/controller-manager-879f6c89f-brsbv" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.988447 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.989092 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.989243 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.989333 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.990598 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.990762 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.990852 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.991790 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.993244 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.993374 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.993539 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.994014 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.994192 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.994332 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.994423 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.994574 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.994674 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.994783 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.973049 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/352eae31-d0e1-452b-8319-ab53b8095b5a-config\") pod \"authentication-operator-69f744f599-zl2jh\" (UID: \"352eae31-d0e1-452b-8319-ab53b8095b5a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zl2jh" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.994920 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-4t5nj\" (UID: \"7b5ed6bf-c032-4782-86eb-4803da62cb59\") " pod="openshift-authentication/oauth-openshift-558db77b4-4t5nj" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.994957 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/86f5311b-39ed-455f-a9bc-b83044d63db8-audit-dir\") pod \"apiserver-76f77b778f-h7mf4\" (UID: \"86f5311b-39ed-455f-a9bc-b83044d63db8\") " pod="openshift-apiserver/apiserver-76f77b778f-h7mf4" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.994982 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4e673de9-6eb1-430b-8123-1254957f125f-encryption-config\") pod \"apiserver-7bbb656c7d-tw9k9\" (UID: \"4e673de9-6eb1-430b-8123-1254957f125f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tw9k9" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.995020 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7h6q\" (UniqueName: \"kubernetes.io/projected/4e673de9-6eb1-430b-8123-1254957f125f-kube-api-access-f7h6q\") pod \"apiserver-7bbb656c7d-tw9k9\" (UID: \"4e673de9-6eb1-430b-8123-1254957f125f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tw9k9" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.995049 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84d8b63e-fcf9-45c2-98be-d2aa00660cee-config\") pod \"machine-api-operator-5694c8668f-6tq97\" (UID: \"84d8b63e-fcf9-45c2-98be-d2aa00660cee\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6tq97" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.995075 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8db9fbca-87a7-4706-9aef-e78fa4fefe16-serving-cert\") pod \"controller-manager-879f6c89f-brsbv\" (UID: \"8db9fbca-87a7-4706-9aef-e78fa4fefe16\") " pod="openshift-controller-manager/controller-manager-879f6c89f-brsbv" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.995105 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d9978e8-9235-4a27-b28e-6e8ed8cf70c4-config\") pod \"route-controller-manager-6576b87f9c-hcg9z\" (UID: \"3d9978e8-9235-4a27-b28e-6e8ed8cf70c4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hcg9z" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.995126 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jm24\" (UniqueName: \"kubernetes.io/projected/3d9978e8-9235-4a27-b28e-6e8ed8cf70c4-kube-api-access-9jm24\") pod \"route-controller-manager-6576b87f9c-hcg9z\" (UID: \"3d9978e8-9235-4a27-b28e-6e8ed8cf70c4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hcg9z" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.995149 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/86f5311b-39ed-455f-a9bc-b83044d63db8-image-import-ca\") pod \"apiserver-76f77b778f-h7mf4\" (UID: \"86f5311b-39ed-455f-a9bc-b83044d63db8\") " pod="openshift-apiserver/apiserver-76f77b778f-h7mf4" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.995168 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/86f5311b-39ed-455f-a9bc-b83044d63db8-encryption-config\") pod \"apiserver-76f77b778f-h7mf4\" (UID: \"86f5311b-39ed-455f-a9bc-b83044d63db8\") " pod="openshift-apiserver/apiserver-76f77b778f-h7mf4" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.995188 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7b5ed6bf-c032-4782-86eb-4803da62cb59-audit-dir\") pod \"oauth-openshift-558db77b4-4t5nj\" (UID: \"7b5ed6bf-c032-4782-86eb-4803da62cb59\") " pod="openshift-authentication/oauth-openshift-558db77b4-4t5nj" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.995214 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4aa156db-ba19-4535-ba78-b7a4b94e29e9-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-h6hdz\" (UID: \"4aa156db-ba19-4535-ba78-b7a4b94e29e9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h6hdz" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.995235 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5a99915-a5d4-4e6f-87e0-ead11079eeec-config\") pod \"machine-approver-56656f9798-6sfrz\" (UID: \"c5a99915-a5d4-4e6f-87e0-ead11079eeec\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6sfrz" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.995258 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8db9fbca-87a7-4706-9aef-e78fa4fefe16-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-brsbv\" (UID: \"8db9fbca-87a7-4706-9aef-e78fa4fefe16\") " pod="openshift-controller-manager/controller-manager-879f6c89f-brsbv" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.995281 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4e673de9-6eb1-430b-8123-1254957f125f-audit-policies\") pod \"apiserver-7bbb656c7d-tw9k9\" (UID: \"4e673de9-6eb1-430b-8123-1254957f125f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tw9k9" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.995334 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/352eae31-d0e1-452b-8319-ab53b8095b5a-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-zl2jh\" (UID: \"352eae31-d0e1-452b-8319-ab53b8095b5a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zl2jh" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.995359 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkr6v\" (UniqueName: \"kubernetes.io/projected/352eae31-d0e1-452b-8319-ab53b8095b5a-kube-api-access-kkr6v\") pod \"authentication-operator-69f744f599-zl2jh\" (UID: \"352eae31-d0e1-452b-8319-ab53b8095b5a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zl2jh" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.995383 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4e673de9-6eb1-430b-8123-1254957f125f-audit-dir\") pod \"apiserver-7bbb656c7d-tw9k9\" (UID: \"4e673de9-6eb1-430b-8123-1254957f125f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tw9k9" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.995406 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86f5311b-39ed-455f-a9bc-b83044d63db8-trusted-ca-bundle\") pod \"apiserver-76f77b778f-h7mf4\" (UID: \"86f5311b-39ed-455f-a9bc-b83044d63db8\") " pod="openshift-apiserver/apiserver-76f77b778f-h7mf4" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.995426 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4e673de9-6eb1-430b-8123-1254957f125f-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-tw9k9\" (UID: \"4e673de9-6eb1-430b-8123-1254957f125f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tw9k9" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.995448 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4e673de9-6eb1-430b-8123-1254957f125f-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-tw9k9\" (UID: \"4e673de9-6eb1-430b-8123-1254957f125f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tw9k9" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.995475 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7wwk\" (UniqueName: \"kubernetes.io/projected/7b5ed6bf-c032-4782-86eb-4803da62cb59-kube-api-access-n7wwk\") pod \"oauth-openshift-558db77b4-4t5nj\" (UID: \"7b5ed6bf-c032-4782-86eb-4803da62cb59\") " pod="openshift-authentication/oauth-openshift-558db77b4-4t5nj" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.995500 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-4t5nj\" (UID: \"7b5ed6bf-c032-4782-86eb-4803da62cb59\") " pod="openshift-authentication/oauth-openshift-558db77b4-4t5nj" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.995521 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-4t5nj\" (UID: \"7b5ed6bf-c032-4782-86eb-4803da62cb59\") " pod="openshift-authentication/oauth-openshift-558db77b4-4t5nj" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.995540 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/86f5311b-39ed-455f-a9bc-b83044d63db8-etcd-serving-ca\") pod \"apiserver-76f77b778f-h7mf4\" (UID: \"86f5311b-39ed-455f-a9bc-b83044d63db8\") " pod="openshift-apiserver/apiserver-76f77b778f-h7mf4" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.995564 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c5a99915-a5d4-4e6f-87e0-ead11079eeec-machine-approver-tls\") pod \"machine-approver-56656f9798-6sfrz\" (UID: \"c5a99915-a5d4-4e6f-87e0-ead11079eeec\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6sfrz" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.995589 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/86f5311b-39ed-455f-a9bc-b83044d63db8-etcd-client\") pod \"apiserver-76f77b778f-h7mf4\" (UID: \"86f5311b-39ed-455f-a9bc-b83044d63db8\") " pod="openshift-apiserver/apiserver-76f77b778f-h7mf4" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.995608 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.995767 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.995609 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/84d8b63e-fcf9-45c2-98be-d2aa00660cee-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-6tq97\" (UID: \"84d8b63e-fcf9-45c2-98be-d2aa00660cee\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6tq97" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.995873 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/84d8b63e-fcf9-45c2-98be-d2aa00660cee-images\") pod \"machine-api-operator-5694c8668f-6tq97\" (UID: \"84d8b63e-fcf9-45c2-98be-d2aa00660cee\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6tq97" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.995907 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/352eae31-d0e1-452b-8319-ab53b8095b5a-serving-cert\") pod \"authentication-operator-69f744f599-zl2jh\" (UID: \"352eae31-d0e1-452b-8319-ab53b8095b5a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zl2jh" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.995938 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-4t5nj\" (UID: \"7b5ed6bf-c032-4782-86eb-4803da62cb59\") " pod="openshift-authentication/oauth-openshift-558db77b4-4t5nj" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.995962 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/86f5311b-39ed-455f-a9bc-b83044d63db8-audit\") pod \"apiserver-76f77b778f-h7mf4\" (UID: \"86f5311b-39ed-455f-a9bc-b83044d63db8\") " pod="openshift-apiserver/apiserver-76f77b778f-h7mf4" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.995984 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86f5311b-39ed-455f-a9bc-b83044d63db8-serving-cert\") pod \"apiserver-76f77b778f-h7mf4\" (UID: \"86f5311b-39ed-455f-a9bc-b83044d63db8\") " pod="openshift-apiserver/apiserver-76f77b778f-h7mf4" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.996007 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-4t5nj\" (UID: \"7b5ed6bf-c032-4782-86eb-4803da62cb59\") " pod="openshift-authentication/oauth-openshift-558db77b4-4t5nj" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.996036 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnblb\" (UniqueName: \"kubernetes.io/projected/4aa156db-ba19-4535-ba78-b7a4b94e29e9-kube-api-access-lnblb\") pod \"cluster-samples-operator-665b6dd947-h6hdz\" (UID: \"4aa156db-ba19-4535-ba78-b7a4b94e29e9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h6hdz" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.996032 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.996062 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/352eae31-d0e1-452b-8319-ab53b8095b5a-service-ca-bundle\") pod \"authentication-operator-69f744f599-zl2jh\" (UID: \"352eae31-d0e1-452b-8319-ab53b8095b5a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zl2jh" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.996087 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7b5ed6bf-c032-4782-86eb-4803da62cb59-audit-policies\") pod \"oauth-openshift-558db77b4-4t5nj\" (UID: \"7b5ed6bf-c032-4782-86eb-4803da62cb59\") " pod="openshift-authentication/oauth-openshift-558db77b4-4t5nj" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.996111 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-4t5nj\" (UID: \"7b5ed6bf-c032-4782-86eb-4803da62cb59\") " pod="openshift-authentication/oauth-openshift-558db77b4-4t5nj" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.996131 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4e673de9-6eb1-430b-8123-1254957f125f-etcd-client\") pod \"apiserver-7bbb656c7d-tw9k9\" (UID: \"4e673de9-6eb1-430b-8123-1254957f125f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tw9k9" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.996153 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e673de9-6eb1-430b-8123-1254957f125f-serving-cert\") pod \"apiserver-7bbb656c7d-tw9k9\" (UID: \"4e673de9-6eb1-430b-8123-1254957f125f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tw9k9" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.996174 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvkzz\" (UniqueName: \"kubernetes.io/projected/84d8b63e-fcf9-45c2-98be-d2aa00660cee-kube-api-access-xvkzz\") pod \"machine-api-operator-5694c8668f-6tq97\" (UID: \"84d8b63e-fcf9-45c2-98be-d2aa00660cee\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6tq97" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.996201 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-4t5nj\" (UID: \"7b5ed6bf-c032-4782-86eb-4803da62cb59\") " pod="openshift-authentication/oauth-openshift-558db77b4-4t5nj" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.996230 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8db9fbca-87a7-4706-9aef-e78fa4fefe16-config\") pod \"controller-manager-879f6c89f-brsbv\" (UID: \"8db9fbca-87a7-4706-9aef-e78fa4fefe16\") " pod="openshift-controller-manager/controller-manager-879f6c89f-brsbv" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.996237 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.996250 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d9978e8-9235-4a27-b28e-6e8ed8cf70c4-serving-cert\") pod \"route-controller-manager-6576b87f9c-hcg9z\" (UID: \"3d9978e8-9235-4a27-b28e-6e8ed8cf70c4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hcg9z" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.996286 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-4t5nj\" (UID: \"7b5ed6bf-c032-4782-86eb-4803da62cb59\") " pod="openshift-authentication/oauth-openshift-558db77b4-4t5nj" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.996308 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-4t5nj\" (UID: \"7b5ed6bf-c032-4782-86eb-4803da62cb59\") " pod="openshift-authentication/oauth-openshift-558db77b4-4t5nj" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.996327 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fk87\" (UniqueName: \"kubernetes.io/projected/c5a99915-a5d4-4e6f-87e0-ead11079eeec-kube-api-access-4fk87\") pod \"machine-approver-56656f9798-6sfrz\" (UID: \"c5a99915-a5d4-4e6f-87e0-ead11079eeec\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6sfrz" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.996342 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5ln6\" (UniqueName: \"kubernetes.io/projected/86f5311b-39ed-455f-a9bc-b83044d63db8-kube-api-access-v5ln6\") pod \"apiserver-76f77b778f-h7mf4\" (UID: \"86f5311b-39ed-455f-a9bc-b83044d63db8\") " pod="openshift-apiserver/apiserver-76f77b778f-h7mf4" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.996363 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mgsm\" (UniqueName: \"kubernetes.io/projected/8db9fbca-87a7-4706-9aef-e78fa4fefe16-kube-api-access-9mgsm\") pod \"controller-manager-879f6c89f-brsbv\" (UID: \"8db9fbca-87a7-4706-9aef-e78fa4fefe16\") " pod="openshift-controller-manager/controller-manager-879f6c89f-brsbv" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.996382 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-4t5nj\" (UID: \"7b5ed6bf-c032-4782-86eb-4803da62cb59\") " pod="openshift-authentication/oauth-openshift-558db77b4-4t5nj" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.996397 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86f5311b-39ed-455f-a9bc-b83044d63db8-config\") pod \"apiserver-76f77b778f-h7mf4\" (UID: \"86f5311b-39ed-455f-a9bc-b83044d63db8\") " pod="openshift-apiserver/apiserver-76f77b778f-h7mf4" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.996414 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-4t5nj\" (UID: \"7b5ed6bf-c032-4782-86eb-4803da62cb59\") " pod="openshift-authentication/oauth-openshift-558db77b4-4t5nj" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.996603 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.997380 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.997486 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.997552 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.997579 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.997612 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.997754 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.997776 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.997866 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.997892 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.997972 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.997485 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.998029 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.998039 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.997648 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.997721 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.997969 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.998301 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.998794 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-hcg9z"] Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.998975 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.999073 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.999085 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 19 18:58:07 crc kubenswrapper[4826]: I0319 18:58:07.999134 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.004276 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.006282 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.009626 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.009805 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.009941 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.010299 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-p892r"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.010622 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.011030 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-jmhs5"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.011521 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mwhjm"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.011625 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-p892r" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.012189 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.012726 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-jmhs5" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.012221 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.014524 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.015841 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.016621 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.020643 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.021374 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.021373 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.021695 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.021785 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.021868 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.022197 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.033940 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.034491 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.036571 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.036884 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.043255 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.043501 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.043702 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.046425 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jgpxv"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.046637 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.046865 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.047160 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jgpxv" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.047326 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.047526 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-98vw6"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.047860 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-98vw6" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.049005 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8ghl4"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.049294 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8ghl4" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.049770 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.050042 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vxvlt"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.050840 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vxvlt" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.050972 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-tmbjc"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.058796 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vj6zq"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.059369 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-drbf6"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.059746 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kzx65"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.060095 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-6vlbh"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.060215 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-drbf6" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.052992 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.059058 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-tmbjc" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.053143 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.060991 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lfzff"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.053197 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.053236 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.053277 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.061488 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-9bmdx"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.060181 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vj6zq" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.061754 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kzx65" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.062056 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-6vlbh" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.062371 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9bmdx" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.062915 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lfzff" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.063595 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-jfxcc"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.064221 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jfxcc" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.066226 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-v6d7k"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.066580 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.067190 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-v6d7k" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.067715 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-r4w59"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.068394 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r4w59" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.068796 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nw52z"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.069289 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nw52z" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.069896 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nlft6"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.070485 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nlft6" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.072461 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h6hdz"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.077317 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-tw9k9"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.078757 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dnc22"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.079482 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dnc22" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.079724 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-brsbv"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.084818 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.085842 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-flj2h"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.086856 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-flj2h" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.088286 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-7jk4n"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.088842 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-zc8ht"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.089009 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-7jk4n" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.094264 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565765-b929z"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.096430 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565765-b929z" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.100888 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/84d8b63e-fcf9-45c2-98be-d2aa00660cee-images\") pod \"machine-api-operator-5694c8668f-6tq97\" (UID: \"84d8b63e-fcf9-45c2-98be-d2aa00660cee\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6tq97" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.101173 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f61cc107-39c3-4add-b9a1-45c5d744ea4b-serving-cert\") pod \"console-operator-58897d9998-zc8ht\" (UID: \"f61cc107-39c3-4add-b9a1-45c5d744ea4b\") " pod="openshift-console-operator/console-operator-58897d9998-zc8ht" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.101303 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-659lx\" (UniqueName: \"kubernetes.io/projected/72f0a310-1676-49a4-826a-d83406d28e93-kube-api-access-659lx\") pod \"openshift-config-operator-7777fb866f-pfrcn\" (UID: \"72f0a310-1676-49a4-826a-d83406d28e93\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-pfrcn" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.101400 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crzlf\" (UniqueName: \"kubernetes.io/projected/0a13bc75-83b6-4952-8e8e-cd93809a87b5-kube-api-access-crzlf\") pod \"downloads-7954f5f757-cbmtf\" (UID: \"0a13bc75-83b6-4952-8e8e-cd93809a87b5\") " pod="openshift-console/downloads-7954f5f757-cbmtf" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.101508 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/90903b96-fde6-4803-93ab-15ecd16645af-etcd-client\") pod \"etcd-operator-b45778765-tmbjc\" (UID: \"90903b96-fde6-4803-93ab-15ecd16645af\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tmbjc" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.101618 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/352eae31-d0e1-452b-8319-ab53b8095b5a-serving-cert\") pod \"authentication-operator-69f744f599-zl2jh\" (UID: \"352eae31-d0e1-452b-8319-ab53b8095b5a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zl2jh" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.101756 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-4t5nj\" (UID: \"7b5ed6bf-c032-4782-86eb-4803da62cb59\") " pod="openshift-authentication/oauth-openshift-558db77b4-4t5nj" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.102023 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/86f5311b-39ed-455f-a9bc-b83044d63db8-audit\") pod \"apiserver-76f77b778f-h7mf4\" (UID: \"86f5311b-39ed-455f-a9bc-b83044d63db8\") " pod="openshift-apiserver/apiserver-76f77b778f-h7mf4" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.102185 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86f5311b-39ed-455f-a9bc-b83044d63db8-serving-cert\") pod \"apiserver-76f77b778f-h7mf4\" (UID: \"86f5311b-39ed-455f-a9bc-b83044d63db8\") " pod="openshift-apiserver/apiserver-76f77b778f-h7mf4" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.102286 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72f0a310-1676-49a4-826a-d83406d28e93-serving-cert\") pod \"openshift-config-operator-7777fb866f-pfrcn\" (UID: \"72f0a310-1676-49a4-826a-d83406d28e93\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-pfrcn" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.102396 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d3acdc6-8778-4094-af1f-8f3824029d90-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-vj6zq\" (UID: \"0d3acdc6-8778-4094-af1f-8f3824029d90\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vj6zq" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.102493 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqzxr\" (UniqueName: \"kubernetes.io/projected/9bc83b3f-72da-4527-b7a8-5f09d3f5f39f-kube-api-access-hqzxr\") pod \"catalog-operator-68c6474976-v6d7k\" (UID: \"9bc83b3f-72da-4527-b7a8-5f09d3f5f39f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-v6d7k" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.102613 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnblb\" (UniqueName: \"kubernetes.io/projected/4aa156db-ba19-4535-ba78-b7a4b94e29e9-kube-api-access-lnblb\") pod \"cluster-samples-operator-665b6dd947-h6hdz\" (UID: \"4aa156db-ba19-4535-ba78-b7a4b94e29e9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h6hdz" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.102807 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/352eae31-d0e1-452b-8319-ab53b8095b5a-service-ca-bundle\") pod \"authentication-operator-69f744f599-zl2jh\" (UID: \"352eae31-d0e1-452b-8319-ab53b8095b5a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zl2jh" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.104827 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7b5ed6bf-c032-4782-86eb-4803da62cb59-audit-policies\") pod \"oauth-openshift-558db77b4-4t5nj\" (UID: \"7b5ed6bf-c032-4782-86eb-4803da62cb59\") " pod="openshift-authentication/oauth-openshift-558db77b4-4t5nj" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.104950 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-4t5nj\" (UID: \"7b5ed6bf-c032-4782-86eb-4803da62cb59\") " pod="openshift-authentication/oauth-openshift-558db77b4-4t5nj" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.105111 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-4t5nj\" (UID: \"7b5ed6bf-c032-4782-86eb-4803da62cb59\") " pod="openshift-authentication/oauth-openshift-558db77b4-4t5nj" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.105204 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvkzz\" (UniqueName: \"kubernetes.io/projected/84d8b63e-fcf9-45c2-98be-d2aa00660cee-kube-api-access-xvkzz\") pod \"machine-api-operator-5694c8668f-6tq97\" (UID: \"84d8b63e-fcf9-45c2-98be-d2aa00660cee\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6tq97" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.105289 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4e673de9-6eb1-430b-8123-1254957f125f-etcd-client\") pod \"apiserver-7bbb656c7d-tw9k9\" (UID: \"4e673de9-6eb1-430b-8123-1254957f125f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tw9k9" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.105390 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e673de9-6eb1-430b-8123-1254957f125f-serving-cert\") pod \"apiserver-7bbb656c7d-tw9k9\" (UID: \"4e673de9-6eb1-430b-8123-1254957f125f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tw9k9" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.105524 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-4t5nj\" (UID: \"7b5ed6bf-c032-4782-86eb-4803da62cb59\") " pod="openshift-authentication/oauth-openshift-558db77b4-4t5nj" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.105823 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v545x\" (UniqueName: \"kubernetes.io/projected/f61cc107-39c3-4add-b9a1-45c5d744ea4b-kube-api-access-v545x\") pod \"console-operator-58897d9998-zc8ht\" (UID: \"f61cc107-39c3-4add-b9a1-45c5d744ea4b\") " pod="openshift-console-operator/console-operator-58897d9998-zc8ht" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.106067 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1392e2fd-a142-4584-8df9-7470b9441a3d-images\") pod \"machine-config-operator-74547568cd-9bmdx\" (UID: \"1392e2fd-a142-4584-8df9-7470b9441a3d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9bmdx" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.106165 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0d3acdc6-8778-4094-af1f-8f3824029d90-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-vj6zq\" (UID: \"0d3acdc6-8778-4094-af1f-8f3824029d90\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vj6zq" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.106262 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed48d331-c0eb-42d6-8d6e-6617fc4b7985-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-jgpxv\" (UID: \"ed48d331-c0eb-42d6-8d6e-6617fc4b7985\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jgpxv" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.106361 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8db9fbca-87a7-4706-9aef-e78fa4fefe16-config\") pod \"controller-manager-879f6c89f-brsbv\" (UID: \"8db9fbca-87a7-4706-9aef-e78fa4fefe16\") " pod="openshift-controller-manager/controller-manager-879f6c89f-brsbv" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.106457 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d9978e8-9235-4a27-b28e-6e8ed8cf70c4-serving-cert\") pod \"route-controller-manager-6576b87f9c-hcg9z\" (UID: \"3d9978e8-9235-4a27-b28e-6e8ed8cf70c4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hcg9z" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.107205 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-4t5nj\" (UID: \"7b5ed6bf-c032-4782-86eb-4803da62cb59\") " pod="openshift-authentication/oauth-openshift-558db77b4-4t5nj" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.107594 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-4t5nj\" (UID: \"7b5ed6bf-c032-4782-86eb-4803da62cb59\") " pod="openshift-authentication/oauth-openshift-558db77b4-4t5nj" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.107631 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/d3d2b5c3-e37e-4a58-af35-8980d9c8d43a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-nw52z\" (UID: \"d3d2b5c3-e37e-4a58-af35-8980d9c8d43a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nw52z" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.107686 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fk87\" (UniqueName: \"kubernetes.io/projected/c5a99915-a5d4-4e6f-87e0-ead11079eeec-kube-api-access-4fk87\") pod \"machine-approver-56656f9798-6sfrz\" (UID: \"c5a99915-a5d4-4e6f-87e0-ead11079eeec\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6sfrz" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.102206 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/84d8b63e-fcf9-45c2-98be-d2aa00660cee-images\") pod \"machine-api-operator-5694c8668f-6tq97\" (UID: \"84d8b63e-fcf9-45c2-98be-d2aa00660cee\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6tq97" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.103866 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/86f5311b-39ed-455f-a9bc-b83044d63db8-audit\") pod \"apiserver-76f77b778f-h7mf4\" (UID: \"86f5311b-39ed-455f-a9bc-b83044d63db8\") " pod="openshift-apiserver/apiserver-76f77b778f-h7mf4" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.102837 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fcnzx"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.107714 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5ln6\" (UniqueName: \"kubernetes.io/projected/86f5311b-39ed-455f-a9bc-b83044d63db8-kube-api-access-v5ln6\") pod \"apiserver-76f77b778f-h7mf4\" (UID: \"86f5311b-39ed-455f-a9bc-b83044d63db8\") " pod="openshift-apiserver/apiserver-76f77b778f-h7mf4" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.130276 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7b5ed6bf-c032-4782-86eb-4803da62cb59-audit-policies\") pod \"oauth-openshift-558db77b4-4t5nj\" (UID: \"7b5ed6bf-c032-4782-86eb-4803da62cb59\") " pod="openshift-authentication/oauth-openshift-558db77b4-4t5nj" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.130339 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/352eae31-d0e1-452b-8319-ab53b8095b5a-service-ca-bundle\") pod \"authentication-operator-69f744f599-zl2jh\" (UID: \"352eae31-d0e1-452b-8319-ab53b8095b5a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zl2jh" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.111344 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86f5311b-39ed-455f-a9bc-b83044d63db8-serving-cert\") pod \"apiserver-76f77b778f-h7mf4\" (UID: \"86f5311b-39ed-455f-a9bc-b83044d63db8\") " pod="openshift-apiserver/apiserver-76f77b778f-h7mf4" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.130556 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-4t5nj\" (UID: \"7b5ed6bf-c032-4782-86eb-4803da62cb59\") " pod="openshift-authentication/oauth-openshift-558db77b4-4t5nj" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.130894 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tn895\" (UniqueName: \"kubernetes.io/projected/cf827e9c-2ada-440e-a3e0-99deb1eb54c1-kube-api-access-tn895\") pod \"dns-operator-744455d44c-jmhs5\" (UID: \"cf827e9c-2ada-440e-a3e0-99deb1eb54c1\") " pod="openshift-dns-operator/dns-operator-744455d44c-jmhs5" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.131413 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-4t5nj\" (UID: \"7b5ed6bf-c032-4782-86eb-4803da62cb59\") " pod="openshift-authentication/oauth-openshift-558db77b4-4t5nj" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.131586 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.131743 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-4t5nj\" (UID: \"7b5ed6bf-c032-4782-86eb-4803da62cb59\") " pod="openshift-authentication/oauth-openshift-558db77b4-4t5nj" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.131927 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mgsm\" (UniqueName: \"kubernetes.io/projected/8db9fbca-87a7-4706-9aef-e78fa4fefe16-kube-api-access-9mgsm\") pod \"controller-manager-879f6c89f-brsbv\" (UID: \"8db9fbca-87a7-4706-9aef-e78fa4fefe16\") " pod="openshift-controller-manager/controller-manager-879f6c89f-brsbv" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.131978 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f61cc107-39c3-4add-b9a1-45c5d744ea4b-trusted-ca\") pod \"console-operator-58897d9998-zc8ht\" (UID: \"f61cc107-39c3-4add-b9a1-45c5d744ea4b\") " pod="openshift-console-operator/console-operator-58897d9998-zc8ht" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.131984 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8db9fbca-87a7-4706-9aef-e78fa4fefe16-config\") pod \"controller-manager-879f6c89f-brsbv\" (UID: \"8db9fbca-87a7-4706-9aef-e78fa4fefe16\") " pod="openshift-controller-manager/controller-manager-879f6c89f-brsbv" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.131991 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/352eae31-d0e1-452b-8319-ab53b8095b5a-serving-cert\") pod \"authentication-operator-69f744f599-zl2jh\" (UID: \"352eae31-d0e1-452b-8319-ab53b8095b5a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zl2jh" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.132058 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gw4br\" (UniqueName: \"kubernetes.io/projected/d3d2b5c3-e37e-4a58-af35-8980d9c8d43a-kube-api-access-gw4br\") pod \"control-plane-machine-set-operator-78cbb6b69f-nw52z\" (UID: \"d3d2b5c3-e37e-4a58-af35-8980d9c8d43a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nw52z" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.132090 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6f264ef7-44e5-4dee-91c5-89c87a000c9f-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-6vlbh\" (UID: \"6f264ef7-44e5-4dee-91c5-89c87a000c9f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6vlbh" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.132249 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-4t5nj\" (UID: \"7b5ed6bf-c032-4782-86eb-4803da62cb59\") " pod="openshift-authentication/oauth-openshift-558db77b4-4t5nj" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.132288 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86f5311b-39ed-455f-a9bc-b83044d63db8-config\") pod \"apiserver-76f77b778f-h7mf4\" (UID: \"86f5311b-39ed-455f-a9bc-b83044d63db8\") " pod="openshift-apiserver/apiserver-76f77b778f-h7mf4" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.132308 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.132351 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-4t5nj\" (UID: \"7b5ed6bf-c032-4782-86eb-4803da62cb59\") " pod="openshift-authentication/oauth-openshift-558db77b4-4t5nj" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.132388 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66m4j\" (UniqueName: \"kubernetes.io/projected/90903b96-fde6-4803-93ab-15ecd16645af-kube-api-access-66m4j\") pod \"etcd-operator-b45778765-tmbjc\" (UID: \"90903b96-fde6-4803-93ab-15ecd16645af\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tmbjc" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.132444 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9181cd89-dbc9-4fe2-aacb-4f67003e5738-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kzx65\" (UID: \"9181cd89-dbc9-4fe2-aacb-4f67003e5738\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kzx65" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.132806 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4e673de9-6eb1-430b-8123-1254957f125f-etcd-client\") pod \"apiserver-7bbb656c7d-tw9k9\" (UID: \"4e673de9-6eb1-430b-8123-1254957f125f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tw9k9" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.132967 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-4t5nj\" (UID: \"7b5ed6bf-c032-4782-86eb-4803da62cb59\") " pod="openshift-authentication/oauth-openshift-558db77b4-4t5nj" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.133217 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86f5311b-39ed-455f-a9bc-b83044d63db8-config\") pod \"apiserver-76f77b778f-h7mf4\" (UID: \"86f5311b-39ed-455f-a9bc-b83044d63db8\") " pod="openshift-apiserver/apiserver-76f77b778f-h7mf4" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.133288 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/86f5311b-39ed-455f-a9bc-b83044d63db8-node-pullsecrets\") pod \"apiserver-76f77b778f-h7mf4\" (UID: \"86f5311b-39ed-455f-a9bc-b83044d63db8\") " pod="openshift-apiserver/apiserver-76f77b778f-h7mf4" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.133318 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-4t5nj\" (UID: \"7b5ed6bf-c032-4782-86eb-4803da62cb59\") " pod="openshift-authentication/oauth-openshift-558db77b4-4t5nj" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.133327 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a1f7edf-3d9d-468d-a18c-a08128eca13c-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-98vw6\" (UID: \"4a1f7edf-3d9d-468d-a18c-a08128eca13c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-98vw6" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.133363 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9181cd89-dbc9-4fe2-aacb-4f67003e5738-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kzx65\" (UID: \"9181cd89-dbc9-4fe2-aacb-4f67003e5738\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kzx65" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.133423 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c5a99915-a5d4-4e6f-87e0-ead11079eeec-auth-proxy-config\") pod \"machine-approver-56656f9798-6sfrz\" (UID: \"c5a99915-a5d4-4e6f-87e0-ead11079eeec\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6sfrz" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.133473 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8db9fbca-87a7-4706-9aef-e78fa4fefe16-client-ca\") pod \"controller-manager-879f6c89f-brsbv\" (UID: \"8db9fbca-87a7-4706-9aef-e78fa4fefe16\") " pod="openshift-controller-manager/controller-manager-879f6c89f-brsbv" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.133502 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3d9978e8-9235-4a27-b28e-6e8ed8cf70c4-client-ca\") pod \"route-controller-manager-6576b87f9c-hcg9z\" (UID: \"3d9978e8-9235-4a27-b28e-6e8ed8cf70c4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hcg9z" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.133533 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d3acdc6-8778-4094-af1f-8f3824029d90-config\") pod \"kube-controller-manager-operator-78b949d7b-vj6zq\" (UID: \"0d3acdc6-8778-4094-af1f-8f3824029d90\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vj6zq" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.133560 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/352eae31-d0e1-452b-8319-ab53b8095b5a-config\") pod \"authentication-operator-69f744f599-zl2jh\" (UID: \"352eae31-d0e1-452b-8319-ab53b8095b5a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zl2jh" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.133588 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-4t5nj\" (UID: \"7b5ed6bf-c032-4782-86eb-4803da62cb59\") " pod="openshift-authentication/oauth-openshift-558db77b4-4t5nj" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.133729 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/86f5311b-39ed-455f-a9bc-b83044d63db8-audit-dir\") pod \"apiserver-76f77b778f-h7mf4\" (UID: \"86f5311b-39ed-455f-a9bc-b83044d63db8\") " pod="openshift-apiserver/apiserver-76f77b778f-h7mf4" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.133763 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4e673de9-6eb1-430b-8123-1254957f125f-encryption-config\") pod \"apiserver-7bbb656c7d-tw9k9\" (UID: \"4e673de9-6eb1-430b-8123-1254957f125f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tw9k9" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.133795 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z66bw\" (UniqueName: \"kubernetes.io/projected/a7b17be3-fa05-45b6-ba33-b514326061b1-kube-api-access-z66bw\") pod \"migrator-59844c95c7-jfxcc\" (UID: \"a7b17be3-fa05-45b6-ba33-b514326061b1\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jfxcc" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.133822 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9bc83b3f-72da-4527-b7a8-5f09d3f5f39f-srv-cert\") pod \"catalog-operator-68c6474976-v6d7k\" (UID: \"9bc83b3f-72da-4527-b7a8-5f09d3f5f39f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-v6d7k" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.133853 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7h6q\" (UniqueName: \"kubernetes.io/projected/4e673de9-6eb1-430b-8123-1254957f125f-kube-api-access-f7h6q\") pod \"apiserver-7bbb656c7d-tw9k9\" (UID: \"4e673de9-6eb1-430b-8123-1254957f125f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tw9k9" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.134076 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e673de9-6eb1-430b-8123-1254957f125f-serving-cert\") pod \"apiserver-7bbb656c7d-tw9k9\" (UID: \"4e673de9-6eb1-430b-8123-1254957f125f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tw9k9" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.135350 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-4t5nj\" (UID: \"7b5ed6bf-c032-4782-86eb-4803da62cb59\") " pod="openshift-authentication/oauth-openshift-558db77b4-4t5nj" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.135420 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/86f5311b-39ed-455f-a9bc-b83044d63db8-node-pullsecrets\") pod \"apiserver-76f77b778f-h7mf4\" (UID: \"86f5311b-39ed-455f-a9bc-b83044d63db8\") " pod="openshift-apiserver/apiserver-76f77b778f-h7mf4" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.136046 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c5a99915-a5d4-4e6f-87e0-ead11079eeec-auth-proxy-config\") pod \"machine-approver-56656f9798-6sfrz\" (UID: \"c5a99915-a5d4-4e6f-87e0-ead11079eeec\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6sfrz" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.136078 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8db9fbca-87a7-4706-9aef-e78fa4fefe16-client-ca\") pod \"controller-manager-879f6c89f-brsbv\" (UID: \"8db9fbca-87a7-4706-9aef-e78fa4fefe16\") " pod="openshift-controller-manager/controller-manager-879f6c89f-brsbv" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.136824 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/352eae31-d0e1-452b-8319-ab53b8095b5a-config\") pod \"authentication-operator-69f744f599-zl2jh\" (UID: \"352eae31-d0e1-452b-8319-ab53b8095b5a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zl2jh" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.136897 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/86f5311b-39ed-455f-a9bc-b83044d63db8-audit-dir\") pod \"apiserver-76f77b778f-h7mf4\" (UID: \"86f5311b-39ed-455f-a9bc-b83044d63db8\") " pod="openshift-apiserver/apiserver-76f77b778f-h7mf4" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.136944 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-4t5nj\" (UID: \"7b5ed6bf-c032-4782-86eb-4803da62cb59\") " pod="openshift-authentication/oauth-openshift-558db77b4-4t5nj" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.137169 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84d8b63e-fcf9-45c2-98be-d2aa00660cee-config\") pod \"machine-api-operator-5694c8668f-6tq97\" (UID: \"84d8b63e-fcf9-45c2-98be-d2aa00660cee\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6tq97" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.137891 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84d8b63e-fcf9-45c2-98be-d2aa00660cee-config\") pod \"machine-api-operator-5694c8668f-6tq97\" (UID: \"84d8b63e-fcf9-45c2-98be-d2aa00660cee\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6tq97" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.137949 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8db9fbca-87a7-4706-9aef-e78fa4fefe16-serving-cert\") pod \"controller-manager-879f6c89f-brsbv\" (UID: \"8db9fbca-87a7-4706-9aef-e78fa4fefe16\") " pod="openshift-controller-manager/controller-manager-879f6c89f-brsbv" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.137999 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9bc83b3f-72da-4527-b7a8-5f09d3f5f39f-profile-collector-cert\") pod \"catalog-operator-68c6474976-v6d7k\" (UID: \"9bc83b3f-72da-4527-b7a8-5f09d3f5f39f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-v6d7k" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.139211 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3d9978e8-9235-4a27-b28e-6e8ed8cf70c4-client-ca\") pod \"route-controller-manager-6576b87f9c-hcg9z\" (UID: \"3d9978e8-9235-4a27-b28e-6e8ed8cf70c4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hcg9z" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.140883 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-4t5nj\" (UID: \"7b5ed6bf-c032-4782-86eb-4803da62cb59\") " pod="openshift-authentication/oauth-openshift-558db77b4-4t5nj" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.141701 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d9978e8-9235-4a27-b28e-6e8ed8cf70c4-config\") pod \"route-controller-manager-6576b87f9c-hcg9z\" (UID: \"3d9978e8-9235-4a27-b28e-6e8ed8cf70c4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hcg9z" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.141757 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90903b96-fde6-4803-93ab-15ecd16645af-config\") pod \"etcd-operator-b45778765-tmbjc\" (UID: \"90903b96-fde6-4803-93ab-15ecd16645af\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tmbjc" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.141784 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9181cd89-dbc9-4fe2-aacb-4f67003e5738-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kzx65\" (UID: \"9181cd89-dbc9-4fe2-aacb-4f67003e5738\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kzx65" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.141815 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/90903b96-fde6-4803-93ab-15ecd16645af-serving-cert\") pod \"etcd-operator-b45778765-tmbjc\" (UID: \"90903b96-fde6-4803-93ab-15ecd16645af\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tmbjc" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.141952 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4e673de9-6eb1-430b-8123-1254957f125f-encryption-config\") pod \"apiserver-7bbb656c7d-tw9k9\" (UID: \"4e673de9-6eb1-430b-8123-1254957f125f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tw9k9" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.141991 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jm24\" (UniqueName: \"kubernetes.io/projected/3d9978e8-9235-4a27-b28e-6e8ed8cf70c4-kube-api-access-9jm24\") pod \"route-controller-manager-6576b87f9c-hcg9z\" (UID: \"3d9978e8-9235-4a27-b28e-6e8ed8cf70c4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hcg9z" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.142174 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/86f5311b-39ed-455f-a9bc-b83044d63db8-image-import-ca\") pod \"apiserver-76f77b778f-h7mf4\" (UID: \"86f5311b-39ed-455f-a9bc-b83044d63db8\") " pod="openshift-apiserver/apiserver-76f77b778f-h7mf4" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.142205 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/86f5311b-39ed-455f-a9bc-b83044d63db8-encryption-config\") pod \"apiserver-76f77b778f-h7mf4\" (UID: \"86f5311b-39ed-455f-a9bc-b83044d63db8\") " pod="openshift-apiserver/apiserver-76f77b778f-h7mf4" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.142265 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/90903b96-fde6-4803-93ab-15ecd16645af-etcd-service-ca\") pod \"etcd-operator-b45778765-tmbjc\" (UID: \"90903b96-fde6-4803-93ab-15ecd16645af\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tmbjc" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.142808 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.142858 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4aa156db-ba19-4535-ba78-b7a4b94e29e9-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-h6hdz\" (UID: \"4aa156db-ba19-4535-ba78-b7a4b94e29e9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h6hdz" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.142933 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5a99915-a5d4-4e6f-87e0-ead11079eeec-config\") pod \"machine-approver-56656f9798-6sfrz\" (UID: \"c5a99915-a5d4-4e6f-87e0-ead11079eeec\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6sfrz" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.143021 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8db9fbca-87a7-4706-9aef-e78fa4fefe16-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-brsbv\" (UID: \"8db9fbca-87a7-4706-9aef-e78fa4fefe16\") " pod="openshift-controller-manager/controller-manager-879f6c89f-brsbv" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.143049 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7b5ed6bf-c032-4782-86eb-4803da62cb59-audit-dir\") pod \"oauth-openshift-558db77b4-4t5nj\" (UID: \"7b5ed6bf-c032-4782-86eb-4803da62cb59\") " pod="openshift-authentication/oauth-openshift-558db77b4-4t5nj" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.143074 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4e673de9-6eb1-430b-8123-1254957f125f-audit-policies\") pod \"apiserver-7bbb656c7d-tw9k9\" (UID: \"4e673de9-6eb1-430b-8123-1254957f125f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tw9k9" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.143142 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1392e2fd-a142-4584-8df9-7470b9441a3d-auth-proxy-config\") pod \"machine-config-operator-74547568cd-9bmdx\" (UID: \"1392e2fd-a142-4584-8df9-7470b9441a3d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9bmdx" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.143198 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/352eae31-d0e1-452b-8319-ab53b8095b5a-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-zl2jh\" (UID: \"352eae31-d0e1-452b-8319-ab53b8095b5a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zl2jh" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.143234 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkr6v\" (UniqueName: \"kubernetes.io/projected/352eae31-d0e1-452b-8319-ab53b8095b5a-kube-api-access-kkr6v\") pod \"authentication-operator-69f744f599-zl2jh\" (UID: \"352eae31-d0e1-452b-8319-ab53b8095b5a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zl2jh" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.143285 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4e673de9-6eb1-430b-8123-1254957f125f-audit-dir\") pod \"apiserver-7bbb656c7d-tw9k9\" (UID: \"4e673de9-6eb1-430b-8123-1254957f125f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tw9k9" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.143357 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4e673de9-6eb1-430b-8123-1254957f125f-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-tw9k9\" (UID: \"4e673de9-6eb1-430b-8123-1254957f125f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tw9k9" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.143388 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f61cc107-39c3-4add-b9a1-45c5d744ea4b-config\") pod \"console-operator-58897d9998-zc8ht\" (UID: \"f61cc107-39c3-4add-b9a1-45c5d744ea4b\") " pod="openshift-console-operator/console-operator-58897d9998-zc8ht" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.143434 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkbtp\" (UniqueName: \"kubernetes.io/projected/ed48d331-c0eb-42d6-8d6e-6617fc4b7985-kube-api-access-dkbtp\") pod \"openshift-controller-manager-operator-756b6f6bc6-jgpxv\" (UID: \"ed48d331-c0eb-42d6-8d6e-6617fc4b7985\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jgpxv" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.143459 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1392e2fd-a142-4584-8df9-7470b9441a3d-proxy-tls\") pod \"machine-config-operator-74547568cd-9bmdx\" (UID: \"1392e2fd-a142-4584-8df9-7470b9441a3d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9bmdx" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.143503 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4a1f7edf-3d9d-468d-a18c-a08128eca13c-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-98vw6\" (UID: \"4a1f7edf-3d9d-468d-a18c-a08128eca13c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-98vw6" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.143535 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/90903b96-fde6-4803-93ab-15ecd16645af-etcd-ca\") pod \"etcd-operator-b45778765-tmbjc\" (UID: \"90903b96-fde6-4803-93ab-15ecd16645af\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tmbjc" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.143583 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86f5311b-39ed-455f-a9bc-b83044d63db8-trusted-ca-bundle\") pod \"apiserver-76f77b778f-h7mf4\" (UID: \"86f5311b-39ed-455f-a9bc-b83044d63db8\") " pod="openshift-apiserver/apiserver-76f77b778f-h7mf4" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.143615 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4e673de9-6eb1-430b-8123-1254957f125f-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-tw9k9\" (UID: \"4e673de9-6eb1-430b-8123-1254957f125f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tw9k9" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.143697 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7wwk\" (UniqueName: \"kubernetes.io/projected/7b5ed6bf-c032-4782-86eb-4803da62cb59-kube-api-access-n7wwk\") pod \"oauth-openshift-558db77b4-4t5nj\" (UID: \"7b5ed6bf-c032-4782-86eb-4803da62cb59\") " pod="openshift-authentication/oauth-openshift-558db77b4-4t5nj" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.143731 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-4t5nj\" (UID: \"7b5ed6bf-c032-4782-86eb-4803da62cb59\") " pod="openshift-authentication/oauth-openshift-558db77b4-4t5nj" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.143781 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/86f5311b-39ed-455f-a9bc-b83044d63db8-etcd-serving-ca\") pod \"apiserver-76f77b778f-h7mf4\" (UID: \"86f5311b-39ed-455f-a9bc-b83044d63db8\") " pod="openshift-apiserver/apiserver-76f77b778f-h7mf4" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.143810 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blslc\" (UniqueName: \"kubernetes.io/projected/6f264ef7-44e5-4dee-91c5-89c87a000c9f-kube-api-access-blslc\") pod \"multus-admission-controller-857f4d67dd-6vlbh\" (UID: \"6f264ef7-44e5-4dee-91c5-89c87a000c9f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6vlbh" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.143866 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-4t5nj\" (UID: \"7b5ed6bf-c032-4782-86eb-4803da62cb59\") " pod="openshift-authentication/oauth-openshift-558db77b4-4t5nj" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.143869 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/86f5311b-39ed-455f-a9bc-b83044d63db8-image-import-ca\") pod \"apiserver-76f77b778f-h7mf4\" (UID: \"86f5311b-39ed-455f-a9bc-b83044d63db8\") " pod="openshift-apiserver/apiserver-76f77b778f-h7mf4" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.143897 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a1f7edf-3d9d-468d-a18c-a08128eca13c-config\") pod \"kube-apiserver-operator-766d6c64bb-98vw6\" (UID: \"4a1f7edf-3d9d-468d-a18c-a08128eca13c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-98vw6" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.143944 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cf827e9c-2ada-440e-a3e0-99deb1eb54c1-metrics-tls\") pod \"dns-operator-744455d44c-jmhs5\" (UID: \"cf827e9c-2ada-440e-a3e0-99deb1eb54c1\") " pod="openshift-dns-operator/dns-operator-744455d44c-jmhs5" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.143974 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/72f0a310-1676-49a4-826a-d83406d28e93-available-featuregates\") pod \"openshift-config-operator-7777fb866f-pfrcn\" (UID: \"72f0a310-1676-49a4-826a-d83406d28e93\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-pfrcn" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.144026 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/86f5311b-39ed-455f-a9bc-b83044d63db8-etcd-client\") pod \"apiserver-76f77b778f-h7mf4\" (UID: \"86f5311b-39ed-455f-a9bc-b83044d63db8\") " pod="openshift-apiserver/apiserver-76f77b778f-h7mf4" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.144051 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/84d8b63e-fcf9-45c2-98be-d2aa00660cee-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-6tq97\" (UID: \"84d8b63e-fcf9-45c2-98be-d2aa00660cee\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6tq97" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.144098 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed48d331-c0eb-42d6-8d6e-6617fc4b7985-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-jgpxv\" (UID: \"ed48d331-c0eb-42d6-8d6e-6617fc4b7985\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jgpxv" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.144130 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pn9bh\" (UniqueName: \"kubernetes.io/projected/1392e2fd-a142-4584-8df9-7470b9441a3d-kube-api-access-pn9bh\") pod \"machine-config-operator-74547568cd-9bmdx\" (UID: \"1392e2fd-a142-4584-8df9-7470b9441a3d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9bmdx" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.144178 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c5a99915-a5d4-4e6f-87e0-ead11079eeec-machine-approver-tls\") pod \"machine-approver-56656f9798-6sfrz\" (UID: \"c5a99915-a5d4-4e6f-87e0-ead11079eeec\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6sfrz" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.144886 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d9978e8-9235-4a27-b28e-6e8ed8cf70c4-config\") pod \"route-controller-manager-6576b87f9c-hcg9z\" (UID: \"3d9978e8-9235-4a27-b28e-6e8ed8cf70c4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hcg9z" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.146776 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-76ppq"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.146806 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8db9fbca-87a7-4706-9aef-e78fa4fefe16-serving-cert\") pod \"controller-manager-879f6c89f-brsbv\" (UID: \"8db9fbca-87a7-4706-9aef-e78fa4fefe16\") " pod="openshift-controller-manager/controller-manager-879f6c89f-brsbv" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.146884 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4e673de9-6eb1-430b-8123-1254957f125f-audit-dir\") pod \"apiserver-7bbb656c7d-tw9k9\" (UID: \"4e673de9-6eb1-430b-8123-1254957f125f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tw9k9" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.147201 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5a99915-a5d4-4e6f-87e0-ead11079eeec-config\") pod \"machine-approver-56656f9798-6sfrz\" (UID: \"c5a99915-a5d4-4e6f-87e0-ead11079eeec\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6sfrz" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.147531 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565778-hxldh"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.147995 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-z8vj2"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.148360 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4aa156db-ba19-4535-ba78-b7a4b94e29e9-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-h6hdz\" (UID: \"4aa156db-ba19-4535-ba78-b7a4b94e29e9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h6hdz" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.148693 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-h7mf4"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.148719 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-98vw6"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.148732 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-p892r"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.148743 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-4t5nj"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.148699 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86f5311b-39ed-455f-a9bc-b83044d63db8-trusted-ca-bundle\") pod \"apiserver-76f77b778f-h7mf4\" (UID: \"86f5311b-39ed-455f-a9bc-b83044d63db8\") " pod="openshift-apiserver/apiserver-76f77b778f-h7mf4" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.148873 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fcnzx" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.149049 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-4rf57"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.149574 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/86f5311b-39ed-455f-a9bc-b83044d63db8-etcd-serving-ca\") pod \"apiserver-76f77b778f-h7mf4\" (UID: \"86f5311b-39ed-455f-a9bc-b83044d63db8\") " pod="openshift-apiserver/apiserver-76f77b778f-h7mf4" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.149861 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4e673de9-6eb1-430b-8123-1254957f125f-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-tw9k9\" (UID: \"4e673de9-6eb1-430b-8123-1254957f125f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tw9k9" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.150141 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c5a99915-a5d4-4e6f-87e0-ead11079eeec-machine-approver-tls\") pod \"machine-approver-56656f9798-6sfrz\" (UID: \"c5a99915-a5d4-4e6f-87e0-ead11079eeec\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6sfrz" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.150451 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-76ppq" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.147538 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4e673de9-6eb1-430b-8123-1254957f125f-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-tw9k9\" (UID: \"4e673de9-6eb1-430b-8123-1254957f125f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tw9k9" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.150728 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565778-hxldh" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.150922 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-z8vj2" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.151172 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-4rf57" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.152123 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mwhjm"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.153035 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4e673de9-6eb1-430b-8123-1254957f125f-audit-policies\") pod \"apiserver-7bbb656c7d-tw9k9\" (UID: \"4e673de9-6eb1-430b-8123-1254957f125f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tw9k9" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.153110 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7b5ed6bf-c032-4782-86eb-4803da62cb59-audit-dir\") pod \"oauth-openshift-558db77b4-4t5nj\" (UID: \"7b5ed6bf-c032-4782-86eb-4803da62cb59\") " pod="openshift-authentication/oauth-openshift-558db77b4-4t5nj" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.153840 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-v6d7k"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.154195 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/352eae31-d0e1-452b-8319-ab53b8095b5a-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-zl2jh\" (UID: \"352eae31-d0e1-452b-8319-ab53b8095b5a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zl2jh" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.154573 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/86f5311b-39ed-455f-a9bc-b83044d63db8-etcd-client\") pod \"apiserver-76f77b778f-h7mf4\" (UID: \"86f5311b-39ed-455f-a9bc-b83044d63db8\") " pod="openshift-apiserver/apiserver-76f77b778f-h7mf4" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.154761 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-4t5nj\" (UID: \"7b5ed6bf-c032-4782-86eb-4803da62cb59\") " pod="openshift-authentication/oauth-openshift-558db77b4-4t5nj" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.155749 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8db9fbca-87a7-4706-9aef-e78fa4fefe16-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-brsbv\" (UID: \"8db9fbca-87a7-4706-9aef-e78fa4fefe16\") " pod="openshift-controller-manager/controller-manager-879f6c89f-brsbv" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.155917 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kzx65"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.157822 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nw52z"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.157765 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-4t5nj\" (UID: \"7b5ed6bf-c032-4782-86eb-4803da62cb59\") " pod="openshift-authentication/oauth-openshift-558db77b4-4t5nj" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.158115 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-tmbjc"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.158676 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d9978e8-9235-4a27-b28e-6e8ed8cf70c4-serving-cert\") pod \"route-controller-manager-6576b87f9c-hcg9z\" (UID: \"3d9978e8-9235-4a27-b28e-6e8ed8cf70c4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hcg9z" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.158937 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/84d8b63e-fcf9-45c2-98be-d2aa00660cee-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-6tq97\" (UID: \"84d8b63e-fcf9-45c2-98be-d2aa00660cee\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6tq97" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.161424 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-4t5nj\" (UID: \"7b5ed6bf-c032-4782-86eb-4803da62cb59\") " pod="openshift-authentication/oauth-openshift-558db77b4-4t5nj" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.163255 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.163368 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-cbmtf"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.163436 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-9bmdx"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.164809 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-6vlbh"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.165108 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-jmhs5"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.166093 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jgpxv"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.166220 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/86f5311b-39ed-455f-a9bc-b83044d63db8-encryption-config\") pod \"apiserver-76f77b778f-h7mf4\" (UID: \"86f5311b-39ed-455f-a9bc-b83044d63db8\") " pod="openshift-apiserver/apiserver-76f77b778f-h7mf4" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.167088 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-pfrcn"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.167979 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dnc22"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.168940 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nlft6"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.169864 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565778-hxldh"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.170823 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vj6zq"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.171755 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8ghl4"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.172904 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-r4w59"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.173853 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-4rf57"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.174792 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lfzff"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.176807 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-jfxcc"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.177779 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vxvlt"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.178845 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-6tq97"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.179847 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-7jk4n"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.181043 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-z8vj2"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.181992 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565765-b929z"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.182907 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-cx4gd"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.182987 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.183608 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-cx4gd" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.184014 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-ph9t5"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.184723 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ph9t5" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.185037 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-flj2h"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.186063 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-76ppq"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.187084 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fcnzx"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.188154 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-cx4gd"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.189370 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-ph9t5"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.190323 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-m7zht"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.191128 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-m7zht" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.191348 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-fc4kh"] Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.191948 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-fc4kh" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.204003 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.223535 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.243765 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.245224 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a1f7edf-3d9d-468d-a18c-a08128eca13c-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-98vw6\" (UID: \"4a1f7edf-3d9d-468d-a18c-a08128eca13c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-98vw6" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.245262 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9181cd89-dbc9-4fe2-aacb-4f67003e5738-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kzx65\" (UID: \"9181cd89-dbc9-4fe2-aacb-4f67003e5738\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kzx65" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.245296 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d3acdc6-8778-4094-af1f-8f3824029d90-config\") pod \"kube-controller-manager-operator-78b949d7b-vj6zq\" (UID: \"0d3acdc6-8778-4094-af1f-8f3824029d90\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vj6zq" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.245324 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z66bw\" (UniqueName: \"kubernetes.io/projected/a7b17be3-fa05-45b6-ba33-b514326061b1-kube-api-access-z66bw\") pod \"migrator-59844c95c7-jfxcc\" (UID: \"a7b17be3-fa05-45b6-ba33-b514326061b1\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jfxcc" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.245342 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9bc83b3f-72da-4527-b7a8-5f09d3f5f39f-srv-cert\") pod \"catalog-operator-68c6474976-v6d7k\" (UID: \"9bc83b3f-72da-4527-b7a8-5f09d3f5f39f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-v6d7k" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.245374 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9bc83b3f-72da-4527-b7a8-5f09d3f5f39f-profile-collector-cert\") pod \"catalog-operator-68c6474976-v6d7k\" (UID: \"9bc83b3f-72da-4527-b7a8-5f09d3f5f39f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-v6d7k" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.245395 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90903b96-fde6-4803-93ab-15ecd16645af-config\") pod \"etcd-operator-b45778765-tmbjc\" (UID: \"90903b96-fde6-4803-93ab-15ecd16645af\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tmbjc" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.245410 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9181cd89-dbc9-4fe2-aacb-4f67003e5738-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kzx65\" (UID: \"9181cd89-dbc9-4fe2-aacb-4f67003e5738\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kzx65" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.245427 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/90903b96-fde6-4803-93ab-15ecd16645af-serving-cert\") pod \"etcd-operator-b45778765-tmbjc\" (UID: \"90903b96-fde6-4803-93ab-15ecd16645af\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tmbjc" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.245450 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/90903b96-fde6-4803-93ab-15ecd16645af-etcd-service-ca\") pod \"etcd-operator-b45778765-tmbjc\" (UID: \"90903b96-fde6-4803-93ab-15ecd16645af\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tmbjc" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.245482 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1392e2fd-a142-4584-8df9-7470b9441a3d-auth-proxy-config\") pod \"machine-config-operator-74547568cd-9bmdx\" (UID: \"1392e2fd-a142-4584-8df9-7470b9441a3d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9bmdx" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.245520 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f61cc107-39c3-4add-b9a1-45c5d744ea4b-config\") pod \"console-operator-58897d9998-zc8ht\" (UID: \"f61cc107-39c3-4add-b9a1-45c5d744ea4b\") " pod="openshift-console-operator/console-operator-58897d9998-zc8ht" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.245540 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkbtp\" (UniqueName: \"kubernetes.io/projected/ed48d331-c0eb-42d6-8d6e-6617fc4b7985-kube-api-access-dkbtp\") pod \"openshift-controller-manager-operator-756b6f6bc6-jgpxv\" (UID: \"ed48d331-c0eb-42d6-8d6e-6617fc4b7985\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jgpxv" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.245561 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1392e2fd-a142-4584-8df9-7470b9441a3d-proxy-tls\") pod \"machine-config-operator-74547568cd-9bmdx\" (UID: \"1392e2fd-a142-4584-8df9-7470b9441a3d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9bmdx" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.245579 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4a1f7edf-3d9d-468d-a18c-a08128eca13c-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-98vw6\" (UID: \"4a1f7edf-3d9d-468d-a18c-a08128eca13c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-98vw6" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.245598 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/90903b96-fde6-4803-93ab-15ecd16645af-etcd-ca\") pod \"etcd-operator-b45778765-tmbjc\" (UID: \"90903b96-fde6-4803-93ab-15ecd16645af\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tmbjc" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.245630 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blslc\" (UniqueName: \"kubernetes.io/projected/6f264ef7-44e5-4dee-91c5-89c87a000c9f-kube-api-access-blslc\") pod \"multus-admission-controller-857f4d67dd-6vlbh\" (UID: \"6f264ef7-44e5-4dee-91c5-89c87a000c9f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6vlbh" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.245664 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a1f7edf-3d9d-468d-a18c-a08128eca13c-config\") pod \"kube-apiserver-operator-766d6c64bb-98vw6\" (UID: \"4a1f7edf-3d9d-468d-a18c-a08128eca13c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-98vw6" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.245688 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cf827e9c-2ada-440e-a3e0-99deb1eb54c1-metrics-tls\") pod \"dns-operator-744455d44c-jmhs5\" (UID: \"cf827e9c-2ada-440e-a3e0-99deb1eb54c1\") " pod="openshift-dns-operator/dns-operator-744455d44c-jmhs5" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.245714 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/72f0a310-1676-49a4-826a-d83406d28e93-available-featuregates\") pod \"openshift-config-operator-7777fb866f-pfrcn\" (UID: \"72f0a310-1676-49a4-826a-d83406d28e93\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-pfrcn" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.245736 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed48d331-c0eb-42d6-8d6e-6617fc4b7985-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-jgpxv\" (UID: \"ed48d331-c0eb-42d6-8d6e-6617fc4b7985\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jgpxv" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.245758 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pn9bh\" (UniqueName: \"kubernetes.io/projected/1392e2fd-a142-4584-8df9-7470b9441a3d-kube-api-access-pn9bh\") pod \"machine-config-operator-74547568cd-9bmdx\" (UID: \"1392e2fd-a142-4584-8df9-7470b9441a3d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9bmdx" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.245778 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f61cc107-39c3-4add-b9a1-45c5d744ea4b-serving-cert\") pod \"console-operator-58897d9998-zc8ht\" (UID: \"f61cc107-39c3-4add-b9a1-45c5d744ea4b\") " pod="openshift-console-operator/console-operator-58897d9998-zc8ht" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.245801 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-659lx\" (UniqueName: \"kubernetes.io/projected/72f0a310-1676-49a4-826a-d83406d28e93-kube-api-access-659lx\") pod \"openshift-config-operator-7777fb866f-pfrcn\" (UID: \"72f0a310-1676-49a4-826a-d83406d28e93\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-pfrcn" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.245843 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crzlf\" (UniqueName: \"kubernetes.io/projected/0a13bc75-83b6-4952-8e8e-cd93809a87b5-kube-api-access-crzlf\") pod \"downloads-7954f5f757-cbmtf\" (UID: \"0a13bc75-83b6-4952-8e8e-cd93809a87b5\") " pod="openshift-console/downloads-7954f5f757-cbmtf" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.245891 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/90903b96-fde6-4803-93ab-15ecd16645af-etcd-client\") pod \"etcd-operator-b45778765-tmbjc\" (UID: \"90903b96-fde6-4803-93ab-15ecd16645af\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tmbjc" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.245919 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72f0a310-1676-49a4-826a-d83406d28e93-serving-cert\") pod \"openshift-config-operator-7777fb866f-pfrcn\" (UID: \"72f0a310-1676-49a4-826a-d83406d28e93\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-pfrcn" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.245944 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d3acdc6-8778-4094-af1f-8f3824029d90-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-vj6zq\" (UID: \"0d3acdc6-8778-4094-af1f-8f3824029d90\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vj6zq" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.245968 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqzxr\" (UniqueName: \"kubernetes.io/projected/9bc83b3f-72da-4527-b7a8-5f09d3f5f39f-kube-api-access-hqzxr\") pod \"catalog-operator-68c6474976-v6d7k\" (UID: \"9bc83b3f-72da-4527-b7a8-5f09d3f5f39f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-v6d7k" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.246016 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v545x\" (UniqueName: \"kubernetes.io/projected/f61cc107-39c3-4add-b9a1-45c5d744ea4b-kube-api-access-v545x\") pod \"console-operator-58897d9998-zc8ht\" (UID: \"f61cc107-39c3-4add-b9a1-45c5d744ea4b\") " pod="openshift-console-operator/console-operator-58897d9998-zc8ht" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.246044 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1392e2fd-a142-4584-8df9-7470b9441a3d-images\") pod \"machine-config-operator-74547568cd-9bmdx\" (UID: \"1392e2fd-a142-4584-8df9-7470b9441a3d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9bmdx" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.246065 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0d3acdc6-8778-4094-af1f-8f3824029d90-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-vj6zq\" (UID: \"0d3acdc6-8778-4094-af1f-8f3824029d90\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vj6zq" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.246085 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed48d331-c0eb-42d6-8d6e-6617fc4b7985-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-jgpxv\" (UID: \"ed48d331-c0eb-42d6-8d6e-6617fc4b7985\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jgpxv" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.246143 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/d3d2b5c3-e37e-4a58-af35-8980d9c8d43a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-nw52z\" (UID: \"d3d2b5c3-e37e-4a58-af35-8980d9c8d43a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nw52z" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.246193 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tn895\" (UniqueName: \"kubernetes.io/projected/cf827e9c-2ada-440e-a3e0-99deb1eb54c1-kube-api-access-tn895\") pod \"dns-operator-744455d44c-jmhs5\" (UID: \"cf827e9c-2ada-440e-a3e0-99deb1eb54c1\") " pod="openshift-dns-operator/dns-operator-744455d44c-jmhs5" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.246229 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f61cc107-39c3-4add-b9a1-45c5d744ea4b-trusted-ca\") pod \"console-operator-58897d9998-zc8ht\" (UID: \"f61cc107-39c3-4add-b9a1-45c5d744ea4b\") " pod="openshift-console-operator/console-operator-58897d9998-zc8ht" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.246248 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gw4br\" (UniqueName: \"kubernetes.io/projected/d3d2b5c3-e37e-4a58-af35-8980d9c8d43a-kube-api-access-gw4br\") pod \"control-plane-machine-set-operator-78cbb6b69f-nw52z\" (UID: \"d3d2b5c3-e37e-4a58-af35-8980d9c8d43a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nw52z" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.246265 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6f264ef7-44e5-4dee-91c5-89c87a000c9f-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-6vlbh\" (UID: \"6f264ef7-44e5-4dee-91c5-89c87a000c9f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6vlbh" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.246287 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66m4j\" (UniqueName: \"kubernetes.io/projected/90903b96-fde6-4803-93ab-15ecd16645af-kube-api-access-66m4j\") pod \"etcd-operator-b45778765-tmbjc\" (UID: \"90903b96-fde6-4803-93ab-15ecd16645af\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tmbjc" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.246309 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9181cd89-dbc9-4fe2-aacb-4f67003e5738-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kzx65\" (UID: \"9181cd89-dbc9-4fe2-aacb-4f67003e5738\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kzx65" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.246557 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/72f0a310-1676-49a4-826a-d83406d28e93-available-featuregates\") pod \"openshift-config-operator-7777fb866f-pfrcn\" (UID: \"72f0a310-1676-49a4-826a-d83406d28e93\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-pfrcn" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.246573 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f61cc107-39c3-4add-b9a1-45c5d744ea4b-config\") pod \"console-operator-58897d9998-zc8ht\" (UID: \"f61cc107-39c3-4add-b9a1-45c5d744ea4b\") " pod="openshift-console-operator/console-operator-58897d9998-zc8ht" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.247513 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed48d331-c0eb-42d6-8d6e-6617fc4b7985-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-jgpxv\" (UID: \"ed48d331-c0eb-42d6-8d6e-6617fc4b7985\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jgpxv" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.247632 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a1f7edf-3d9d-468d-a18c-a08128eca13c-config\") pod \"kube-apiserver-operator-766d6c64bb-98vw6\" (UID: \"4a1f7edf-3d9d-468d-a18c-a08128eca13c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-98vw6" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.247799 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1392e2fd-a142-4584-8df9-7470b9441a3d-auth-proxy-config\") pod \"machine-config-operator-74547568cd-9bmdx\" (UID: \"1392e2fd-a142-4584-8df9-7470b9441a3d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9bmdx" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.248110 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f61cc107-39c3-4add-b9a1-45c5d744ea4b-trusted-ca\") pod \"console-operator-58897d9998-zc8ht\" (UID: \"f61cc107-39c3-4add-b9a1-45c5d744ea4b\") " pod="openshift-console-operator/console-operator-58897d9998-zc8ht" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.249258 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed48d331-c0eb-42d6-8d6e-6617fc4b7985-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-jgpxv\" (UID: \"ed48d331-c0eb-42d6-8d6e-6617fc4b7985\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jgpxv" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.249310 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cf827e9c-2ada-440e-a3e0-99deb1eb54c1-metrics-tls\") pod \"dns-operator-744455d44c-jmhs5\" (UID: \"cf827e9c-2ada-440e-a3e0-99deb1eb54c1\") " pod="openshift-dns-operator/dns-operator-744455d44c-jmhs5" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.249863 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f61cc107-39c3-4add-b9a1-45c5d744ea4b-serving-cert\") pod \"console-operator-58897d9998-zc8ht\" (UID: \"f61cc107-39c3-4add-b9a1-45c5d744ea4b\") " pod="openshift-console-operator/console-operator-58897d9998-zc8ht" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.250164 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72f0a310-1676-49a4-826a-d83406d28e93-serving-cert\") pod \"openshift-config-operator-7777fb866f-pfrcn\" (UID: \"72f0a310-1676-49a4-826a-d83406d28e93\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-pfrcn" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.283157 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.291298 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a1f7edf-3d9d-468d-a18c-a08128eca13c-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-98vw6\" (UID: \"4a1f7edf-3d9d-468d-a18c-a08128eca13c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-98vw6" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.316416 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.323726 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.344245 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.364440 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.382879 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.404081 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.423201 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.443500 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.464574 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.469170 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/90903b96-fde6-4803-93ab-15ecd16645af-etcd-client\") pod \"etcd-operator-b45778765-tmbjc\" (UID: \"90903b96-fde6-4803-93ab-15ecd16645af\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tmbjc" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.482963 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.502687 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.508863 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/90903b96-fde6-4803-93ab-15ecd16645af-serving-cert\") pod \"etcd-operator-b45778765-tmbjc\" (UID: \"90903b96-fde6-4803-93ab-15ecd16645af\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tmbjc" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.523245 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.527056 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90903b96-fde6-4803-93ab-15ecd16645af-config\") pod \"etcd-operator-b45778765-tmbjc\" (UID: \"90903b96-fde6-4803-93ab-15ecd16645af\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tmbjc" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.544539 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.546454 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/90903b96-fde6-4803-93ab-15ecd16645af-etcd-ca\") pod \"etcd-operator-b45778765-tmbjc\" (UID: \"90903b96-fde6-4803-93ab-15ecd16645af\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tmbjc" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.563198 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.566561 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/90903b96-fde6-4803-93ab-15ecd16645af-etcd-service-ca\") pod \"etcd-operator-b45778765-tmbjc\" (UID: \"90903b96-fde6-4803-93ab-15ecd16645af\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tmbjc" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.583094 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.603737 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.624289 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.644471 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.651925 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d3acdc6-8778-4094-af1f-8f3824029d90-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-vj6zq\" (UID: \"0d3acdc6-8778-4094-af1f-8f3824029d90\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vj6zq" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.664126 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.683646 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.688766 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d3acdc6-8778-4094-af1f-8f3824029d90-config\") pod \"kube-controller-manager-operator-78b949d7b-vj6zq\" (UID: \"0d3acdc6-8778-4094-af1f-8f3824029d90\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vj6zq" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.703504 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.710289 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9181cd89-dbc9-4fe2-aacb-4f67003e5738-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kzx65\" (UID: \"9181cd89-dbc9-4fe2-aacb-4f67003e5738\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kzx65" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.723383 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.742788 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.747492 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9181cd89-dbc9-4fe2-aacb-4f67003e5738-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kzx65\" (UID: \"9181cd89-dbc9-4fe2-aacb-4f67003e5738\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kzx65" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.763302 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.771278 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6f264ef7-44e5-4dee-91c5-89c87a000c9f-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-6vlbh\" (UID: \"6f264ef7-44e5-4dee-91c5-89c87a000c9f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6vlbh" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.783915 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.802979 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.807746 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1392e2fd-a142-4584-8df9-7470b9441a3d-images\") pod \"machine-config-operator-74547568cd-9bmdx\" (UID: \"1392e2fd-a142-4584-8df9-7470b9441a3d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9bmdx" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.824827 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.844707 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.850716 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1392e2fd-a142-4584-8df9-7470b9441a3d-proxy-tls\") pod \"machine-config-operator-74547568cd-9bmdx\" (UID: \"1392e2fd-a142-4584-8df9-7470b9441a3d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9bmdx" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.864229 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.883725 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.902702 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.924278 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.944164 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.963989 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.975691 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fdpm" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.975720 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.975689 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.975953 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 18:58:08 crc kubenswrapper[4826]: I0319 18:58:08.984223 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 19 18:58:09 crc kubenswrapper[4826]: I0319 18:58:09.003294 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 19 18:58:09 crc kubenswrapper[4826]: I0319 18:58:09.023248 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 19 18:58:09 crc kubenswrapper[4826]: I0319 18:58:09.044133 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 19 18:58:09 crc kubenswrapper[4826]: I0319 18:58:09.063218 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 19 18:58:09 crc kubenswrapper[4826]: I0319 18:58:09.070713 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9bc83b3f-72da-4527-b7a8-5f09d3f5f39f-srv-cert\") pod \"catalog-operator-68c6474976-v6d7k\" (UID: \"9bc83b3f-72da-4527-b7a8-5f09d3f5f39f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-v6d7k" Mar 19 18:58:09 crc kubenswrapper[4826]: I0319 18:58:09.081637 4826 request.go:700] Waited for 1.013948872s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/secrets?fieldSelector=metadata.name%3Dpprof-cert&limit=500&resourceVersion=0 Mar 19 18:58:09 crc kubenswrapper[4826]: I0319 18:58:09.083597 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 19 18:58:09 crc kubenswrapper[4826]: I0319 18:58:09.091834 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9bc83b3f-72da-4527-b7a8-5f09d3f5f39f-profile-collector-cert\") pod \"catalog-operator-68c6474976-v6d7k\" (UID: \"9bc83b3f-72da-4527-b7a8-5f09d3f5f39f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-v6d7k" Mar 19 18:58:09 crc kubenswrapper[4826]: I0319 18:58:09.104295 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 19 18:58:09 crc kubenswrapper[4826]: I0319 18:58:09.124348 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 19 18:58:09 crc kubenswrapper[4826]: I0319 18:58:09.144126 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 19 18:58:09 crc kubenswrapper[4826]: I0319 18:58:09.163383 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 19 18:58:09 crc kubenswrapper[4826]: I0319 18:58:09.192925 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 19 18:58:09 crc kubenswrapper[4826]: I0319 18:58:09.203985 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 19 18:58:09 crc kubenswrapper[4826]: I0319 18:58:09.223914 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 19 18:58:09 crc kubenswrapper[4826]: I0319 18:58:09.233235 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/d3d2b5c3-e37e-4a58-af35-8980d9c8d43a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-nw52z\" (UID: \"d3d2b5c3-e37e-4a58-af35-8980d9c8d43a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nw52z" Mar 19 18:58:09 crc kubenswrapper[4826]: I0319 18:58:09.246872 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 19 18:58:09 crc kubenswrapper[4826]: I0319 18:58:09.263101 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 19 18:58:09 crc kubenswrapper[4826]: I0319 18:58:09.303567 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 19 18:58:09 crc kubenswrapper[4826]: I0319 18:58:09.337020 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 19 18:58:09 crc kubenswrapper[4826]: I0319 18:58:09.344103 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 19 18:58:09 crc kubenswrapper[4826]: I0319 18:58:09.363869 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 19 18:58:09 crc kubenswrapper[4826]: I0319 18:58:09.384533 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 19 18:58:09 crc kubenswrapper[4826]: I0319 18:58:09.403646 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 19 18:58:09 crc kubenswrapper[4826]: I0319 18:58:09.423591 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 19 18:58:09 crc kubenswrapper[4826]: I0319 18:58:09.443858 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 19 18:58:09 crc kubenswrapper[4826]: I0319 18:58:09.464304 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 19 18:58:09 crc kubenswrapper[4826]: I0319 18:58:09.483925 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 19 18:58:09 crc kubenswrapper[4826]: I0319 18:58:09.503424 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 19 18:58:09 crc kubenswrapper[4826]: I0319 18:58:09.524769 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 19 18:58:09 crc kubenswrapper[4826]: I0319 18:58:09.543582 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 19 18:58:09 crc kubenswrapper[4826]: I0319 18:58:09.592362 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnblb\" (UniqueName: \"kubernetes.io/projected/4aa156db-ba19-4535-ba78-b7a4b94e29e9-kube-api-access-lnblb\") pod \"cluster-samples-operator-665b6dd947-h6hdz\" (UID: \"4aa156db-ba19-4535-ba78-b7a4b94e29e9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h6hdz" Mar 19 18:58:09 crc kubenswrapper[4826]: I0319 18:58:09.612066 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fk87\" (UniqueName: \"kubernetes.io/projected/c5a99915-a5d4-4e6f-87e0-ead11079eeec-kube-api-access-4fk87\") pod \"machine-approver-56656f9798-6sfrz\" (UID: \"c5a99915-a5d4-4e6f-87e0-ead11079eeec\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6sfrz" Mar 19 18:58:09 crc kubenswrapper[4826]: I0319 18:58:09.633640 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvkzz\" (UniqueName: \"kubernetes.io/projected/84d8b63e-fcf9-45c2-98be-d2aa00660cee-kube-api-access-xvkzz\") pod \"machine-api-operator-5694c8668f-6tq97\" (UID: \"84d8b63e-fcf9-45c2-98be-d2aa00660cee\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6tq97" Mar 19 18:58:09 crc kubenswrapper[4826]: I0319 18:58:09.656324 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mgsm\" (UniqueName: \"kubernetes.io/projected/8db9fbca-87a7-4706-9aef-e78fa4fefe16-kube-api-access-9mgsm\") pod \"controller-manager-879f6c89f-brsbv\" (UID: \"8db9fbca-87a7-4706-9aef-e78fa4fefe16\") " pod="openshift-controller-manager/controller-manager-879f6c89f-brsbv" Mar 19 18:58:09 crc kubenswrapper[4826]: I0319 18:58:09.672068 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5ln6\" (UniqueName: \"kubernetes.io/projected/86f5311b-39ed-455f-a9bc-b83044d63db8-kube-api-access-v5ln6\") pod \"apiserver-76f77b778f-h7mf4\" (UID: \"86f5311b-39ed-455f-a9bc-b83044d63db8\") " pod="openshift-apiserver/apiserver-76f77b778f-h7mf4" Mar 19 18:58:09 crc kubenswrapper[4826]: I0319 18:58:09.693940 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7h6q\" (UniqueName: \"kubernetes.io/projected/4e673de9-6eb1-430b-8123-1254957f125f-kube-api-access-f7h6q\") pod \"apiserver-7bbb656c7d-tw9k9\" (UID: \"4e673de9-6eb1-430b-8123-1254957f125f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tw9k9" Mar 19 18:58:09 crc kubenswrapper[4826]: I0319 18:58:09.709250 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jm24\" (UniqueName: \"kubernetes.io/projected/3d9978e8-9235-4a27-b28e-6e8ed8cf70c4-kube-api-access-9jm24\") pod \"route-controller-manager-6576b87f9c-hcg9z\" (UID: \"3d9978e8-9235-4a27-b28e-6e8ed8cf70c4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hcg9z" Mar 19 18:58:09 crc kubenswrapper[4826]: I0319 18:58:09.726835 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hcg9z" Mar 19 18:58:09 crc kubenswrapper[4826]: I0319 18:58:09.730897 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkr6v\" (UniqueName: \"kubernetes.io/projected/352eae31-d0e1-452b-8319-ab53b8095b5a-kube-api-access-kkr6v\") pod \"authentication-operator-69f744f599-zl2jh\" (UID: \"352eae31-d0e1-452b-8319-ab53b8095b5a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zl2jh" Mar 19 18:58:09 crc kubenswrapper[4826]: I0319 18:58:09.739695 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-h7mf4" Mar 19 18:58:09 crc kubenswrapper[4826]: I0319 18:58:09.744269 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 19 18:58:09 crc kubenswrapper[4826]: I0319 18:58:09.749737 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tw9k9" Mar 19 18:58:09 crc kubenswrapper[4826]: I0319 18:58:09.756076 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7wwk\" (UniqueName: \"kubernetes.io/projected/7b5ed6bf-c032-4782-86eb-4803da62cb59-kube-api-access-n7wwk\") pod \"oauth-openshift-558db77b4-4t5nj\" (UID: \"7b5ed6bf-c032-4782-86eb-4803da62cb59\") " pod="openshift-authentication/oauth-openshift-558db77b4-4t5nj" Mar 19 18:58:09 crc kubenswrapper[4826]: I0319 18:58:09.764337 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 19 18:58:09 crc kubenswrapper[4826]: I0319 18:58:09.785051 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 18:58:09 crc kubenswrapper[4826]: I0319 18:58:09.798220 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-6tq97" Mar 19 18:58:09 crc kubenswrapper[4826]: I0319 18:58:09.804444 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 19 18:58:09 crc kubenswrapper[4826]: I0319 18:58:09.814239 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-brsbv" Mar 19 18:58:09 crc kubenswrapper[4826]: I0319 18:58:09.824877 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 19 18:58:09 crc kubenswrapper[4826]: I0319 18:58:09.833185 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6sfrz" Mar 19 18:58:09 crc kubenswrapper[4826]: I0319 18:58:09.844904 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 18:58:09 crc kubenswrapper[4826]: I0319 18:58:09.850036 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-zl2jh" Mar 19 18:58:09 crc kubenswrapper[4826]: W0319 18:58:09.857679 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5a99915_a5d4_4e6f_87e0_ead11079eeec.slice/crio-a24dbd95d0818610fb03e37c4f86b7d947f353ba800326bc84973fdea2163e69 WatchSource:0}: Error finding container a24dbd95d0818610fb03e37c4f86b7d947f353ba800326bc84973fdea2163e69: Status 404 returned error can't find the container with id a24dbd95d0818610fb03e37c4f86b7d947f353ba800326bc84973fdea2163e69 Mar 19 18:58:09 crc kubenswrapper[4826]: I0319 18:58:09.868267 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 19 18:58:09 crc kubenswrapper[4826]: I0319 18:58:09.871254 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-4t5nj" Mar 19 18:58:09 crc kubenswrapper[4826]: I0319 18:58:09.887843 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 19 18:58:09 crc kubenswrapper[4826]: I0319 18:58:09.888743 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h6hdz" Mar 19 18:58:09 crc kubenswrapper[4826]: I0319 18:58:09.906275 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 19 18:58:09 crc kubenswrapper[4826]: I0319 18:58:09.923932 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 19 18:58:09 crc kubenswrapper[4826]: I0319 18:58:09.944422 4826 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 19 18:58:09 crc kubenswrapper[4826]: I0319 18:58:09.968638 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 19 18:58:09 crc kubenswrapper[4826]: I0319 18:58:09.986180 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.005597 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.029304 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.044533 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.059379 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-h7mf4"] Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.064101 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.069174 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-hcg9z"] Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.082914 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 19 18:58:10 crc kubenswrapper[4826]: W0319 18:58:10.090128 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d9978e8_9235_4a27_b28e_6e8ed8cf70c4.slice/crio-f1c8abc571381b171bdaca4e6f9ac25e41b9bc990451c75631d8ca91baf9ec0a WatchSource:0}: Error finding container f1c8abc571381b171bdaca4e6f9ac25e41b9bc990451c75631d8ca91baf9ec0a: Status 404 returned error can't find the container with id f1c8abc571381b171bdaca4e6f9ac25e41b9bc990451c75631d8ca91baf9ec0a Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.101390 4826 request.go:700] Waited for 1.916460172s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-dns/secrets?fieldSelector=metadata.name%3Ddns-dockercfg-jwfmh&limit=500&resourceVersion=0 Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.102647 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.126020 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.144742 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.152645 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-6tq97"] Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.164114 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.175878 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-zl2jh"] Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.177208 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-4t5nj"] Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.184015 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 19 18:58:10 crc kubenswrapper[4826]: W0319 18:58:10.189138 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84d8b63e_fcf9_45c2_98be_d2aa00660cee.slice/crio-d92052b4bbcc5dd470bc1a38604de8319c4688c8274228810d5c3dd3d374c82c WatchSource:0}: Error finding container d92052b4bbcc5dd470bc1a38604de8319c4688c8274228810d5c3dd3d374c82c: Status 404 returned error can't find the container with id d92052b4bbcc5dd470bc1a38604de8319c4688c8274228810d5c3dd3d374c82c Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.206987 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.224270 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h6hdz"] Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.244942 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z66bw\" (UniqueName: \"kubernetes.io/projected/a7b17be3-fa05-45b6-ba33-b514326061b1-kube-api-access-z66bw\") pod \"migrator-59844c95c7-jfxcc\" (UID: \"a7b17be3-fa05-45b6-ba33-b514326061b1\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jfxcc" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.260379 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9181cd89-dbc9-4fe2-aacb-4f67003e5738-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kzx65\" (UID: \"9181cd89-dbc9-4fe2-aacb-4f67003e5738\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kzx65" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.282252 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkbtp\" (UniqueName: \"kubernetes.io/projected/ed48d331-c0eb-42d6-8d6e-6617fc4b7985-kube-api-access-dkbtp\") pod \"openshift-controller-manager-operator-756b6f6bc6-jgpxv\" (UID: \"ed48d331-c0eb-42d6-8d6e-6617fc4b7985\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jgpxv" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.296210 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-tw9k9"] Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.302140 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4a1f7edf-3d9d-468d-a18c-a08128eca13c-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-98vw6\" (UID: \"4a1f7edf-3d9d-468d-a18c-a08128eca13c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-98vw6" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.307014 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jgpxv" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.307275 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-brsbv"] Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.315302 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-98vw6" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.324731 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crzlf\" (UniqueName: \"kubernetes.io/projected/0a13bc75-83b6-4952-8e8e-cd93809a87b5-kube-api-access-crzlf\") pod \"downloads-7954f5f757-cbmtf\" (UID: \"0a13bc75-83b6-4952-8e8e-cd93809a87b5\") " pod="openshift-console/downloads-7954f5f757-cbmtf" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.338411 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqzxr\" (UniqueName: \"kubernetes.io/projected/9bc83b3f-72da-4527-b7a8-5f09d3f5f39f-kube-api-access-hqzxr\") pod \"catalog-operator-68c6474976-v6d7k\" (UID: \"9bc83b3f-72da-4527-b7a8-5f09d3f5f39f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-v6d7k" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.357183 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kzx65" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.357568 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v545x\" (UniqueName: \"kubernetes.io/projected/f61cc107-39c3-4add-b9a1-45c5d744ea4b-kube-api-access-v545x\") pod \"console-operator-58897d9998-zc8ht\" (UID: \"f61cc107-39c3-4add-b9a1-45c5d744ea4b\") " pod="openshift-console-operator/console-operator-58897d9998-zc8ht" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.382038 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0d3acdc6-8778-4094-af1f-8f3824029d90-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-vj6zq\" (UID: \"0d3acdc6-8778-4094-af1f-8f3824029d90\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vj6zq" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.394092 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jfxcc" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.399893 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blslc\" (UniqueName: \"kubernetes.io/projected/6f264ef7-44e5-4dee-91c5-89c87a000c9f-kube-api-access-blslc\") pod \"multus-admission-controller-857f4d67dd-6vlbh\" (UID: \"6f264ef7-44e5-4dee-91c5-89c87a000c9f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6vlbh" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.423945 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-v6d7k" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.425484 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-659lx\" (UniqueName: \"kubernetes.io/projected/72f0a310-1676-49a4-826a-d83406d28e93-kube-api-access-659lx\") pod \"openshift-config-operator-7777fb866f-pfrcn\" (UID: \"72f0a310-1676-49a4-826a-d83406d28e93\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-pfrcn" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.442413 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gw4br\" (UniqueName: \"kubernetes.io/projected/d3d2b5c3-e37e-4a58-af35-8980d9c8d43a-kube-api-access-gw4br\") pod \"control-plane-machine-set-operator-78cbb6b69f-nw52z\" (UID: \"d3d2b5c3-e37e-4a58-af35-8980d9c8d43a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nw52z" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.462410 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66m4j\" (UniqueName: \"kubernetes.io/projected/90903b96-fde6-4803-93ab-15ecd16645af-kube-api-access-66m4j\") pod \"etcd-operator-b45778765-tmbjc\" (UID: \"90903b96-fde6-4803-93ab-15ecd16645af\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tmbjc" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.479500 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tn895\" (UniqueName: \"kubernetes.io/projected/cf827e9c-2ada-440e-a3e0-99deb1eb54c1-kube-api-access-tn895\") pod \"dns-operator-744455d44c-jmhs5\" (UID: \"cf827e9c-2ada-440e-a3e0-99deb1eb54c1\") " pod="openshift-dns-operator/dns-operator-744455d44c-jmhs5" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.480090 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-zc8ht" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.516627 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pn9bh\" (UniqueName: \"kubernetes.io/projected/1392e2fd-a142-4584-8df9-7470b9441a3d-kube-api-access-pn9bh\") pod \"machine-config-operator-74547568cd-9bmdx\" (UID: \"1392e2fd-a142-4584-8df9-7470b9441a3d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9bmdx" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.527849 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.530978 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-cbmtf" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.546056 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.560964 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jgpxv"] Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.566253 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.578703 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-pfrcn" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.587607 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 19 18:58:10 crc kubenswrapper[4826]: W0319 18:58:10.592130 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded48d331_c0eb_42d6_8d6e_6617fc4b7985.slice/crio-4008f847d8393cf0289eb1d2c1fba8b0c21d318bcbc7cb885aa7c587a7cb2c4b WatchSource:0}: Error finding container 4008f847d8393cf0289eb1d2c1fba8b0c21d318bcbc7cb885aa7c587a7cb2c4b: Status 404 returned error can't find the container with id 4008f847d8393cf0289eb1d2c1fba8b0c21d318bcbc7cb885aa7c587a7cb2c4b Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.600335 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-jmhs5" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.606682 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.620326 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-98vw6"] Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.626029 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.643842 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-tmbjc" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.650109 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vj6zq" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.654289 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-6tq97" event={"ID":"84d8b63e-fcf9-45c2-98be-d2aa00660cee","Type":"ContainerStarted","Data":"246d797bed7ca6c3883d36039adbfddb48c0ae28959c23f01bf918a67b82d516"} Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.654352 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-6tq97" event={"ID":"84d8b63e-fcf9-45c2-98be-d2aa00660cee","Type":"ContainerStarted","Data":"008788c51b1e43456f187b47f4d2d2cf9377dc90b504b1cd7ed07fca66049fc9"} Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.654368 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-6tq97" event={"ID":"84d8b63e-fcf9-45c2-98be-d2aa00660cee","Type":"ContainerStarted","Data":"d92052b4bbcc5dd470bc1a38604de8319c4688c8274228810d5c3dd3d374c82c"} Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.656241 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tw9k9" event={"ID":"4e673de9-6eb1-430b-8123-1254957f125f","Type":"ContainerStarted","Data":"104571ef23302cc6ca4eb13c65faddf34942c80de2bdccd33624aa91d4bcbad7"} Mar 19 18:58:10 crc kubenswrapper[4826]: W0319 18:58:10.661855 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a1f7edf_3d9d_468d_a18c_a08128eca13c.slice/crio-3b17e56e3c7d370f9e7209ebab234d88a510d6c8b357067b630281ca64b41bf0 WatchSource:0}: Error finding container 3b17e56e3c7d370f9e7209ebab234d88a510d6c8b357067b630281ca64b41bf0: Status 404 returned error can't find the container with id 3b17e56e3c7d370f9e7209ebab234d88a510d6c8b357067b630281ca64b41bf0 Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.662198 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-brsbv" event={"ID":"8db9fbca-87a7-4706-9aef-e78fa4fefe16","Type":"ContainerStarted","Data":"a677e0802b103dc9d367ba48513b25e441755829fbbaadda602f9b0a35a2ad74"} Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.662237 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-brsbv" event={"ID":"8db9fbca-87a7-4706-9aef-e78fa4fefe16","Type":"ContainerStarted","Data":"90877a5449a9002d86ccb0108f6df38bb0e30ed896b06bd5477fe4b621b82bbc"} Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.662452 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-brsbv" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.664108 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-6vlbh" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.665980 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-4t5nj" event={"ID":"7b5ed6bf-c032-4782-86eb-4803da62cb59","Type":"ContainerStarted","Data":"74631407175b2a500e4c0f3c7e68c1cc4fb09d80bf5f57c3acbef11c960f217d"} Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.666114 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-4t5nj" event={"ID":"7b5ed6bf-c032-4782-86eb-4803da62cb59","Type":"ContainerStarted","Data":"2658ee96c6ef5f5d81200c5cd8ac3ab3646124091534ae879d3aa094b5ef4801"} Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.666915 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-4t5nj" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.670849 4826 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-brsbv container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.670905 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-brsbv" podUID="8db9fbca-87a7-4706-9aef-e78fa4fefe16" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.671184 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-zl2jh" event={"ID":"352eae31-d0e1-452b-8319-ab53b8095b5a","Type":"ContainerStarted","Data":"d2e3ec0be6cedec0fc7839de9ce4bc718a61bf1dc4687ae1ec01c9bb5e46e584"} Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.671218 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-zl2jh" event={"ID":"352eae31-d0e1-452b-8319-ab53b8095b5a","Type":"ContainerStarted","Data":"7e0d634bbbead14e667deb02f8268f65dcbfc61045531aef3016a3c370f3e1dc"} Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.671580 4826 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-4t5nj container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.10:6443/healthz\": dial tcp 10.217.0.10:6443: connect: connection refused" start-of-body= Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.671612 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-4t5nj" podUID="7b5ed6bf-c032-4782-86eb-4803da62cb59" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.10:6443/healthz\": dial tcp 10.217.0.10:6443: connect: connection refused" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.678333 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9bmdx" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.684598 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/db8ad588-15a8-47f2-97d5-950d4a757183-oauth-serving-cert\") pod \"console-f9d7485db-p892r\" (UID: \"db8ad588-15a8-47f2-97d5-950d4a757183\") " pod="openshift-console/console-f9d7485db-p892r" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.684690 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/db8ad588-15a8-47f2-97d5-950d4a757183-console-oauth-config\") pod \"console-f9d7485db-p892r\" (UID: \"db8ad588-15a8-47f2-97d5-950d4a757183\") " pod="openshift-console/console-f9d7485db-p892r" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.684886 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ee11e1f6-25be-40f4-b19b-a2d8e439d8c6-metrics-certs\") pod \"router-default-5444994796-drbf6\" (UID: \"ee11e1f6-25be-40f4-b19b-a2d8e439d8c6\") " pod="openshift-ingress/router-default-5444994796-drbf6" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.685221 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hjjm\" (UniqueName: \"kubernetes.io/projected/3abdacf1-5969-4ef1-a1f6-745a3750dfaa-kube-api-access-9hjjm\") pod \"kube-storage-version-migrator-operator-b67b599dd-lfzff\" (UID: \"3abdacf1-5969-4ef1-a1f6-745a3750dfaa\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lfzff" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.685274 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2760a70-3c84-42db-824f-1ed69c419347-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-vxvlt\" (UID: \"a2760a70-3c84-42db-824f-1ed69c419347\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vxvlt" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.685296 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee11e1f6-25be-40f4-b19b-a2d8e439d8c6-service-ca-bundle\") pod \"router-default-5444994796-drbf6\" (UID: \"ee11e1f6-25be-40f4-b19b-a2d8e439d8c6\") " pod="openshift-ingress/router-default-5444994796-drbf6" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.685374 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4b7a5ec4-de52-4a0c-9e90-5a2835f6476e-bound-sa-token\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.685542 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4b7a5ec4-de52-4a0c-9e90-5a2835f6476e-registry-tls\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.685581 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/db8ad588-15a8-47f2-97d5-950d4a757183-console-config\") pod \"console-f9d7485db-p892r\" (UID: \"db8ad588-15a8-47f2-97d5-950d4a757183\") " pod="openshift-console/console-f9d7485db-p892r" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.685750 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4b7a5ec4-de52-4a0c-9e90-5a2835f6476e-trusted-ca\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.686017 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.686039 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ee11e1f6-25be-40f4-b19b-a2d8e439d8c6-stats-auth\") pod \"router-default-5444994796-drbf6\" (UID: \"ee11e1f6-25be-40f4-b19b-a2d8e439d8c6\") " pod="openshift-ingress/router-default-5444994796-drbf6" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.686058 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qr286\" (UniqueName: \"kubernetes.io/projected/ee11e1f6-25be-40f4-b19b-a2d8e439d8c6-kube-api-access-qr286\") pod \"router-default-5444994796-drbf6\" (UID: \"ee11e1f6-25be-40f4-b19b-a2d8e439d8c6\") " pod="openshift-ingress/router-default-5444994796-drbf6" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.686086 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4b7a5ec4-de52-4a0c-9e90-5a2835f6476e-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.686104 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a5ae7566-0aea-4736-8a36-3f4664ab9768-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-8ghl4\" (UID: \"a5ae7566-0aea-4736-8a36-3f4664ab9768\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8ghl4" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.686126 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/db8ad588-15a8-47f2-97d5-950d4a757183-console-serving-cert\") pod \"console-f9d7485db-p892r\" (UID: \"db8ad588-15a8-47f2-97d5-950d4a757183\") " pod="openshift-console/console-f9d7485db-p892r" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.686480 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a5ae7566-0aea-4736-8a36-3f4664ab9768-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-8ghl4\" (UID: \"a5ae7566-0aea-4736-8a36-3f4664ab9768\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8ghl4" Mar 19 18:58:10 crc kubenswrapper[4826]: E0319 18:58:10.686497 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:11.186478222 +0000 UTC m=+115.940546535 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mwhjm" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.686563 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vblpc\" (UniqueName: \"kubernetes.io/projected/4b7a5ec4-de52-4a0c-9e90-5a2835f6476e-kube-api-access-vblpc\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.686605 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/db8ad588-15a8-47f2-97d5-950d4a757183-service-ca\") pod \"console-f9d7485db-p892r\" (UID: \"db8ad588-15a8-47f2-97d5-950d4a757183\") " pod="openshift-console/console-f9d7485db-p892r" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.686703 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a5ae7566-0aea-4736-8a36-3f4664ab9768-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-8ghl4\" (UID: \"a5ae7566-0aea-4736-8a36-3f4664ab9768\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8ghl4" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.686734 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/18582bb6-239d-4f2e-91a4-2ea59fb54d71-trusted-ca\") pod \"ingress-operator-5b745b69d9-r4w59\" (UID: \"18582bb6-239d-4f2e-91a4-2ea59fb54d71\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r4w59" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.686752 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4b7a5ec4-de52-4a0c-9e90-5a2835f6476e-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.686774 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3abdacf1-5969-4ef1-a1f6-745a3750dfaa-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-lfzff\" (UID: \"3abdacf1-5969-4ef1-a1f6-745a3750dfaa\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lfzff" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.686818 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/18582bb6-239d-4f2e-91a4-2ea59fb54d71-bound-sa-token\") pod \"ingress-operator-5b745b69d9-r4w59\" (UID: \"18582bb6-239d-4f2e-91a4-2ea59fb54d71\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r4w59" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.686893 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/4858c7f7-6a71-40dc-8222-082f6d97504c-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-nlft6\" (UID: \"4858c7f7-6a71-40dc-8222-082f6d97504c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nlft6" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.686927 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2760a70-3c84-42db-824f-1ed69c419347-config\") pod \"openshift-apiserver-operator-796bbdcf4f-vxvlt\" (UID: \"a2760a70-3c84-42db-824f-1ed69c419347\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vxvlt" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.686996 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2j99t\" (UniqueName: \"kubernetes.io/projected/18582bb6-239d-4f2e-91a4-2ea59fb54d71-kube-api-access-2j99t\") pod \"ingress-operator-5b745b69d9-r4w59\" (UID: \"18582bb6-239d-4f2e-91a4-2ea59fb54d71\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r4w59" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.687128 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77zqc\" (UniqueName: \"kubernetes.io/projected/db8ad588-15a8-47f2-97d5-950d4a757183-kube-api-access-77zqc\") pod \"console-f9d7485db-p892r\" (UID: \"db8ad588-15a8-47f2-97d5-950d4a757183\") " pod="openshift-console/console-f9d7485db-p892r" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.687154 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db8ad588-15a8-47f2-97d5-950d4a757183-trusted-ca-bundle\") pod \"console-f9d7485db-p892r\" (UID: \"db8ad588-15a8-47f2-97d5-950d4a757183\") " pod="openshift-console/console-f9d7485db-p892r" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.687195 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4b7a5ec4-de52-4a0c-9e90-5a2835f6476e-registry-certificates\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.687255 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/18582bb6-239d-4f2e-91a4-2ea59fb54d71-metrics-tls\") pod \"ingress-operator-5b745b69d9-r4w59\" (UID: \"18582bb6-239d-4f2e-91a4-2ea59fb54d71\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r4w59" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.687274 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc62w\" (UniqueName: \"kubernetes.io/projected/4858c7f7-6a71-40dc-8222-082f6d97504c-kube-api-access-dc62w\") pod \"package-server-manager-789f6589d5-nlft6\" (UID: \"4858c7f7-6a71-40dc-8222-082f6d97504c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nlft6" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.687498 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlmzr\" (UniqueName: \"kubernetes.io/projected/a2760a70-3c84-42db-824f-1ed69c419347-kube-api-access-mlmzr\") pod \"openshift-apiserver-operator-796bbdcf4f-vxvlt\" (UID: \"a2760a70-3c84-42db-824f-1ed69c419347\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vxvlt" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.687564 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3abdacf1-5969-4ef1-a1f6-745a3750dfaa-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-lfzff\" (UID: \"3abdacf1-5969-4ef1-a1f6-745a3750dfaa\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lfzff" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.687602 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ee11e1f6-25be-40f4-b19b-a2d8e439d8c6-default-certificate\") pod \"router-default-5444994796-drbf6\" (UID: \"ee11e1f6-25be-40f4-b19b-a2d8e439d8c6\") " pod="openshift-ingress/router-default-5444994796-drbf6" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.687661 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2c4k\" (UniqueName: \"kubernetes.io/projected/a5ae7566-0aea-4736-8a36-3f4664ab9768-kube-api-access-g2c4k\") pod \"cluster-image-registry-operator-dc59b4c8b-8ghl4\" (UID: \"a5ae7566-0aea-4736-8a36-3f4664ab9768\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8ghl4" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.691835 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kzx65"] Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.716843 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6sfrz" event={"ID":"c5a99915-a5d4-4e6f-87e0-ead11079eeec","Type":"ContainerStarted","Data":"53376fde2762d3aa26db4166115f205b2a474308845713685329b2f817389ac0"} Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.716894 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6sfrz" event={"ID":"c5a99915-a5d4-4e6f-87e0-ead11079eeec","Type":"ContainerStarted","Data":"a24dbd95d0818610fb03e37c4f86b7d947f353ba800326bc84973fdea2163e69"} Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.717720 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jgpxv" event={"ID":"ed48d331-c0eb-42d6-8d6e-6617fc4b7985","Type":"ContainerStarted","Data":"4008f847d8393cf0289eb1d2c1fba8b0c21d318bcbc7cb885aa7c587a7cb2c4b"} Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.723015 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-jfxcc"] Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.728194 4826 generic.go:334] "Generic (PLEG): container finished" podID="86f5311b-39ed-455f-a9bc-b83044d63db8" containerID="efd9053ae76eb6cf747cec32478dc582ecc6cd925512ccb2c6d45ebab9385cf7" exitCode=0 Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.728593 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-h7mf4" event={"ID":"86f5311b-39ed-455f-a9bc-b83044d63db8","Type":"ContainerDied","Data":"efd9053ae76eb6cf747cec32478dc582ecc6cd925512ccb2c6d45ebab9385cf7"} Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.728634 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-h7mf4" event={"ID":"86f5311b-39ed-455f-a9bc-b83044d63db8","Type":"ContainerStarted","Data":"61291cfdba2222d7aabbadd6aa80af544c56d3a1a3b7074a909b59b1ac2187cd"} Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.732045 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hcg9z" event={"ID":"3d9978e8-9235-4a27-b28e-6e8ed8cf70c4","Type":"ContainerStarted","Data":"94a5de13a40094bcfc5b95efc4f3f8adfa66b38cd34fcadc97d5a1981791a664"} Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.732084 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hcg9z" event={"ID":"3d9978e8-9235-4a27-b28e-6e8ed8cf70c4","Type":"ContainerStarted","Data":"f1c8abc571381b171bdaca4e6f9ac25e41b9bc990451c75631d8ca91baf9ec0a"} Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.733113 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hcg9z" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.737738 4826 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-hcg9z container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.737781 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hcg9z" podUID="3d9978e8-9235-4a27-b28e-6e8ed8cf70c4" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.738218 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h6hdz" event={"ID":"4aa156db-ba19-4535-ba78-b7a4b94e29e9","Type":"ContainerStarted","Data":"19ca1236e4c6ff05800c7bb270356105589bb14f6cc362aaf7ba154ecc9b7321"} Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.738268 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h6hdz" event={"ID":"4aa156db-ba19-4535-ba78-b7a4b94e29e9","Type":"ContainerStarted","Data":"5c2ed2c59a376309e44ce019a82619e522eb8d601a5a15c59aed95e38dd32d12"} Mar 19 18:58:10 crc kubenswrapper[4826]: W0319 18:58:10.738405 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9181cd89_dbc9_4fe2_aacb_4f67003e5738.slice/crio-7fb07bd0281714154508f14a301f27f5415497b8e1aef38271e61887573dfa6d WatchSource:0}: Error finding container 7fb07bd0281714154508f14a301f27f5415497b8e1aef38271e61887573dfa6d: Status 404 returned error can't find the container with id 7fb07bd0281714154508f14a301f27f5415497b8e1aef38271e61887573dfa6d Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.739636 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nw52z" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.762043 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-zc8ht"] Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.790509 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:10 crc kubenswrapper[4826]: E0319 18:58:10.790804 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:11.290762947 +0000 UTC m=+116.044831270 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.791106 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77zqc\" (UniqueName: \"kubernetes.io/projected/db8ad588-15a8-47f2-97d5-950d4a757183-kube-api-access-77zqc\") pod \"console-f9d7485db-p892r\" (UID: \"db8ad588-15a8-47f2-97d5-950d4a757183\") " pod="openshift-console/console-f9d7485db-p892r" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.791149 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db8ad588-15a8-47f2-97d5-950d4a757183-trusted-ca-bundle\") pod \"console-f9d7485db-p892r\" (UID: \"db8ad588-15a8-47f2-97d5-950d4a757183\") " pod="openshift-console/console-f9d7485db-p892r" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.791180 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4b7a5ec4-de52-4a0c-9e90-5a2835f6476e-registry-certificates\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.791205 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/fdb49b25-5e81-4f9d-9a17-34bade2cec18-srv-cert\") pod \"olm-operator-6b444d44fb-dnc22\" (UID: \"fdb49b25-5e81-4f9d-9a17-34bade2cec18\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dnc22" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.791255 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/18582bb6-239d-4f2e-91a4-2ea59fb54d71-metrics-tls\") pod \"ingress-operator-5b745b69d9-r4w59\" (UID: \"18582bb6-239d-4f2e-91a4-2ea59fb54d71\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r4w59" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.791273 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dc62w\" (UniqueName: \"kubernetes.io/projected/4858c7f7-6a71-40dc-8222-082f6d97504c-kube-api-access-dc62w\") pod \"package-server-manager-789f6589d5-nlft6\" (UID: \"4858c7f7-6a71-40dc-8222-082f6d97504c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nlft6" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.791484 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/eeb43c2f-961b-4ed4-9aa0-cda4dea289cb-registration-dir\") pod \"csi-hostpathplugin-4rf57\" (UID: \"eeb43c2f-961b-4ed4-9aa0-cda4dea289cb\") " pod="hostpath-provisioner/csi-hostpathplugin-4rf57" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.791500 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/e4f6594b-0c99-449b-ac20-4aae09000a73-certs\") pod \"machine-config-server-fc4kh\" (UID: \"e4f6594b-0c99-449b-ac20-4aae09000a73\") " pod="openshift-machine-config-operator/machine-config-server-fc4kh" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.791527 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlmzr\" (UniqueName: \"kubernetes.io/projected/a2760a70-3c84-42db-824f-1ed69c419347-kube-api-access-mlmzr\") pod \"openshift-apiserver-operator-796bbdcf4f-vxvlt\" (UID: \"a2760a70-3c84-42db-824f-1ed69c419347\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vxvlt" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.791545 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6d732543-0ecb-4570-bbfb-8a80570674d5-proxy-tls\") pod \"machine-config-controller-84d6567774-76ppq\" (UID: \"6d732543-0ecb-4570-bbfb-8a80570674d5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-76ppq" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.791595 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3abdacf1-5969-4ef1-a1f6-745a3750dfaa-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-lfzff\" (UID: \"3abdacf1-5969-4ef1-a1f6-745a3750dfaa\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lfzff" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.791616 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ee11e1f6-25be-40f4-b19b-a2d8e439d8c6-default-certificate\") pod \"router-default-5444994796-drbf6\" (UID: \"ee11e1f6-25be-40f4-b19b-a2d8e439d8c6\") " pod="openshift-ingress/router-default-5444994796-drbf6" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.791633 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/eeb43c2f-961b-4ed4-9aa0-cda4dea289cb-csi-data-dir\") pod \"csi-hostpathplugin-4rf57\" (UID: \"eeb43c2f-961b-4ed4-9aa0-cda4dea289cb\") " pod="hostpath-provisioner/csi-hostpathplugin-4rf57" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.791674 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2c4k\" (UniqueName: \"kubernetes.io/projected/a5ae7566-0aea-4736-8a36-3f4664ab9768-kube-api-access-g2c4k\") pod \"cluster-image-registry-operator-dc59b4c8b-8ghl4\" (UID: \"a5ae7566-0aea-4736-8a36-3f4664ab9768\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8ghl4" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.791694 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kc8w2\" (UniqueName: \"kubernetes.io/projected/7c53abe4-412d-47a0-bccc-ec9e6f4d8784-kube-api-access-kc8w2\") pod \"collect-profiles-29565765-b929z\" (UID: \"7c53abe4-412d-47a0-bccc-ec9e6f4d8784\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565765-b929z" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.791735 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/db8ad588-15a8-47f2-97d5-950d4a757183-oauth-serving-cert\") pod \"console-f9d7485db-p892r\" (UID: \"db8ad588-15a8-47f2-97d5-950d4a757183\") " pod="openshift-console/console-f9d7485db-p892r" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.791767 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ffb5bca9-57f3-415c-b2c2-c5088fe6c5d9-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-flj2h\" (UID: \"ffb5bca9-57f3-415c-b2c2-c5088fe6c5d9\") " pod="openshift-marketplace/marketplace-operator-79b997595-flj2h" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.791844 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a-ready\") pod \"cni-sysctl-allowlist-ds-m7zht\" (UID: \"97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a\") " pod="openshift-multus/cni-sysctl-allowlist-ds-m7zht" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.791881 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-m7zht\" (UID: \"97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a\") " pod="openshift-multus/cni-sysctl-allowlist-ds-m7zht" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.791996 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xssjm\" (UniqueName: \"kubernetes.io/projected/a4a3e741-fc60-4076-8167-0e7cc776345e-kube-api-access-xssjm\") pod \"dns-default-ph9t5\" (UID: \"a4a3e741-fc60-4076-8167-0e7cc776345e\") " pod="openshift-dns/dns-default-ph9t5" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.792051 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f2201d52-2767-4119-9f15-8e3da0ee8570-signing-cabundle\") pod \"service-ca-9c57cc56f-7jk4n\" (UID: \"f2201d52-2767-4119-9f15-8e3da0ee8570\") " pod="openshift-service-ca/service-ca-9c57cc56f-7jk4n" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.792116 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ee11e1f6-25be-40f4-b19b-a2d8e439d8c6-metrics-certs\") pod \"router-default-5444994796-drbf6\" (UID: \"ee11e1f6-25be-40f4-b19b-a2d8e439d8c6\") " pod="openshift-ingress/router-default-5444994796-drbf6" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.792132 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a4a3e741-fc60-4076-8167-0e7cc776345e-config-volume\") pod \"dns-default-ph9t5\" (UID: \"a4a3e741-fc60-4076-8167-0e7cc776345e\") " pod="openshift-dns/dns-default-ph9t5" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.792149 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/db8ad588-15a8-47f2-97d5-950d4a757183-console-oauth-config\") pod \"console-f9d7485db-p892r\" (UID: \"db8ad588-15a8-47f2-97d5-950d4a757183\") " pod="openshift-console/console-f9d7485db-p892r" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.792194 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/eeb43c2f-961b-4ed4-9aa0-cda4dea289cb-plugins-dir\") pod \"csi-hostpathplugin-4rf57\" (UID: \"eeb43c2f-961b-4ed4-9aa0-cda4dea289cb\") " pod="hostpath-provisioner/csi-hostpathplugin-4rf57" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.792212 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b3f973e-b13a-45f3-a9cc-b84a8a8310a1-config\") pod \"service-ca-operator-777779d784-z8vj2\" (UID: \"7b3f973e-b13a-45f3-a9cc-b84a8a8310a1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-z8vj2" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.792230 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/781f0741-f222-4ccc-aa80-6dde59e9648d-apiservice-cert\") pod \"packageserver-d55dfcdfc-fcnzx\" (UID: \"781f0741-f222-4ccc-aa80-6dde59e9648d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fcnzx" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.792247 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/e4f6594b-0c99-449b-ac20-4aae09000a73-node-bootstrap-token\") pod \"machine-config-server-fc4kh\" (UID: \"e4f6594b-0c99-449b-ac20-4aae09000a73\") " pod="openshift-machine-config-operator/machine-config-server-fc4kh" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.792283 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hjjm\" (UniqueName: \"kubernetes.io/projected/3abdacf1-5969-4ef1-a1f6-745a3750dfaa-kube-api-access-9hjjm\") pod \"kube-storage-version-migrator-operator-b67b599dd-lfzff\" (UID: \"3abdacf1-5969-4ef1-a1f6-745a3750dfaa\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lfzff" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.792301 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6mbz\" (UniqueName: \"kubernetes.io/projected/eeb43c2f-961b-4ed4-9aa0-cda4dea289cb-kube-api-access-p6mbz\") pod \"csi-hostpathplugin-4rf57\" (UID: \"eeb43c2f-961b-4ed4-9aa0-cda4dea289cb\") " pod="hostpath-provisioner/csi-hostpathplugin-4rf57" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.804905 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/781f0741-f222-4ccc-aa80-6dde59e9648d-tmpfs\") pod \"packageserver-d55dfcdfc-fcnzx\" (UID: \"781f0741-f222-4ccc-aa80-6dde59e9648d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fcnzx" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.805097 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2760a70-3c84-42db-824f-1ed69c419347-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-vxvlt\" (UID: \"a2760a70-3c84-42db-824f-1ed69c419347\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vxvlt" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.805134 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee11e1f6-25be-40f4-b19b-a2d8e439d8c6-service-ca-bundle\") pod \"router-default-5444994796-drbf6\" (UID: \"ee11e1f6-25be-40f4-b19b-a2d8e439d8c6\") " pod="openshift-ingress/router-default-5444994796-drbf6" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.805171 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4brm\" (UniqueName: \"kubernetes.io/projected/6d732543-0ecb-4570-bbfb-8a80570674d5-kube-api-access-t4brm\") pod \"machine-config-controller-84d6567774-76ppq\" (UID: \"6d732543-0ecb-4570-bbfb-8a80570674d5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-76ppq" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.805210 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flkpq\" (UniqueName: \"kubernetes.io/projected/4a42d5d3-6341-4be7-a1fc-69f49476197f-kube-api-access-flkpq\") pod \"ingress-canary-cx4gd\" (UID: \"4a42d5d3-6341-4be7-a1fc-69f49476197f\") " pod="openshift-ingress-canary/ingress-canary-cx4gd" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.805294 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4b7a5ec4-de52-4a0c-9e90-5a2835f6476e-bound-sa-token\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.805828 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4b7a5ec4-de52-4a0c-9e90-5a2835f6476e-registry-tls\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.806219 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/db8ad588-15a8-47f2-97d5-950d4a757183-console-config\") pod \"console-f9d7485db-p892r\" (UID: \"db8ad588-15a8-47f2-97d5-950d4a757183\") " pod="openshift-console/console-f9d7485db-p892r" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.806258 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6d732543-0ecb-4570-bbfb-8a80570674d5-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-76ppq\" (UID: \"6d732543-0ecb-4570-bbfb-8a80570674d5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-76ppq" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.806312 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/eeb43c2f-961b-4ed4-9aa0-cda4dea289cb-socket-dir\") pod \"csi-hostpathplugin-4rf57\" (UID: \"eeb43c2f-961b-4ed4-9aa0-cda4dea289cb\") " pod="hostpath-provisioner/csi-hostpathplugin-4rf57" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.806786 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4b7a5ec4-de52-4a0c-9e90-5a2835f6476e-trusted-ca\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.807032 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shfdt\" (UniqueName: \"kubernetes.io/projected/97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a-kube-api-access-shfdt\") pod \"cni-sysctl-allowlist-ds-m7zht\" (UID: \"97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a\") " pod="openshift-multus/cni-sysctl-allowlist-ds-m7zht" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.807089 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvltp\" (UniqueName: \"kubernetes.io/projected/fdb49b25-5e81-4f9d-9a17-34bade2cec18-kube-api-access-gvltp\") pod \"olm-operator-6b444d44fb-dnc22\" (UID: \"fdb49b25-5e81-4f9d-9a17-34bade2cec18\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dnc22" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.807930 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3abdacf1-5969-4ef1-a1f6-745a3750dfaa-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-lfzff\" (UID: \"3abdacf1-5969-4ef1-a1f6-745a3750dfaa\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lfzff" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.817333 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/db8ad588-15a8-47f2-97d5-950d4a757183-console-config\") pod \"console-f9d7485db-p892r\" (UID: \"db8ad588-15a8-47f2-97d5-950d4a757183\") " pod="openshift-console/console-f9d7485db-p892r" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.819376 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee11e1f6-25be-40f4-b19b-a2d8e439d8c6-service-ca-bundle\") pod \"router-default-5444994796-drbf6\" (UID: \"ee11e1f6-25be-40f4-b19b-a2d8e439d8c6\") " pod="openshift-ingress/router-default-5444994796-drbf6" Mar 19 18:58:10 crc kubenswrapper[4826]: W0319 18:58:10.826547 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf61cc107_39c3_4add_b9a1_45c5d744ea4b.slice/crio-c171b331b82387ac72a3bc70808a3ed64199b133828e5ffa3a07bfb7d6890fa0 WatchSource:0}: Error finding container c171b331b82387ac72a3bc70808a3ed64199b133828e5ffa3a07bfb7d6890fa0: Status 404 returned error can't find the container with id c171b331b82387ac72a3bc70808a3ed64199b133828e5ffa3a07bfb7d6890fa0 Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.832805 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2760a70-3c84-42db-824f-1ed69c419347-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-vxvlt\" (UID: \"a2760a70-3c84-42db-824f-1ed69c419347\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vxvlt" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.839343 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4b7a5ec4-de52-4a0c-9e90-5a2835f6476e-registry-certificates\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.840069 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ee11e1f6-25be-40f4-b19b-a2d8e439d8c6-default-certificate\") pod \"router-default-5444994796-drbf6\" (UID: \"ee11e1f6-25be-40f4-b19b-a2d8e439d8c6\") " pod="openshift-ingress/router-default-5444994796-drbf6" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.843809 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/db8ad588-15a8-47f2-97d5-950d4a757183-oauth-serving-cert\") pod \"console-f9d7485db-p892r\" (UID: \"db8ad588-15a8-47f2-97d5-950d4a757183\") " pod="openshift-console/console-f9d7485db-p892r" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.863925 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/eeb43c2f-961b-4ed4-9aa0-cda4dea289cb-mountpoint-dir\") pod \"csi-hostpathplugin-4rf57\" (UID: \"eeb43c2f-961b-4ed4-9aa0-cda4dea289cb\") " pod="hostpath-provisioner/csi-hostpathplugin-4rf57" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.867088 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsr27\" (UniqueName: \"kubernetes.io/projected/ffb5bca9-57f3-415c-b2c2-c5088fe6c5d9-kube-api-access-wsr27\") pod \"marketplace-operator-79b997595-flj2h\" (UID: \"ffb5bca9-57f3-415c-b2c2-c5088fe6c5d9\") " pod="openshift-marketplace/marketplace-operator-79b997595-flj2h" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.867151 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4a42d5d3-6341-4be7-a1fc-69f49476197f-cert\") pod \"ingress-canary-cx4gd\" (UID: \"4a42d5d3-6341-4be7-a1fc-69f49476197f\") " pod="openshift-ingress-canary/ingress-canary-cx4gd" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.867220 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.867294 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ee11e1f6-25be-40f4-b19b-a2d8e439d8c6-stats-auth\") pod \"router-default-5444994796-drbf6\" (UID: \"ee11e1f6-25be-40f4-b19b-a2d8e439d8c6\") " pod="openshift-ingress/router-default-5444994796-drbf6" Mar 19 18:58:10 crc kubenswrapper[4826]: E0319 18:58:10.867535 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:11.367521444 +0000 UTC m=+116.121589757 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mwhjm" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.868733 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ee11e1f6-25be-40f4-b19b-a2d8e439d8c6-metrics-certs\") pod \"router-default-5444994796-drbf6\" (UID: \"ee11e1f6-25be-40f4-b19b-a2d8e439d8c6\") " pod="openshift-ingress/router-default-5444994796-drbf6" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.869844 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77zqc\" (UniqueName: \"kubernetes.io/projected/db8ad588-15a8-47f2-97d5-950d4a757183-kube-api-access-77zqc\") pod \"console-f9d7485db-p892r\" (UID: \"db8ad588-15a8-47f2-97d5-950d4a757183\") " pod="openshift-console/console-f9d7485db-p892r" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.869943 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4b7a5ec4-de52-4a0c-9e90-5a2835f6476e-trusted-ca\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.870085 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qr286\" (UniqueName: \"kubernetes.io/projected/ee11e1f6-25be-40f4-b19b-a2d8e439d8c6-kube-api-access-qr286\") pod \"router-default-5444994796-drbf6\" (UID: \"ee11e1f6-25be-40f4-b19b-a2d8e439d8c6\") " pod="openshift-ingress/router-default-5444994796-drbf6" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.870149 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4b7a5ec4-de52-4a0c-9e90-5a2835f6476e-registry-tls\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.870396 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4b7a5ec4-de52-4a0c-9e90-5a2835f6476e-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.871172 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/db8ad588-15a8-47f2-97d5-950d4a757183-console-oauth-config\") pod \"console-f9d7485db-p892r\" (UID: \"db8ad588-15a8-47f2-97d5-950d4a757183\") " pod="openshift-console/console-f9d7485db-p892r" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.871436 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db8ad588-15a8-47f2-97d5-950d4a757183-trusted-ca-bundle\") pod \"console-f9d7485db-p892r\" (UID: \"db8ad588-15a8-47f2-97d5-950d4a757183\") " pod="openshift-console/console-f9d7485db-p892r" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.871625 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a5ae7566-0aea-4736-8a36-3f4664ab9768-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-8ghl4\" (UID: \"a5ae7566-0aea-4736-8a36-3f4664ab9768\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8ghl4" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.871716 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/db8ad588-15a8-47f2-97d5-950d4a757183-console-serving-cert\") pod \"console-f9d7485db-p892r\" (UID: \"db8ad588-15a8-47f2-97d5-950d4a757183\") " pod="openshift-console/console-f9d7485db-p892r" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.871742 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7c53abe4-412d-47a0-bccc-ec9e6f4d8784-config-volume\") pod \"collect-profiles-29565765-b929z\" (UID: \"7c53abe4-412d-47a0-bccc-ec9e6f4d8784\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565765-b929z" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.871763 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdq6z\" (UniqueName: \"kubernetes.io/projected/509cd3a8-f3bb-4214-a70b-e589905ad242-kube-api-access-gdq6z\") pod \"auto-csr-approver-29565778-hxldh\" (UID: \"509cd3a8-f3bb-4214-a70b-e589905ad242\") " pod="openshift-infra/auto-csr-approver-29565778-hxldh" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.872531 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a5ae7566-0aea-4736-8a36-3f4664ab9768-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-8ghl4\" (UID: \"a5ae7566-0aea-4736-8a36-3f4664ab9768\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8ghl4" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.872729 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vblpc\" (UniqueName: \"kubernetes.io/projected/4b7a5ec4-de52-4a0c-9e90-5a2835f6476e-kube-api-access-vblpc\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.872790 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/db8ad588-15a8-47f2-97d5-950d4a757183-service-ca\") pod \"console-f9d7485db-p892r\" (UID: \"db8ad588-15a8-47f2-97d5-950d4a757183\") " pod="openshift-console/console-f9d7485db-p892r" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.874006 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/db8ad588-15a8-47f2-97d5-950d4a757183-service-ca\") pod \"console-f9d7485db-p892r\" (UID: \"db8ad588-15a8-47f2-97d5-950d4a757183\") " pod="openshift-console/console-f9d7485db-p892r" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.874144 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ee11e1f6-25be-40f4-b19b-a2d8e439d8c6-stats-auth\") pod \"router-default-5444994796-drbf6\" (UID: \"ee11e1f6-25be-40f4-b19b-a2d8e439d8c6\") " pod="openshift-ingress/router-default-5444994796-drbf6" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.874236 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7c53abe4-412d-47a0-bccc-ec9e6f4d8784-secret-volume\") pod \"collect-profiles-29565765-b929z\" (UID: \"7c53abe4-412d-47a0-bccc-ec9e6f4d8784\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565765-b929z" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.874305 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4b7a5ec4-de52-4a0c-9e90-5a2835f6476e-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.874583 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a5ae7566-0aea-4736-8a36-3f4664ab9768-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-8ghl4\" (UID: \"a5ae7566-0aea-4736-8a36-3f4664ab9768\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8ghl4" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.874801 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-m7zht\" (UID: \"97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a\") " pod="openshift-multus/cni-sysctl-allowlist-ds-m7zht" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.874952 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b3f973e-b13a-45f3-a9cc-b84a8a8310a1-serving-cert\") pod \"service-ca-operator-777779d784-z8vj2\" (UID: \"7b3f973e-b13a-45f3-a9cc-b84a8a8310a1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-z8vj2" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.875003 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sm4nn\" (UniqueName: \"kubernetes.io/projected/e4f6594b-0c99-449b-ac20-4aae09000a73-kube-api-access-sm4nn\") pod \"machine-config-server-fc4kh\" (UID: \"e4f6594b-0c99-449b-ac20-4aae09000a73\") " pod="openshift-machine-config-operator/machine-config-server-fc4kh" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.875298 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/18582bb6-239d-4f2e-91a4-2ea59fb54d71-trusted-ca\") pod \"ingress-operator-5b745b69d9-r4w59\" (UID: \"18582bb6-239d-4f2e-91a4-2ea59fb54d71\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r4w59" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.875323 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3abdacf1-5969-4ef1-a1f6-745a3750dfaa-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-lfzff\" (UID: \"3abdacf1-5969-4ef1-a1f6-745a3750dfaa\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lfzff" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.875441 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4b7a5ec4-de52-4a0c-9e90-5a2835f6476e-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.875716 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/18582bb6-239d-4f2e-91a4-2ea59fb54d71-bound-sa-token\") pod \"ingress-operator-5b745b69d9-r4w59\" (UID: \"18582bb6-239d-4f2e-91a4-2ea59fb54d71\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r4w59" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.875783 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/781f0741-f222-4ccc-aa80-6dde59e9648d-webhook-cert\") pod \"packageserver-d55dfcdfc-fcnzx\" (UID: \"781f0741-f222-4ccc-aa80-6dde59e9648d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fcnzx" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.875803 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzqkc\" (UniqueName: \"kubernetes.io/projected/781f0741-f222-4ccc-aa80-6dde59e9648d-kube-api-access-nzqkc\") pod \"packageserver-d55dfcdfc-fcnzx\" (UID: \"781f0741-f222-4ccc-aa80-6dde59e9648d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fcnzx" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.875825 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/fdb49b25-5e81-4f9d-9a17-34bade2cec18-profile-collector-cert\") pod \"olm-operator-6b444d44fb-dnc22\" (UID: \"fdb49b25-5e81-4f9d-9a17-34bade2cec18\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dnc22" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.875848 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6bpb\" (UniqueName: \"kubernetes.io/projected/7b3f973e-b13a-45f3-a9cc-b84a8a8310a1-kube-api-access-m6bpb\") pod \"service-ca-operator-777779d784-z8vj2\" (UID: \"7b3f973e-b13a-45f3-a9cc-b84a8a8310a1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-z8vj2" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.875872 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f2201d52-2767-4119-9f15-8e3da0ee8570-signing-key\") pod \"service-ca-9c57cc56f-7jk4n\" (UID: \"f2201d52-2767-4119-9f15-8e3da0ee8570\") " pod="openshift-service-ca/service-ca-9c57cc56f-7jk4n" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.876022 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ffb5bca9-57f3-415c-b2c2-c5088fe6c5d9-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-flj2h\" (UID: \"ffb5bca9-57f3-415c-b2c2-c5088fe6c5d9\") " pod="openshift-marketplace/marketplace-operator-79b997595-flj2h" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.876060 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a4a3e741-fc60-4076-8167-0e7cc776345e-metrics-tls\") pod \"dns-default-ph9t5\" (UID: \"a4a3e741-fc60-4076-8167-0e7cc776345e\") " pod="openshift-dns/dns-default-ph9t5" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.876672 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4b7a5ec4-de52-4a0c-9e90-5a2835f6476e-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.877852 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/18582bb6-239d-4f2e-91a4-2ea59fb54d71-trusted-ca\") pod \"ingress-operator-5b745b69d9-r4w59\" (UID: \"18582bb6-239d-4f2e-91a4-2ea59fb54d71\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r4w59" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.878891 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a5ae7566-0aea-4736-8a36-3f4664ab9768-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-8ghl4\" (UID: \"a5ae7566-0aea-4736-8a36-3f4664ab9768\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8ghl4" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.879554 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/db8ad588-15a8-47f2-97d5-950d4a757183-console-serving-cert\") pod \"console-f9d7485db-p892r\" (UID: \"db8ad588-15a8-47f2-97d5-950d4a757183\") " pod="openshift-console/console-f9d7485db-p892r" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.879629 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/4858c7f7-6a71-40dc-8222-082f6d97504c-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-nlft6\" (UID: \"4858c7f7-6a71-40dc-8222-082f6d97504c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nlft6" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.879790 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-v6d7k"] Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.880038 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-th5vs\" (UniqueName: \"kubernetes.io/projected/f2201d52-2767-4119-9f15-8e3da0ee8570-kube-api-access-th5vs\") pod \"service-ca-9c57cc56f-7jk4n\" (UID: \"f2201d52-2767-4119-9f15-8e3da0ee8570\") " pod="openshift-service-ca/service-ca-9c57cc56f-7jk4n" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.880202 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2760a70-3c84-42db-824f-1ed69c419347-config\") pod \"openshift-apiserver-operator-796bbdcf4f-vxvlt\" (UID: \"a2760a70-3c84-42db-824f-1ed69c419347\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vxvlt" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.880824 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2j99t\" (UniqueName: \"kubernetes.io/projected/18582bb6-239d-4f2e-91a4-2ea59fb54d71-kube-api-access-2j99t\") pod \"ingress-operator-5b745b69d9-r4w59\" (UID: \"18582bb6-239d-4f2e-91a4-2ea59fb54d71\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r4w59" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.880958 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2760a70-3c84-42db-824f-1ed69c419347-config\") pod \"openshift-apiserver-operator-796bbdcf4f-vxvlt\" (UID: \"a2760a70-3c84-42db-824f-1ed69c419347\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vxvlt" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.883261 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a5ae7566-0aea-4736-8a36-3f4664ab9768-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-8ghl4\" (UID: \"a5ae7566-0aea-4736-8a36-3f4664ab9768\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8ghl4" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.883854 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/4858c7f7-6a71-40dc-8222-082f6d97504c-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-nlft6\" (UID: \"4858c7f7-6a71-40dc-8222-082f6d97504c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nlft6" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.884274 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlmzr\" (UniqueName: \"kubernetes.io/projected/a2760a70-3c84-42db-824f-1ed69c419347-kube-api-access-mlmzr\") pod \"openshift-apiserver-operator-796bbdcf4f-vxvlt\" (UID: \"a2760a70-3c84-42db-824f-1ed69c419347\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vxvlt" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.884566 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/18582bb6-239d-4f2e-91a4-2ea59fb54d71-metrics-tls\") pod \"ingress-operator-5b745b69d9-r4w59\" (UID: \"18582bb6-239d-4f2e-91a4-2ea59fb54d71\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r4w59" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.884882 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-p892r" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.892629 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dc62w\" (UniqueName: \"kubernetes.io/projected/4858c7f7-6a71-40dc-8222-082f6d97504c-kube-api-access-dc62w\") pod \"package-server-manager-789f6589d5-nlft6\" (UID: \"4858c7f7-6a71-40dc-8222-082f6d97504c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nlft6" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.893187 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hjjm\" (UniqueName: \"kubernetes.io/projected/3abdacf1-5969-4ef1-a1f6-745a3750dfaa-kube-api-access-9hjjm\") pod \"kube-storage-version-migrator-operator-b67b599dd-lfzff\" (UID: \"3abdacf1-5969-4ef1-a1f6-745a3750dfaa\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lfzff" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.894301 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3abdacf1-5969-4ef1-a1f6-745a3750dfaa-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-lfzff\" (UID: \"3abdacf1-5969-4ef1-a1f6-745a3750dfaa\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lfzff" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.905176 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2c4k\" (UniqueName: \"kubernetes.io/projected/a5ae7566-0aea-4736-8a36-3f4664ab9768-kube-api-access-g2c4k\") pod \"cluster-image-registry-operator-dc59b4c8b-8ghl4\" (UID: \"a5ae7566-0aea-4736-8a36-3f4664ab9768\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8ghl4" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.919906 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4b7a5ec4-de52-4a0c-9e90-5a2835f6476e-bound-sa-token\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.932924 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vxvlt" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.948380 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qr286\" (UniqueName: \"kubernetes.io/projected/ee11e1f6-25be-40f4-b19b-a2d8e439d8c6-kube-api-access-qr286\") pod \"router-default-5444994796-drbf6\" (UID: \"ee11e1f6-25be-40f4-b19b-a2d8e439d8c6\") " pod="openshift-ingress/router-default-5444994796-drbf6" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.952881 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-pfrcn"] Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.972115 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a5ae7566-0aea-4736-8a36-3f4664ab9768-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-8ghl4\" (UID: \"a5ae7566-0aea-4736-8a36-3f4664ab9768\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8ghl4" Mar 19 18:58:10 crc kubenswrapper[4826]: E0319 18:58:10.984404 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:11.484370982 +0000 UTC m=+116.238439295 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.984935 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.985416 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a4a3e741-fc60-4076-8167-0e7cc776345e-config-volume\") pod \"dns-default-ph9t5\" (UID: \"a4a3e741-fc60-4076-8167-0e7cc776345e\") " pod="openshift-dns/dns-default-ph9t5" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.985452 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/eeb43c2f-961b-4ed4-9aa0-cda4dea289cb-plugins-dir\") pod \"csi-hostpathplugin-4rf57\" (UID: \"eeb43c2f-961b-4ed4-9aa0-cda4dea289cb\") " pod="hostpath-provisioner/csi-hostpathplugin-4rf57" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.985468 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b3f973e-b13a-45f3-a9cc-b84a8a8310a1-config\") pod \"service-ca-operator-777779d784-z8vj2\" (UID: \"7b3f973e-b13a-45f3-a9cc-b84a8a8310a1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-z8vj2" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.985490 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/781f0741-f222-4ccc-aa80-6dde59e9648d-apiservice-cert\") pod \"packageserver-d55dfcdfc-fcnzx\" (UID: \"781f0741-f222-4ccc-aa80-6dde59e9648d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fcnzx" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.985508 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/e4f6594b-0c99-449b-ac20-4aae09000a73-node-bootstrap-token\") pod \"machine-config-server-fc4kh\" (UID: \"e4f6594b-0c99-449b-ac20-4aae09000a73\") " pod="openshift-machine-config-operator/machine-config-server-fc4kh" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.985548 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6mbz\" (UniqueName: \"kubernetes.io/projected/eeb43c2f-961b-4ed4-9aa0-cda4dea289cb-kube-api-access-p6mbz\") pod \"csi-hostpathplugin-4rf57\" (UID: \"eeb43c2f-961b-4ed4-9aa0-cda4dea289cb\") " pod="hostpath-provisioner/csi-hostpathplugin-4rf57" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.985565 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/781f0741-f222-4ccc-aa80-6dde59e9648d-tmpfs\") pod \"packageserver-d55dfcdfc-fcnzx\" (UID: \"781f0741-f222-4ccc-aa80-6dde59e9648d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fcnzx" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.985582 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4brm\" (UniqueName: \"kubernetes.io/projected/6d732543-0ecb-4570-bbfb-8a80570674d5-kube-api-access-t4brm\") pod \"machine-config-controller-84d6567774-76ppq\" (UID: \"6d732543-0ecb-4570-bbfb-8a80570674d5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-76ppq" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.985603 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flkpq\" (UniqueName: \"kubernetes.io/projected/4a42d5d3-6341-4be7-a1fc-69f49476197f-kube-api-access-flkpq\") pod \"ingress-canary-cx4gd\" (UID: \"4a42d5d3-6341-4be7-a1fc-69f49476197f\") " pod="openshift-ingress-canary/ingress-canary-cx4gd" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.985642 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6d732543-0ecb-4570-bbfb-8a80570674d5-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-76ppq\" (UID: \"6d732543-0ecb-4570-bbfb-8a80570674d5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-76ppq" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.985678 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/eeb43c2f-961b-4ed4-9aa0-cda4dea289cb-socket-dir\") pod \"csi-hostpathplugin-4rf57\" (UID: \"eeb43c2f-961b-4ed4-9aa0-cda4dea289cb\") " pod="hostpath-provisioner/csi-hostpathplugin-4rf57" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.985704 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shfdt\" (UniqueName: \"kubernetes.io/projected/97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a-kube-api-access-shfdt\") pod \"cni-sysctl-allowlist-ds-m7zht\" (UID: \"97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a\") " pod="openshift-multus/cni-sysctl-allowlist-ds-m7zht" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.985725 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvltp\" (UniqueName: \"kubernetes.io/projected/fdb49b25-5e81-4f9d-9a17-34bade2cec18-kube-api-access-gvltp\") pod \"olm-operator-6b444d44fb-dnc22\" (UID: \"fdb49b25-5e81-4f9d-9a17-34bade2cec18\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dnc22" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.985744 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/eeb43c2f-961b-4ed4-9aa0-cda4dea289cb-mountpoint-dir\") pod \"csi-hostpathplugin-4rf57\" (UID: \"eeb43c2f-961b-4ed4-9aa0-cda4dea289cb\") " pod="hostpath-provisioner/csi-hostpathplugin-4rf57" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.985763 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsr27\" (UniqueName: \"kubernetes.io/projected/ffb5bca9-57f3-415c-b2c2-c5088fe6c5d9-kube-api-access-wsr27\") pod \"marketplace-operator-79b997595-flj2h\" (UID: \"ffb5bca9-57f3-415c-b2c2-c5088fe6c5d9\") " pod="openshift-marketplace/marketplace-operator-79b997595-flj2h" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.985780 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4a42d5d3-6341-4be7-a1fc-69f49476197f-cert\") pod \"ingress-canary-cx4gd\" (UID: \"4a42d5d3-6341-4be7-a1fc-69f49476197f\") " pod="openshift-ingress-canary/ingress-canary-cx4gd" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.985806 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.985837 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7c53abe4-412d-47a0-bccc-ec9e6f4d8784-config-volume\") pod \"collect-profiles-29565765-b929z\" (UID: \"7c53abe4-412d-47a0-bccc-ec9e6f4d8784\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565765-b929z" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.985857 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdq6z\" (UniqueName: \"kubernetes.io/projected/509cd3a8-f3bb-4214-a70b-e589905ad242-kube-api-access-gdq6z\") pod \"auto-csr-approver-29565778-hxldh\" (UID: \"509cd3a8-f3bb-4214-a70b-e589905ad242\") " pod="openshift-infra/auto-csr-approver-29565778-hxldh" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.985899 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7c53abe4-412d-47a0-bccc-ec9e6f4d8784-secret-volume\") pod \"collect-profiles-29565765-b929z\" (UID: \"7c53abe4-412d-47a0-bccc-ec9e6f4d8784\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565765-b929z" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.985923 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b3f973e-b13a-45f3-a9cc-b84a8a8310a1-serving-cert\") pod \"service-ca-operator-777779d784-z8vj2\" (UID: \"7b3f973e-b13a-45f3-a9cc-b84a8a8310a1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-z8vj2" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.985941 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-m7zht\" (UID: \"97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a\") " pod="openshift-multus/cni-sysctl-allowlist-ds-m7zht" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.985961 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sm4nn\" (UniqueName: \"kubernetes.io/projected/e4f6594b-0c99-449b-ac20-4aae09000a73-kube-api-access-sm4nn\") pod \"machine-config-server-fc4kh\" (UID: \"e4f6594b-0c99-449b-ac20-4aae09000a73\") " pod="openshift-machine-config-operator/machine-config-server-fc4kh" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.985980 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/fdb49b25-5e81-4f9d-9a17-34bade2cec18-profile-collector-cert\") pod \"olm-operator-6b444d44fb-dnc22\" (UID: \"fdb49b25-5e81-4f9d-9a17-34bade2cec18\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dnc22" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.985999 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6bpb\" (UniqueName: \"kubernetes.io/projected/7b3f973e-b13a-45f3-a9cc-b84a8a8310a1-kube-api-access-m6bpb\") pod \"service-ca-operator-777779d784-z8vj2\" (UID: \"7b3f973e-b13a-45f3-a9cc-b84a8a8310a1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-z8vj2" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.986022 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/781f0741-f222-4ccc-aa80-6dde59e9648d-webhook-cert\") pod \"packageserver-d55dfcdfc-fcnzx\" (UID: \"781f0741-f222-4ccc-aa80-6dde59e9648d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fcnzx" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.986041 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzqkc\" (UniqueName: \"kubernetes.io/projected/781f0741-f222-4ccc-aa80-6dde59e9648d-kube-api-access-nzqkc\") pod \"packageserver-d55dfcdfc-fcnzx\" (UID: \"781f0741-f222-4ccc-aa80-6dde59e9648d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fcnzx" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.986059 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f2201d52-2767-4119-9f15-8e3da0ee8570-signing-key\") pod \"service-ca-9c57cc56f-7jk4n\" (UID: \"f2201d52-2767-4119-9f15-8e3da0ee8570\") " pod="openshift-service-ca/service-ca-9c57cc56f-7jk4n" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.986085 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ffb5bca9-57f3-415c-b2c2-c5088fe6c5d9-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-flj2h\" (UID: \"ffb5bca9-57f3-415c-b2c2-c5088fe6c5d9\") " pod="openshift-marketplace/marketplace-operator-79b997595-flj2h" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.986104 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a4a3e741-fc60-4076-8167-0e7cc776345e-metrics-tls\") pod \"dns-default-ph9t5\" (UID: \"a4a3e741-fc60-4076-8167-0e7cc776345e\") " pod="openshift-dns/dns-default-ph9t5" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.986129 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-th5vs\" (UniqueName: \"kubernetes.io/projected/f2201d52-2767-4119-9f15-8e3da0ee8570-kube-api-access-th5vs\") pod \"service-ca-9c57cc56f-7jk4n\" (UID: \"f2201d52-2767-4119-9f15-8e3da0ee8570\") " pod="openshift-service-ca/service-ca-9c57cc56f-7jk4n" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.986181 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/fdb49b25-5e81-4f9d-9a17-34bade2cec18-srv-cert\") pod \"olm-operator-6b444d44fb-dnc22\" (UID: \"fdb49b25-5e81-4f9d-9a17-34bade2cec18\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dnc22" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.986205 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/eeb43c2f-961b-4ed4-9aa0-cda4dea289cb-registration-dir\") pod \"csi-hostpathplugin-4rf57\" (UID: \"eeb43c2f-961b-4ed4-9aa0-cda4dea289cb\") " pod="hostpath-provisioner/csi-hostpathplugin-4rf57" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.986221 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/e4f6594b-0c99-449b-ac20-4aae09000a73-certs\") pod \"machine-config-server-fc4kh\" (UID: \"e4f6594b-0c99-449b-ac20-4aae09000a73\") " pod="openshift-machine-config-operator/machine-config-server-fc4kh" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.986241 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6d732543-0ecb-4570-bbfb-8a80570674d5-proxy-tls\") pod \"machine-config-controller-84d6567774-76ppq\" (UID: \"6d732543-0ecb-4570-bbfb-8a80570674d5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-76ppq" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.986261 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/eeb43c2f-961b-4ed4-9aa0-cda4dea289cb-csi-data-dir\") pod \"csi-hostpathplugin-4rf57\" (UID: \"eeb43c2f-961b-4ed4-9aa0-cda4dea289cb\") " pod="hostpath-provisioner/csi-hostpathplugin-4rf57" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.986288 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kc8w2\" (UniqueName: \"kubernetes.io/projected/7c53abe4-412d-47a0-bccc-ec9e6f4d8784-kube-api-access-kc8w2\") pod \"collect-profiles-29565765-b929z\" (UID: \"7c53abe4-412d-47a0-bccc-ec9e6f4d8784\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565765-b929z" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.986323 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ffb5bca9-57f3-415c-b2c2-c5088fe6c5d9-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-flj2h\" (UID: \"ffb5bca9-57f3-415c-b2c2-c5088fe6c5d9\") " pod="openshift-marketplace/marketplace-operator-79b997595-flj2h" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.986345 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a-ready\") pod \"cni-sysctl-allowlist-ds-m7zht\" (UID: \"97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a\") " pod="openshift-multus/cni-sysctl-allowlist-ds-m7zht" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.986366 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-m7zht\" (UID: \"97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a\") " pod="openshift-multus/cni-sysctl-allowlist-ds-m7zht" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.986387 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xssjm\" (UniqueName: \"kubernetes.io/projected/a4a3e741-fc60-4076-8167-0e7cc776345e-kube-api-access-xssjm\") pod \"dns-default-ph9t5\" (UID: \"a4a3e741-fc60-4076-8167-0e7cc776345e\") " pod="openshift-dns/dns-default-ph9t5" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.986420 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f2201d52-2767-4119-9f15-8e3da0ee8570-signing-cabundle\") pod \"service-ca-9c57cc56f-7jk4n\" (UID: \"f2201d52-2767-4119-9f15-8e3da0ee8570\") " pod="openshift-service-ca/service-ca-9c57cc56f-7jk4n" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.990843 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vblpc\" (UniqueName: \"kubernetes.io/projected/4b7a5ec4-de52-4a0c-9e90-5a2835f6476e-kube-api-access-vblpc\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.992544 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f2201d52-2767-4119-9f15-8e3da0ee8570-signing-cabundle\") pod \"service-ca-9c57cc56f-7jk4n\" (UID: \"f2201d52-2767-4119-9f15-8e3da0ee8570\") " pod="openshift-service-ca/service-ca-9c57cc56f-7jk4n" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.992808 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6d732543-0ecb-4570-bbfb-8a80570674d5-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-76ppq\" (UID: \"6d732543-0ecb-4570-bbfb-8a80570674d5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-76ppq" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.993341 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a4a3e741-fc60-4076-8167-0e7cc776345e-config-volume\") pod \"dns-default-ph9t5\" (UID: \"a4a3e741-fc60-4076-8167-0e7cc776345e\") " pod="openshift-dns/dns-default-ph9t5" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.993548 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/eeb43c2f-961b-4ed4-9aa0-cda4dea289cb-plugins-dir\") pod \"csi-hostpathplugin-4rf57\" (UID: \"eeb43c2f-961b-4ed4-9aa0-cda4dea289cb\") " pod="hostpath-provisioner/csi-hostpathplugin-4rf57" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.993998 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b3f973e-b13a-45f3-a9cc-b84a8a8310a1-config\") pod \"service-ca-operator-777779d784-z8vj2\" (UID: \"7b3f973e-b13a-45f3-a9cc-b84a8a8310a1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-z8vj2" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.994119 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7c53abe4-412d-47a0-bccc-ec9e6f4d8784-secret-volume\") pod \"collect-profiles-29565765-b929z\" (UID: \"7c53abe4-412d-47a0-bccc-ec9e6f4d8784\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565765-b929z" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.994229 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/eeb43c2f-961b-4ed4-9aa0-cda4dea289cb-socket-dir\") pod \"csi-hostpathplugin-4rf57\" (UID: \"eeb43c2f-961b-4ed4-9aa0-cda4dea289cb\") " pod="hostpath-provisioner/csi-hostpathplugin-4rf57" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.994447 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/eeb43c2f-961b-4ed4-9aa0-cda4dea289cb-mountpoint-dir\") pod \"csi-hostpathplugin-4rf57\" (UID: \"eeb43c2f-961b-4ed4-9aa0-cda4dea289cb\") " pod="hostpath-provisioner/csi-hostpathplugin-4rf57" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.996557 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b3f973e-b13a-45f3-a9cc-b84a8a8310a1-serving-cert\") pod \"service-ca-operator-777779d784-z8vj2\" (UID: \"7b3f973e-b13a-45f3-a9cc-b84a8a8310a1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-z8vj2" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.996635 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-m7zht\" (UID: \"97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a\") " pod="openshift-multus/cni-sysctl-allowlist-ds-m7zht" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.997062 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/eeb43c2f-961b-4ed4-9aa0-cda4dea289cb-registration-dir\") pod \"csi-hostpathplugin-4rf57\" (UID: \"eeb43c2f-961b-4ed4-9aa0-cda4dea289cb\") " pod="hostpath-provisioner/csi-hostpathplugin-4rf57" Mar 19 18:58:10 crc kubenswrapper[4826]: I0319 18:58:10.998035 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/781f0741-f222-4ccc-aa80-6dde59e9648d-tmpfs\") pod \"packageserver-d55dfcdfc-fcnzx\" (UID: \"781f0741-f222-4ccc-aa80-6dde59e9648d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fcnzx" Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:10.988014 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lfzff" Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.004068 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/18582bb6-239d-4f2e-91a4-2ea59fb54d71-bound-sa-token\") pod \"ingress-operator-5b745b69d9-r4w59\" (UID: \"18582bb6-239d-4f2e-91a4-2ea59fb54d71\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r4w59" Mar 19 18:58:11 crc kubenswrapper[4826]: E0319 18:58:11.004281 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:11.504264333 +0000 UTC m=+116.258332646 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mwhjm" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.006392 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/e4f6594b-0c99-449b-ac20-4aae09000a73-node-bootstrap-token\") pod \"machine-config-server-fc4kh\" (UID: \"e4f6594b-0c99-449b-ac20-4aae09000a73\") " pod="openshift-machine-config-operator/machine-config-server-fc4kh" Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.008475 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/781f0741-f222-4ccc-aa80-6dde59e9648d-apiservice-cert\") pod \"packageserver-d55dfcdfc-fcnzx\" (UID: \"781f0741-f222-4ccc-aa80-6dde59e9648d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fcnzx" Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.008631 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/eeb43c2f-961b-4ed4-9aa0-cda4dea289cb-csi-data-dir\") pod \"csi-hostpathplugin-4rf57\" (UID: \"eeb43c2f-961b-4ed4-9aa0-cda4dea289cb\") " pod="hostpath-provisioner/csi-hostpathplugin-4rf57" Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.011678 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7c53abe4-412d-47a0-bccc-ec9e6f4d8784-config-volume\") pod \"collect-profiles-29565765-b929z\" (UID: \"7c53abe4-412d-47a0-bccc-ec9e6f4d8784\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565765-b929z" Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.012317 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/fdb49b25-5e81-4f9d-9a17-34bade2cec18-profile-collector-cert\") pod \"olm-operator-6b444d44fb-dnc22\" (UID: \"fdb49b25-5e81-4f9d-9a17-34bade2cec18\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dnc22" Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.013118 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a-ready\") pod \"cni-sysctl-allowlist-ds-m7zht\" (UID: \"97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a\") " pod="openshift-multus/cni-sysctl-allowlist-ds-m7zht" Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.013420 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-m7zht\" (UID: \"97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a\") " pod="openshift-multus/cni-sysctl-allowlist-ds-m7zht" Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.013562 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a4a3e741-fc60-4076-8167-0e7cc776345e-metrics-tls\") pod \"dns-default-ph9t5\" (UID: \"a4a3e741-fc60-4076-8167-0e7cc776345e\") " pod="openshift-dns/dns-default-ph9t5" Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.013590 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6d732543-0ecb-4570-bbfb-8a80570674d5-proxy-tls\") pod \"machine-config-controller-84d6567774-76ppq\" (UID: \"6d732543-0ecb-4570-bbfb-8a80570674d5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-76ppq" Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.014536 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ffb5bca9-57f3-415c-b2c2-c5088fe6c5d9-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-flj2h\" (UID: \"ffb5bca9-57f3-415c-b2c2-c5088fe6c5d9\") " pod="openshift-marketplace/marketplace-operator-79b997595-flj2h" Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.015211 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/781f0741-f222-4ccc-aa80-6dde59e9648d-webhook-cert\") pod \"packageserver-d55dfcdfc-fcnzx\" (UID: \"781f0741-f222-4ccc-aa80-6dde59e9648d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fcnzx" Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.016498 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/fdb49b25-5e81-4f9d-9a17-34bade2cec18-srv-cert\") pod \"olm-operator-6b444d44fb-dnc22\" (UID: \"fdb49b25-5e81-4f9d-9a17-34bade2cec18\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dnc22" Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.016553 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/e4f6594b-0c99-449b-ac20-4aae09000a73-certs\") pod \"machine-config-server-fc4kh\" (UID: \"e4f6594b-0c99-449b-ac20-4aae09000a73\") " pod="openshift-machine-config-operator/machine-config-server-fc4kh" Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.017221 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4a42d5d3-6341-4be7-a1fc-69f49476197f-cert\") pod \"ingress-canary-cx4gd\" (UID: \"4a42d5d3-6341-4be7-a1fc-69f49476197f\") " pod="openshift-ingress-canary/ingress-canary-cx4gd" Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.017372 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f2201d52-2767-4119-9f15-8e3da0ee8570-signing-key\") pod \"service-ca-9c57cc56f-7jk4n\" (UID: \"f2201d52-2767-4119-9f15-8e3da0ee8570\") " pod="openshift-service-ca/service-ca-9c57cc56f-7jk4n" Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.019447 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ffb5bca9-57f3-415c-b2c2-c5088fe6c5d9-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-flj2h\" (UID: \"ffb5bca9-57f3-415c-b2c2-c5088fe6c5d9\") " pod="openshift-marketplace/marketplace-operator-79b997595-flj2h" Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.046206 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nlft6" Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.046843 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2j99t\" (UniqueName: \"kubernetes.io/projected/18582bb6-239d-4f2e-91a4-2ea59fb54d71-kube-api-access-2j99t\") pod \"ingress-operator-5b745b69d9-r4w59\" (UID: \"18582bb6-239d-4f2e-91a4-2ea59fb54d71\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r4w59" Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.058680 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shfdt\" (UniqueName: \"kubernetes.io/projected/97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a-kube-api-access-shfdt\") pod \"cni-sysctl-allowlist-ds-m7zht\" (UID: \"97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a\") " pod="openshift-multus/cni-sysctl-allowlist-ds-m7zht" Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.084131 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvltp\" (UniqueName: \"kubernetes.io/projected/fdb49b25-5e81-4f9d-9a17-34bade2cec18-kube-api-access-gvltp\") pod \"olm-operator-6b444d44fb-dnc22\" (UID: \"fdb49b25-5e81-4f9d-9a17-34bade2cec18\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dnc22" Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.092516 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:11 crc kubenswrapper[4826]: E0319 18:58:11.093041 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:11.593021376 +0000 UTC m=+116.347089689 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.101380 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsr27\" (UniqueName: \"kubernetes.io/projected/ffb5bca9-57f3-415c-b2c2-c5088fe6c5d9-kube-api-access-wsr27\") pod \"marketplace-operator-79b997595-flj2h\" (UID: \"ffb5bca9-57f3-415c-b2c2-c5088fe6c5d9\") " pod="openshift-marketplace/marketplace-operator-79b997595-flj2h" Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.112070 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-cbmtf"] Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.125249 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-6vlbh"] Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.128965 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sm4nn\" (UniqueName: \"kubernetes.io/projected/e4f6594b-0c99-449b-ac20-4aae09000a73-kube-api-access-sm4nn\") pod \"machine-config-server-fc4kh\" (UID: \"e4f6594b-0c99-449b-ac20-4aae09000a73\") " pod="openshift-machine-config-operator/machine-config-server-fc4kh" Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.157328 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6bpb\" (UniqueName: \"kubernetes.io/projected/7b3f973e-b13a-45f3-a9cc-b84a8a8310a1-kube-api-access-m6bpb\") pod \"service-ca-operator-777779d784-z8vj2\" (UID: \"7b3f973e-b13a-45f3-a9cc-b84a8a8310a1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-z8vj2" Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.163581 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-m7zht" Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.169841 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-fc4kh" Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.180067 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4brm\" (UniqueName: \"kubernetes.io/projected/6d732543-0ecb-4570-bbfb-8a80570674d5-kube-api-access-t4brm\") pod \"machine-config-controller-84d6567774-76ppq\" (UID: \"6d732543-0ecb-4570-bbfb-8a80570674d5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-76ppq" Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.193928 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:11 crc kubenswrapper[4826]: E0319 18:58:11.194542 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:11.694520006 +0000 UTC m=+116.448588319 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mwhjm" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.198100 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flkpq\" (UniqueName: \"kubernetes.io/projected/4a42d5d3-6341-4be7-a1fc-69f49476197f-kube-api-access-flkpq\") pod \"ingress-canary-cx4gd\" (UID: \"4a42d5d3-6341-4be7-a1fc-69f49476197f\") " pod="openshift-ingress-canary/ingress-canary-cx4gd" Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.201804 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-th5vs\" (UniqueName: \"kubernetes.io/projected/f2201d52-2767-4119-9f15-8e3da0ee8570-kube-api-access-th5vs\") pod \"service-ca-9c57cc56f-7jk4n\" (UID: \"f2201d52-2767-4119-9f15-8e3da0ee8570\") " pod="openshift-service-ca/service-ca-9c57cc56f-7jk4n" Mar 19 18:58:11 crc kubenswrapper[4826]: W0319 18:58:11.202920 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f264ef7_44e5_4dee_91c5_89c87a000c9f.slice/crio-faf63bc91603aa92212bc66b59ba6a9b55e07db9a385e9ada06343dcff078525 WatchSource:0}: Error finding container faf63bc91603aa92212bc66b59ba6a9b55e07db9a385e9ada06343dcff078525: Status 404 returned error can't find the container with id faf63bc91603aa92212bc66b59ba6a9b55e07db9a385e9ada06343dcff078525 Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.222162 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdq6z\" (UniqueName: \"kubernetes.io/projected/509cd3a8-f3bb-4214-a70b-e589905ad242-kube-api-access-gdq6z\") pod \"auto-csr-approver-29565778-hxldh\" (UID: \"509cd3a8-f3bb-4214-a70b-e589905ad242\") " pod="openshift-infra/auto-csr-approver-29565778-hxldh" Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.222435 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8ghl4" Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.237513 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-drbf6" Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.245754 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzqkc\" (UniqueName: \"kubernetes.io/projected/781f0741-f222-4ccc-aa80-6dde59e9648d-kube-api-access-nzqkc\") pod \"packageserver-d55dfcdfc-fcnzx\" (UID: \"781f0741-f222-4ccc-aa80-6dde59e9648d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fcnzx" Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.266045 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6mbz\" (UniqueName: \"kubernetes.io/projected/eeb43c2f-961b-4ed4-9aa0-cda4dea289cb-kube-api-access-p6mbz\") pod \"csi-hostpathplugin-4rf57\" (UID: \"eeb43c2f-961b-4ed4-9aa0-cda4dea289cb\") " pod="hostpath-provisioner/csi-hostpathplugin-4rf57" Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.268296 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-jmhs5"] Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.284933 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kc8w2\" (UniqueName: \"kubernetes.io/projected/7c53abe4-412d-47a0-bccc-ec9e6f4d8784-kube-api-access-kc8w2\") pod \"collect-profiles-29565765-b929z\" (UID: \"7c53abe4-412d-47a0-bccc-ec9e6f4d8784\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565765-b929z" Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.295992 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:11 crc kubenswrapper[4826]: E0319 18:58:11.296178 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:11.796148179 +0000 UTC m=+116.550216492 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.296569 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.296611 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f822d71-562c-4d2c-917f-82281bef6c8a-metrics-certs\") pod \"network-metrics-daemon-7fdpm\" (UID: \"9f822d71-562c-4d2c-917f-82281bef6c8a\") " pod="openshift-multus/network-metrics-daemon-7fdpm" Mar 19 18:58:11 crc kubenswrapper[4826]: E0319 18:58:11.296966 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:11.796958202 +0000 UTC m=+116.551026515 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mwhjm" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.305846 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xssjm\" (UniqueName: \"kubernetes.io/projected/a4a3e741-fc60-4076-8167-0e7cc776345e-kube-api-access-xssjm\") pod \"dns-default-ph9t5\" (UID: \"a4a3e741-fc60-4076-8167-0e7cc776345e\") " pod="openshift-dns/dns-default-ph9t5" Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.306028 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f822d71-562c-4d2c-917f-82281bef6c8a-metrics-certs\") pod \"network-metrics-daemon-7fdpm\" (UID: \"9f822d71-562c-4d2c-917f-82281bef6c8a\") " pod="openshift-multus/network-metrics-daemon-7fdpm" Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.334215 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r4w59" Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.352564 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dnc22" Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.357114 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vj6zq"] Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.361232 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-flj2h" Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.373201 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-7jk4n" Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.374086 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565765-b929z" Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.378850 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fcnzx" Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.392588 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-76ppq" Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.396201 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-9bmdx"] Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.398677 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-tmbjc"] Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.403335 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:11 crc kubenswrapper[4826]: E0319 18:58:11.404150 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:11.904100655 +0000 UTC m=+116.658168968 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.404399 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fdpm" Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.404481 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:11 crc kubenswrapper[4826]: E0319 18:58:11.406702 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:11.906693585 +0000 UTC m=+116.660761898 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mwhjm" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.417949 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565778-hxldh" Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.425511 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-z8vj2" Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.432116 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-4rf57" Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.450771 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-cx4gd" Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.459338 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ph9t5" Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.460641 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nw52z"] Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.476776 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lfzff"] Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.505449 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:11 crc kubenswrapper[4826]: E0319 18:58:11.505832 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:12.005814561 +0000 UTC m=+116.759882874 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.506119 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-p892r"] Mar 19 18:58:11 crc kubenswrapper[4826]: W0319 18:58:11.571916 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb8ad588_15a8_47f2_97d5_950d4a757183.slice/crio-4514fae9eb9413de939022a0a83779ffc4e12982e4d63d5bd09182c188e4d252 WatchSource:0}: Error finding container 4514fae9eb9413de939022a0a83779ffc4e12982e4d63d5bd09182c188e4d252: Status 404 returned error can't find the container with id 4514fae9eb9413de939022a0a83779ffc4e12982e4d63d5bd09182c188e4d252 Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.580890 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nlft6"] Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.612875 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:11 crc kubenswrapper[4826]: E0319 18:58:11.613199 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:12.113187701 +0000 UTC m=+116.867256014 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mwhjm" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.616151 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vxvlt"] Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.714449 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:11 crc kubenswrapper[4826]: E0319 18:58:11.714769 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:12.214725151 +0000 UTC m=+116.968793464 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.715096 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:11 crc kubenswrapper[4826]: E0319 18:58:11.715520 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:12.215505583 +0000 UTC m=+116.969573896 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mwhjm" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:11 crc kubenswrapper[4826]: W0319 18:58:11.748434 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2760a70_3c84_42db_824f_1ed69c419347.slice/crio-23a87dcafe72c88cf95bceba056a72c51c371ea35daac22590b0d1104fbeb3ec WatchSource:0}: Error finding container 23a87dcafe72c88cf95bceba056a72c51c371ea35daac22590b0d1104fbeb3ec: Status 404 returned error can't find the container with id 23a87dcafe72c88cf95bceba056a72c51c371ea35daac22590b0d1104fbeb3ec Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.760751 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nlft6" event={"ID":"4858c7f7-6a71-40dc-8222-082f6d97504c","Type":"ContainerStarted","Data":"1ab50348737efedbc994bf4932f0d34c36b3e7eb06044c631c33785338595fd0"} Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.770473 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6sfrz" event={"ID":"c5a99915-a5d4-4e6f-87e0-ead11079eeec","Type":"ContainerStarted","Data":"09e1cc4460238ec6c3b87ae4711b926916978c3a89ef7df0fb6bcea8499b2275"} Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.776626 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-98vw6" event={"ID":"4a1f7edf-3d9d-468d-a18c-a08128eca13c","Type":"ContainerStarted","Data":"6ef9661aee7852ebc1a1c67102d993cd2f04da7ac697adc6f39ca177916fcc0e"} Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.776693 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-98vw6" event={"ID":"4a1f7edf-3d9d-468d-a18c-a08128eca13c","Type":"ContainerStarted","Data":"3b17e56e3c7d370f9e7209ebab234d88a510d6c8b357067b630281ca64b41bf0"} Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.783114 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lfzff" event={"ID":"3abdacf1-5969-4ef1-a1f6-745a3750dfaa","Type":"ContainerStarted","Data":"d72cc8bd6f3e8e26ad1ea4f8c9fc613f367998d1531abe834a24193afde78b76"} Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.794364 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jfxcc" event={"ID":"a7b17be3-fa05-45b6-ba33-b514326061b1","Type":"ContainerStarted","Data":"39fae11bebde2083210732851ac8154b3980e7ed245da255d9f081df45c2ee68"} Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.794439 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jfxcc" event={"ID":"a7b17be3-fa05-45b6-ba33-b514326061b1","Type":"ContainerStarted","Data":"489a0e991b58c5476ec6358372317d94c4163b55a1b21c03266b76e53297edad"} Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.797386 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vj6zq" event={"ID":"0d3acdc6-8778-4094-af1f-8f3824029d90","Type":"ContainerStarted","Data":"ed936baaa1e9eafd161e98f68ff38c5ab511d6f0894c3ddb2f9a005ad1275fee"} Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.810677 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9bmdx" event={"ID":"1392e2fd-a142-4584-8df9-7470b9441a3d","Type":"ContainerStarted","Data":"991d6f68b80b6f8897e451a06b96848a8260782b667933743fdb1d66a68415a6"} Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.811996 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-tmbjc" event={"ID":"90903b96-fde6-4803-93ab-15ecd16645af","Type":"ContainerStarted","Data":"67dd36d8aaec696e992d806a964bcfb73ca4868937747a2138508529e795165f"} Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.816225 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:11 crc kubenswrapper[4826]: E0319 18:58:11.816408 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:12.316360135 +0000 UTC m=+117.070428438 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.816563 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:11 crc kubenswrapper[4826]: E0319 18:58:11.818119 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:12.318095852 +0000 UTC m=+117.072164365 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mwhjm" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.819288 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-zc8ht" event={"ID":"f61cc107-39c3-4add-b9a1-45c5d744ea4b","Type":"ContainerStarted","Data":"a456ee0ad60c9f375ff1bffa7a4c02e145d0984db73abfcd1d8cb0a4007c2682"} Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.819324 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-zc8ht" event={"ID":"f61cc107-39c3-4add-b9a1-45c5d744ea4b","Type":"ContainerStarted","Data":"c171b331b82387ac72a3bc70808a3ed64199b133828e5ffa3a07bfb7d6890fa0"} Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.819541 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-zc8ht" Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.830784 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-m7zht" event={"ID":"97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a","Type":"ContainerStarted","Data":"ded79eb4312c739b1cf9f964f7b563bbc6467a4fea8d3743fcefa50e46b9d391"} Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.841209 4826 patch_prober.go:28] interesting pod/console-operator-58897d9998-zc8ht container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.841282 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-zc8ht" podUID="f61cc107-39c3-4add-b9a1-45c5d744ea4b" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.842988 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-cbmtf" event={"ID":"0a13bc75-83b6-4952-8e8e-cd93809a87b5","Type":"ContainerStarted","Data":"5ecb93938d478004edef40836b5b669a963eed6dad842c63d7946224fc7b16c3"} Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.843023 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-cbmtf" event={"ID":"0a13bc75-83b6-4952-8e8e-cd93809a87b5","Type":"ContainerStarted","Data":"2b6d6e540d000dd350d0d76c69bae0e06aee764c44fb13927e1694f5ed11c33d"} Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.844395 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-cbmtf" Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.845460 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565778-hxldh"] Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.869179 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-v6d7k" event={"ID":"9bc83b3f-72da-4527-b7a8-5f09d3f5f39f","Type":"ContainerStarted","Data":"e82fdfcddbdef8d498bd3bdde22d4b6daa10cd720da0d57c6f23bff2745a3227"} Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.869243 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-v6d7k" event={"ID":"9bc83b3f-72da-4527-b7a8-5f09d3f5f39f","Type":"ContainerStarted","Data":"565ddb873b7702b019897ad559ec9336f22dd101fd744fd63069d321381a4613"} Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.870142 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-v6d7k" Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.876668 4826 patch_prober.go:28] interesting pod/downloads-7954f5f757-cbmtf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.876771 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-cbmtf" podUID="0a13bc75-83b6-4952-8e8e-cd93809a87b5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.880241 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nw52z" event={"ID":"d3d2b5c3-e37e-4a58-af35-8980d9c8d43a","Type":"ContainerStarted","Data":"4f03fdf980bec1a1f38e8f69a8728ccf21bbba30de33c5fbd9f148c3aa142e6e"} Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.889000 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-p892r" event={"ID":"db8ad588-15a8-47f2-97d5-950d4a757183","Type":"ContainerStarted","Data":"4514fae9eb9413de939022a0a83779ffc4e12982e4d63d5bd09182c188e4d252"} Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.891299 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-jmhs5" event={"ID":"cf827e9c-2ada-440e-a3e0-99deb1eb54c1","Type":"ContainerStarted","Data":"3b34e2d7ceb04da11d75936fb0f12b90e7fbbddcfe8c7478126203fc5524b58e"} Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.899076 4826 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-v6d7k container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.899120 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-v6d7k" podUID="9bc83b3f-72da-4527-b7a8-5f09d3f5f39f" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.902742 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jgpxv" event={"ID":"ed48d331-c0eb-42d6-8d6e-6617fc4b7985","Type":"ContainerStarted","Data":"09cbcd3872f102a64fe9b6bf8b5f837ae4151465a83e5cdf562dc6a360f07252"} Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.920020 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:11 crc kubenswrapper[4826]: E0319 18:58:11.922726 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:12.422697187 +0000 UTC m=+117.176765510 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:11 crc kubenswrapper[4826]: I0319 18:58:11.946646 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-h7mf4" event={"ID":"86f5311b-39ed-455f-a9bc-b83044d63db8","Type":"ContainerStarted","Data":"9c12fa48650e0faf68aae77fd81aae464b15adacc73a0c098be86548caa7a4f7"} Mar 19 18:58:12 crc kubenswrapper[4826]: I0319 18:58:12.029536 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-6vlbh" event={"ID":"6f264ef7-44e5-4dee-91c5-89c87a000c9f","Type":"ContainerStarted","Data":"faf63bc91603aa92212bc66b59ba6a9b55e07db9a385e9ada06343dcff078525"} Mar 19 18:58:12 crc kubenswrapper[4826]: I0319 18:58:12.029952 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:12 crc kubenswrapper[4826]: W0319 18:58:12.030138 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod509cd3a8_f3bb_4214_a70b_e589905ad242.slice/crio-f732234a1dfd37c93e5311d567ecb0c71e547a83a35ffe4e10ae66c7f109dfb1 WatchSource:0}: Error finding container f732234a1dfd37c93e5311d567ecb0c71e547a83a35ffe4e10ae66c7f109dfb1: Status 404 returned error can't find the container with id f732234a1dfd37c93e5311d567ecb0c71e547a83a35ffe4e10ae66c7f109dfb1 Mar 19 18:58:12 crc kubenswrapper[4826]: E0319 18:58:12.030378 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:12.530359574 +0000 UTC m=+117.284427887 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mwhjm" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:12 crc kubenswrapper[4826]: I0319 18:58:12.059128 4826 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 18:58:12 crc kubenswrapper[4826]: I0319 18:58:12.104097 4826 generic.go:334] "Generic (PLEG): container finished" podID="4e673de9-6eb1-430b-8123-1254957f125f" containerID="619d60ff36707930434a3ff9492d66f3875d457d343516d2539a0d632b73602b" exitCode=0 Mar 19 18:58:12 crc kubenswrapper[4826]: I0319 18:58:12.104187 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tw9k9" event={"ID":"4e673de9-6eb1-430b-8123-1254957f125f","Type":"ContainerDied","Data":"619d60ff36707930434a3ff9492d66f3875d457d343516d2539a0d632b73602b"} Mar 19 18:58:12 crc kubenswrapper[4826]: I0319 18:58:12.131308 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:12 crc kubenswrapper[4826]: I0319 18:58:12.149330 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h6hdz" event={"ID":"4aa156db-ba19-4535-ba78-b7a4b94e29e9","Type":"ContainerStarted","Data":"f24b9348b19c22015f3292f743ddcd907919eda1d64704302684c494b4e862ae"} Mar 19 18:58:12 crc kubenswrapper[4826]: E0319 18:58:12.152095 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:12.652045783 +0000 UTC m=+117.406114086 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:12 crc kubenswrapper[4826]: I0319 18:58:12.175213 4826 generic.go:334] "Generic (PLEG): container finished" podID="72f0a310-1676-49a4-826a-d83406d28e93" containerID="644a8a1f4fe5112f888e0472605f159749a2a26f6fe6fca7b11010a2b2f7f8fa" exitCode=0 Mar 19 18:58:12 crc kubenswrapper[4826]: I0319 18:58:12.180967 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-pfrcn" event={"ID":"72f0a310-1676-49a4-826a-d83406d28e93","Type":"ContainerDied","Data":"644a8a1f4fe5112f888e0472605f159749a2a26f6fe6fca7b11010a2b2f7f8fa"} Mar 19 18:58:12 crc kubenswrapper[4826]: I0319 18:58:12.181049 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-pfrcn" event={"ID":"72f0a310-1676-49a4-826a-d83406d28e93","Type":"ContainerStarted","Data":"b8c55a24db5076b5f3ff1e9b97e849fd79aeae6ce116c932f553748a66c45162"} Mar 19 18:58:12 crc kubenswrapper[4826]: I0319 18:58:12.216579 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kzx65" event={"ID":"9181cd89-dbc9-4fe2-aacb-4f67003e5738","Type":"ContainerStarted","Data":"2f497571c09aac55d2bb76489cb46c683a7849f6b072871dcf30cb4f6d3ce234"} Mar 19 18:58:12 crc kubenswrapper[4826]: I0319 18:58:12.216629 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kzx65" event={"ID":"9181cd89-dbc9-4fe2-aacb-4f67003e5738","Type":"ContainerStarted","Data":"7fb07bd0281714154508f14a301f27f5415497b8e1aef38271e61887573dfa6d"} Mar 19 18:58:12 crc kubenswrapper[4826]: I0319 18:58:12.219166 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-fc4kh" event={"ID":"e4f6594b-0c99-449b-ac20-4aae09000a73","Type":"ContainerStarted","Data":"9f1f0956d7400d0aa5fcf74d4c6dfcc26093d03dfe76db945f5c51ecad7db482"} Mar 19 18:58:12 crc kubenswrapper[4826]: I0319 18:58:12.221965 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-drbf6" event={"ID":"ee11e1f6-25be-40f4-b19b-a2d8e439d8c6","Type":"ContainerStarted","Data":"1b282a7c86302598b6ff45405ce57ecd13f87206a8325a6c4e74cc8fde337acb"} Mar 19 18:58:12 crc kubenswrapper[4826]: I0319 18:58:12.251213 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-brsbv" Mar 19 18:58:12 crc kubenswrapper[4826]: I0319 18:58:12.252505 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hcg9z" Mar 19 18:58:12 crc kubenswrapper[4826]: I0319 18:58:12.265764 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:12 crc kubenswrapper[4826]: I0319 18:58:12.265785 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8ghl4"] Mar 19 18:58:12 crc kubenswrapper[4826]: E0319 18:58:12.266141 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:12.766122975 +0000 UTC m=+117.520191288 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mwhjm" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:12 crc kubenswrapper[4826]: I0319 18:58:12.381754 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:12 crc kubenswrapper[4826]: E0319 18:58:12.383887 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:12.883855266 +0000 UTC m=+117.637923739 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:12 crc kubenswrapper[4826]: I0319 18:58:12.412564 4826 ???:1] "http: TLS handshake error from 192.168.126.11:53330: no serving certificate available for the kubelet" Mar 19 18:58:12 crc kubenswrapper[4826]: I0319 18:58:12.495142 4826 ???:1] "http: TLS handshake error from 192.168.126.11:53332: no serving certificate available for the kubelet" Mar 19 18:58:12 crc kubenswrapper[4826]: I0319 18:58:12.495635 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:12 crc kubenswrapper[4826]: I0319 18:58:12.495925 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-4rf57"] Mar 19 18:58:12 crc kubenswrapper[4826]: E0319 18:58:12.495998 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:12.995984225 +0000 UTC m=+117.750052538 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mwhjm" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:12 crc kubenswrapper[4826]: I0319 18:58:12.597987 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:12 crc kubenswrapper[4826]: E0319 18:58:12.598797 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:13.098777721 +0000 UTC m=+117.852846034 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:12 crc kubenswrapper[4826]: I0319 18:58:12.605431 4826 ???:1] "http: TLS handshake error from 192.168.126.11:53336: no serving certificate available for the kubelet" Mar 19 18:58:12 crc kubenswrapper[4826]: W0319 18:58:12.656052 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeeb43c2f_961b_4ed4_9aa0_cda4dea289cb.slice/crio-bbd0d686a7cdaa8e6ea45a2660fc4c09b1f9a34aceea078edf7f6ac506405339 WatchSource:0}: Error finding container bbd0d686a7cdaa8e6ea45a2660fc4c09b1f9a34aceea078edf7f6ac506405339: Status 404 returned error can't find the container with id bbd0d686a7cdaa8e6ea45a2660fc4c09b1f9a34aceea078edf7f6ac506405339 Mar 19 18:58:12 crc kubenswrapper[4826]: I0319 18:58:12.700505 4826 ???:1] "http: TLS handshake error from 192.168.126.11:53344: no serving certificate available for the kubelet" Mar 19 18:58:12 crc kubenswrapper[4826]: I0319 18:58:12.707898 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:12 crc kubenswrapper[4826]: E0319 18:58:12.714353 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:13.214334092 +0000 UTC m=+117.968402405 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mwhjm" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:12 crc kubenswrapper[4826]: I0319 18:58:12.808857 4826 ???:1] "http: TLS handshake error from 192.168.126.11:53348: no serving certificate available for the kubelet" Mar 19 18:58:12 crc kubenswrapper[4826]: I0319 18:58:12.816040 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:12 crc kubenswrapper[4826]: E0319 18:58:12.816541 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:13.316523141 +0000 UTC m=+118.070591454 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:12 crc kubenswrapper[4826]: I0319 18:58:12.921320 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:12 crc kubenswrapper[4826]: E0319 18:58:12.922633 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:13.422617946 +0000 UTC m=+118.176686259 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mwhjm" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:12 crc kubenswrapper[4826]: I0319 18:58:12.923063 4826 ???:1] "http: TLS handshake error from 192.168.126.11:53360: no serving certificate available for the kubelet" Mar 19 18:58:12 crc kubenswrapper[4826]: I0319 18:58:12.950521 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-zl2jh" podStartSLOduration=56.950500584 podStartE2EDuration="56.950500584s" podCreationTimestamp="2026-03-19 18:57:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:12.950113894 +0000 UTC m=+117.704182207" watchObservedRunningTime="2026-03-19 18:58:12.950500584 +0000 UTC m=+117.704568887" Mar 19 18:58:12 crc kubenswrapper[4826]: I0319 18:58:12.972697 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-brsbv" podStartSLOduration=56.972665587 podStartE2EDuration="56.972665587s" podCreationTimestamp="2026-03-19 18:57:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:12.888810257 +0000 UTC m=+117.642878570" watchObservedRunningTime="2026-03-19 18:58:12.972665587 +0000 UTC m=+117.726733900" Mar 19 18:58:12 crc kubenswrapper[4826]: I0319 18:58:12.979295 4826 scope.go:117] "RemoveContainer" containerID="d6543dc21146ffce18eefd1d6f58480662c580fc8dbb20550656709811dd6cc7" Mar 19 18:58:13 crc kubenswrapper[4826]: I0319 18:58:13.013707 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-zc8ht" podStartSLOduration=57.013676472 podStartE2EDuration="57.013676472s" podCreationTimestamp="2026-03-19 18:57:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:13.000098393 +0000 UTC m=+117.754166716" watchObservedRunningTime="2026-03-19 18:58:13.013676472 +0000 UTC m=+117.767744775" Mar 19 18:58:13 crc kubenswrapper[4826]: I0319 18:58:13.014346 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-r4w59"] Mar 19 18:58:13 crc kubenswrapper[4826]: I0319 18:58:13.023848 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:13 crc kubenswrapper[4826]: E0319 18:58:13.024319 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:13.524295601 +0000 UTC m=+118.278363914 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:13 crc kubenswrapper[4826]: I0319 18:58:13.050935 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-98vw6" podStartSLOduration=56.050893403 podStartE2EDuration="56.050893403s" podCreationTimestamp="2026-03-19 18:57:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:13.049477045 +0000 UTC m=+117.803545518" watchObservedRunningTime="2026-03-19 18:58:13.050893403 +0000 UTC m=+117.804961716" Mar 19 18:58:13 crc kubenswrapper[4826]: I0319 18:58:13.066564 4826 ???:1] "http: TLS handshake error from 192.168.126.11:53362: no serving certificate available for the kubelet" Mar 19 18:58:13 crc kubenswrapper[4826]: I0319 18:58:13.084191 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-ph9t5"] Mar 19 18:58:13 crc kubenswrapper[4826]: I0319 18:58:13.137713 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:13 crc kubenswrapper[4826]: E0319 18:58:13.138277 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:13.638250959 +0000 UTC m=+118.392319272 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mwhjm" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:13 crc kubenswrapper[4826]: I0319 18:58:13.166404 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-4t5nj" Mar 19 18:58:13 crc kubenswrapper[4826]: I0319 18:58:13.192523 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h6hdz" podStartSLOduration=57.192495355 podStartE2EDuration="57.192495355s" podCreationTimestamp="2026-03-19 18:57:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:13.088133696 +0000 UTC m=+117.842202009" watchObservedRunningTime="2026-03-19 18:58:13.192495355 +0000 UTC m=+117.946563668" Mar 19 18:58:13 crc kubenswrapper[4826]: I0319 18:58:13.197011 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hcg9z" podStartSLOduration=56.196999087 podStartE2EDuration="56.196999087s" podCreationTimestamp="2026-03-19 18:57:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:13.187238852 +0000 UTC m=+117.941307155" watchObservedRunningTime="2026-03-19 18:58:13.196999087 +0000 UTC m=+117.951067410" Mar 19 18:58:13 crc kubenswrapper[4826]: I0319 18:58:13.197477 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-z8vj2"] Mar 19 18:58:13 crc kubenswrapper[4826]: I0319 18:58:13.196647 4826 ???:1] "http: TLS handshake error from 192.168.126.11:53364: no serving certificate available for the kubelet" Mar 19 18:58:13 crc kubenswrapper[4826]: I0319 18:58:13.240420 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:13 crc kubenswrapper[4826]: I0319 18:58:13.249512 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-cbmtf" podStartSLOduration=57.249488524 podStartE2EDuration="57.249488524s" podCreationTimestamp="2026-03-19 18:57:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:13.243571643 +0000 UTC m=+117.997639976" watchObservedRunningTime="2026-03-19 18:58:13.249488524 +0000 UTC m=+118.003556837" Mar 19 18:58:13 crc kubenswrapper[4826]: I0319 18:58:13.254720 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lfzff" event={"ID":"3abdacf1-5969-4ef1-a1f6-745a3750dfaa","Type":"ContainerStarted","Data":"99d4352c6df4fafe17a45de5e4e646d5742b1692d8c536f30b9dc415941f9b95"} Mar 19 18:58:13 crc kubenswrapper[4826]: I0319 18:58:13.268458 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-m7zht" event={"ID":"97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a","Type":"ContainerStarted","Data":"357c925170cbafc742e33faa59acfddfc8282dc11185639e6fdf58e5e556690b"} Mar 19 18:58:13 crc kubenswrapper[4826]: I0319 18:58:13.268842 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-m7zht" Mar 19 18:58:13 crc kubenswrapper[4826]: E0319 18:58:13.269986 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:13.76993634 +0000 UTC m=+118.524004653 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:13 crc kubenswrapper[4826]: I0319 18:58:13.270161 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:13 crc kubenswrapper[4826]: E0319 18:58:13.270798 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:13.770791234 +0000 UTC m=+118.524859547 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mwhjm" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:13 crc kubenswrapper[4826]: I0319 18:58:13.282320 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kzx65" podStartSLOduration=56.282294206 podStartE2EDuration="56.282294206s" podCreationTimestamp="2026-03-19 18:57:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:13.26625814 +0000 UTC m=+118.020326453" watchObservedRunningTime="2026-03-19 18:58:13.282294206 +0000 UTC m=+118.036362519" Mar 19 18:58:13 crc kubenswrapper[4826]: I0319 18:58:13.289512 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-6vlbh" event={"ID":"6f264ef7-44e5-4dee-91c5-89c87a000c9f","Type":"ContainerStarted","Data":"20c18f27978cb0f91f35a7d5b913f1a176bf9da81bc540de159fb5c2c544906e"} Mar 19 18:58:13 crc kubenswrapper[4826]: I0319 18:58:13.316745 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-6tq97" podStartSLOduration=56.316729972 podStartE2EDuration="56.316729972s" podCreationTimestamp="2026-03-19 18:57:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:13.314682417 +0000 UTC m=+118.068750740" watchObservedRunningTime="2026-03-19 18:58:13.316729972 +0000 UTC m=+118.070798285" Mar 19 18:58:13 crc kubenswrapper[4826]: I0319 18:58:13.336527 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8ghl4" event={"ID":"a5ae7566-0aea-4736-8a36-3f4664ab9768","Type":"ContainerStarted","Data":"152eb94e851269d7cf0cfee9b75ef6fd6d4b9972fa300ff560ced4972e5c2ba8"} Mar 19 18:58:13 crc kubenswrapper[4826]: I0319 18:58:13.338362 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nw52z" event={"ID":"d3d2b5c3-e37e-4a58-af35-8980d9c8d43a","Type":"ContainerStarted","Data":"05403313ef5a32e3e344227e995441ddc40172b029448ac0e56141b3593b876e"} Mar 19 18:58:13 crc kubenswrapper[4826]: I0319 18:58:13.352411 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565778-hxldh" event={"ID":"509cd3a8-f3bb-4214-a70b-e589905ad242","Type":"ContainerStarted","Data":"f732234a1dfd37c93e5311d567ecb0c71e547a83a35ffe4e10ae66c7f109dfb1"} Mar 19 18:58:13 crc kubenswrapper[4826]: I0319 18:58:13.355414 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jgpxv" podStartSLOduration=57.355375843 podStartE2EDuration="57.355375843s" podCreationTimestamp="2026-03-19 18:57:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:13.351380185 +0000 UTC m=+118.105448498" watchObservedRunningTime="2026-03-19 18:58:13.355375843 +0000 UTC m=+118.109444166" Mar 19 18:58:13 crc kubenswrapper[4826]: I0319 18:58:13.374385 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:13 crc kubenswrapper[4826]: E0319 18:58:13.375176 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:13.875155891 +0000 UTC m=+118.629224204 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:13 crc kubenswrapper[4826]: I0319 18:58:13.385565 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4rf57" event={"ID":"eeb43c2f-961b-4ed4-9aa0-cda4dea289cb","Type":"ContainerStarted","Data":"bbd0d686a7cdaa8e6ea45a2660fc4c09b1f9a34aceea078edf7f6ac506405339"} Mar 19 18:58:13 crc kubenswrapper[4826]: I0319 18:58:13.411223 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9bmdx" event={"ID":"1392e2fd-a142-4584-8df9-7470b9441a3d","Type":"ContainerStarted","Data":"9633cf8d008a51be86a0f3637ed74da336d55333ae2246aeb90bbec59e6f4097"} Mar 19 18:58:13 crc kubenswrapper[4826]: I0319 18:58:13.435457 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-v6d7k" podStartSLOduration=56.43544124 podStartE2EDuration="56.43544124s" podCreationTimestamp="2026-03-19 18:57:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:13.434943687 +0000 UTC m=+118.189012000" watchObservedRunningTime="2026-03-19 18:58:13.43544124 +0000 UTC m=+118.189509553" Mar 19 18:58:13 crc kubenswrapper[4826]: I0319 18:58:13.464507 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-7jk4n"] Mar 19 18:58:13 crc kubenswrapper[4826]: I0319 18:58:13.475071 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-cx4gd"] Mar 19 18:58:13 crc kubenswrapper[4826]: I0319 18:58:13.478247 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:13 crc kubenswrapper[4826]: E0319 18:58:13.479608 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:13.979578161 +0000 UTC m=+118.733646474 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mwhjm" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:13 crc kubenswrapper[4826]: I0319 18:58:13.512204 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fcnzx"] Mar 19 18:58:13 crc kubenswrapper[4826]: I0319 18:58:13.540207 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-tmbjc" event={"ID":"90903b96-fde6-4803-93ab-15ecd16645af","Type":"ContainerStarted","Data":"641450f5b208674ec4dde3742290234c30c41b7d45e2ba8545d943429a5ea49d"} Mar 19 18:58:13 crc kubenswrapper[4826]: I0319 18:58:13.545721 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6sfrz" podStartSLOduration=57.545697679 podStartE2EDuration="57.545697679s" podCreationTimestamp="2026-03-19 18:57:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:13.51925843 +0000 UTC m=+118.273326743" watchObservedRunningTime="2026-03-19 18:58:13.545697679 +0000 UTC m=+118.299765992" Mar 19 18:58:13 crc kubenswrapper[4826]: I0319 18:58:13.556885 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dnc22"] Mar 19 18:58:13 crc kubenswrapper[4826]: I0319 18:58:13.573969 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vxvlt" event={"ID":"a2760a70-3c84-42db-824f-1ed69c419347","Type":"ContainerStarted","Data":"23a87dcafe72c88cf95bceba056a72c51c371ea35daac22590b0d1104fbeb3ec"} Mar 19 18:58:13 crc kubenswrapper[4826]: I0319 18:58:13.574038 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-76ppq"] Mar 19 18:58:13 crc kubenswrapper[4826]: I0319 18:58:13.576246 4826 patch_prober.go:28] interesting pod/downloads-7954f5f757-cbmtf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Mar 19 18:58:13 crc kubenswrapper[4826]: I0319 18:58:13.576293 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-cbmtf" podUID="0a13bc75-83b6-4952-8e8e-cd93809a87b5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Mar 19 18:58:13 crc kubenswrapper[4826]: I0319 18:58:13.585249 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:13 crc kubenswrapper[4826]: E0319 18:58:13.586777 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:14.086740494 +0000 UTC m=+118.840808807 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:13 crc kubenswrapper[4826]: I0319 18:58:13.587569 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:13 crc kubenswrapper[4826]: E0319 18:58:13.601634 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:14.101612819 +0000 UTC m=+118.855681132 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mwhjm" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:13 crc kubenswrapper[4826]: I0319 18:58:13.637017 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7fdpm"] Mar 19 18:58:13 crc kubenswrapper[4826]: I0319 18:58:13.637478 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-v6d7k" Mar 19 18:58:13 crc kubenswrapper[4826]: I0319 18:58:13.696911 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:13 crc kubenswrapper[4826]: E0319 18:58:13.699035 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:14.199010667 +0000 UTC m=+118.953078990 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:13 crc kubenswrapper[4826]: I0319 18:58:13.699218 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:13 crc kubenswrapper[4826]: E0319 18:58:13.716153 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:14.216120243 +0000 UTC m=+118.970188556 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mwhjm" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:13 crc kubenswrapper[4826]: I0319 18:58:13.717430 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-4t5nj" podStartSLOduration=57.717408908 podStartE2EDuration="57.717408908s" podCreationTimestamp="2026-03-19 18:57:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:13.677333368 +0000 UTC m=+118.431401681" watchObservedRunningTime="2026-03-19 18:58:13.717408908 +0000 UTC m=+118.471477221" Mar 19 18:58:13 crc kubenswrapper[4826]: I0319 18:58:13.800116 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:13 crc kubenswrapper[4826]: E0319 18:58:13.803356 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:14.303280133 +0000 UTC m=+119.057348446 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:13 crc kubenswrapper[4826]: W0319 18:58:13.875488 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d732543_0ecb_4570_bbfb_8a80570674d5.slice/crio-f9b44251d3c797fe03ab30169daa99bf5b5ad1e9ec8bfee018d4e854a5663d3b WatchSource:0}: Error finding container f9b44251d3c797fe03ab30169daa99bf5b5ad1e9ec8bfee018d4e854a5663d3b: Status 404 returned error can't find the container with id f9b44251d3c797fe03ab30169daa99bf5b5ad1e9ec8bfee018d4e854a5663d3b Mar 19 18:58:13 crc kubenswrapper[4826]: I0319 18:58:13.886937 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-tmbjc" podStartSLOduration=56.886893667 podStartE2EDuration="56.886893667s" podCreationTimestamp="2026-03-19 18:57:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:13.853249142 +0000 UTC m=+118.607317445" watchObservedRunningTime="2026-03-19 18:58:13.886893667 +0000 UTC m=+118.640961980" Mar 19 18:58:13 crc kubenswrapper[4826]: I0319 18:58:13.893444 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565765-b929z"] Mar 19 18:58:13 crc kubenswrapper[4826]: I0319 18:58:13.904692 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-flj2h"] Mar 19 18:58:13 crc kubenswrapper[4826]: I0319 18:58:13.905320 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:13 crc kubenswrapper[4826]: E0319 18:58:13.906290 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:14.406236772 +0000 UTC m=+119.160305085 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mwhjm" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:13 crc kubenswrapper[4826]: I0319 18:58:13.965587 4826 ???:1] "http: TLS handshake error from 192.168.126.11:53368: no serving certificate available for the kubelet" Mar 19 18:58:13 crc kubenswrapper[4826]: I0319 18:58:13.976116 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lfzff" podStartSLOduration=56.976094631 podStartE2EDuration="56.976094631s" podCreationTimestamp="2026-03-19 18:57:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:13.975692161 +0000 UTC m=+118.729760494" watchObservedRunningTime="2026-03-19 18:58:13.976094631 +0000 UTC m=+118.730162944" Mar 19 18:58:13 crc kubenswrapper[4826]: I0319 18:58:13.978050 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vxvlt" podStartSLOduration=57.978041495 podStartE2EDuration="57.978041495s" podCreationTimestamp="2026-03-19 18:57:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:13.939172868 +0000 UTC m=+118.693241191" watchObservedRunningTime="2026-03-19 18:58:13.978041495 +0000 UTC m=+118.732109808" Mar 19 18:58:14 crc kubenswrapper[4826]: I0319 18:58:14.027843 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:14 crc kubenswrapper[4826]: E0319 18:58:14.028530 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:14.528506467 +0000 UTC m=+119.282574780 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:14 crc kubenswrapper[4826]: I0319 18:58:14.042743 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nw52z" podStartSLOduration=57.042716763 podStartE2EDuration="57.042716763s" podCreationTimestamp="2026-03-19 18:57:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:14.014507447 +0000 UTC m=+118.768575760" watchObservedRunningTime="2026-03-19 18:58:14.042716763 +0000 UTC m=+118.796785066" Mar 19 18:58:14 crc kubenswrapper[4826]: I0319 18:58:14.062839 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-m7zht" podStartSLOduration=6.06281364 podStartE2EDuration="6.06281364s" podCreationTimestamp="2026-03-19 18:58:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:14.062415219 +0000 UTC m=+118.816483532" watchObservedRunningTime="2026-03-19 18:58:14.06281364 +0000 UTC m=+118.816881953" Mar 19 18:58:14 crc kubenswrapper[4826]: I0319 18:58:14.129728 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:14 crc kubenswrapper[4826]: E0319 18:58:14.130167 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:14.630152681 +0000 UTC m=+119.384220994 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mwhjm" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:14 crc kubenswrapper[4826]: I0319 18:58:14.231772 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:14 crc kubenswrapper[4826]: E0319 18:58:14.232235 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:14.732201056 +0000 UTC m=+119.486269369 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:14 crc kubenswrapper[4826]: I0319 18:58:14.298086 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-brsbv"] Mar 19 18:58:14 crc kubenswrapper[4826]: I0319 18:58:14.322775 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-hcg9z"] Mar 19 18:58:14 crc kubenswrapper[4826]: I0319 18:58:14.334719 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:14 crc kubenswrapper[4826]: E0319 18:58:14.335103 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:14.835088384 +0000 UTC m=+119.589156697 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mwhjm" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:14 crc kubenswrapper[4826]: I0319 18:58:14.436804 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:14 crc kubenswrapper[4826]: E0319 18:58:14.437029 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:14.936996664 +0000 UTC m=+119.691064977 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:14 crc kubenswrapper[4826]: I0319 18:58:14.437098 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:14 crc kubenswrapper[4826]: E0319 18:58:14.437809 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:14.937802186 +0000 UTC m=+119.691870499 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mwhjm" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:14 crc kubenswrapper[4826]: I0319 18:58:14.491450 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-psc5t"] Mar 19 18:58:14 crc kubenswrapper[4826]: I0319 18:58:14.492744 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-psc5t" Mar 19 18:58:14 crc kubenswrapper[4826]: I0319 18:58:14.496850 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 19 18:58:14 crc kubenswrapper[4826]: I0319 18:58:14.512406 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-psc5t"] Mar 19 18:58:14 crc kubenswrapper[4826]: I0319 18:58:14.541671 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:14 crc kubenswrapper[4826]: I0319 18:58:14.542166 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca0af29a-9ab6-4d5f-a6fd-bdb5f3d5526c-catalog-content\") pod \"certified-operators-psc5t\" (UID: \"ca0af29a-9ab6-4d5f-a6fd-bdb5f3d5526c\") " pod="openshift-marketplace/certified-operators-psc5t" Mar 19 18:58:14 crc kubenswrapper[4826]: I0319 18:58:14.542210 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zp8h\" (UniqueName: \"kubernetes.io/projected/ca0af29a-9ab6-4d5f-a6fd-bdb5f3d5526c-kube-api-access-5zp8h\") pod \"certified-operators-psc5t\" (UID: \"ca0af29a-9ab6-4d5f-a6fd-bdb5f3d5526c\") " pod="openshift-marketplace/certified-operators-psc5t" Mar 19 18:58:14 crc kubenswrapper[4826]: I0319 18:58:14.542284 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca0af29a-9ab6-4d5f-a6fd-bdb5f3d5526c-utilities\") pod \"certified-operators-psc5t\" (UID: \"ca0af29a-9ab6-4d5f-a6fd-bdb5f3d5526c\") " pod="openshift-marketplace/certified-operators-psc5t" Mar 19 18:58:14 crc kubenswrapper[4826]: E0319 18:58:14.542418 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:15.042397481 +0000 UTC m=+119.796465794 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:14 crc kubenswrapper[4826]: I0319 18:58:14.576779 4826 patch_prober.go:28] interesting pod/console-operator-58897d9998-zc8ht container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 18:58:14 crc kubenswrapper[4826]: I0319 18:58:14.576901 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-zc8ht" podUID="f61cc107-39c3-4add-b9a1-45c5d744ea4b" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 18:58:14 crc kubenswrapper[4826]: I0319 18:58:14.590898 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lwdqq"] Mar 19 18:58:14 crc kubenswrapper[4826]: I0319 18:58:14.613559 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lwdqq" Mar 19 18:58:14 crc kubenswrapper[4826]: I0319 18:58:14.620420 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lwdqq"] Mar 19 18:58:14 crc kubenswrapper[4826]: I0319 18:58:14.631071 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 19 18:58:14 crc kubenswrapper[4826]: I0319 18:58:14.647727 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6397cca1-7284-4e40-9b7e-3f8026c72f5f-catalog-content\") pod \"community-operators-lwdqq\" (UID: \"6397cca1-7284-4e40-9b7e-3f8026c72f5f\") " pod="openshift-marketplace/community-operators-lwdqq" Mar 19 18:58:14 crc kubenswrapper[4826]: I0319 18:58:14.647802 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:14 crc kubenswrapper[4826]: I0319 18:58:14.647837 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca0af29a-9ab6-4d5f-a6fd-bdb5f3d5526c-utilities\") pod \"certified-operators-psc5t\" (UID: \"ca0af29a-9ab6-4d5f-a6fd-bdb5f3d5526c\") " pod="openshift-marketplace/certified-operators-psc5t" Mar 19 18:58:14 crc kubenswrapper[4826]: I0319 18:58:14.647861 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b68bp\" (UniqueName: \"kubernetes.io/projected/6397cca1-7284-4e40-9b7e-3f8026c72f5f-kube-api-access-b68bp\") pod \"community-operators-lwdqq\" (UID: \"6397cca1-7284-4e40-9b7e-3f8026c72f5f\") " pod="openshift-marketplace/community-operators-lwdqq" Mar 19 18:58:14 crc kubenswrapper[4826]: I0319 18:58:14.647887 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6397cca1-7284-4e40-9b7e-3f8026c72f5f-utilities\") pod \"community-operators-lwdqq\" (UID: \"6397cca1-7284-4e40-9b7e-3f8026c72f5f\") " pod="openshift-marketplace/community-operators-lwdqq" Mar 19 18:58:14 crc kubenswrapper[4826]: I0319 18:58:14.647942 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca0af29a-9ab6-4d5f-a6fd-bdb5f3d5526c-catalog-content\") pod \"certified-operators-psc5t\" (UID: \"ca0af29a-9ab6-4d5f-a6fd-bdb5f3d5526c\") " pod="openshift-marketplace/certified-operators-psc5t" Mar 19 18:58:14 crc kubenswrapper[4826]: I0319 18:58:14.647973 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zp8h\" (UniqueName: \"kubernetes.io/projected/ca0af29a-9ab6-4d5f-a6fd-bdb5f3d5526c-kube-api-access-5zp8h\") pod \"certified-operators-psc5t\" (UID: \"ca0af29a-9ab6-4d5f-a6fd-bdb5f3d5526c\") " pod="openshift-marketplace/certified-operators-psc5t" Mar 19 18:58:14 crc kubenswrapper[4826]: E0319 18:58:14.648672 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:15.14864045 +0000 UTC m=+119.902708763 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mwhjm" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:14 crc kubenswrapper[4826]: I0319 18:58:14.648997 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca0af29a-9ab6-4d5f-a6fd-bdb5f3d5526c-utilities\") pod \"certified-operators-psc5t\" (UID: \"ca0af29a-9ab6-4d5f-a6fd-bdb5f3d5526c\") " pod="openshift-marketplace/certified-operators-psc5t" Mar 19 18:58:14 crc kubenswrapper[4826]: I0319 18:58:14.649083 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca0af29a-9ab6-4d5f-a6fd-bdb5f3d5526c-catalog-content\") pod \"certified-operators-psc5t\" (UID: \"ca0af29a-9ab6-4d5f-a6fd-bdb5f3d5526c\") " pod="openshift-marketplace/certified-operators-psc5t" Mar 19 18:58:14 crc kubenswrapper[4826]: I0319 18:58:14.690447 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vxvlt" event={"ID":"a2760a70-3c84-42db-824f-1ed69c419347","Type":"ContainerStarted","Data":"60deb0e10b313a9ee95d66d624a9c3daadd0fff2b911a52661dfe7be978700c2"} Mar 19 18:58:14 crc kubenswrapper[4826]: I0319 18:58:14.727061 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-7jk4n" event={"ID":"f2201d52-2767-4119-9f15-8e3da0ee8570","Type":"ContainerStarted","Data":"a2637a4359b2730d6ddd80061b14ae717287188d4359010529dc9f211220a2e6"} Mar 19 18:58:14 crc kubenswrapper[4826]: I0319 18:58:14.727363 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zp8h\" (UniqueName: \"kubernetes.io/projected/ca0af29a-9ab6-4d5f-a6fd-bdb5f3d5526c-kube-api-access-5zp8h\") pod \"certified-operators-psc5t\" (UID: \"ca0af29a-9ab6-4d5f-a6fd-bdb5f3d5526c\") " pod="openshift-marketplace/certified-operators-psc5t" Mar 19 18:58:14 crc kubenswrapper[4826]: I0319 18:58:14.751133 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:14 crc kubenswrapper[4826]: I0319 18:58:14.751338 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:58:14 crc kubenswrapper[4826]: I0319 18:58:14.751384 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 18:58:14 crc kubenswrapper[4826]: I0319 18:58:14.751401 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:58:14 crc kubenswrapper[4826]: I0319 18:58:14.751429 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6397cca1-7284-4e40-9b7e-3f8026c72f5f-catalog-content\") pod \"community-operators-lwdqq\" (UID: \"6397cca1-7284-4e40-9b7e-3f8026c72f5f\") " pod="openshift-marketplace/community-operators-lwdqq" Mar 19 18:58:14 crc kubenswrapper[4826]: I0319 18:58:14.751470 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b68bp\" (UniqueName: \"kubernetes.io/projected/6397cca1-7284-4e40-9b7e-3f8026c72f5f-kube-api-access-b68bp\") pod \"community-operators-lwdqq\" (UID: \"6397cca1-7284-4e40-9b7e-3f8026c72f5f\") " pod="openshift-marketplace/community-operators-lwdqq" Mar 19 18:58:14 crc kubenswrapper[4826]: I0319 18:58:14.751501 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6397cca1-7284-4e40-9b7e-3f8026c72f5f-utilities\") pod \"community-operators-lwdqq\" (UID: \"6397cca1-7284-4e40-9b7e-3f8026c72f5f\") " pod="openshift-marketplace/community-operators-lwdqq" Mar 19 18:58:14 crc kubenswrapper[4826]: E0319 18:58:14.752428 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:15.252409051 +0000 UTC m=+120.006477364 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:14 crc kubenswrapper[4826]: I0319 18:58:14.754193 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:58:14 crc kubenswrapper[4826]: I0319 18:58:14.766990 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-z8vj2" event={"ID":"7b3f973e-b13a-45f3-a9cc-b84a8a8310a1","Type":"ContainerStarted","Data":"708b8dbea65dd6b1edfae5dd0b21d9ddd789391891e46c27b72f0aa548caec41"} Mar 19 18:58:14 crc kubenswrapper[4826]: I0319 18:58:14.768151 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-7jk4n" podStartSLOduration=57.768125168 podStartE2EDuration="57.768125168s" podCreationTimestamp="2026-03-19 18:57:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:14.76670148 +0000 UTC m=+119.520769803" watchObservedRunningTime="2026-03-19 18:58:14.768125168 +0000 UTC m=+119.522193481" Mar 19 18:58:14 crc kubenswrapper[4826]: I0319 18:58:14.768916 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6397cca1-7284-4e40-9b7e-3f8026c72f5f-catalog-content\") pod \"community-operators-lwdqq\" (UID: \"6397cca1-7284-4e40-9b7e-3f8026c72f5f\") " pod="openshift-marketplace/community-operators-lwdqq" Mar 19 18:58:14 crc kubenswrapper[4826]: I0319 18:58:14.769159 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6397cca1-7284-4e40-9b7e-3f8026c72f5f-utilities\") pod \"community-operators-lwdqq\" (UID: \"6397cca1-7284-4e40-9b7e-3f8026c72f5f\") " pod="openshift-marketplace/community-operators-lwdqq" Mar 19 18:58:14 crc kubenswrapper[4826]: I0319 18:58:14.770309 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:14.788340 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:14.809264 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jfxcc" event={"ID":"a7b17be3-fa05-45b6-ba33-b514326061b1","Type":"ContainerStarted","Data":"fc54f3ac254635ce002ff58cf6b8bbc7290258821d2befbac2a042676d3f644d"} Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:14.815647 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b68bp\" (UniqueName: \"kubernetes.io/projected/6397cca1-7284-4e40-9b7e-3f8026c72f5f-kube-api-access-b68bp\") pod \"community-operators-lwdqq\" (UID: \"6397cca1-7284-4e40-9b7e-3f8026c72f5f\") " pod="openshift-marketplace/community-operators-lwdqq" Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:14.817294 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mprm6"] Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:14.818683 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mprm6" Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:14.832764 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mprm6"] Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:14.844001 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-76ppq" event={"ID":"6d732543-0ecb-4570-bbfb-8a80570674d5","Type":"ContainerStarted","Data":"f9b44251d3c797fe03ab30169daa99bf5b5ad1e9ec8bfee018d4e854a5663d3b"} Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:14.853597 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:14.853683 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/007d8118-0079-4d3d-b764-01eadbd419c5-utilities\") pod \"certified-operators-mprm6\" (UID: \"007d8118-0079-4d3d-b764-01eadbd419c5\") " pod="openshift-marketplace/certified-operators-mprm6" Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:14.853710 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/007d8118-0079-4d3d-b764-01eadbd419c5-catalog-content\") pod \"certified-operators-mprm6\" (UID: \"007d8118-0079-4d3d-b764-01eadbd419c5\") " pod="openshift-marketplace/certified-operators-mprm6" Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:14.853734 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzm69\" (UniqueName: \"kubernetes.io/projected/007d8118-0079-4d3d-b764-01eadbd419c5-kube-api-access-qzm69\") pod \"certified-operators-mprm6\" (UID: \"007d8118-0079-4d3d-b764-01eadbd419c5\") " pod="openshift-marketplace/certified-operators-mprm6" Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:14.853763 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 18:58:15 crc kubenswrapper[4826]: E0319 18:58:14.854133 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:15.354107706 +0000 UTC m=+120.108176009 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mwhjm" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:14.864218 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dnc22" event={"ID":"fdb49b25-5e81-4f9d-9a17-34bade2cec18","Type":"ContainerStarted","Data":"044f92f86aba2991d80748642db0c047dfdc7107f38c0fd838b88fe12ce76934"} Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:14.867454 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:14.882969 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jfxcc" podStartSLOduration=57.882952001 podStartE2EDuration="57.882952001s" podCreationTimestamp="2026-03-19 18:57:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:14.851749053 +0000 UTC m=+119.605817366" watchObservedRunningTime="2026-03-19 18:58:14.882952001 +0000 UTC m=+119.637020314" Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:14.884193 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-psc5t" Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:14.934334 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8ghl4" event={"ID":"a5ae7566-0aea-4736-8a36-3f4664ab9768","Type":"ContainerStarted","Data":"79b98f7a448993b829f355a82bc005a4bc8345fafcdb36301acdb539122c0d0f"} Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:14.950642 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vj6zq" event={"ID":"0d3acdc6-8778-4094-af1f-8f3824029d90","Type":"ContainerStarted","Data":"68b7cdacbe181946539534a998efb33bd3018b114c6e91f42b493bf2d1faa02b"} Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:14.967872 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:14.968331 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/007d8118-0079-4d3d-b764-01eadbd419c5-utilities\") pod \"certified-operators-mprm6\" (UID: \"007d8118-0079-4d3d-b764-01eadbd419c5\") " pod="openshift-marketplace/certified-operators-mprm6" Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:14.968363 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/007d8118-0079-4d3d-b764-01eadbd419c5-catalog-content\") pod \"certified-operators-mprm6\" (UID: \"007d8118-0079-4d3d-b764-01eadbd419c5\") " pod="openshift-marketplace/certified-operators-mprm6" Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:14.968382 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzm69\" (UniqueName: \"kubernetes.io/projected/007d8118-0079-4d3d-b764-01eadbd419c5-kube-api-access-qzm69\") pod \"certified-operators-mprm6\" (UID: \"007d8118-0079-4d3d-b764-01eadbd419c5\") " pod="openshift-marketplace/certified-operators-mprm6" Mar 19 18:58:15 crc kubenswrapper[4826]: E0319 18:58:14.971205 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:15.471129478 +0000 UTC m=+120.225197791 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:14.971932 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/007d8118-0079-4d3d-b764-01eadbd419c5-utilities\") pod \"certified-operators-mprm6\" (UID: \"007d8118-0079-4d3d-b764-01eadbd419c5\") " pod="openshift-marketplace/certified-operators-mprm6" Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:14.972684 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/007d8118-0079-4d3d-b764-01eadbd419c5-catalog-content\") pod \"certified-operators-mprm6\" (UID: \"007d8118-0079-4d3d-b764-01eadbd419c5\") " pod="openshift-marketplace/certified-operators-mprm6" Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:14.982610 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8rs7z"] Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:14.996666 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lwdqq" Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:15.002627 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nlft6" event={"ID":"4858c7f7-6a71-40dc-8222-082f6d97504c","Type":"ContainerStarted","Data":"1576b91cf818fd7de613b5438b0bc85e1e77b30f68a43c9c1ff14ba11799209c"} Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:15.002818 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8rs7z" Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:15.011189 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8rs7z"] Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:15.026090 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:15.026470 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ph9t5" event={"ID":"a4a3e741-fc60-4076-8167-0e7cc776345e","Type":"ContainerStarted","Data":"6d451e6a726a97984b2b3162445c08aaa0b0df23c2f7279c4936e02182bb843a"} Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:15.037216 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:15.049904 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzm69\" (UniqueName: \"kubernetes.io/projected/007d8118-0079-4d3d-b764-01eadbd419c5-kube-api-access-qzm69\") pod \"certified-operators-mprm6\" (UID: \"007d8118-0079-4d3d-b764-01eadbd419c5\") " pod="openshift-marketplace/certified-operators-mprm6" Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:15.054610 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:15.060356 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-pfrcn" event={"ID":"72f0a310-1676-49a4-826a-d83406d28e93","Type":"ContainerStarted","Data":"76591b9cfad14c68b1ef112f6ed7cca58927da7f28bfa6fafae17389b99d7728"} Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:15.061076 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-pfrcn" Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:15.068381 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fcnzx" event={"ID":"781f0741-f222-4ccc-aa80-6dde59e9648d","Type":"ContainerStarted","Data":"8b0ba342dc18287b7d03756611a5c9647857168fb16fc103e470e4ca570d9483"} Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:15.069374 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcf719a6-7a63-4efa-b8dd-1beba09934f9-utilities\") pod \"community-operators-8rs7z\" (UID: \"dcf719a6-7a63-4efa-b8dd-1beba09934f9\") " pod="openshift-marketplace/community-operators-8rs7z" Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:15.069427 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:15.069563 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qghf\" (UniqueName: \"kubernetes.io/projected/dcf719a6-7a63-4efa-b8dd-1beba09934f9-kube-api-access-4qghf\") pod \"community-operators-8rs7z\" (UID: \"dcf719a6-7a63-4efa-b8dd-1beba09934f9\") " pod="openshift-marketplace/community-operators-8rs7z" Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:15.069597 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcf719a6-7a63-4efa-b8dd-1beba09934f9-catalog-content\") pod \"community-operators-8rs7z\" (UID: \"dcf719a6-7a63-4efa-b8dd-1beba09934f9\") " pod="openshift-marketplace/community-operators-8rs7z" Mar 19 18:58:15 crc kubenswrapper[4826]: E0319 18:58:15.069893 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:15.569869613 +0000 UTC m=+120.323937926 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mwhjm" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:15.071525 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fcnzx" Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:15.111827 4826 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-fcnzx container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.44:5443/healthz\": dial tcp 10.217.0.44:5443: connect: connection refused" start-of-body= Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:15.111891 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fcnzx" podUID="781f0741-f222-4ccc-aa80-6dde59e9648d" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.44:5443/healthz\": dial tcp 10.217.0.44:5443: connect: connection refused" Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:15.114125 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-h7mf4" event={"ID":"86f5311b-39ed-455f-a9bc-b83044d63db8","Type":"ContainerStarted","Data":"bbb7e08499f711602a480215d582d883771a80279c65a4f7839619d70f8c3a13"} Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:15.174376 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:15.174673 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qghf\" (UniqueName: \"kubernetes.io/projected/dcf719a6-7a63-4efa-b8dd-1beba09934f9-kube-api-access-4qghf\") pod \"community-operators-8rs7z\" (UID: \"dcf719a6-7a63-4efa-b8dd-1beba09934f9\") " pod="openshift-marketplace/community-operators-8rs7z" Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:15.174698 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcf719a6-7a63-4efa-b8dd-1beba09934f9-catalog-content\") pod \"community-operators-8rs7z\" (UID: \"dcf719a6-7a63-4efa-b8dd-1beba09934f9\") " pod="openshift-marketplace/community-operators-8rs7z" Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:15.174731 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcf719a6-7a63-4efa-b8dd-1beba09934f9-utilities\") pod \"community-operators-8rs7z\" (UID: \"dcf719a6-7a63-4efa-b8dd-1beba09934f9\") " pod="openshift-marketplace/community-operators-8rs7z" Mar 19 18:58:15 crc kubenswrapper[4826]: E0319 18:58:15.174847 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:15.674828987 +0000 UTC m=+120.428897300 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:15.177524 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcf719a6-7a63-4efa-b8dd-1beba09934f9-catalog-content\") pod \"community-operators-8rs7z\" (UID: \"dcf719a6-7a63-4efa-b8dd-1beba09934f9\") " pod="openshift-marketplace/community-operators-8rs7z" Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:15.177996 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-p892r" event={"ID":"db8ad588-15a8-47f2-97d5-950d4a757183","Type":"ContainerStarted","Data":"fd0778cea8bc6275ed6e6423bd7ef2ef76b80d94d0f747e3ea7062dbb60aeb42"} Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:15.178057 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcf719a6-7a63-4efa-b8dd-1beba09934f9-utilities\") pod \"community-operators-8rs7z\" (UID: \"dcf719a6-7a63-4efa-b8dd-1beba09934f9\") " pod="openshift-marketplace/community-operators-8rs7z" Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:15.185151 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8ghl4" podStartSLOduration=59.185130138 podStartE2EDuration="59.185130138s" podCreationTimestamp="2026-03-19 18:57:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:15.177378196 +0000 UTC m=+119.931446509" watchObservedRunningTime="2026-03-19 18:58:15.185130138 +0000 UTC m=+119.939198451" Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:15.206188 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r4w59" event={"ID":"18582bb6-239d-4f2e-91a4-2ea59fb54d71","Type":"ContainerStarted","Data":"a518dc620f2f9eca2fdab8ae112dd974c28c50f76bd594776ee49d0c52844203"} Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:15.216020 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-cx4gd" event={"ID":"4a42d5d3-6341-4be7-a1fc-69f49476197f","Type":"ContainerStarted","Data":"90bcf9880e9a5f5618b3d358c985201eff61b1fce07206be5610321c9524a68b"} Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:15.237925 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-6vlbh" event={"ID":"6f264ef7-44e5-4dee-91c5-89c87a000c9f","Type":"ContainerStarted","Data":"6be9fd29bc26095fe3bfb7b25ecbb9dc3c972ab66fd0a850de6dcd893ecd9668"} Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:15.253113 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qghf\" (UniqueName: \"kubernetes.io/projected/dcf719a6-7a63-4efa-b8dd-1beba09934f9-kube-api-access-4qghf\") pod \"community-operators-8rs7z\" (UID: \"dcf719a6-7a63-4efa-b8dd-1beba09934f9\") " pod="openshift-marketplace/community-operators-8rs7z" Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:15.286383 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:15 crc kubenswrapper[4826]: E0319 18:58:15.287943 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:15.787911122 +0000 UTC m=+120.541979435 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mwhjm" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:15.374024 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-pfrcn" podStartSLOduration=59.373992153 podStartE2EDuration="59.373992153s" podCreationTimestamp="2026-03-19 18:57:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:15.353604438 +0000 UTC m=+120.107672751" watchObservedRunningTime="2026-03-19 18:58:15.373992153 +0000 UTC m=+120.128060466" Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:15.374904 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fcnzx" podStartSLOduration=58.374897277 podStartE2EDuration="58.374897277s" podCreationTimestamp="2026-03-19 18:57:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:15.286162365 +0000 UTC m=+120.040230678" watchObservedRunningTime="2026-03-19 18:58:15.374897277 +0000 UTC m=+120.128965590" Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:15.386577 4826 ???:1] "http: TLS handshake error from 192.168.126.11:53382: no serving certificate available for the kubelet" Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:15.387513 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:15 crc kubenswrapper[4826]: E0319 18:58:15.388833 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:15.888814776 +0000 UTC m=+120.642883089 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:15.415784 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-drbf6" event={"ID":"ee11e1f6-25be-40f4-b19b-a2d8e439d8c6","Type":"ContainerStarted","Data":"e5ad2e2cb4a689f1fcd62421183d45d8058fdf428b04c4a6dd5bcdf89ea60bd9"} Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:15.509155 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:15 crc kubenswrapper[4826]: E0319 18:58:15.516395 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:16.016374514 +0000 UTC m=+120.770442827 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mwhjm" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:15.523665 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mprm6" Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:15.526071 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vj6zq" podStartSLOduration=58.526054927 podStartE2EDuration="58.526054927s" podCreationTimestamp="2026-03-19 18:57:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:15.523290892 +0000 UTC m=+120.277359205" watchObservedRunningTime="2026-03-19 18:58:15.526054927 +0000 UTC m=+120.280123240" Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:15.578050 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-jmhs5" event={"ID":"cf827e9c-2ada-440e-a3e0-99deb1eb54c1","Type":"ContainerStarted","Data":"38a2e9f7f317160213e38339c11817369d592a8f5eed83c58e724eedec2f14d3"} Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:15.612185 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:15 crc kubenswrapper[4826]: E0319 18:58:15.621415 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:16.113895496 +0000 UTC m=+120.867963809 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:15.637454 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8rs7z" Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:15.677004 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565765-b929z" event={"ID":"7c53abe4-412d-47a0-bccc-ec9e6f4d8784","Type":"ContainerStarted","Data":"4b0b2244dc2d1bfa26f6b6b2f0832b5da4f1ff6e82bc6b2786fa65bc080e6611"} Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:15.705558 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-h7mf4" podStartSLOduration=59.705525057 podStartE2EDuration="59.705525057s" podCreationTimestamp="2026-03-19 18:57:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:15.66036228 +0000 UTC m=+120.414430603" watchObservedRunningTime="2026-03-19 18:58:15.705525057 +0000 UTC m=+120.459593370" Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:15.723810 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:15 crc kubenswrapper[4826]: E0319 18:58:15.725193 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:16.225173443 +0000 UTC m=+120.979241756 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mwhjm" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:15.765525 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-flj2h" event={"ID":"ffb5bca9-57f3-415c-b2c2-c5088fe6c5d9","Type":"ContainerStarted","Data":"d31b3b850c369f0d05e723c9102b3b9f24bb6a7e11c48499933d9534695a950c"} Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:15.767752 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-fc4kh" event={"ID":"e4f6594b-0c99-449b-ac20-4aae09000a73","Type":"ContainerStarted","Data":"3ab9dbbf3cf7eedce5c20967a164da7e318138c5cb718813db05f3d7dbf7b226"} Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:15.770716 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7fdpm" event={"ID":"9f822d71-562c-4d2c-917f-82281bef6c8a","Type":"ContainerStarted","Data":"88bcfdb334984330951ef9a43d451f9aec49bae9be2e3d600b36aed40a11f18f"} Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:15.771746 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-brsbv" podUID="8db9fbca-87a7-4706-9aef-e78fa4fefe16" containerName="controller-manager" containerID="cri-o://a677e0802b103dc9d367ba48513b25e441755829fbbaadda602f9b0a35a2ad74" gracePeriod=30 Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:15.774790 4826 patch_prober.go:28] interesting pod/downloads-7954f5f757-cbmtf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:15.781493 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-cbmtf" podUID="0a13bc75-83b6-4952-8e8e-cd93809a87b5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:15.824725 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:15 crc kubenswrapper[4826]: E0319 18:58:15.824899 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:16.324864383 +0000 UTC m=+121.078932696 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:15.830722 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:15 crc kubenswrapper[4826]: E0319 18:58:15.843726 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:16.343709795 +0000 UTC m=+121.097778108 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mwhjm" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:15.866556 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-m7zht" Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:15.874551 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-p892r" podStartSLOduration=59.874535093 podStartE2EDuration="59.874535093s" podCreationTimestamp="2026-03-19 18:57:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:15.749257467 +0000 UTC m=+120.503325780" watchObservedRunningTime="2026-03-19 18:58:15.874535093 +0000 UTC m=+120.628603406" Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:15.876849 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9bmdx" podStartSLOduration=58.876842236 podStartE2EDuration="58.876842236s" podCreationTimestamp="2026-03-19 18:57:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:15.873574357 +0000 UTC m=+120.627642670" watchObservedRunningTime="2026-03-19 18:58:15.876842236 +0000 UTC m=+120.630910549" Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:15.934428 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:15 crc kubenswrapper[4826]: E0319 18:58:15.935417 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:16.435391439 +0000 UTC m=+121.189459752 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:15 crc kubenswrapper[4826]: I0319 18:58:15.950869 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-drbf6" podStartSLOduration=58.950849639 podStartE2EDuration="58.950849639s" podCreationTimestamp="2026-03-19 18:57:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:15.949186043 +0000 UTC m=+120.703254356" watchObservedRunningTime="2026-03-19 18:58:15.950849639 +0000 UTC m=+120.704917942" Mar 19 18:58:16 crc kubenswrapper[4826]: I0319 18:58:16.039468 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:16 crc kubenswrapper[4826]: E0319 18:58:16.043806 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:16.543789986 +0000 UTC m=+121.297858299 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mwhjm" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:16 crc kubenswrapper[4826]: I0319 18:58:16.069462 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tw9k9" podStartSLOduration=59.069436933 podStartE2EDuration="59.069436933s" podCreationTimestamp="2026-03-19 18:57:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:16.037833474 +0000 UTC m=+120.791901797" watchObservedRunningTime="2026-03-19 18:58:16.069436933 +0000 UTC m=+120.823505236" Mar 19 18:58:16 crc kubenswrapper[4826]: I0319 18:58:16.085880 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-psc5t"] Mar 19 18:58:16 crc kubenswrapper[4826]: I0319 18:58:16.085917 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 19 18:58:16 crc kubenswrapper[4826]: I0319 18:58:16.128037 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-6vlbh" podStartSLOduration=59.128007926 podStartE2EDuration="59.128007926s" podCreationTimestamp="2026-03-19 18:57:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:16.119529275 +0000 UTC m=+120.873597598" watchObservedRunningTime="2026-03-19 18:58:16.128007926 +0000 UTC m=+120.882076239" Mar 19 18:58:16 crc kubenswrapper[4826]: I0319 18:58:16.144883 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:16 crc kubenswrapper[4826]: E0319 18:58:16.145576 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:16.645553763 +0000 UTC m=+121.399622066 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:16 crc kubenswrapper[4826]: I0319 18:58:16.242884 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-drbf6" Mar 19 18:58:16 crc kubenswrapper[4826]: I0319 18:58:16.249991 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:16 crc kubenswrapper[4826]: E0319 18:58:16.250317 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:16.750303572 +0000 UTC m=+121.504371885 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mwhjm" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:16 crc kubenswrapper[4826]: I0319 18:58:16.263360 4826 patch_prober.go:28] interesting pod/router-default-5444994796-drbf6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 18:58:16 crc kubenswrapper[4826]: [-]has-synced failed: reason withheld Mar 19 18:58:16 crc kubenswrapper[4826]: [+]process-running ok Mar 19 18:58:16 crc kubenswrapper[4826]: healthz check failed Mar 19 18:58:16 crc kubenswrapper[4826]: I0319 18:58:16.264479 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-drbf6" podUID="ee11e1f6-25be-40f4-b19b-a2d8e439d8c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 18:58:16 crc kubenswrapper[4826]: I0319 18:58:16.313516 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=0.31348827 podStartE2EDuration="313.48827ms" podCreationTimestamp="2026-03-19 18:58:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:16.278116987 +0000 UTC m=+121.032185300" watchObservedRunningTime="2026-03-19 18:58:16.31348827 +0000 UTC m=+121.067556583" Mar 19 18:58:16 crc kubenswrapper[4826]: I0319 18:58:16.314150 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-fc4kh" podStartSLOduration=8.314144037 podStartE2EDuration="8.314144037s" podCreationTimestamp="2026-03-19 18:58:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:16.224080028 +0000 UTC m=+120.978148341" watchObservedRunningTime="2026-03-19 18:58:16.314144037 +0000 UTC m=+121.068212350" Mar 19 18:58:16 crc kubenswrapper[4826]: I0319 18:58:16.330843 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lwdqq"] Mar 19 18:58:16 crc kubenswrapper[4826]: I0319 18:58:16.351483 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:16 crc kubenswrapper[4826]: E0319 18:58:16.351942 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:16.851902744 +0000 UTC m=+121.605971227 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:16 crc kubenswrapper[4826]: I0319 18:58:16.454310 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:16 crc kubenswrapper[4826]: E0319 18:58:16.455239 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:16.955226734 +0000 UTC m=+121.709295047 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mwhjm" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:16 crc kubenswrapper[4826]: I0319 18:58:16.550125 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2fpvj"] Mar 19 18:58:16 crc kubenswrapper[4826]: I0319 18:58:16.559905 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:16 crc kubenswrapper[4826]: E0319 18:58:16.560299 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:17.06027808 +0000 UTC m=+121.814346393 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:16 crc kubenswrapper[4826]: I0319 18:58:16.560967 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2fpvj" Mar 19 18:58:16 crc kubenswrapper[4826]: I0319 18:58:16.565459 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 19 18:58:16 crc kubenswrapper[4826]: I0319 18:58:16.589166 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2fpvj"] Mar 19 18:58:16 crc kubenswrapper[4826]: I0319 18:58:16.661895 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e0d7689-755d-4e24-a337-4177c37c2437-catalog-content\") pod \"redhat-marketplace-2fpvj\" (UID: \"0e0d7689-755d-4e24-a337-4177c37c2437\") " pod="openshift-marketplace/redhat-marketplace-2fpvj" Mar 19 18:58:16 crc kubenswrapper[4826]: I0319 18:58:16.661964 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e0d7689-755d-4e24-a337-4177c37c2437-utilities\") pod \"redhat-marketplace-2fpvj\" (UID: \"0e0d7689-755d-4e24-a337-4177c37c2437\") " pod="openshift-marketplace/redhat-marketplace-2fpvj" Mar 19 18:58:16 crc kubenswrapper[4826]: I0319 18:58:16.662036 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcngb\" (UniqueName: \"kubernetes.io/projected/0e0d7689-755d-4e24-a337-4177c37c2437-kube-api-access-zcngb\") pod \"redhat-marketplace-2fpvj\" (UID: \"0e0d7689-755d-4e24-a337-4177c37c2437\") " pod="openshift-marketplace/redhat-marketplace-2fpvj" Mar 19 18:58:16 crc kubenswrapper[4826]: I0319 18:58:16.662083 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:16 crc kubenswrapper[4826]: E0319 18:58:16.666105 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:17.166086648 +0000 UTC m=+121.920154961 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mwhjm" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:16 crc kubenswrapper[4826]: I0319 18:58:16.766760 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:16 crc kubenswrapper[4826]: E0319 18:58:16.767390 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:17.267366041 +0000 UTC m=+122.021434364 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:16 crc kubenswrapper[4826]: I0319 18:58:16.767848 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcngb\" (UniqueName: \"kubernetes.io/projected/0e0d7689-755d-4e24-a337-4177c37c2437-kube-api-access-zcngb\") pod \"redhat-marketplace-2fpvj\" (UID: \"0e0d7689-755d-4e24-a337-4177c37c2437\") " pod="openshift-marketplace/redhat-marketplace-2fpvj" Mar 19 18:58:16 crc kubenswrapper[4826]: I0319 18:58:16.767894 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:16 crc kubenswrapper[4826]: I0319 18:58:16.768375 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e0d7689-755d-4e24-a337-4177c37c2437-catalog-content\") pod \"redhat-marketplace-2fpvj\" (UID: \"0e0d7689-755d-4e24-a337-4177c37c2437\") " pod="openshift-marketplace/redhat-marketplace-2fpvj" Mar 19 18:58:16 crc kubenswrapper[4826]: I0319 18:58:16.767951 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e0d7689-755d-4e24-a337-4177c37c2437-catalog-content\") pod \"redhat-marketplace-2fpvj\" (UID: \"0e0d7689-755d-4e24-a337-4177c37c2437\") " pod="openshift-marketplace/redhat-marketplace-2fpvj" Mar 19 18:58:16 crc kubenswrapper[4826]: I0319 18:58:16.768699 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e0d7689-755d-4e24-a337-4177c37c2437-utilities\") pod \"redhat-marketplace-2fpvj\" (UID: \"0e0d7689-755d-4e24-a337-4177c37c2437\") " pod="openshift-marketplace/redhat-marketplace-2fpvj" Mar 19 18:58:16 crc kubenswrapper[4826]: I0319 18:58:16.769094 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e0d7689-755d-4e24-a337-4177c37c2437-utilities\") pod \"redhat-marketplace-2fpvj\" (UID: \"0e0d7689-755d-4e24-a337-4177c37c2437\") " pod="openshift-marketplace/redhat-marketplace-2fpvj" Mar 19 18:58:16 crc kubenswrapper[4826]: E0319 18:58:16.769291 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:17.269277983 +0000 UTC m=+122.023346286 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mwhjm" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:16 crc kubenswrapper[4826]: I0319 18:58:16.844268 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcngb\" (UniqueName: \"kubernetes.io/projected/0e0d7689-755d-4e24-a337-4177c37c2437-kube-api-access-zcngb\") pod \"redhat-marketplace-2fpvj\" (UID: \"0e0d7689-755d-4e24-a337-4177c37c2437\") " pod="openshift-marketplace/redhat-marketplace-2fpvj" Mar 19 18:58:16 crc kubenswrapper[4826]: I0319 18:58:16.848274 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nlft6" event={"ID":"4858c7f7-6a71-40dc-8222-082f6d97504c","Type":"ContainerStarted","Data":"67227ad1999624c280a3bf5925ca8570a04b346a25b44f4988aff9a21b97cb48"} Mar 19 18:58:16 crc kubenswrapper[4826]: I0319 18:58:16.851758 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nlft6" Mar 19 18:58:16 crc kubenswrapper[4826]: I0319 18:58:16.880155 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:16 crc kubenswrapper[4826]: E0319 18:58:16.880888 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:17.380851517 +0000 UTC m=+122.134919830 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:16 crc kubenswrapper[4826]: I0319 18:58:16.882242 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-flj2h" event={"ID":"ffb5bca9-57f3-415c-b2c2-c5088fe6c5d9","Type":"ContainerStarted","Data":"7df9b284d591f0263e0c1a5abc3c274f43472b3d37014e63cb605c3924259aef"} Mar 19 18:58:16 crc kubenswrapper[4826]: I0319 18:58:16.883159 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-flj2h" Mar 19 18:58:16 crc kubenswrapper[4826]: I0319 18:58:16.891278 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-m7zht"] Mar 19 18:58:16 crc kubenswrapper[4826]: I0319 18:58:16.907731 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r4w59" event={"ID":"18582bb6-239d-4f2e-91a4-2ea59fb54d71","Type":"ContainerStarted","Data":"291f2bfd2fd19b51f22cb732e6d9ca88d3ce22f547b3f6d9407d48fc5cdf257f"} Mar 19 18:58:16 crc kubenswrapper[4826]: I0319 18:58:16.911370 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r4w59" event={"ID":"18582bb6-239d-4f2e-91a4-2ea59fb54d71","Type":"ContainerStarted","Data":"785d856a1cfc436d7faa12c4c0d36dbec765dea62c8051279b17662f5754d96b"} Mar 19 18:58:16 crc kubenswrapper[4826]: I0319 18:58:16.909341 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nlft6" podStartSLOduration=59.90930327 podStartE2EDuration="59.90930327s" podCreationTimestamp="2026-03-19 18:57:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:16.902211058 +0000 UTC m=+121.656279371" watchObservedRunningTime="2026-03-19 18:58:16.90930327 +0000 UTC m=+121.663371583" Mar 19 18:58:16 crc kubenswrapper[4826]: I0319 18:58:16.911527 4826 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-flj2h container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Mar 19 18:58:16 crc kubenswrapper[4826]: I0319 18:58:16.911584 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-flj2h" podUID="ffb5bca9-57f3-415c-b2c2-c5088fe6c5d9" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" Mar 19 18:58:16 crc kubenswrapper[4826]: I0319 18:58:16.944150 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2fpvj" Mar 19 18:58:16 crc kubenswrapper[4826]: I0319 18:58:16.944610 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"a7bf203e75be9362b1d1416119e99869df66a71f1b2d4686a9691d5fec5a7b23"} Mar 19 18:58:16 crc kubenswrapper[4826]: I0319 18:58:16.965330 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-flj2h" podStartSLOduration=59.965289793 podStartE2EDuration="59.965289793s" podCreationTimestamp="2026-03-19 18:57:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:16.960746209 +0000 UTC m=+121.714814522" watchObservedRunningTime="2026-03-19 18:58:16.965289793 +0000 UTC m=+121.719358096" Mar 19 18:58:16 crc kubenswrapper[4826]: I0319 18:58:16.982678 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:16 crc kubenswrapper[4826]: E0319 18:58:16.983027 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:17.483015305 +0000 UTC m=+122.237083618 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mwhjm" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.010803 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vsrvh"] Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.012150 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vsrvh" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.019277 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r4w59" podStartSLOduration=60.019259311 podStartE2EDuration="1m0.019259311s" podCreationTimestamp="2026-03-19 18:57:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:17.016576018 +0000 UTC m=+121.770644341" watchObservedRunningTime="2026-03-19 18:58:17.019259311 +0000 UTC m=+121.773327624" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.032244 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mprm6"] Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.032307 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-jmhs5" event={"ID":"cf827e9c-2ada-440e-a3e0-99deb1eb54c1","Type":"ContainerStarted","Data":"3caf1336f682d6441d8e108e7000c8e6347918f89fff2cfd185a70727a59b501"} Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.038181 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vsrvh"] Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.061166 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.086626 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:17 crc kubenswrapper[4826]: E0319 18:58:17.087054 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:17.587015783 +0000 UTC m=+122.341084096 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.087156 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06fdacd5-0f40-4d55-8df2-67ea56f25595-catalog-content\") pod \"redhat-marketplace-vsrvh\" (UID: \"06fdacd5-0f40-4d55-8df2-67ea56f25595\") " pod="openshift-marketplace/redhat-marketplace-vsrvh" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.087183 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06fdacd5-0f40-4d55-8df2-67ea56f25595-utilities\") pod \"redhat-marketplace-vsrvh\" (UID: \"06fdacd5-0f40-4d55-8df2-67ea56f25595\") " pod="openshift-marketplace/redhat-marketplace-vsrvh" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.087226 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.087272 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkc6j\" (UniqueName: \"kubernetes.io/projected/06fdacd5-0f40-4d55-8df2-67ea56f25595-kube-api-access-xkc6j\") pod \"redhat-marketplace-vsrvh\" (UID: \"06fdacd5-0f40-4d55-8df2-67ea56f25595\") " pod="openshift-marketplace/redhat-marketplace-vsrvh" Mar 19 18:58:17 crc kubenswrapper[4826]: E0319 18:58:17.089000 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:17.588980737 +0000 UTC m=+122.343049050 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mwhjm" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.108007 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-jmhs5" podStartSLOduration=60.107985273 podStartE2EDuration="1m0.107985273s" podCreationTimestamp="2026-03-19 18:57:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:17.105954898 +0000 UTC m=+121.860023211" watchObservedRunningTime="2026-03-19 18:58:17.107985273 +0000 UTC m=+121.862053586" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.135486 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"924eaf45c108010a89e613176dcd7b45a23f2ab5f857de0f6f9d92f5f4ddc7f8"} Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.137741 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.189696 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.190060 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkc6j\" (UniqueName: \"kubernetes.io/projected/06fdacd5-0f40-4d55-8df2-67ea56f25595-kube-api-access-xkc6j\") pod \"redhat-marketplace-vsrvh\" (UID: \"06fdacd5-0f40-4d55-8df2-67ea56f25595\") " pod="openshift-marketplace/redhat-marketplace-vsrvh" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.190150 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06fdacd5-0f40-4d55-8df2-67ea56f25595-catalog-content\") pod \"redhat-marketplace-vsrvh\" (UID: \"06fdacd5-0f40-4d55-8df2-67ea56f25595\") " pod="openshift-marketplace/redhat-marketplace-vsrvh" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.190178 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06fdacd5-0f40-4d55-8df2-67ea56f25595-utilities\") pod \"redhat-marketplace-vsrvh\" (UID: \"06fdacd5-0f40-4d55-8df2-67ea56f25595\") " pod="openshift-marketplace/redhat-marketplace-vsrvh" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.190809 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06fdacd5-0f40-4d55-8df2-67ea56f25595-utilities\") pod \"redhat-marketplace-vsrvh\" (UID: \"06fdacd5-0f40-4d55-8df2-67ea56f25595\") " pod="openshift-marketplace/redhat-marketplace-vsrvh" Mar 19 18:58:17 crc kubenswrapper[4826]: E0319 18:58:17.192564 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:17.692528952 +0000 UTC m=+122.446597265 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.193134 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06fdacd5-0f40-4d55-8df2-67ea56f25595-catalog-content\") pod \"redhat-marketplace-vsrvh\" (UID: \"06fdacd5-0f40-4d55-8df2-67ea56f25595\") " pod="openshift-marketplace/redhat-marketplace-vsrvh" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.206327 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4rf57" event={"ID":"eeb43c2f-961b-4ed4-9aa0-cda4dea289cb","Type":"ContainerStarted","Data":"704feb3759aa2ea56bbd87171d4eb41ed72c669eac178bfeb28f17d077c30eae"} Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.207931 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-brsbv" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.248803 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565765-b929z" event={"ID":"7c53abe4-412d-47a0-bccc-ec9e6f4d8784","Type":"ContainerStarted","Data":"c9b0fe2e7f9ebfff2357c08b0c5c9bdbaa33ce2f44b1e85e2a7eb86f4a52b539"} Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.262270 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-76ppq" event={"ID":"6d732543-0ecb-4570-bbfb-8a80570674d5","Type":"ContainerStarted","Data":"16df905742abaceffc2a8c87db7c05e8bfa3d301f3b7c546e4deee033ab562f8"} Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.268353 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8rs7z"] Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.279063 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkc6j\" (UniqueName: \"kubernetes.io/projected/06fdacd5-0f40-4d55-8df2-67ea56f25595-kube-api-access-xkc6j\") pod \"redhat-marketplace-vsrvh\" (UID: \"06fdacd5-0f40-4d55-8df2-67ea56f25595\") " pod="openshift-marketplace/redhat-marketplace-vsrvh" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.291937 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:17 crc kubenswrapper[4826]: E0319 18:58:17.292244 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:17.792232863 +0000 UTC m=+122.546301176 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mwhjm" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.292736 4826 patch_prober.go:28] interesting pod/router-default-5444994796-drbf6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 18:58:17 crc kubenswrapper[4826]: [-]has-synced failed: reason withheld Mar 19 18:58:17 crc kubenswrapper[4826]: [+]process-running ok Mar 19 18:58:17 crc kubenswrapper[4826]: healthz check failed Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.292799 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-drbf6" podUID="ee11e1f6-25be-40f4-b19b-a2d8e439d8c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.305568 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=34.305529244 podStartE2EDuration="34.305529244s" podCreationTimestamp="2026-03-19 18:57:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:17.289757416 +0000 UTC m=+122.043825749" watchObservedRunningTime="2026-03-19 18:58:17.305529244 +0000 UTC m=+122.059597567" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.313162 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-cx4gd" event={"ID":"4a42d5d3-6341-4be7-a1fc-69f49476197f","Type":"ContainerStarted","Data":"02e430fcf08217055316c36d211773666bfecefc8939e5562bee7ba9a1bd78a2"} Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.362411 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-76ppq" podStartSLOduration=60.362391501 podStartE2EDuration="1m0.362391501s" podCreationTimestamp="2026-03-19 18:57:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:17.339244471 +0000 UTC m=+122.093312774" watchObservedRunningTime="2026-03-19 18:58:17.362391501 +0000 UTC m=+122.116459814" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.376388 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-7jk4n" event={"ID":"f2201d52-2767-4119-9f15-8e3da0ee8570","Type":"ContainerStarted","Data":"7b4f8658273bfc134e464840d85906d90247329c57582c8d616a7ff33dc82cc7"} Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.392365 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8db9fbca-87a7-4706-9aef-e78fa4fefe16-client-ca\") pod \"8db9fbca-87a7-4706-9aef-e78fa4fefe16\" (UID: \"8db9fbca-87a7-4706-9aef-e78fa4fefe16\") " Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.392425 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8db9fbca-87a7-4706-9aef-e78fa4fefe16-config\") pod \"8db9fbca-87a7-4706-9aef-e78fa4fefe16\" (UID: \"8db9fbca-87a7-4706-9aef-e78fa4fefe16\") " Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.392511 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.392534 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8db9fbca-87a7-4706-9aef-e78fa4fefe16-serving-cert\") pod \"8db9fbca-87a7-4706-9aef-e78fa4fefe16\" (UID: \"8db9fbca-87a7-4706-9aef-e78fa4fefe16\") " Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.392554 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mgsm\" (UniqueName: \"kubernetes.io/projected/8db9fbca-87a7-4706-9aef-e78fa4fefe16-kube-api-access-9mgsm\") pod \"8db9fbca-87a7-4706-9aef-e78fa4fefe16\" (UID: \"8db9fbca-87a7-4706-9aef-e78fa4fefe16\") " Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.392613 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8db9fbca-87a7-4706-9aef-e78fa4fefe16-proxy-ca-bundles\") pod \"8db9fbca-87a7-4706-9aef-e78fa4fefe16\" (UID: \"8db9fbca-87a7-4706-9aef-e78fa4fefe16\") " Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.394543 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vsrvh" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.399600 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8db9fbca-87a7-4706-9aef-e78fa4fefe16-client-ca" (OuterVolumeSpecName: "client-ca") pod "8db9fbca-87a7-4706-9aef-e78fa4fefe16" (UID: "8db9fbca-87a7-4706-9aef-e78fa4fefe16"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.400217 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8db9fbca-87a7-4706-9aef-e78fa4fefe16-config" (OuterVolumeSpecName: "config") pod "8db9fbca-87a7-4706-9aef-e78fa4fefe16" (UID: "8db9fbca-87a7-4706-9aef-e78fa4fefe16"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:58:17 crc kubenswrapper[4826]: E0319 18:58:17.400924 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:17.900893197 +0000 UTC m=+122.654961680 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.401845 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8db9fbca-87a7-4706-9aef-e78fa4fefe16-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "8db9fbca-87a7-4706-9aef-e78fa4fefe16" (UID: "8db9fbca-87a7-4706-9aef-e78fa4fefe16"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.417078 4826 generic.go:334] "Generic (PLEG): container finished" podID="8db9fbca-87a7-4706-9aef-e78fa4fefe16" containerID="a677e0802b103dc9d367ba48513b25e441755829fbbaadda602f9b0a35a2ad74" exitCode=0 Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.417298 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-brsbv" event={"ID":"8db9fbca-87a7-4706-9aef-e78fa4fefe16","Type":"ContainerDied","Data":"a677e0802b103dc9d367ba48513b25e441755829fbbaadda602f9b0a35a2ad74"} Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.417341 4826 scope.go:117] "RemoveContainer" containerID="a677e0802b103dc9d367ba48513b25e441755829fbbaadda602f9b0a35a2ad74" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.417699 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-brsbv" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.437058 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29565765-b929z" podStartSLOduration=61.43703708 podStartE2EDuration="1m1.43703708s" podCreationTimestamp="2026-03-19 18:57:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:17.433170695 +0000 UTC m=+122.187239008" watchObservedRunningTime="2026-03-19 18:58:17.43703708 +0000 UTC m=+122.191105393" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.449372 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8db9fbca-87a7-4706-9aef-e78fa4fefe16-kube-api-access-9mgsm" (OuterVolumeSpecName: "kube-api-access-9mgsm") pod "8db9fbca-87a7-4706-9aef-e78fa4fefe16" (UID: "8db9fbca-87a7-4706-9aef-e78fa4fefe16"). InnerVolumeSpecName "kube-api-access-9mgsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.450286 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8db9fbca-87a7-4706-9aef-e78fa4fefe16-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8db9fbca-87a7-4706-9aef-e78fa4fefe16" (UID: "8db9fbca-87a7-4706-9aef-e78fa4fefe16"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.493411 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9bmdx" event={"ID":"1392e2fd-a142-4584-8df9-7470b9441a3d","Type":"ContainerStarted","Data":"8a491d4cc1aee61a32acc6ed3464d0b7d28a90ac03a1767805a28cde79b34be3"} Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.529842 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.530129 4826 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8db9fbca-87a7-4706-9aef-e78fa4fefe16-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.530147 4826 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8db9fbca-87a7-4706-9aef-e78fa4fefe16-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.530162 4826 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8db9fbca-87a7-4706-9aef-e78fa4fefe16-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.530173 4826 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8db9fbca-87a7-4706-9aef-e78fa4fefe16-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.530184 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mgsm\" (UniqueName: \"kubernetes.io/projected/8db9fbca-87a7-4706-9aef-e78fa4fefe16-kube-api-access-9mgsm\") on node \"crc\" DevicePath \"\"" Mar 19 18:58:17 crc kubenswrapper[4826]: E0319 18:58:17.532240 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:18.032222049 +0000 UTC m=+122.786290352 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mwhjm" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.585410 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lwdqq" event={"ID":"6397cca1-7284-4e40-9b7e-3f8026c72f5f","Type":"ContainerStarted","Data":"538c7fed2191a231d74d752f0d7bd6c3b9fcdba88b9571b2b897b3defb394568"} Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.625731 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-574fb44d96-6zwvm"] Mar 19 18:58:17 crc kubenswrapper[4826]: E0319 18:58:17.626208 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8db9fbca-87a7-4706-9aef-e78fa4fefe16" containerName="controller-manager" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.626224 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="8db9fbca-87a7-4706-9aef-e78fa4fefe16" containerName="controller-manager" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.626293 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-cx4gd" podStartSLOduration=10.626269146 podStartE2EDuration="10.626269146s" podCreationTimestamp="2026-03-19 18:58:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:17.544081182 +0000 UTC m=+122.298149485" watchObservedRunningTime="2026-03-19 18:58:17.626269146 +0000 UTC m=+122.380337459" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.626363 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="8db9fbca-87a7-4706-9aef-e78fa4fefe16" containerName="controller-manager" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.626876 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-574fb44d96-6zwvm" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.631082 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.631883 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2be5868b-c0a8-45a6-8b09-5020ed53e863-serving-cert\") pod \"controller-manager-574fb44d96-6zwvm\" (UID: \"2be5868b-c0a8-45a6-8b09-5020ed53e863\") " pod="openshift-controller-manager/controller-manager-574fb44d96-6zwvm" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.632026 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2be5868b-c0a8-45a6-8b09-5020ed53e863-client-ca\") pod \"controller-manager-574fb44d96-6zwvm\" (UID: \"2be5868b-c0a8-45a6-8b09-5020ed53e863\") " pod="openshift-controller-manager/controller-manager-574fb44d96-6zwvm" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.632143 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2be5868b-c0a8-45a6-8b09-5020ed53e863-config\") pod \"controller-manager-574fb44d96-6zwvm\" (UID: \"2be5868b-c0a8-45a6-8b09-5020ed53e863\") " pod="openshift-controller-manager/controller-manager-574fb44d96-6zwvm" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.632612 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w97r\" (UniqueName: \"kubernetes.io/projected/2be5868b-c0a8-45a6-8b09-5020ed53e863-kube-api-access-2w97r\") pod \"controller-manager-574fb44d96-6zwvm\" (UID: \"2be5868b-c0a8-45a6-8b09-5020ed53e863\") " pod="openshift-controller-manager/controller-manager-574fb44d96-6zwvm" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.632792 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2be5868b-c0a8-45a6-8b09-5020ed53e863-proxy-ca-bundles\") pod \"controller-manager-574fb44d96-6zwvm\" (UID: \"2be5868b-c0a8-45a6-8b09-5020ed53e863\") " pod="openshift-controller-manager/controller-manager-574fb44d96-6zwvm" Mar 19 18:58:17 crc kubenswrapper[4826]: E0319 18:58:17.633563 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:18.133531913 +0000 UTC m=+122.887600226 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.635291 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fcnzx" event={"ID":"781f0741-f222-4ccc-aa80-6dde59e9648d","Type":"ContainerStarted","Data":"9f3ea13753127e75061a238605c9b00b1148fb1f5ff5ec51558df743b2cf27d5"} Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.677068 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zkslk"] Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.678365 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zkslk" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.680317 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.687637 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7fdpm" event={"ID":"9f822d71-562c-4d2c-917f-82281bef6c8a","Type":"ContainerStarted","Data":"c4b13f8693c6e77e7089b4354c54a5ff157080394fea37de5aa348cea0c8caea"} Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.694530 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-574fb44d96-6zwvm"] Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.706389 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zkslk"] Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.727249 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ph9t5" event={"ID":"a4a3e741-fc60-4076-8167-0e7cc776345e","Type":"ContainerStarted","Data":"8864b360a6f07d5935c2e50e4fc545579cfb8306cac2dda7a16db4dffb3bd04a"} Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.727967 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-ph9t5" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.735245 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2be5868b-c0a8-45a6-8b09-5020ed53e863-config\") pod \"controller-manager-574fb44d96-6zwvm\" (UID: \"2be5868b-c0a8-45a6-8b09-5020ed53e863\") " pod="openshift-controller-manager/controller-manager-574fb44d96-6zwvm" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.735328 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2w97r\" (UniqueName: \"kubernetes.io/projected/2be5868b-c0a8-45a6-8b09-5020ed53e863-kube-api-access-2w97r\") pod \"controller-manager-574fb44d96-6zwvm\" (UID: \"2be5868b-c0a8-45a6-8b09-5020ed53e863\") " pod="openshift-controller-manager/controller-manager-574fb44d96-6zwvm" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.735367 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7109581b-42ad-4e72-89be-ae269dcaea42-catalog-content\") pod \"redhat-operators-zkslk\" (UID: \"7109581b-42ad-4e72-89be-ae269dcaea42\") " pod="openshift-marketplace/redhat-operators-zkslk" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.735389 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7109581b-42ad-4e72-89be-ae269dcaea42-utilities\") pod \"redhat-operators-zkslk\" (UID: \"7109581b-42ad-4e72-89be-ae269dcaea42\") " pod="openshift-marketplace/redhat-operators-zkslk" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.735422 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2be5868b-c0a8-45a6-8b09-5020ed53e863-proxy-ca-bundles\") pod \"controller-manager-574fb44d96-6zwvm\" (UID: \"2be5868b-c0a8-45a6-8b09-5020ed53e863\") " pod="openshift-controller-manager/controller-manager-574fb44d96-6zwvm" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.735470 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcbbb\" (UniqueName: \"kubernetes.io/projected/7109581b-42ad-4e72-89be-ae269dcaea42-kube-api-access-vcbbb\") pod \"redhat-operators-zkslk\" (UID: \"7109581b-42ad-4e72-89be-ae269dcaea42\") " pod="openshift-marketplace/redhat-operators-zkslk" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.735503 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2be5868b-c0a8-45a6-8b09-5020ed53e863-serving-cert\") pod \"controller-manager-574fb44d96-6zwvm\" (UID: \"2be5868b-c0a8-45a6-8b09-5020ed53e863\") " pod="openshift-controller-manager/controller-manager-574fb44d96-6zwvm" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.735525 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2be5868b-c0a8-45a6-8b09-5020ed53e863-client-ca\") pod \"controller-manager-574fb44d96-6zwvm\" (UID: \"2be5868b-c0a8-45a6-8b09-5020ed53e863\") " pod="openshift-controller-manager/controller-manager-574fb44d96-6zwvm" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.735570 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.747412 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2be5868b-c0a8-45a6-8b09-5020ed53e863-config\") pod \"controller-manager-574fb44d96-6zwvm\" (UID: \"2be5868b-c0a8-45a6-8b09-5020ed53e863\") " pod="openshift-controller-manager/controller-manager-574fb44d96-6zwvm" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.749039 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-7fdpm" podStartSLOduration=61.749004414 podStartE2EDuration="1m1.749004414s" podCreationTimestamp="2026-03-19 18:57:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:17.734739626 +0000 UTC m=+122.488807929" watchObservedRunningTime="2026-03-19 18:58:17.749004414 +0000 UTC m=+122.503072717" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.759436 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-psc5t" event={"ID":"ca0af29a-9ab6-4d5f-a6fd-bdb5f3d5526c","Type":"ContainerStarted","Data":"be6fcb17cd5035bc0108081053febfd13705bdea7bba9efe0eb9505951f08038"} Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.759965 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2be5868b-c0a8-45a6-8b09-5020ed53e863-client-ca\") pod \"controller-manager-574fb44d96-6zwvm\" (UID: \"2be5868b-c0a8-45a6-8b09-5020ed53e863\") " pod="openshift-controller-manager/controller-manager-574fb44d96-6zwvm" Mar 19 18:58:17 crc kubenswrapper[4826]: E0319 18:58:17.760323 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:18.260304761 +0000 UTC m=+123.014373074 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mwhjm" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.761167 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2be5868b-c0a8-45a6-8b09-5020ed53e863-proxy-ca-bundles\") pod \"controller-manager-574fb44d96-6zwvm\" (UID: \"2be5868b-c0a8-45a6-8b09-5020ed53e863\") " pod="openshift-controller-manager/controller-manager-574fb44d96-6zwvm" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.791160 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2be5868b-c0a8-45a6-8b09-5020ed53e863-serving-cert\") pod \"controller-manager-574fb44d96-6zwvm\" (UID: \"2be5868b-c0a8-45a6-8b09-5020ed53e863\") " pod="openshift-controller-manager/controller-manager-574fb44d96-6zwvm" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.794730 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-ph9t5" podStartSLOduration=10.790648296 podStartE2EDuration="10.790648296s" podCreationTimestamp="2026-03-19 18:58:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:17.778624179 +0000 UTC m=+122.532692502" watchObservedRunningTime="2026-03-19 18:58:17.790648296 +0000 UTC m=+122.544716609" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.799043 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w97r\" (UniqueName: \"kubernetes.io/projected/2be5868b-c0a8-45a6-8b09-5020ed53e863-kube-api-access-2w97r\") pod \"controller-manager-574fb44d96-6zwvm\" (UID: \"2be5868b-c0a8-45a6-8b09-5020ed53e863\") " pod="openshift-controller-manager/controller-manager-574fb44d96-6zwvm" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.818981 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-brsbv"] Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.831461 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dnc22" event={"ID":"fdb49b25-5e81-4f9d-9a17-34bade2cec18","Type":"ContainerStarted","Data":"d3a209fa43a7430fe81c94d1187f4b67a6e187072f60a4066fd2e2e620507871"} Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.832505 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dnc22" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.847446 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.847828 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcbbb\" (UniqueName: \"kubernetes.io/projected/7109581b-42ad-4e72-89be-ae269dcaea42-kube-api-access-vcbbb\") pod \"redhat-operators-zkslk\" (UID: \"7109581b-42ad-4e72-89be-ae269dcaea42\") " pod="openshift-marketplace/redhat-operators-zkslk" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.847935 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7109581b-42ad-4e72-89be-ae269dcaea42-catalog-content\") pod \"redhat-operators-zkslk\" (UID: \"7109581b-42ad-4e72-89be-ae269dcaea42\") " pod="openshift-marketplace/redhat-operators-zkslk" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.847958 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7109581b-42ad-4e72-89be-ae269dcaea42-utilities\") pod \"redhat-operators-zkslk\" (UID: \"7109581b-42ad-4e72-89be-ae269dcaea42\") " pod="openshift-marketplace/redhat-operators-zkslk" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.848359 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7109581b-42ad-4e72-89be-ae269dcaea42-utilities\") pod \"redhat-operators-zkslk\" (UID: \"7109581b-42ad-4e72-89be-ae269dcaea42\") " pod="openshift-marketplace/redhat-operators-zkslk" Mar 19 18:58:17 crc kubenswrapper[4826]: E0319 18:58:17.848435 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:18.348417587 +0000 UTC m=+123.102485900 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.849269 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7109581b-42ad-4e72-89be-ae269dcaea42-catalog-content\") pod \"redhat-operators-zkslk\" (UID: \"7109581b-42ad-4e72-89be-ae269dcaea42\") " pod="openshift-marketplace/redhat-operators-zkslk" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.852346 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-brsbv"] Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.884762 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcbbb\" (UniqueName: \"kubernetes.io/projected/7109581b-42ad-4e72-89be-ae269dcaea42-kube-api-access-vcbbb\") pod \"redhat-operators-zkslk\" (UID: \"7109581b-42ad-4e72-89be-ae269dcaea42\") " pod="openshift-marketplace/redhat-operators-zkslk" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.888689 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-z8vj2" event={"ID":"7b3f973e-b13a-45f3-a9cc-b84a8a8310a1","Type":"ContainerStarted","Data":"bfc4b3a8f2c8b5a43ac491b29b1901c9c3dc32f04aa84e58ca45508e03ce10c3"} Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.890800 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dnc22" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.893044 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2fpvj"] Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.915519 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tw9k9" event={"ID":"4e673de9-6eb1-430b-8123-1254957f125f","Type":"ContainerStarted","Data":"ff9e65eee2422ae7722509b7ce511151420b1ccfaaa19aedd283607dc0d4e579"} Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.924521 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dnc22" podStartSLOduration=60.924495875 podStartE2EDuration="1m0.924495875s" podCreationTimestamp="2026-03-19 18:57:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:17.922029199 +0000 UTC m=+122.676097512" watchObservedRunningTime="2026-03-19 18:58:17.924495875 +0000 UTC m=+122.678564188" Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.933521 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hcg9z" podUID="3d9978e8-9235-4a27-b28e-6e8ed8cf70c4" containerName="route-controller-manager" containerID="cri-o://94a5de13a40094bcfc5b95efc4f3f8adfa66b38cd34fcadc97d5a1981791a664" gracePeriod=30 Mar 19 18:58:17 crc kubenswrapper[4826]: W0319 18:58:17.935860 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e0d7689_755d_4e24_a337_4177c37c2437.slice/crio-b95365443890536067099892c17281b9ba29d7aa7272278995046563a6ae5121 WatchSource:0}: Error finding container b95365443890536067099892c17281b9ba29d7aa7272278995046563a6ae5121: Status 404 returned error can't find the container with id b95365443890536067099892c17281b9ba29d7aa7272278995046563a6ae5121 Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.949092 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:17 crc kubenswrapper[4826]: E0319 18:58:17.977457 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:18.477434695 +0000 UTC m=+123.231503008 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mwhjm" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:17 crc kubenswrapper[4826]: I0319 18:58:17.978422 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-pfrcn" Mar 19 18:58:18 crc kubenswrapper[4826]: I0319 18:58:18.002470 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-574fb44d96-6zwvm" Mar 19 18:58:18 crc kubenswrapper[4826]: I0319 18:58:18.034717 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zkslk" Mar 19 18:58:18 crc kubenswrapper[4826]: I0319 18:58:18.051056 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:18 crc kubenswrapper[4826]: E0319 18:58:18.051510 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:18.551489259 +0000 UTC m=+123.305557582 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:18 crc kubenswrapper[4826]: I0319 18:58:18.065677 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8db9fbca-87a7-4706-9aef-e78fa4fefe16" path="/var/lib/kubelet/pods/8db9fbca-87a7-4706-9aef-e78fa4fefe16/volumes" Mar 19 18:58:18 crc kubenswrapper[4826]: I0319 18:58:18.070514 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rrk5r"] Mar 19 18:58:18 crc kubenswrapper[4826]: I0319 18:58:18.071795 4826 ???:1] "http: TLS handshake error from 192.168.126.11:53384: no serving certificate available for the kubelet" Mar 19 18:58:18 crc kubenswrapper[4826]: I0319 18:58:18.072516 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rrk5r" Mar 19 18:58:18 crc kubenswrapper[4826]: I0319 18:58:18.099753 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rrk5r"] Mar 19 18:58:18 crc kubenswrapper[4826]: I0319 18:58:18.103814 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-z8vj2" podStartSLOduration=61.10379136 podStartE2EDuration="1m1.10379136s" podCreationTimestamp="2026-03-19 18:57:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:18.028948026 +0000 UTC m=+122.783016339" watchObservedRunningTime="2026-03-19 18:58:18.10379136 +0000 UTC m=+122.857859673" Mar 19 18:58:18 crc kubenswrapper[4826]: I0319 18:58:18.152976 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:18 crc kubenswrapper[4826]: I0319 18:58:18.153059 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vs6z8\" (UniqueName: \"kubernetes.io/projected/f4293235-5c04-462c-bef4-8595d0c89ec6-kube-api-access-vs6z8\") pod \"redhat-operators-rrk5r\" (UID: \"f4293235-5c04-462c-bef4-8595d0c89ec6\") " pod="openshift-marketplace/redhat-operators-rrk5r" Mar 19 18:58:18 crc kubenswrapper[4826]: I0319 18:58:18.153081 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4293235-5c04-462c-bef4-8595d0c89ec6-catalog-content\") pod \"redhat-operators-rrk5r\" (UID: \"f4293235-5c04-462c-bef4-8595d0c89ec6\") " pod="openshift-marketplace/redhat-operators-rrk5r" Mar 19 18:58:18 crc kubenswrapper[4826]: I0319 18:58:18.153101 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4293235-5c04-462c-bef4-8595d0c89ec6-utilities\") pod \"redhat-operators-rrk5r\" (UID: \"f4293235-5c04-462c-bef4-8595d0c89ec6\") " pod="openshift-marketplace/redhat-operators-rrk5r" Mar 19 18:58:18 crc kubenswrapper[4826]: E0319 18:58:18.153454 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:18.653434861 +0000 UTC m=+123.407503174 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mwhjm" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:18 crc kubenswrapper[4826]: I0319 18:58:18.177931 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 19 18:58:18 crc kubenswrapper[4826]: I0319 18:58:18.179101 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 18:58:18 crc kubenswrapper[4826]: I0319 18:58:18.182427 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 19 18:58:18 crc kubenswrapper[4826]: I0319 18:58:18.185997 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 19 18:58:18 crc kubenswrapper[4826]: I0319 18:58:18.207974 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 19 18:58:18 crc kubenswrapper[4826]: I0319 18:58:18.248508 4826 patch_prober.go:28] interesting pod/router-default-5444994796-drbf6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 18:58:18 crc kubenswrapper[4826]: [-]has-synced failed: reason withheld Mar 19 18:58:18 crc kubenswrapper[4826]: [+]process-running ok Mar 19 18:58:18 crc kubenswrapper[4826]: healthz check failed Mar 19 18:58:18 crc kubenswrapper[4826]: I0319 18:58:18.248579 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-drbf6" podUID="ee11e1f6-25be-40f4-b19b-a2d8e439d8c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 18:58:18 crc kubenswrapper[4826]: I0319 18:58:18.255593 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:18 crc kubenswrapper[4826]: E0319 18:58:18.255693 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:18.755673611 +0000 UTC m=+123.509741914 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:18 crc kubenswrapper[4826]: I0319 18:58:18.256029 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vs6z8\" (UniqueName: \"kubernetes.io/projected/f4293235-5c04-462c-bef4-8595d0c89ec6-kube-api-access-vs6z8\") pod \"redhat-operators-rrk5r\" (UID: \"f4293235-5c04-462c-bef4-8595d0c89ec6\") " pod="openshift-marketplace/redhat-operators-rrk5r" Mar 19 18:58:18 crc kubenswrapper[4826]: I0319 18:58:18.256058 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9055ec15-01ab-4655-9059-b8f38b404dc8-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"9055ec15-01ab-4655-9059-b8f38b404dc8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 18:58:18 crc kubenswrapper[4826]: I0319 18:58:18.256081 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4293235-5c04-462c-bef4-8595d0c89ec6-catalog-content\") pod \"redhat-operators-rrk5r\" (UID: \"f4293235-5c04-462c-bef4-8595d0c89ec6\") " pod="openshift-marketplace/redhat-operators-rrk5r" Mar 19 18:58:18 crc kubenswrapper[4826]: I0319 18:58:18.256103 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4293235-5c04-462c-bef4-8595d0c89ec6-utilities\") pod \"redhat-operators-rrk5r\" (UID: \"f4293235-5c04-462c-bef4-8595d0c89ec6\") " pod="openshift-marketplace/redhat-operators-rrk5r" Mar 19 18:58:18 crc kubenswrapper[4826]: I0319 18:58:18.256131 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9055ec15-01ab-4655-9059-b8f38b404dc8-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"9055ec15-01ab-4655-9059-b8f38b404dc8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 18:58:18 crc kubenswrapper[4826]: I0319 18:58:18.256172 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:18 crc kubenswrapper[4826]: E0319 18:58:18.256460 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:18.756453161 +0000 UTC m=+123.510521464 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mwhjm" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:18 crc kubenswrapper[4826]: I0319 18:58:18.257081 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4293235-5c04-462c-bef4-8595d0c89ec6-catalog-content\") pod \"redhat-operators-rrk5r\" (UID: \"f4293235-5c04-462c-bef4-8595d0c89ec6\") " pod="openshift-marketplace/redhat-operators-rrk5r" Mar 19 18:58:18 crc kubenswrapper[4826]: I0319 18:58:18.257295 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4293235-5c04-462c-bef4-8595d0c89ec6-utilities\") pod \"redhat-operators-rrk5r\" (UID: \"f4293235-5c04-462c-bef4-8595d0c89ec6\") " pod="openshift-marketplace/redhat-operators-rrk5r" Mar 19 18:58:18 crc kubenswrapper[4826]: I0319 18:58:18.280382 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vs6z8\" (UniqueName: \"kubernetes.io/projected/f4293235-5c04-462c-bef4-8595d0c89ec6-kube-api-access-vs6z8\") pod \"redhat-operators-rrk5r\" (UID: \"f4293235-5c04-462c-bef4-8595d0c89ec6\") " pod="openshift-marketplace/redhat-operators-rrk5r" Mar 19 18:58:18 crc kubenswrapper[4826]: I0319 18:58:18.357751 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:18 crc kubenswrapper[4826]: I0319 18:58:18.357916 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9055ec15-01ab-4655-9059-b8f38b404dc8-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"9055ec15-01ab-4655-9059-b8f38b404dc8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 18:58:18 crc kubenswrapper[4826]: I0319 18:58:18.357952 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9055ec15-01ab-4655-9059-b8f38b404dc8-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"9055ec15-01ab-4655-9059-b8f38b404dc8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 18:58:18 crc kubenswrapper[4826]: E0319 18:58:18.358130 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:18.858095786 +0000 UTC m=+123.612164099 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:18 crc kubenswrapper[4826]: I0319 18:58:18.358221 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9055ec15-01ab-4655-9059-b8f38b404dc8-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"9055ec15-01ab-4655-9059-b8f38b404dc8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 18:58:18 crc kubenswrapper[4826]: I0319 18:58:18.406102 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9055ec15-01ab-4655-9059-b8f38b404dc8-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"9055ec15-01ab-4655-9059-b8f38b404dc8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 18:58:18 crc kubenswrapper[4826]: I0319 18:58:18.436026 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rrk5r" Mar 19 18:58:18 crc kubenswrapper[4826]: I0319 18:58:18.459434 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:18 crc kubenswrapper[4826]: E0319 18:58:18.459822 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:18.959809352 +0000 UTC m=+123.713877665 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mwhjm" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:18 crc kubenswrapper[4826]: I0319 18:58:18.544742 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fcnzx" Mar 19 18:58:18 crc kubenswrapper[4826]: I0319 18:58:18.562879 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:18 crc kubenswrapper[4826]: E0319 18:58:18.563288 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:19.063268095 +0000 UTC m=+123.817336408 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:18 crc kubenswrapper[4826]: I0319 18:58:18.582802 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 18:58:18 crc kubenswrapper[4826]: I0319 18:58:18.607108 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vsrvh"] Mar 19 18:58:18 crc kubenswrapper[4826]: I0319 18:58:18.666290 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:18 crc kubenswrapper[4826]: E0319 18:58:18.666645 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:19.166633175 +0000 UTC m=+123.920701488 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mwhjm" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:18 crc kubenswrapper[4826]: I0319 18:58:18.767960 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:18 crc kubenswrapper[4826]: E0319 18:58:18.768583 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:19.268564287 +0000 UTC m=+124.022632600 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:18 crc kubenswrapper[4826]: I0319 18:58:18.871364 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:18 crc kubenswrapper[4826]: E0319 18:58:18.871785 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:19.371769133 +0000 UTC m=+124.125837446 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mwhjm" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:18 crc kubenswrapper[4826]: I0319 18:58:18.911595 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-574fb44d96-6zwvm"] Mar 19 18:58:18 crc kubenswrapper[4826]: I0319 18:58:18.927706 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zkslk"] Mar 19 18:58:18 crc kubenswrapper[4826]: W0319 18:58:18.948112 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2be5868b_c0a8_45a6_8b09_5020ed53e863.slice/crio-c99c5bf4e04fb0295d6e9006dc419627c97df31f5d4c3689c808bb0d2d429f3a WatchSource:0}: Error finding container c99c5bf4e04fb0295d6e9006dc419627c97df31f5d4c3689c808bb0d2d429f3a: Status 404 returned error can't find the container with id c99c5bf4e04fb0295d6e9006dc419627c97df31f5d4c3689c808bb0d2d429f3a Mar 19 18:58:18 crc kubenswrapper[4826]: I0319 18:58:18.963445 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7fdpm" event={"ID":"9f822d71-562c-4d2c-917f-82281bef6c8a","Type":"ContainerStarted","Data":"62af4771d75ceb6bdc1f947928b2a61629cac972d443656bc1f2f05e623c4fe5"} Mar 19 18:58:18 crc kubenswrapper[4826]: I0319 18:58:18.973757 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:18 crc kubenswrapper[4826]: E0319 18:58:18.974269 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:19.47424806 +0000 UTC m=+124.228316373 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.002018 4826 generic.go:334] "Generic (PLEG): container finished" podID="dcf719a6-7a63-4efa-b8dd-1beba09934f9" containerID="fce0ed192b79f75994a875d13faaa198326ed2a587b5059461fafa6ede5ebe93" exitCode=0 Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.002170 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8rs7z" event={"ID":"dcf719a6-7a63-4efa-b8dd-1beba09934f9","Type":"ContainerDied","Data":"fce0ed192b79f75994a875d13faaa198326ed2a587b5059461fafa6ede5ebe93"} Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.002204 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8rs7z" event={"ID":"dcf719a6-7a63-4efa-b8dd-1beba09934f9","Type":"ContainerStarted","Data":"8808d6cdfcc08c439d0ad29456c5553236abe3af32f30aab313ebc2ce01b6ad1"} Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.027372 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hcg9z" Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.053939 4826 generic.go:334] "Generic (PLEG): container finished" podID="6397cca1-7284-4e40-9b7e-3f8026c72f5f" containerID="01b9e4327f4b37897867f0466ff0f9cf9c675d9b152e4b94acd69fd58ef474cb" exitCode=0 Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.054045 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lwdqq" event={"ID":"6397cca1-7284-4e40-9b7e-3f8026c72f5f","Type":"ContainerDied","Data":"01b9e4327f4b37897867f0466ff0f9cf9c675d9b152e4b94acd69fd58ef474cb"} Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.064138 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vsrvh" event={"ID":"06fdacd5-0f40-4d55-8df2-67ea56f25595","Type":"ContainerStarted","Data":"853904801d629cce8382d4c3d0e460ba5c864b42e9656e1d27b51ed877edf187"} Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.065792 4826 generic.go:334] "Generic (PLEG): container finished" podID="007d8118-0079-4d3d-b764-01eadbd419c5" containerID="935933d38157cc5c5f1feee2866013bfb5884eb978a1257eb5329d92b6ce0a9d" exitCode=0 Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.065831 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mprm6" event={"ID":"007d8118-0079-4d3d-b764-01eadbd419c5","Type":"ContainerDied","Data":"935933d38157cc5c5f1feee2866013bfb5884eb978a1257eb5329d92b6ce0a9d"} Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.065846 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mprm6" event={"ID":"007d8118-0079-4d3d-b764-01eadbd419c5","Type":"ContainerStarted","Data":"830bfaecf6788968409c8dbaf53bcc31ac98b36d22acc6987e1535c0d27dcde2"} Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.071543 4826 generic.go:334] "Generic (PLEG): container finished" podID="0e0d7689-755d-4e24-a337-4177c37c2437" containerID="7343345644b75d17b5be21256f8aa9cc9d3cdc32652ea8fdf64752900330e54b" exitCode=0 Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.071587 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2fpvj" event={"ID":"0e0d7689-755d-4e24-a337-4177c37c2437","Type":"ContainerDied","Data":"7343345644b75d17b5be21256f8aa9cc9d3cdc32652ea8fdf64752900330e54b"} Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.071603 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2fpvj" event={"ID":"0e0d7689-755d-4e24-a337-4177c37c2437","Type":"ContainerStarted","Data":"b95365443890536067099892c17281b9ba29d7aa7272278995046563a6ae5121"} Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.076438 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jm24\" (UniqueName: \"kubernetes.io/projected/3d9978e8-9235-4a27-b28e-6e8ed8cf70c4-kube-api-access-9jm24\") pod \"3d9978e8-9235-4a27-b28e-6e8ed8cf70c4\" (UID: \"3d9978e8-9235-4a27-b28e-6e8ed8cf70c4\") " Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.076843 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3d9978e8-9235-4a27-b28e-6e8ed8cf70c4-client-ca\") pod \"3d9978e8-9235-4a27-b28e-6e8ed8cf70c4\" (UID: \"3d9978e8-9235-4a27-b28e-6e8ed8cf70c4\") " Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.076900 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d9978e8-9235-4a27-b28e-6e8ed8cf70c4-serving-cert\") pod \"3d9978e8-9235-4a27-b28e-6e8ed8cf70c4\" (UID: \"3d9978e8-9235-4a27-b28e-6e8ed8cf70c4\") " Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.077389 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.078096 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d9978e8-9235-4a27-b28e-6e8ed8cf70c4-client-ca" (OuterVolumeSpecName: "client-ca") pod "3d9978e8-9235-4a27-b28e-6e8ed8cf70c4" (UID: "3d9978e8-9235-4a27-b28e-6e8ed8cf70c4"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:58:19 crc kubenswrapper[4826]: E0319 18:58:19.079435 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:19.579416019 +0000 UTC m=+124.333484332 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mwhjm" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.082769 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d9978e8-9235-4a27-b28e-6e8ed8cf70c4-kube-api-access-9jm24" (OuterVolumeSpecName: "kube-api-access-9jm24") pod "3d9978e8-9235-4a27-b28e-6e8ed8cf70c4" (UID: "3d9978e8-9235-4a27-b28e-6e8ed8cf70c4"). InnerVolumeSpecName "kube-api-access-9jm24". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.092437 4826 generic.go:334] "Generic (PLEG): container finished" podID="ca0af29a-9ab6-4d5f-a6fd-bdb5f3d5526c" containerID="2441382ea1909fa40ab74a874916e31708b8e86f3926810690ee5b075d0d2267" exitCode=0 Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.092537 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-psc5t" event={"ID":"ca0af29a-9ab6-4d5f-a6fd-bdb5f3d5526c","Type":"ContainerDied","Data":"2441382ea1909fa40ab74a874916e31708b8e86f3926810690ee5b075d0d2267"} Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.099198 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d9978e8-9235-4a27-b28e-6e8ed8cf70c4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3d9978e8-9235-4a27-b28e-6e8ed8cf70c4" (UID: "3d9978e8-9235-4a27-b28e-6e8ed8cf70c4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.099732 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"68dfd4bf9ea9b92f709ae7f29c0c492b27f36660c09c393d31135321d0431c18"} Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.099857 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.102024 4826 generic.go:334] "Generic (PLEG): container finished" podID="3d9978e8-9235-4a27-b28e-6e8ed8cf70c4" containerID="94a5de13a40094bcfc5b95efc4f3f8adfa66b38cd34fcadc97d5a1981791a664" exitCode=0 Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.102086 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hcg9z" event={"ID":"3d9978e8-9235-4a27-b28e-6e8ed8cf70c4","Type":"ContainerDied","Data":"94a5de13a40094bcfc5b95efc4f3f8adfa66b38cd34fcadc97d5a1981791a664"} Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.102115 4826 scope.go:117] "RemoveContainer" containerID="94a5de13a40094bcfc5b95efc4f3f8adfa66b38cd34fcadc97d5a1981791a664" Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.102253 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hcg9z" Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.106599 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"a53596ef25e5a2f64ef34a13fc4aa6ab9b3d2ef57c42d9a43568c17276780b45"} Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.106621 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"ba3228c9e5ecf694a3c2d8dc245e38424b62d367495f9a85c3757b4b0c435cf9"} Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.108528 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-76ppq" event={"ID":"6d732543-0ecb-4570-bbfb-8a80570674d5","Type":"ContainerStarted","Data":"c74b7136b810281732e1d2e6ec0393555dd6a158e3e0aa33375e61026c744676"} Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.133280 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"eca9e677002d24d65c17a62afbb26f7c4a029452b7e7270b93e8ae4b52eb3e45"} Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.133582 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"dc559e9de218c6aa4f91e7c535e634d4458d7fbbefdc45327017a6f01f3fd6a7"} Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.185835 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d9978e8-9235-4a27-b28e-6e8ed8cf70c4-config\") pod \"3d9978e8-9235-4a27-b28e-6e8ed8cf70c4\" (UID: \"3d9978e8-9235-4a27-b28e-6e8ed8cf70c4\") " Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.186037 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.186171 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rrk5r"] Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.186644 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jm24\" (UniqueName: \"kubernetes.io/projected/3d9978e8-9235-4a27-b28e-6e8ed8cf70c4-kube-api-access-9jm24\") on node \"crc\" DevicePath \"\"" Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.186688 4826 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3d9978e8-9235-4a27-b28e-6e8ed8cf70c4-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.186700 4826 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d9978e8-9235-4a27-b28e-6e8ed8cf70c4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:58:19 crc kubenswrapper[4826]: E0319 18:58:19.186888 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:19.686871592 +0000 UTC m=+124.440939905 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.187356 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d9978e8-9235-4a27-b28e-6e8ed8cf70c4-config" (OuterVolumeSpecName: "config") pod "3d9978e8-9235-4a27-b28e-6e8ed8cf70c4" (UID: "3d9978e8-9235-4a27-b28e-6e8ed8cf70c4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.216292 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ph9t5" event={"ID":"a4a3e741-fc60-4076-8167-0e7cc776345e","Type":"ContainerStarted","Data":"2a29284b1a230b3d7dc175d661a15ccd015e3f562d17b5794c331d6204ee7341"} Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.219430 4826 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-flj2h container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.219481 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-flj2h" podUID="ffb5bca9-57f3-415c-b2c2-c5088fe6c5d9" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.219606 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-m7zht" podUID="97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://357c925170cbafc742e33faa59acfddfc8282dc11185639e6fdf58e5e556690b" gracePeriod=30 Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.246941 4826 patch_prober.go:28] interesting pod/router-default-5444994796-drbf6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 18:58:19 crc kubenswrapper[4826]: [-]has-synced failed: reason withheld Mar 19 18:58:19 crc kubenswrapper[4826]: [+]process-running ok Mar 19 18:58:19 crc kubenswrapper[4826]: healthz check failed Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.247381 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-drbf6" podUID="ee11e1f6-25be-40f4-b19b-a2d8e439d8c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.289494 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.294717 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.296074 4826 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d9978e8-9235-4a27-b28e-6e8ed8cf70c4-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:58:19 crc kubenswrapper[4826]: E0319 18:58:19.301634 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:19.801606781 +0000 UTC m=+124.555675094 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mwhjm" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.397528 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:19 crc kubenswrapper[4826]: E0319 18:58:19.397807 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:19.897786597 +0000 UTC m=+124.651854900 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.450816 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-hcg9z"] Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.457683 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-hcg9z"] Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.500684 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:19 crc kubenswrapper[4826]: E0319 18:58:19.501184 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:20.001160957 +0000 UTC m=+124.755229270 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mwhjm" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.603408 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:19 crc kubenswrapper[4826]: E0319 18:58:19.603732 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:20.103708476 +0000 UTC m=+124.857776789 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.603883 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:19 crc kubenswrapper[4826]: E0319 18:58:19.605590 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:20.105576117 +0000 UTC m=+124.859644420 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mwhjm" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.706410 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:19 crc kubenswrapper[4826]: E0319 18:58:19.706719 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:20.206626044 +0000 UTC m=+124.960694357 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.706851 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:19 crc kubenswrapper[4826]: E0319 18:58:19.707328 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:20.207292933 +0000 UTC m=+124.961361246 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mwhjm" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.745278 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-h7mf4" Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.745941 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-h7mf4" Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.750159 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tw9k9" Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.752636 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tw9k9" Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.763280 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tw9k9" Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.763375 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-h7mf4" Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.815352 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:19 crc kubenswrapper[4826]: E0319 18:58:19.818047 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:20.318019224 +0000 UTC m=+125.072087527 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.867730 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 19 18:58:19 crc kubenswrapper[4826]: E0319 18:58:19.868081 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d9978e8-9235-4a27-b28e-6e8ed8cf70c4" containerName="route-controller-manager" Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.868102 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d9978e8-9235-4a27-b28e-6e8ed8cf70c4" containerName="route-controller-manager" Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.868242 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d9978e8-9235-4a27-b28e-6e8ed8cf70c4" containerName="route-controller-manager" Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.868750 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.873323 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.885523 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.890417 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.918480 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/35fe00e4-6b92-49ef-8ad3-88f630b0bb7f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"35fe00e4-6b92-49ef-8ad3-88f630b0bb7f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.918576 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.918605 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/35fe00e4-6b92-49ef-8ad3-88f630b0bb7f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"35fe00e4-6b92-49ef-8ad3-88f630b0bb7f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 18:58:19 crc kubenswrapper[4826]: E0319 18:58:19.919924 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:20.419911284 +0000 UTC m=+125.173979597 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mwhjm" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:19 crc kubenswrapper[4826]: I0319 18:58:19.990381 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d9978e8-9235-4a27-b28e-6e8ed8cf70c4" path="/var/lib/kubelet/pods/3d9978e8-9235-4a27-b28e-6e8ed8cf70c4/volumes" Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.011163 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59f4d7d68d-p8tdq"] Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.013186 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59f4d7d68d-p8tdq" Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.019349 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.019491 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.019739 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/033a3bd5-e9ce-484e-a282-5a01b1bcf85e-config\") pod \"route-controller-manager-59f4d7d68d-p8tdq\" (UID: \"033a3bd5-e9ce-484e-a282-5a01b1bcf85e\") " pod="openshift-route-controller-manager/route-controller-manager-59f4d7d68d-p8tdq" Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.019795 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/35fe00e4-6b92-49ef-8ad3-88f630b0bb7f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"35fe00e4-6b92-49ef-8ad3-88f630b0bb7f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.019813 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/033a3bd5-e9ce-484e-a282-5a01b1bcf85e-serving-cert\") pod \"route-controller-manager-59f4d7d68d-p8tdq\" (UID: \"033a3bd5-e9ce-484e-a282-5a01b1bcf85e\") " pod="openshift-route-controller-manager/route-controller-manager-59f4d7d68d-p8tdq" Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.019861 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.019889 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp7qb\" (UniqueName: \"kubernetes.io/projected/033a3bd5-e9ce-484e-a282-5a01b1bcf85e-kube-api-access-kp7qb\") pod \"route-controller-manager-59f4d7d68d-p8tdq\" (UID: \"033a3bd5-e9ce-484e-a282-5a01b1bcf85e\") " pod="openshift-route-controller-manager/route-controller-manager-59f4d7d68d-p8tdq" Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.019893 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.019917 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/033a3bd5-e9ce-484e-a282-5a01b1bcf85e-client-ca\") pod \"route-controller-manager-59f4d7d68d-p8tdq\" (UID: \"033a3bd5-e9ce-484e-a282-5a01b1bcf85e\") " pod="openshift-route-controller-manager/route-controller-manager-59f4d7d68d-p8tdq" Mar 19 18:58:20 crc kubenswrapper[4826]: E0319 18:58:20.019991 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:20.519969345 +0000 UTC m=+125.274037658 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.026854 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/35fe00e4-6b92-49ef-8ad3-88f630b0bb7f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"35fe00e4-6b92-49ef-8ad3-88f630b0bb7f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.027048 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/35fe00e4-6b92-49ef-8ad3-88f630b0bb7f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"35fe00e4-6b92-49ef-8ad3-88f630b0bb7f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.020072 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.027912 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.028257 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.028372 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59f4d7d68d-p8tdq"] Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.064284 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/35fe00e4-6b92-49ef-8ad3-88f630b0bb7f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"35fe00e4-6b92-49ef-8ad3-88f630b0bb7f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.128920 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/033a3bd5-e9ce-484e-a282-5a01b1bcf85e-serving-cert\") pod \"route-controller-manager-59f4d7d68d-p8tdq\" (UID: \"033a3bd5-e9ce-484e-a282-5a01b1bcf85e\") " pod="openshift-route-controller-manager/route-controller-manager-59f4d7d68d-p8tdq" Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.129021 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kp7qb\" (UniqueName: \"kubernetes.io/projected/033a3bd5-e9ce-484e-a282-5a01b1bcf85e-kube-api-access-kp7qb\") pod \"route-controller-manager-59f4d7d68d-p8tdq\" (UID: \"033a3bd5-e9ce-484e-a282-5a01b1bcf85e\") " pod="openshift-route-controller-manager/route-controller-manager-59f4d7d68d-p8tdq" Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.129054 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/033a3bd5-e9ce-484e-a282-5a01b1bcf85e-client-ca\") pod \"route-controller-manager-59f4d7d68d-p8tdq\" (UID: \"033a3bd5-e9ce-484e-a282-5a01b1bcf85e\") " pod="openshift-route-controller-manager/route-controller-manager-59f4d7d68d-p8tdq" Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.129083 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/033a3bd5-e9ce-484e-a282-5a01b1bcf85e-config\") pod \"route-controller-manager-59f4d7d68d-p8tdq\" (UID: \"033a3bd5-e9ce-484e-a282-5a01b1bcf85e\") " pod="openshift-route-controller-manager/route-controller-manager-59f4d7d68d-p8tdq" Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.129116 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:20 crc kubenswrapper[4826]: E0319 18:58:20.129479 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:20.629464072 +0000 UTC m=+125.383532385 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mwhjm" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.132591 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/033a3bd5-e9ce-484e-a282-5a01b1bcf85e-config\") pod \"route-controller-manager-59f4d7d68d-p8tdq\" (UID: \"033a3bd5-e9ce-484e-a282-5a01b1bcf85e\") " pod="openshift-route-controller-manager/route-controller-manager-59f4d7d68d-p8tdq" Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.136875 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/033a3bd5-e9ce-484e-a282-5a01b1bcf85e-client-ca\") pod \"route-controller-manager-59f4d7d68d-p8tdq\" (UID: \"033a3bd5-e9ce-484e-a282-5a01b1bcf85e\") " pod="openshift-route-controller-manager/route-controller-manager-59f4d7d68d-p8tdq" Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.152366 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/033a3bd5-e9ce-484e-a282-5a01b1bcf85e-serving-cert\") pod \"route-controller-manager-59f4d7d68d-p8tdq\" (UID: \"033a3bd5-e9ce-484e-a282-5a01b1bcf85e\") " pod="openshift-route-controller-manager/route-controller-manager-59f4d7d68d-p8tdq" Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.191399 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp7qb\" (UniqueName: \"kubernetes.io/projected/033a3bd5-e9ce-484e-a282-5a01b1bcf85e-kube-api-access-kp7qb\") pod \"route-controller-manager-59f4d7d68d-p8tdq\" (UID: \"033a3bd5-e9ce-484e-a282-5a01b1bcf85e\") " pod="openshift-route-controller-manager/route-controller-manager-59f4d7d68d-p8tdq" Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.200218 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.222335 4826 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.231650 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:20 crc kubenswrapper[4826]: E0319 18:58:20.231954 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:20.731918018 +0000 UTC m=+125.485986331 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.232423 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:20 crc kubenswrapper[4826]: E0319 18:58:20.249645 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:20.749619279 +0000 UTC m=+125.503687592 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mwhjm" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.257049 4826 patch_prober.go:28] interesting pod/router-default-5444994796-drbf6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 18:58:20 crc kubenswrapper[4826]: [-]has-synced failed: reason withheld Mar 19 18:58:20 crc kubenswrapper[4826]: [+]process-running ok Mar 19 18:58:20 crc kubenswrapper[4826]: healthz check failed Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.257150 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-drbf6" podUID="ee11e1f6-25be-40f4-b19b-a2d8e439d8c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.290831 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-574fb44d96-6zwvm" event={"ID":"2be5868b-c0a8-45a6-8b09-5020ed53e863","Type":"ContainerStarted","Data":"bdced0f6c149d0c9f1dfc745c1329d1b49fc233f2768905e8c9a312e0b6e6f60"} Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.290898 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-574fb44d96-6zwvm" event={"ID":"2be5868b-c0a8-45a6-8b09-5020ed53e863","Type":"ContainerStarted","Data":"c99c5bf4e04fb0295d6e9006dc419627c97df31f5d4c3689c808bb0d2d429f3a"} Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.293164 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-574fb44d96-6zwvm" Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.314347 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-574fb44d96-6zwvm" Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.329540 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-574fb44d96-6zwvm" podStartSLOduration=6.329515912 podStartE2EDuration="6.329515912s" podCreationTimestamp="2026-03-19 18:58:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:20.327384133 +0000 UTC m=+125.081452446" watchObservedRunningTime="2026-03-19 18:58:20.329515912 +0000 UTC m=+125.083584225" Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.334897 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:20 crc kubenswrapper[4826]: E0319 18:58:20.335005 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:20.834985311 +0000 UTC m=+125.589053624 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.335447 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:20 crc kubenswrapper[4826]: E0319 18:58:20.337099 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:20.837089617 +0000 UTC m=+125.591157930 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mwhjm" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.337158 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59f4d7d68d-p8tdq" Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.344161 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4rf57" event={"ID":"eeb43c2f-961b-4ed4-9aa0-cda4dea289cb","Type":"ContainerStarted","Data":"bd37fed794e83c142bc5c4e1c66dd7bae9e4db8784dd35b63d0ce4275220b0e5"} Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.372088 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"9055ec15-01ab-4655-9059-b8f38b404dc8","Type":"ContainerStarted","Data":"e7a27a357c10038f0e46f08c1e754c53a000cfb62c8731792e2997b3f14bc498"} Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.372154 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"9055ec15-01ab-4655-9059-b8f38b404dc8","Type":"ContainerStarted","Data":"ee51fe8b4264e3fa94643db0ca4c36df757004d62f9b03d33018bf4209e48a53"} Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.399843 4826 generic.go:334] "Generic (PLEG): container finished" podID="f4293235-5c04-462c-bef4-8595d0c89ec6" containerID="82335a34432660f4c22f60a6784a80e48e14edfc797730f574e363efdb7ab22a" exitCode=0 Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.399937 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rrk5r" event={"ID":"f4293235-5c04-462c-bef4-8595d0c89ec6","Type":"ContainerDied","Data":"82335a34432660f4c22f60a6784a80e48e14edfc797730f574e363efdb7ab22a"} Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.399979 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rrk5r" event={"ID":"f4293235-5c04-462c-bef4-8595d0c89ec6","Type":"ContainerStarted","Data":"cbb652796d0158a9aaa43d53c18282ff40d16f75b4f0526c646a1a3c05141110"} Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.416591 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.416573069 podStartE2EDuration="2.416573069s" podCreationTimestamp="2026-03-19 18:58:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:20.414922464 +0000 UTC m=+125.168990777" watchObservedRunningTime="2026-03-19 18:58:20.416573069 +0000 UTC m=+125.170641382" Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.434419 4826 generic.go:334] "Generic (PLEG): container finished" podID="7109581b-42ad-4e72-89be-ae269dcaea42" containerID="adc9487e2da585a7282579470adf2501541cde341a769d6d631593359b636b5b" exitCode=0 Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.435998 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zkslk" event={"ID":"7109581b-42ad-4e72-89be-ae269dcaea42","Type":"ContainerDied","Data":"adc9487e2da585a7282579470adf2501541cde341a769d6d631593359b636b5b"} Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.436036 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zkslk" event={"ID":"7109581b-42ad-4e72-89be-ae269dcaea42","Type":"ContainerStarted","Data":"0bd6a7374fa5a9777c60eabca042d04ba570c647b7d7883c326f47d132019430"} Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.440922 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:20 crc kubenswrapper[4826]: E0319 18:58:20.444117 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:20.944079227 +0000 UTC m=+125.698147730 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.492674 4826 generic.go:334] "Generic (PLEG): container finished" podID="06fdacd5-0f40-4d55-8df2-67ea56f25595" containerID="7a031b5b2dbd89420ef42b2c1ecc4234428d6efa972eb316e2ebbd27d0aa48ce" exitCode=0 Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.494377 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vsrvh" event={"ID":"06fdacd5-0f40-4d55-8df2-67ea56f25595","Type":"ContainerDied","Data":"7a031b5b2dbd89420ef42b2c1ecc4234428d6efa972eb316e2ebbd27d0aa48ce"} Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.500117 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-h7mf4" Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.508404 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-flj2h" Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.532305 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tw9k9" Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.533672 4826 patch_prober.go:28] interesting pod/downloads-7954f5f757-cbmtf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.533717 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-cbmtf" podUID="0a13bc75-83b6-4952-8e8e-cd93809a87b5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.534013 4826 patch_prober.go:28] interesting pod/downloads-7954f5f757-cbmtf container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.534047 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-cbmtf" podUID="0a13bc75-83b6-4952-8e8e-cd93809a87b5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.572874 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.575188 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-zc8ht" Mar 19 18:58:20 crc kubenswrapper[4826]: E0319 18:58:20.576512 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:21.076490967 +0000 UTC m=+125.830559490 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mwhjm" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.676264 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:20 crc kubenswrapper[4826]: E0319 18:58:20.678165 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:21.178140682 +0000 UTC m=+125.932208995 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.785426 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:20 crc kubenswrapper[4826]: E0319 18:58:20.786195 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:21.286183049 +0000 UTC m=+126.040251362 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mwhjm" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.885715 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-p892r" Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.885754 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-p892r" Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.888930 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:20 crc kubenswrapper[4826]: E0319 18:58:20.889421 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:21.389401276 +0000 UTC m=+126.143469589 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.919825 4826 patch_prober.go:28] interesting pod/console-f9d7485db-p892r container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Mar 19 18:58:20 crc kubenswrapper[4826]: I0319 18:58:20.919890 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-p892r" podUID="db8ad588-15a8-47f2-97d5-950d4a757183" containerName="console" probeResult="failure" output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" Mar 19 18:58:21 crc kubenswrapper[4826]: I0319 18:58:21.000003 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:21 crc kubenswrapper[4826]: E0319 18:58:21.001845 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 18:58:21.501824743 +0000 UTC m=+126.255893056 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mwhjm" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:21 crc kubenswrapper[4826]: I0319 18:58:21.065334 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 19 18:58:21 crc kubenswrapper[4826]: I0319 18:58:21.092757 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59f4d7d68d-p8tdq"] Mar 19 18:58:21 crc kubenswrapper[4826]: I0319 18:58:21.101263 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:21 crc kubenswrapper[4826]: E0319 18:58:21.101859 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 18:58:21.601836943 +0000 UTC m=+126.355905266 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 18:58:21 crc kubenswrapper[4826]: I0319 18:58:21.159443 4826 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-19T18:58:20.222368248Z","Handler":null,"Name":""} Mar 19 18:58:21 crc kubenswrapper[4826]: I0319 18:58:21.167681 4826 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 19 18:58:21 crc kubenswrapper[4826]: I0319 18:58:21.167724 4826 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 19 18:58:21 crc kubenswrapper[4826]: E0319 18:58:21.174663 4826 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="357c925170cbafc742e33faa59acfddfc8282dc11185639e6fdf58e5e556690b" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 19 18:58:21 crc kubenswrapper[4826]: E0319 18:58:21.200965 4826 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="357c925170cbafc742e33faa59acfddfc8282dc11185639e6fdf58e5e556690b" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 19 18:58:21 crc kubenswrapper[4826]: I0319 18:58:21.202500 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:21 crc kubenswrapper[4826]: E0319 18:58:21.211382 4826 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="357c925170cbafc742e33faa59acfddfc8282dc11185639e6fdf58e5e556690b" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 19 18:58:21 crc kubenswrapper[4826]: E0319 18:58:21.211486 4826 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-m7zht" podUID="97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a" containerName="kube-multus-additional-cni-plugins" Mar 19 18:58:21 crc kubenswrapper[4826]: I0319 18:58:21.212916 4826 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 18:58:21 crc kubenswrapper[4826]: I0319 18:58:21.212984 4826 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:21 crc kubenswrapper[4826]: I0319 18:58:21.244728 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-drbf6" Mar 19 18:58:21 crc kubenswrapper[4826]: I0319 18:58:21.254503 4826 patch_prober.go:28] interesting pod/router-default-5444994796-drbf6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 18:58:21 crc kubenswrapper[4826]: [-]has-synced failed: reason withheld Mar 19 18:58:21 crc kubenswrapper[4826]: [+]process-running ok Mar 19 18:58:21 crc kubenswrapper[4826]: healthz check failed Mar 19 18:58:21 crc kubenswrapper[4826]: I0319 18:58:21.254577 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-drbf6" podUID="ee11e1f6-25be-40f4-b19b-a2d8e439d8c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 18:58:21 crc kubenswrapper[4826]: I0319 18:58:21.322206 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mwhjm\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:21 crc kubenswrapper[4826]: I0319 18:58:21.399806 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 19 18:58:21 crc kubenswrapper[4826]: I0319 18:58:21.406624 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 18:58:21 crc kubenswrapper[4826]: I0319 18:58:21.408876 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:21 crc kubenswrapper[4826]: I0319 18:58:21.424689 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 19 18:58:21 crc kubenswrapper[4826]: I0319 18:58:21.536465 4826 generic.go:334] "Generic (PLEG): container finished" podID="9055ec15-01ab-4655-9059-b8f38b404dc8" containerID="e7a27a357c10038f0e46f08c1e754c53a000cfb62c8731792e2997b3f14bc498" exitCode=0 Mar 19 18:58:21 crc kubenswrapper[4826]: I0319 18:58:21.536562 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"9055ec15-01ab-4655-9059-b8f38b404dc8","Type":"ContainerDied","Data":"e7a27a357c10038f0e46f08c1e754c53a000cfb62c8731792e2997b3f14bc498"} Mar 19 18:58:21 crc kubenswrapper[4826]: I0319 18:58:21.551665 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"35fe00e4-6b92-49ef-8ad3-88f630b0bb7f","Type":"ContainerStarted","Data":"9e615b641486081c9c731c5c51a59d8579218b473a38fee6f68b4528c9d49384"} Mar 19 18:58:21 crc kubenswrapper[4826]: I0319 18:58:21.567494 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4rf57" event={"ID":"eeb43c2f-961b-4ed4-9aa0-cda4dea289cb","Type":"ContainerStarted","Data":"f5b0eabfd9def3b08d7db10405f4c76d730adca310f76e779a43f425c32eed08"} Mar 19 18:58:21 crc kubenswrapper[4826]: I0319 18:58:21.567549 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4rf57" event={"ID":"eeb43c2f-961b-4ed4-9aa0-cda4dea289cb","Type":"ContainerStarted","Data":"7bec8c7d60713a410dc85a75079da94d41f5143092b393a7f440a9f192793aa5"} Mar 19 18:58:21 crc kubenswrapper[4826]: I0319 18:58:21.574951 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59f4d7d68d-p8tdq" event={"ID":"033a3bd5-e9ce-484e-a282-5a01b1bcf85e","Type":"ContainerStarted","Data":"4c06534f3d163154350061e7b02f5b60133f21a0178480cdaccb5dc41b332cbd"} Mar 19 18:58:21 crc kubenswrapper[4826]: I0319 18:58:21.590853 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-4rf57" podStartSLOduration=14.590833109 podStartE2EDuration="14.590833109s" podCreationTimestamp="2026-03-19 18:58:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:21.588250449 +0000 UTC m=+126.342318772" watchObservedRunningTime="2026-03-19 18:58:21.590833109 +0000 UTC m=+126.344901422" Mar 19 18:58:21 crc kubenswrapper[4826]: I0319 18:58:21.598590 4826 generic.go:334] "Generic (PLEG): container finished" podID="7c53abe4-412d-47a0-bccc-ec9e6f4d8784" containerID="c9b0fe2e7f9ebfff2357c08b0c5c9bdbaa33ce2f44b1e85e2a7eb86f4a52b539" exitCode=0 Mar 19 18:58:21 crc kubenswrapper[4826]: I0319 18:58:21.600066 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565765-b929z" event={"ID":"7c53abe4-412d-47a0-bccc-ec9e6f4d8784","Type":"ContainerDied","Data":"c9b0fe2e7f9ebfff2357c08b0c5c9bdbaa33ce2f44b1e85e2a7eb86f4a52b539"} Mar 19 18:58:22 crc kubenswrapper[4826]: I0319 18:58:22.011922 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 19 18:58:22 crc kubenswrapper[4826]: I0319 18:58:22.068506 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mwhjm"] Mar 19 18:58:22 crc kubenswrapper[4826]: I0319 18:58:22.244589 4826 patch_prober.go:28] interesting pod/router-default-5444994796-drbf6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 18:58:22 crc kubenswrapper[4826]: [-]has-synced failed: reason withheld Mar 19 18:58:22 crc kubenswrapper[4826]: [+]process-running ok Mar 19 18:58:22 crc kubenswrapper[4826]: healthz check failed Mar 19 18:58:22 crc kubenswrapper[4826]: I0319 18:58:22.244776 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-drbf6" podUID="ee11e1f6-25be-40f4-b19b-a2d8e439d8c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 18:58:22 crc kubenswrapper[4826]: I0319 18:58:22.643329 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59f4d7d68d-p8tdq" event={"ID":"033a3bd5-e9ce-484e-a282-5a01b1bcf85e","Type":"ContainerStarted","Data":"8152ea476a2ea8194af94b4f363420c50aca349e58542c9217f181414248ff58"} Mar 19 18:58:22 crc kubenswrapper[4826]: I0319 18:58:22.645318 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-59f4d7d68d-p8tdq" Mar 19 18:58:22 crc kubenswrapper[4826]: I0319 18:58:22.661198 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-59f4d7d68d-p8tdq" Mar 19 18:58:22 crc kubenswrapper[4826]: I0319 18:58:22.675423 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" event={"ID":"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e","Type":"ContainerStarted","Data":"16ae911a0fab6988359d706d8d95881dc9d87e72d919a9b6d045d59288b5f632"} Mar 19 18:58:22 crc kubenswrapper[4826]: I0319 18:58:22.675472 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" event={"ID":"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e","Type":"ContainerStarted","Data":"7063689c24369653482c0bd06c861d5cf00a046eaa975b22b6e2a91b623ef4e1"} Mar 19 18:58:22 crc kubenswrapper[4826]: I0319 18:58:22.675773 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:22 crc kubenswrapper[4826]: I0319 18:58:22.699369 4826 generic.go:334] "Generic (PLEG): container finished" podID="35fe00e4-6b92-49ef-8ad3-88f630b0bb7f" containerID="a0b5d84a04652fbc2cc187164dc9631128bcd7a12fbe33568bbfe59380105794" exitCode=0 Mar 19 18:58:22 crc kubenswrapper[4826]: I0319 18:58:22.699715 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"35fe00e4-6b92-49ef-8ad3-88f630b0bb7f","Type":"ContainerDied","Data":"a0b5d84a04652fbc2cc187164dc9631128bcd7a12fbe33568bbfe59380105794"} Mar 19 18:58:22 crc kubenswrapper[4826]: I0319 18:58:22.701389 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-59f4d7d68d-p8tdq" podStartSLOduration=8.701363226 podStartE2EDuration="8.701363226s" podCreationTimestamp="2026-03-19 18:58:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:22.671033181 +0000 UTC m=+127.425101494" watchObservedRunningTime="2026-03-19 18:58:22.701363226 +0000 UTC m=+127.455431539" Mar 19 18:58:22 crc kubenswrapper[4826]: I0319 18:58:22.728787 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" podStartSLOduration=65.728758552 podStartE2EDuration="1m5.728758552s" podCreationTimestamp="2026-03-19 18:57:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:22.724136455 +0000 UTC m=+127.478204768" watchObservedRunningTime="2026-03-19 18:58:22.728758552 +0000 UTC m=+127.482826865" Mar 19 18:58:23 crc kubenswrapper[4826]: I0319 18:58:23.201707 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565765-b929z" Mar 19 18:58:23 crc kubenswrapper[4826]: I0319 18:58:23.243230 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kc8w2\" (UniqueName: \"kubernetes.io/projected/7c53abe4-412d-47a0-bccc-ec9e6f4d8784-kube-api-access-kc8w2\") pod \"7c53abe4-412d-47a0-bccc-ec9e6f4d8784\" (UID: \"7c53abe4-412d-47a0-bccc-ec9e6f4d8784\") " Mar 19 18:58:23 crc kubenswrapper[4826]: I0319 18:58:23.243422 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7c53abe4-412d-47a0-bccc-ec9e6f4d8784-config-volume\") pod \"7c53abe4-412d-47a0-bccc-ec9e6f4d8784\" (UID: \"7c53abe4-412d-47a0-bccc-ec9e6f4d8784\") " Mar 19 18:58:23 crc kubenswrapper[4826]: I0319 18:58:23.243512 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7c53abe4-412d-47a0-bccc-ec9e6f4d8784-secret-volume\") pod \"7c53abe4-412d-47a0-bccc-ec9e6f4d8784\" (UID: \"7c53abe4-412d-47a0-bccc-ec9e6f4d8784\") " Mar 19 18:58:23 crc kubenswrapper[4826]: I0319 18:58:23.244356 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c53abe4-412d-47a0-bccc-ec9e6f4d8784-config-volume" (OuterVolumeSpecName: "config-volume") pod "7c53abe4-412d-47a0-bccc-ec9e6f4d8784" (UID: "7c53abe4-412d-47a0-bccc-ec9e6f4d8784"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:58:23 crc kubenswrapper[4826]: I0319 18:58:23.249107 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-drbf6" Mar 19 18:58:23 crc kubenswrapper[4826]: I0319 18:58:23.263631 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c53abe4-412d-47a0-bccc-ec9e6f4d8784-kube-api-access-kc8w2" (OuterVolumeSpecName: "kube-api-access-kc8w2") pod "7c53abe4-412d-47a0-bccc-ec9e6f4d8784" (UID: "7c53abe4-412d-47a0-bccc-ec9e6f4d8784"). InnerVolumeSpecName "kube-api-access-kc8w2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:58:23 crc kubenswrapper[4826]: I0319 18:58:23.263955 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c53abe4-412d-47a0-bccc-ec9e6f4d8784-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7c53abe4-412d-47a0-bccc-ec9e6f4d8784" (UID: "7c53abe4-412d-47a0-bccc-ec9e6f4d8784"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:58:23 crc kubenswrapper[4826]: I0319 18:58:23.273212 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-drbf6" Mar 19 18:58:23 crc kubenswrapper[4826]: I0319 18:58:23.279866 4826 ???:1] "http: TLS handshake error from 192.168.126.11:52640: no serving certificate available for the kubelet" Mar 19 18:58:23 crc kubenswrapper[4826]: I0319 18:58:23.327114 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 18:58:23 crc kubenswrapper[4826]: I0319 18:58:23.347488 4826 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7c53abe4-412d-47a0-bccc-ec9e6f4d8784-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 19 18:58:23 crc kubenswrapper[4826]: I0319 18:58:23.347570 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kc8w2\" (UniqueName: \"kubernetes.io/projected/7c53abe4-412d-47a0-bccc-ec9e6f4d8784-kube-api-access-kc8w2\") on node \"crc\" DevicePath \"\"" Mar 19 18:58:23 crc kubenswrapper[4826]: I0319 18:58:23.347589 4826 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7c53abe4-412d-47a0-bccc-ec9e6f4d8784-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 18:58:23 crc kubenswrapper[4826]: I0319 18:58:23.448984 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9055ec15-01ab-4655-9059-b8f38b404dc8-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "9055ec15-01ab-4655-9059-b8f38b404dc8" (UID: "9055ec15-01ab-4655-9059-b8f38b404dc8"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 18:58:23 crc kubenswrapper[4826]: I0319 18:58:23.449411 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9055ec15-01ab-4655-9059-b8f38b404dc8-kubelet-dir\") pod \"9055ec15-01ab-4655-9059-b8f38b404dc8\" (UID: \"9055ec15-01ab-4655-9059-b8f38b404dc8\") " Mar 19 18:58:23 crc kubenswrapper[4826]: I0319 18:58:23.449482 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9055ec15-01ab-4655-9059-b8f38b404dc8-kube-api-access\") pod \"9055ec15-01ab-4655-9059-b8f38b404dc8\" (UID: \"9055ec15-01ab-4655-9059-b8f38b404dc8\") " Mar 19 18:58:23 crc kubenswrapper[4826]: I0319 18:58:23.450148 4826 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9055ec15-01ab-4655-9059-b8f38b404dc8-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 19 18:58:23 crc kubenswrapper[4826]: I0319 18:58:23.459864 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9055ec15-01ab-4655-9059-b8f38b404dc8-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "9055ec15-01ab-4655-9059-b8f38b404dc8" (UID: "9055ec15-01ab-4655-9059-b8f38b404dc8"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:58:23 crc kubenswrapper[4826]: I0319 18:58:23.552839 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9055ec15-01ab-4655-9059-b8f38b404dc8-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 18:58:23 crc kubenswrapper[4826]: I0319 18:58:23.736280 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565765-b929z" event={"ID":"7c53abe4-412d-47a0-bccc-ec9e6f4d8784","Type":"ContainerDied","Data":"4b0b2244dc2d1bfa26f6b6b2f0832b5da4f1ff6e82bc6b2786fa65bc080e6611"} Mar 19 18:58:23 crc kubenswrapper[4826]: I0319 18:58:23.736787 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b0b2244dc2d1bfa26f6b6b2f0832b5da4f1ff6e82bc6b2786fa65bc080e6611" Mar 19 18:58:23 crc kubenswrapper[4826]: I0319 18:58:23.736691 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565765-b929z" Mar 19 18:58:23 crc kubenswrapper[4826]: I0319 18:58:23.740172 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"9055ec15-01ab-4655-9059-b8f38b404dc8","Type":"ContainerDied","Data":"ee51fe8b4264e3fa94643db0ca4c36df757004d62f9b03d33018bf4209e48a53"} Mar 19 18:58:23 crc kubenswrapper[4826]: I0319 18:58:23.740241 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee51fe8b4264e3fa94643db0ca4c36df757004d62f9b03d33018bf4209e48a53" Mar 19 18:58:23 crc kubenswrapper[4826]: I0319 18:58:23.740345 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 18:58:23 crc kubenswrapper[4826]: I0319 18:58:23.852558 4826 ???:1] "http: TLS handshake error from 192.168.126.11:52648: no serving certificate available for the kubelet" Mar 19 18:58:24 crc kubenswrapper[4826]: I0319 18:58:24.164333 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 18:58:24 crc kubenswrapper[4826]: I0319 18:58:24.273691 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/35fe00e4-6b92-49ef-8ad3-88f630b0bb7f-kubelet-dir\") pod \"35fe00e4-6b92-49ef-8ad3-88f630b0bb7f\" (UID: \"35fe00e4-6b92-49ef-8ad3-88f630b0bb7f\") " Mar 19 18:58:24 crc kubenswrapper[4826]: I0319 18:58:24.273814 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/35fe00e4-6b92-49ef-8ad3-88f630b0bb7f-kube-api-access\") pod \"35fe00e4-6b92-49ef-8ad3-88f630b0bb7f\" (UID: \"35fe00e4-6b92-49ef-8ad3-88f630b0bb7f\") " Mar 19 18:58:24 crc kubenswrapper[4826]: I0319 18:58:24.276841 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/35fe00e4-6b92-49ef-8ad3-88f630b0bb7f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "35fe00e4-6b92-49ef-8ad3-88f630b0bb7f" (UID: "35fe00e4-6b92-49ef-8ad3-88f630b0bb7f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 18:58:24 crc kubenswrapper[4826]: I0319 18:58:24.292550 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35fe00e4-6b92-49ef-8ad3-88f630b0bb7f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "35fe00e4-6b92-49ef-8ad3-88f630b0bb7f" (UID: "35fe00e4-6b92-49ef-8ad3-88f630b0bb7f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:58:24 crc kubenswrapper[4826]: I0319 18:58:24.377669 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/35fe00e4-6b92-49ef-8ad3-88f630b0bb7f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 18:58:24 crc kubenswrapper[4826]: I0319 18:58:24.377711 4826 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/35fe00e4-6b92-49ef-8ad3-88f630b0bb7f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 19 18:58:24 crc kubenswrapper[4826]: I0319 18:58:24.781788 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 18:58:24 crc kubenswrapper[4826]: I0319 18:58:24.782226 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"35fe00e4-6b92-49ef-8ad3-88f630b0bb7f","Type":"ContainerDied","Data":"9e615b641486081c9c731c5c51a59d8579218b473a38fee6f68b4528c9d49384"} Mar 19 18:58:24 crc kubenswrapper[4826]: I0319 18:58:24.782273 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e615b641486081c9c731c5c51a59d8579218b473a38fee6f68b4528c9d49384" Mar 19 18:58:25 crc kubenswrapper[4826]: I0319 18:58:25.464881 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 18:58:26 crc kubenswrapper[4826]: I0319 18:58:26.463415 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-ph9t5" Mar 19 18:58:28 crc kubenswrapper[4826]: I0319 18:58:28.975034 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:58:30 crc kubenswrapper[4826]: I0319 18:58:30.531682 4826 patch_prober.go:28] interesting pod/downloads-7954f5f757-cbmtf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Mar 19 18:58:30 crc kubenswrapper[4826]: I0319 18:58:30.532185 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-cbmtf" podUID="0a13bc75-83b6-4952-8e8e-cd93809a87b5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Mar 19 18:58:30 crc kubenswrapper[4826]: I0319 18:58:30.532594 4826 patch_prober.go:28] interesting pod/downloads-7954f5f757-cbmtf container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Mar 19 18:58:30 crc kubenswrapper[4826]: I0319 18:58:30.533004 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-cbmtf" podUID="0a13bc75-83b6-4952-8e8e-cd93809a87b5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Mar 19 18:58:30 crc kubenswrapper[4826]: I0319 18:58:30.890647 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-p892r" Mar 19 18:58:30 crc kubenswrapper[4826]: I0319 18:58:30.894338 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-p892r" Mar 19 18:58:31 crc kubenswrapper[4826]: E0319 18:58:31.171897 4826 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="357c925170cbafc742e33faa59acfddfc8282dc11185639e6fdf58e5e556690b" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 19 18:58:31 crc kubenswrapper[4826]: E0319 18:58:31.173528 4826 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="357c925170cbafc742e33faa59acfddfc8282dc11185639e6fdf58e5e556690b" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 19 18:58:31 crc kubenswrapper[4826]: E0319 18:58:31.178609 4826 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="357c925170cbafc742e33faa59acfddfc8282dc11185639e6fdf58e5e556690b" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 19 18:58:31 crc kubenswrapper[4826]: E0319 18:58:31.178704 4826 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-m7zht" podUID="97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a" containerName="kube-multus-additional-cni-plugins" Mar 19 18:58:33 crc kubenswrapper[4826]: I0319 18:58:33.562700 4826 ???:1] "http: TLS handshake error from 192.168.126.11:48398: no serving certificate available for the kubelet" Mar 19 18:58:33 crc kubenswrapper[4826]: I0319 18:58:33.862621 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-574fb44d96-6zwvm"] Mar 19 18:58:33 crc kubenswrapper[4826]: I0319 18:58:33.862872 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-574fb44d96-6zwvm" podUID="2be5868b-c0a8-45a6-8b09-5020ed53e863" containerName="controller-manager" containerID="cri-o://bdced0f6c149d0c9f1dfc745c1329d1b49fc233f2768905e8c9a312e0b6e6f60" gracePeriod=30 Mar 19 18:58:33 crc kubenswrapper[4826]: I0319 18:58:33.897729 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59f4d7d68d-p8tdq"] Mar 19 18:58:33 crc kubenswrapper[4826]: I0319 18:58:33.903422 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-59f4d7d68d-p8tdq" podUID="033a3bd5-e9ce-484e-a282-5a01b1bcf85e" containerName="route-controller-manager" containerID="cri-o://8152ea476a2ea8194af94b4f363420c50aca349e58542c9217f181414248ff58" gracePeriod=30 Mar 19 18:58:34 crc kubenswrapper[4826]: I0319 18:58:34.981669 4826 generic.go:334] "Generic (PLEG): container finished" podID="2be5868b-c0a8-45a6-8b09-5020ed53e863" containerID="bdced0f6c149d0c9f1dfc745c1329d1b49fc233f2768905e8c9a312e0b6e6f60" exitCode=0 Mar 19 18:58:34 crc kubenswrapper[4826]: I0319 18:58:34.981702 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-574fb44d96-6zwvm" event={"ID":"2be5868b-c0a8-45a6-8b09-5020ed53e863","Type":"ContainerDied","Data":"bdced0f6c149d0c9f1dfc745c1329d1b49fc233f2768905e8c9a312e0b6e6f60"} Mar 19 18:58:34 crc kubenswrapper[4826]: I0319 18:58:34.983757 4826 generic.go:334] "Generic (PLEG): container finished" podID="033a3bd5-e9ce-484e-a282-5a01b1bcf85e" containerID="8152ea476a2ea8194af94b4f363420c50aca349e58542c9217f181414248ff58" exitCode=0 Mar 19 18:58:34 crc kubenswrapper[4826]: I0319 18:58:34.983796 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59f4d7d68d-p8tdq" event={"ID":"033a3bd5-e9ce-484e-a282-5a01b1bcf85e","Type":"ContainerDied","Data":"8152ea476a2ea8194af94b4f363420c50aca349e58542c9217f181414248ff58"} Mar 19 18:58:37 crc kubenswrapper[4826]: I0319 18:58:37.991922 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 19 18:58:39 crc kubenswrapper[4826]: I0319 18:58:39.004548 4826 patch_prober.go:28] interesting pod/controller-manager-574fb44d96-6zwvm container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.51:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 18:58:39 crc kubenswrapper[4826]: I0319 18:58:39.005219 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-574fb44d96-6zwvm" podUID="2be5868b-c0a8-45a6-8b09-5020ed53e863" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.51:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 18:58:40 crc kubenswrapper[4826]: I0319 18:58:40.545958 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-cbmtf" Mar 19 18:58:40 crc kubenswrapper[4826]: I0319 18:58:40.564775 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=3.5647558569999998 podStartE2EDuration="3.564755857s" podCreationTimestamp="2026-03-19 18:58:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:40.559835408 +0000 UTC m=+145.313903751" watchObservedRunningTime="2026-03-19 18:58:40.564755857 +0000 UTC m=+145.318824180" Mar 19 18:58:41 crc kubenswrapper[4826]: E0319 18:58:41.167739 4826 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="357c925170cbafc742e33faa59acfddfc8282dc11185639e6fdf58e5e556690b" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 19 18:58:41 crc kubenswrapper[4826]: E0319 18:58:41.169191 4826 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="357c925170cbafc742e33faa59acfddfc8282dc11185639e6fdf58e5e556690b" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 19 18:58:41 crc kubenswrapper[4826]: E0319 18:58:41.170702 4826 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="357c925170cbafc742e33faa59acfddfc8282dc11185639e6fdf58e5e556690b" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 19 18:58:41 crc kubenswrapper[4826]: E0319 18:58:41.170736 4826 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-m7zht" podUID="97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a" containerName="kube-multus-additional-cni-plugins" Mar 19 18:58:41 crc kubenswrapper[4826]: I0319 18:58:41.337818 4826 patch_prober.go:28] interesting pod/route-controller-manager-59f4d7d68d-p8tdq container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.56:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 18:58:41 crc kubenswrapper[4826]: I0319 18:58:41.337921 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-59f4d7d68d-p8tdq" podUID="033a3bd5-e9ce-484e-a282-5a01b1bcf85e" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.56:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 18:58:41 crc kubenswrapper[4826]: I0319 18:58:41.416998 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 18:58:41 crc kubenswrapper[4826]: I0319 18:58:41.991690 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 19 18:58:46 crc kubenswrapper[4826]: I0319 18:58:46.015219 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=5.01517564 podStartE2EDuration="5.01517564s" podCreationTimestamp="2026-03-19 18:58:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:58:46.008332489 +0000 UTC m=+150.762400812" watchObservedRunningTime="2026-03-19 18:58:46.01517564 +0000 UTC m=+150.769243983" Mar 19 18:58:49 crc kubenswrapper[4826]: I0319 18:58:49.004623 4826 patch_prober.go:28] interesting pod/controller-manager-574fb44d96-6zwvm container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.51:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 18:58:49 crc kubenswrapper[4826]: I0319 18:58:49.005118 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-574fb44d96-6zwvm" podUID="2be5868b-c0a8-45a6-8b09-5020ed53e863" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.51:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 18:58:50 crc kubenswrapper[4826]: I0319 18:58:50.092382 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-m7zht_97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a/kube-multus-additional-cni-plugins/0.log" Mar 19 18:58:50 crc kubenswrapper[4826]: I0319 18:58:50.092452 4826 generic.go:334] "Generic (PLEG): container finished" podID="97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a" containerID="357c925170cbafc742e33faa59acfddfc8282dc11185639e6fdf58e5e556690b" exitCode=137 Mar 19 18:58:50 crc kubenswrapper[4826]: I0319 18:58:50.092500 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-m7zht" event={"ID":"97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a","Type":"ContainerDied","Data":"357c925170cbafc742e33faa59acfddfc8282dc11185639e6fdf58e5e556690b"} Mar 19 18:58:51 crc kubenswrapper[4826]: I0319 18:58:51.053294 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nlft6" Mar 19 18:58:51 crc kubenswrapper[4826]: E0319 18:58:51.165108 4826 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 357c925170cbafc742e33faa59acfddfc8282dc11185639e6fdf58e5e556690b is running failed: container process not found" containerID="357c925170cbafc742e33faa59acfddfc8282dc11185639e6fdf58e5e556690b" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 19 18:58:51 crc kubenswrapper[4826]: E0319 18:58:51.166767 4826 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 357c925170cbafc742e33faa59acfddfc8282dc11185639e6fdf58e5e556690b is running failed: container process not found" containerID="357c925170cbafc742e33faa59acfddfc8282dc11185639e6fdf58e5e556690b" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 19 18:58:51 crc kubenswrapper[4826]: E0319 18:58:51.167582 4826 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 357c925170cbafc742e33faa59acfddfc8282dc11185639e6fdf58e5e556690b is running failed: container process not found" containerID="357c925170cbafc742e33faa59acfddfc8282dc11185639e6fdf58e5e556690b" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 19 18:58:51 crc kubenswrapper[4826]: E0319 18:58:51.167715 4826 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 357c925170cbafc742e33faa59acfddfc8282dc11185639e6fdf58e5e556690b is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-m7zht" podUID="97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a" containerName="kube-multus-additional-cni-plugins" Mar 19 18:58:51 crc kubenswrapper[4826]: I0319 18:58:51.338173 4826 patch_prober.go:28] interesting pod/route-controller-manager-59f4d7d68d-p8tdq container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.56:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 18:58:51 crc kubenswrapper[4826]: I0319 18:58:51.338323 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-59f4d7d68d-p8tdq" podUID="033a3bd5-e9ce-484e-a282-5a01b1bcf85e" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.56:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 18:58:51 crc kubenswrapper[4826]: I0319 18:58:51.888446 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-574fb44d96-6zwvm" Mar 19 18:58:51 crc kubenswrapper[4826]: I0319 18:58:51.893408 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59f4d7d68d-p8tdq" Mar 19 18:58:51 crc kubenswrapper[4826]: I0319 18:58:51.987818 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2be5868b-c0a8-45a6-8b09-5020ed53e863-proxy-ca-bundles\") pod \"2be5868b-c0a8-45a6-8b09-5020ed53e863\" (UID: \"2be5868b-c0a8-45a6-8b09-5020ed53e863\") " Mar 19 18:58:51 crc kubenswrapper[4826]: I0319 18:58:51.988352 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/033a3bd5-e9ce-484e-a282-5a01b1bcf85e-client-ca\") pod \"033a3bd5-e9ce-484e-a282-5a01b1bcf85e\" (UID: \"033a3bd5-e9ce-484e-a282-5a01b1bcf85e\") " Mar 19 18:58:51 crc kubenswrapper[4826]: I0319 18:58:51.988397 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/033a3bd5-e9ce-484e-a282-5a01b1bcf85e-serving-cert\") pod \"033a3bd5-e9ce-484e-a282-5a01b1bcf85e\" (UID: \"033a3bd5-e9ce-484e-a282-5a01b1bcf85e\") " Mar 19 18:58:51 crc kubenswrapper[4826]: I0319 18:58:51.988448 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2be5868b-c0a8-45a6-8b09-5020ed53e863-config\") pod \"2be5868b-c0a8-45a6-8b09-5020ed53e863\" (UID: \"2be5868b-c0a8-45a6-8b09-5020ed53e863\") " Mar 19 18:58:51 crc kubenswrapper[4826]: I0319 18:58:51.988475 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kp7qb\" (UniqueName: \"kubernetes.io/projected/033a3bd5-e9ce-484e-a282-5a01b1bcf85e-kube-api-access-kp7qb\") pod \"033a3bd5-e9ce-484e-a282-5a01b1bcf85e\" (UID: \"033a3bd5-e9ce-484e-a282-5a01b1bcf85e\") " Mar 19 18:58:51 crc kubenswrapper[4826]: I0319 18:58:51.988539 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2be5868b-c0a8-45a6-8b09-5020ed53e863-client-ca\") pod \"2be5868b-c0a8-45a6-8b09-5020ed53e863\" (UID: \"2be5868b-c0a8-45a6-8b09-5020ed53e863\") " Mar 19 18:58:51 crc kubenswrapper[4826]: I0319 18:58:51.988566 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w97r\" (UniqueName: \"kubernetes.io/projected/2be5868b-c0a8-45a6-8b09-5020ed53e863-kube-api-access-2w97r\") pod \"2be5868b-c0a8-45a6-8b09-5020ed53e863\" (UID: \"2be5868b-c0a8-45a6-8b09-5020ed53e863\") " Mar 19 18:58:51 crc kubenswrapper[4826]: I0319 18:58:51.988640 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2be5868b-c0a8-45a6-8b09-5020ed53e863-serving-cert\") pod \"2be5868b-c0a8-45a6-8b09-5020ed53e863\" (UID: \"2be5868b-c0a8-45a6-8b09-5020ed53e863\") " Mar 19 18:58:51 crc kubenswrapper[4826]: I0319 18:58:51.988738 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/033a3bd5-e9ce-484e-a282-5a01b1bcf85e-config\") pod \"033a3bd5-e9ce-484e-a282-5a01b1bcf85e\" (UID: \"033a3bd5-e9ce-484e-a282-5a01b1bcf85e\") " Mar 19 18:58:51 crc kubenswrapper[4826]: I0319 18:58:51.988860 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2be5868b-c0a8-45a6-8b09-5020ed53e863-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "2be5868b-c0a8-45a6-8b09-5020ed53e863" (UID: "2be5868b-c0a8-45a6-8b09-5020ed53e863"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:58:51 crc kubenswrapper[4826]: I0319 18:58:51.989093 4826 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2be5868b-c0a8-45a6-8b09-5020ed53e863-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 18:58:51 crc kubenswrapper[4826]: I0319 18:58:51.989903 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/033a3bd5-e9ce-484e-a282-5a01b1bcf85e-config" (OuterVolumeSpecName: "config") pod "033a3bd5-e9ce-484e-a282-5a01b1bcf85e" (UID: "033a3bd5-e9ce-484e-a282-5a01b1bcf85e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:58:51 crc kubenswrapper[4826]: I0319 18:58:51.991165 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2be5868b-c0a8-45a6-8b09-5020ed53e863-client-ca" (OuterVolumeSpecName: "client-ca") pod "2be5868b-c0a8-45a6-8b09-5020ed53e863" (UID: "2be5868b-c0a8-45a6-8b09-5020ed53e863"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:58:51 crc kubenswrapper[4826]: I0319 18:58:51.995759 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/033a3bd5-e9ce-484e-a282-5a01b1bcf85e-client-ca" (OuterVolumeSpecName: "client-ca") pod "033a3bd5-e9ce-484e-a282-5a01b1bcf85e" (UID: "033a3bd5-e9ce-484e-a282-5a01b1bcf85e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:58:51 crc kubenswrapper[4826]: I0319 18:58:51.996802 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2be5868b-c0a8-45a6-8b09-5020ed53e863-config" (OuterVolumeSpecName: "config") pod "2be5868b-c0a8-45a6-8b09-5020ed53e863" (UID: "2be5868b-c0a8-45a6-8b09-5020ed53e863"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:58:51 crc kubenswrapper[4826]: I0319 18:58:51.999321 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2be5868b-c0a8-45a6-8b09-5020ed53e863-kube-api-access-2w97r" (OuterVolumeSpecName: "kube-api-access-2w97r") pod "2be5868b-c0a8-45a6-8b09-5020ed53e863" (UID: "2be5868b-c0a8-45a6-8b09-5020ed53e863"). InnerVolumeSpecName "kube-api-access-2w97r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:58:52 crc kubenswrapper[4826]: I0319 18:58:52.001807 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2be5868b-c0a8-45a6-8b09-5020ed53e863-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2be5868b-c0a8-45a6-8b09-5020ed53e863" (UID: "2be5868b-c0a8-45a6-8b09-5020ed53e863"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:58:52 crc kubenswrapper[4826]: I0319 18:58:52.002905 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/033a3bd5-e9ce-484e-a282-5a01b1bcf85e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "033a3bd5-e9ce-484e-a282-5a01b1bcf85e" (UID: "033a3bd5-e9ce-484e-a282-5a01b1bcf85e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:58:52 crc kubenswrapper[4826]: I0319 18:58:52.021102 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/033a3bd5-e9ce-484e-a282-5a01b1bcf85e-kube-api-access-kp7qb" (OuterVolumeSpecName: "kube-api-access-kp7qb") pod "033a3bd5-e9ce-484e-a282-5a01b1bcf85e" (UID: "033a3bd5-e9ce-484e-a282-5a01b1bcf85e"). InnerVolumeSpecName "kube-api-access-kp7qb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:58:52 crc kubenswrapper[4826]: I0319 18:58:52.043458 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-668558775b-vj5kd"] Mar 19 18:58:52 crc kubenswrapper[4826]: E0319 18:58:52.043708 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2be5868b-c0a8-45a6-8b09-5020ed53e863" containerName="controller-manager" Mar 19 18:58:52 crc kubenswrapper[4826]: I0319 18:58:52.043727 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="2be5868b-c0a8-45a6-8b09-5020ed53e863" containerName="controller-manager" Mar 19 18:58:52 crc kubenswrapper[4826]: E0319 18:58:52.043742 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9055ec15-01ab-4655-9059-b8f38b404dc8" containerName="pruner" Mar 19 18:58:52 crc kubenswrapper[4826]: I0319 18:58:52.043749 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="9055ec15-01ab-4655-9059-b8f38b404dc8" containerName="pruner" Mar 19 18:58:52 crc kubenswrapper[4826]: E0319 18:58:52.043764 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c53abe4-412d-47a0-bccc-ec9e6f4d8784" containerName="collect-profiles" Mar 19 18:58:52 crc kubenswrapper[4826]: I0319 18:58:52.043771 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c53abe4-412d-47a0-bccc-ec9e6f4d8784" containerName="collect-profiles" Mar 19 18:58:52 crc kubenswrapper[4826]: E0319 18:58:52.043783 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35fe00e4-6b92-49ef-8ad3-88f630b0bb7f" containerName="pruner" Mar 19 18:58:52 crc kubenswrapper[4826]: I0319 18:58:52.043789 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="35fe00e4-6b92-49ef-8ad3-88f630b0bb7f" containerName="pruner" Mar 19 18:58:52 crc kubenswrapper[4826]: E0319 18:58:52.043796 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="033a3bd5-e9ce-484e-a282-5a01b1bcf85e" containerName="route-controller-manager" Mar 19 18:58:52 crc kubenswrapper[4826]: I0319 18:58:52.043801 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="033a3bd5-e9ce-484e-a282-5a01b1bcf85e" containerName="route-controller-manager" Mar 19 18:58:52 crc kubenswrapper[4826]: I0319 18:58:52.044043 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="033a3bd5-e9ce-484e-a282-5a01b1bcf85e" containerName="route-controller-manager" Mar 19 18:58:52 crc kubenswrapper[4826]: I0319 18:58:52.044064 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="35fe00e4-6b92-49ef-8ad3-88f630b0bb7f" containerName="pruner" Mar 19 18:58:52 crc kubenswrapper[4826]: I0319 18:58:52.044071 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="2be5868b-c0a8-45a6-8b09-5020ed53e863" containerName="controller-manager" Mar 19 18:58:52 crc kubenswrapper[4826]: I0319 18:58:52.044080 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="9055ec15-01ab-4655-9059-b8f38b404dc8" containerName="pruner" Mar 19 18:58:52 crc kubenswrapper[4826]: I0319 18:58:52.044087 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c53abe4-412d-47a0-bccc-ec9e6f4d8784" containerName="collect-profiles" Mar 19 18:58:52 crc kubenswrapper[4826]: I0319 18:58:52.044432 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-668558775b-vj5kd"] Mar 19 18:58:52 crc kubenswrapper[4826]: I0319 18:58:52.044533 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-668558775b-vj5kd" Mar 19 18:58:52 crc kubenswrapper[4826]: I0319 18:58:52.091416 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f127f891-91c7-43fa-bdf7-b58d406eedc5-proxy-ca-bundles\") pod \"controller-manager-668558775b-vj5kd\" (UID: \"f127f891-91c7-43fa-bdf7-b58d406eedc5\") " pod="openshift-controller-manager/controller-manager-668558775b-vj5kd" Mar 19 18:58:52 crc kubenswrapper[4826]: I0319 18:58:52.091514 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f127f891-91c7-43fa-bdf7-b58d406eedc5-config\") pod \"controller-manager-668558775b-vj5kd\" (UID: \"f127f891-91c7-43fa-bdf7-b58d406eedc5\") " pod="openshift-controller-manager/controller-manager-668558775b-vj5kd" Mar 19 18:58:52 crc kubenswrapper[4826]: I0319 18:58:52.091558 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wk4gm\" (UniqueName: \"kubernetes.io/projected/f127f891-91c7-43fa-bdf7-b58d406eedc5-kube-api-access-wk4gm\") pod \"controller-manager-668558775b-vj5kd\" (UID: \"f127f891-91c7-43fa-bdf7-b58d406eedc5\") " pod="openshift-controller-manager/controller-manager-668558775b-vj5kd" Mar 19 18:58:52 crc kubenswrapper[4826]: I0319 18:58:52.091809 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f127f891-91c7-43fa-bdf7-b58d406eedc5-client-ca\") pod \"controller-manager-668558775b-vj5kd\" (UID: \"f127f891-91c7-43fa-bdf7-b58d406eedc5\") " pod="openshift-controller-manager/controller-manager-668558775b-vj5kd" Mar 19 18:58:52 crc kubenswrapper[4826]: I0319 18:58:52.091881 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f127f891-91c7-43fa-bdf7-b58d406eedc5-serving-cert\") pod \"controller-manager-668558775b-vj5kd\" (UID: \"f127f891-91c7-43fa-bdf7-b58d406eedc5\") " pod="openshift-controller-manager/controller-manager-668558775b-vj5kd" Mar 19 18:58:52 crc kubenswrapper[4826]: I0319 18:58:52.092037 4826 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2be5868b-c0a8-45a6-8b09-5020ed53e863-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:58:52 crc kubenswrapper[4826]: I0319 18:58:52.092055 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kp7qb\" (UniqueName: \"kubernetes.io/projected/033a3bd5-e9ce-484e-a282-5a01b1bcf85e-kube-api-access-kp7qb\") on node \"crc\" DevicePath \"\"" Mar 19 18:58:52 crc kubenswrapper[4826]: I0319 18:58:52.092066 4826 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2be5868b-c0a8-45a6-8b09-5020ed53e863-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 18:58:52 crc kubenswrapper[4826]: I0319 18:58:52.092077 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w97r\" (UniqueName: \"kubernetes.io/projected/2be5868b-c0a8-45a6-8b09-5020ed53e863-kube-api-access-2w97r\") on node \"crc\" DevicePath \"\"" Mar 19 18:58:52 crc kubenswrapper[4826]: I0319 18:58:52.092087 4826 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2be5868b-c0a8-45a6-8b09-5020ed53e863-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:58:52 crc kubenswrapper[4826]: I0319 18:58:52.092097 4826 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/033a3bd5-e9ce-484e-a282-5a01b1bcf85e-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:58:52 crc kubenswrapper[4826]: I0319 18:58:52.092107 4826 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/033a3bd5-e9ce-484e-a282-5a01b1bcf85e-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 18:58:52 crc kubenswrapper[4826]: I0319 18:58:52.092117 4826 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/033a3bd5-e9ce-484e-a282-5a01b1bcf85e-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:58:52 crc kubenswrapper[4826]: I0319 18:58:52.108996 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-574fb44d96-6zwvm" event={"ID":"2be5868b-c0a8-45a6-8b09-5020ed53e863","Type":"ContainerDied","Data":"c99c5bf4e04fb0295d6e9006dc419627c97df31f5d4c3689c808bb0d2d429f3a"} Mar 19 18:58:52 crc kubenswrapper[4826]: I0319 18:58:52.109065 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-574fb44d96-6zwvm" Mar 19 18:58:52 crc kubenswrapper[4826]: I0319 18:58:52.109097 4826 scope.go:117] "RemoveContainer" containerID="bdced0f6c149d0c9f1dfc745c1329d1b49fc233f2768905e8c9a312e0b6e6f60" Mar 19 18:58:52 crc kubenswrapper[4826]: I0319 18:58:52.110884 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59f4d7d68d-p8tdq" event={"ID":"033a3bd5-e9ce-484e-a282-5a01b1bcf85e","Type":"ContainerDied","Data":"4c06534f3d163154350061e7b02f5b60133f21a0178480cdaccb5dc41b332cbd"} Mar 19 18:58:52 crc kubenswrapper[4826]: I0319 18:58:52.111038 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59f4d7d68d-p8tdq" Mar 19 18:58:52 crc kubenswrapper[4826]: I0319 18:58:52.145732 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-574fb44d96-6zwvm"] Mar 19 18:58:52 crc kubenswrapper[4826]: I0319 18:58:52.148996 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-574fb44d96-6zwvm"] Mar 19 18:58:52 crc kubenswrapper[4826]: I0319 18:58:52.154318 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59f4d7d68d-p8tdq"] Mar 19 18:58:52 crc kubenswrapper[4826]: I0319 18:58:52.157420 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59f4d7d68d-p8tdq"] Mar 19 18:58:52 crc kubenswrapper[4826]: I0319 18:58:52.194352 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f127f891-91c7-43fa-bdf7-b58d406eedc5-config\") pod \"controller-manager-668558775b-vj5kd\" (UID: \"f127f891-91c7-43fa-bdf7-b58d406eedc5\") " pod="openshift-controller-manager/controller-manager-668558775b-vj5kd" Mar 19 18:58:52 crc kubenswrapper[4826]: I0319 18:58:52.194429 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wk4gm\" (UniqueName: \"kubernetes.io/projected/f127f891-91c7-43fa-bdf7-b58d406eedc5-kube-api-access-wk4gm\") pod \"controller-manager-668558775b-vj5kd\" (UID: \"f127f891-91c7-43fa-bdf7-b58d406eedc5\") " pod="openshift-controller-manager/controller-manager-668558775b-vj5kd" Mar 19 18:58:52 crc kubenswrapper[4826]: I0319 18:58:52.194481 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f127f891-91c7-43fa-bdf7-b58d406eedc5-client-ca\") pod \"controller-manager-668558775b-vj5kd\" (UID: \"f127f891-91c7-43fa-bdf7-b58d406eedc5\") " pod="openshift-controller-manager/controller-manager-668558775b-vj5kd" Mar 19 18:58:52 crc kubenswrapper[4826]: I0319 18:58:52.194508 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f127f891-91c7-43fa-bdf7-b58d406eedc5-serving-cert\") pod \"controller-manager-668558775b-vj5kd\" (UID: \"f127f891-91c7-43fa-bdf7-b58d406eedc5\") " pod="openshift-controller-manager/controller-manager-668558775b-vj5kd" Mar 19 18:58:52 crc kubenswrapper[4826]: I0319 18:58:52.194567 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f127f891-91c7-43fa-bdf7-b58d406eedc5-proxy-ca-bundles\") pod \"controller-manager-668558775b-vj5kd\" (UID: \"f127f891-91c7-43fa-bdf7-b58d406eedc5\") " pod="openshift-controller-manager/controller-manager-668558775b-vj5kd" Mar 19 18:58:52 crc kubenswrapper[4826]: I0319 18:58:52.195894 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f127f891-91c7-43fa-bdf7-b58d406eedc5-client-ca\") pod \"controller-manager-668558775b-vj5kd\" (UID: \"f127f891-91c7-43fa-bdf7-b58d406eedc5\") " pod="openshift-controller-manager/controller-manager-668558775b-vj5kd" Mar 19 18:58:52 crc kubenswrapper[4826]: I0319 18:58:52.196782 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f127f891-91c7-43fa-bdf7-b58d406eedc5-proxy-ca-bundles\") pod \"controller-manager-668558775b-vj5kd\" (UID: \"f127f891-91c7-43fa-bdf7-b58d406eedc5\") " pod="openshift-controller-manager/controller-manager-668558775b-vj5kd" Mar 19 18:58:52 crc kubenswrapper[4826]: I0319 18:58:52.196915 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f127f891-91c7-43fa-bdf7-b58d406eedc5-config\") pod \"controller-manager-668558775b-vj5kd\" (UID: \"f127f891-91c7-43fa-bdf7-b58d406eedc5\") " pod="openshift-controller-manager/controller-manager-668558775b-vj5kd" Mar 19 18:58:52 crc kubenswrapper[4826]: I0319 18:58:52.198501 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f127f891-91c7-43fa-bdf7-b58d406eedc5-serving-cert\") pod \"controller-manager-668558775b-vj5kd\" (UID: \"f127f891-91c7-43fa-bdf7-b58d406eedc5\") " pod="openshift-controller-manager/controller-manager-668558775b-vj5kd" Mar 19 18:58:52 crc kubenswrapper[4826]: I0319 18:58:52.211176 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wk4gm\" (UniqueName: \"kubernetes.io/projected/f127f891-91c7-43fa-bdf7-b58d406eedc5-kube-api-access-wk4gm\") pod \"controller-manager-668558775b-vj5kd\" (UID: \"f127f891-91c7-43fa-bdf7-b58d406eedc5\") " pod="openshift-controller-manager/controller-manager-668558775b-vj5kd" Mar 19 18:58:52 crc kubenswrapper[4826]: I0319 18:58:52.369210 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-668558775b-vj5kd" Mar 19 18:58:53 crc kubenswrapper[4826]: I0319 18:58:53.870611 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-668558775b-vj5kd"] Mar 19 18:58:53 crc kubenswrapper[4826]: I0319 18:58:53.901340 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7bc9785dc4-k6mdp"] Mar 19 18:58:53 crc kubenswrapper[4826]: I0319 18:58:53.902334 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7bc9785dc4-k6mdp" Mar 19 18:58:53 crc kubenswrapper[4826]: I0319 18:58:53.904789 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 18:58:53 crc kubenswrapper[4826]: I0319 18:58:53.910224 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 19 18:58:53 crc kubenswrapper[4826]: I0319 18:58:53.910444 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 18:58:53 crc kubenswrapper[4826]: I0319 18:58:53.910721 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 18:58:53 crc kubenswrapper[4826]: I0319 18:58:53.910918 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 18:58:53 crc kubenswrapper[4826]: I0319 18:58:53.911541 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 18:58:53 crc kubenswrapper[4826]: I0319 18:58:53.917364 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7bc9785dc4-k6mdp"] Mar 19 18:58:53 crc kubenswrapper[4826]: I0319 18:58:53.919077 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82985cb0-104d-4760-a61a-7165a1f888a3-serving-cert\") pod \"route-controller-manager-7bc9785dc4-k6mdp\" (UID: \"82985cb0-104d-4760-a61a-7165a1f888a3\") " pod="openshift-route-controller-manager/route-controller-manager-7bc9785dc4-k6mdp" Mar 19 18:58:53 crc kubenswrapper[4826]: I0319 18:58:53.921483 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82985cb0-104d-4760-a61a-7165a1f888a3-config\") pod \"route-controller-manager-7bc9785dc4-k6mdp\" (UID: \"82985cb0-104d-4760-a61a-7165a1f888a3\") " pod="openshift-route-controller-manager/route-controller-manager-7bc9785dc4-k6mdp" Mar 19 18:58:53 crc kubenswrapper[4826]: I0319 18:58:53.922493 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fpq8\" (UniqueName: \"kubernetes.io/projected/82985cb0-104d-4760-a61a-7165a1f888a3-kube-api-access-8fpq8\") pod \"route-controller-manager-7bc9785dc4-k6mdp\" (UID: \"82985cb0-104d-4760-a61a-7165a1f888a3\") " pod="openshift-route-controller-manager/route-controller-manager-7bc9785dc4-k6mdp" Mar 19 18:58:53 crc kubenswrapper[4826]: I0319 18:58:53.922575 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/82985cb0-104d-4760-a61a-7165a1f888a3-client-ca\") pod \"route-controller-manager-7bc9785dc4-k6mdp\" (UID: \"82985cb0-104d-4760-a61a-7165a1f888a3\") " pod="openshift-route-controller-manager/route-controller-manager-7bc9785dc4-k6mdp" Mar 19 18:58:53 crc kubenswrapper[4826]: I0319 18:58:53.988892 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="033a3bd5-e9ce-484e-a282-5a01b1bcf85e" path="/var/lib/kubelet/pods/033a3bd5-e9ce-484e-a282-5a01b1bcf85e/volumes" Mar 19 18:58:53 crc kubenswrapper[4826]: I0319 18:58:53.990248 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2be5868b-c0a8-45a6-8b09-5020ed53e863" path="/var/lib/kubelet/pods/2be5868b-c0a8-45a6-8b09-5020ed53e863/volumes" Mar 19 18:58:53 crc kubenswrapper[4826]: I0319 18:58:53.992188 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 19 18:58:53 crc kubenswrapper[4826]: I0319 18:58:53.995275 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 18:58:54 crc kubenswrapper[4826]: I0319 18:58:53.999432 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 19 18:58:54 crc kubenswrapper[4826]: I0319 18:58:54.000091 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 19 18:58:54 crc kubenswrapper[4826]: I0319 18:58:54.012434 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 19 18:58:54 crc kubenswrapper[4826]: I0319 18:58:54.023607 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4fddc778-f1d1-4508-ad83-e9b8aaf31e1c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4fddc778-f1d1-4508-ad83-e9b8aaf31e1c\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 18:58:54 crc kubenswrapper[4826]: I0319 18:58:54.023700 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fpq8\" (UniqueName: \"kubernetes.io/projected/82985cb0-104d-4760-a61a-7165a1f888a3-kube-api-access-8fpq8\") pod \"route-controller-manager-7bc9785dc4-k6mdp\" (UID: \"82985cb0-104d-4760-a61a-7165a1f888a3\") " pod="openshift-route-controller-manager/route-controller-manager-7bc9785dc4-k6mdp" Mar 19 18:58:54 crc kubenswrapper[4826]: I0319 18:58:54.023731 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/82985cb0-104d-4760-a61a-7165a1f888a3-client-ca\") pod \"route-controller-manager-7bc9785dc4-k6mdp\" (UID: \"82985cb0-104d-4760-a61a-7165a1f888a3\") " pod="openshift-route-controller-manager/route-controller-manager-7bc9785dc4-k6mdp" Mar 19 18:58:54 crc kubenswrapper[4826]: I0319 18:58:54.023754 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82985cb0-104d-4760-a61a-7165a1f888a3-serving-cert\") pod \"route-controller-manager-7bc9785dc4-k6mdp\" (UID: \"82985cb0-104d-4760-a61a-7165a1f888a3\") " pod="openshift-route-controller-manager/route-controller-manager-7bc9785dc4-k6mdp" Mar 19 18:58:54 crc kubenswrapper[4826]: I0319 18:58:54.023808 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4fddc778-f1d1-4508-ad83-e9b8aaf31e1c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4fddc778-f1d1-4508-ad83-e9b8aaf31e1c\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 18:58:54 crc kubenswrapper[4826]: I0319 18:58:54.023858 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82985cb0-104d-4760-a61a-7165a1f888a3-config\") pod \"route-controller-manager-7bc9785dc4-k6mdp\" (UID: \"82985cb0-104d-4760-a61a-7165a1f888a3\") " pod="openshift-route-controller-manager/route-controller-manager-7bc9785dc4-k6mdp" Mar 19 18:58:54 crc kubenswrapper[4826]: I0319 18:58:54.025062 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82985cb0-104d-4760-a61a-7165a1f888a3-config\") pod \"route-controller-manager-7bc9785dc4-k6mdp\" (UID: \"82985cb0-104d-4760-a61a-7165a1f888a3\") " pod="openshift-route-controller-manager/route-controller-manager-7bc9785dc4-k6mdp" Mar 19 18:58:54 crc kubenswrapper[4826]: I0319 18:58:54.028052 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/82985cb0-104d-4760-a61a-7165a1f888a3-client-ca\") pod \"route-controller-manager-7bc9785dc4-k6mdp\" (UID: \"82985cb0-104d-4760-a61a-7165a1f888a3\") " pod="openshift-route-controller-manager/route-controller-manager-7bc9785dc4-k6mdp" Mar 19 18:58:54 crc kubenswrapper[4826]: I0319 18:58:54.028315 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82985cb0-104d-4760-a61a-7165a1f888a3-serving-cert\") pod \"route-controller-manager-7bc9785dc4-k6mdp\" (UID: \"82985cb0-104d-4760-a61a-7165a1f888a3\") " pod="openshift-route-controller-manager/route-controller-manager-7bc9785dc4-k6mdp" Mar 19 18:58:54 crc kubenswrapper[4826]: I0319 18:58:54.041722 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fpq8\" (UniqueName: \"kubernetes.io/projected/82985cb0-104d-4760-a61a-7165a1f888a3-kube-api-access-8fpq8\") pod \"route-controller-manager-7bc9785dc4-k6mdp\" (UID: \"82985cb0-104d-4760-a61a-7165a1f888a3\") " pod="openshift-route-controller-manager/route-controller-manager-7bc9785dc4-k6mdp" Mar 19 18:58:54 crc kubenswrapper[4826]: I0319 18:58:54.124551 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4fddc778-f1d1-4508-ad83-e9b8aaf31e1c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4fddc778-f1d1-4508-ad83-e9b8aaf31e1c\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 18:58:54 crc kubenswrapper[4826]: I0319 18:58:54.124686 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4fddc778-f1d1-4508-ad83-e9b8aaf31e1c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4fddc778-f1d1-4508-ad83-e9b8aaf31e1c\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 18:58:54 crc kubenswrapper[4826]: I0319 18:58:54.124737 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4fddc778-f1d1-4508-ad83-e9b8aaf31e1c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4fddc778-f1d1-4508-ad83-e9b8aaf31e1c\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 18:58:54 crc kubenswrapper[4826]: I0319 18:58:54.141177 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4fddc778-f1d1-4508-ad83-e9b8aaf31e1c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4fddc778-f1d1-4508-ad83-e9b8aaf31e1c\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 18:58:54 crc kubenswrapper[4826]: I0319 18:58:54.223386 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7bc9785dc4-k6mdp" Mar 19 18:58:54 crc kubenswrapper[4826]: I0319 18:58:54.317307 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 18:58:55 crc kubenswrapper[4826]: I0319 18:58:55.048770 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 18:58:55 crc kubenswrapper[4826]: E0319 18:58:55.680455 4826 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 19 18:58:55 crc kubenswrapper[4826]: E0319 18:58:55.680946 4826 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zcngb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-2fpvj_openshift-marketplace(0e0d7689-755d-4e24-a337-4177c37c2437): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 18:58:55 crc kubenswrapper[4826]: E0319 18:58:55.682319 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-2fpvj" podUID="0e0d7689-755d-4e24-a337-4177c37c2437" Mar 19 18:58:57 crc kubenswrapper[4826]: E0319 18:58:57.151054 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-2fpvj" podUID="0e0d7689-755d-4e24-a337-4177c37c2437" Mar 19 18:58:57 crc kubenswrapper[4826]: E0319 18:58:57.231695 4826 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 19 18:58:57 crc kubenswrapper[4826]: E0319 18:58:57.231893 4826 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b68bp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-lwdqq_openshift-marketplace(6397cca1-7284-4e40-9b7e-3f8026c72f5f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 18:58:57 crc kubenswrapper[4826]: E0319 18:58:57.233587 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-lwdqq" podUID="6397cca1-7284-4e40-9b7e-3f8026c72f5f" Mar 19 18:58:57 crc kubenswrapper[4826]: E0319 18:58:57.250819 4826 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 19 18:58:57 crc kubenswrapper[4826]: E0319 18:58:57.250953 4826 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xkc6j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-vsrvh_openshift-marketplace(06fdacd5-0f40-4d55-8df2-67ea56f25595): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 18:58:57 crc kubenswrapper[4826]: E0319 18:58:57.252044 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-vsrvh" podUID="06fdacd5-0f40-4d55-8df2-67ea56f25595" Mar 19 18:58:57 crc kubenswrapper[4826]: E0319 18:58:57.262293 4826 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 19 18:58:57 crc kubenswrapper[4826]: E0319 18:58:57.262531 4826 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4qghf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-8rs7z_openshift-marketplace(dcf719a6-7a63-4efa-b8dd-1beba09934f9): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 18:58:57 crc kubenswrapper[4826]: E0319 18:58:57.263735 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-8rs7z" podUID="dcf719a6-7a63-4efa-b8dd-1beba09934f9" Mar 19 18:59:01 crc kubenswrapper[4826]: E0319 18:59:01.000419 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-8rs7z" podUID="dcf719a6-7a63-4efa-b8dd-1beba09934f9" Mar 19 18:59:01 crc kubenswrapper[4826]: E0319 18:59:01.000588 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-lwdqq" podUID="6397cca1-7284-4e40-9b7e-3f8026c72f5f" Mar 19 18:59:01 crc kubenswrapper[4826]: E0319 18:59:01.000623 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-vsrvh" podUID="06fdacd5-0f40-4d55-8df2-67ea56f25595" Mar 19 18:59:01 crc kubenswrapper[4826]: E0319 18:59:01.124601 4826 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 19 18:59:01 crc kubenswrapper[4826]: E0319 18:59:01.124763 4826 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vcbbb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-zkslk_openshift-marketplace(7109581b-42ad-4e72-89be-ae269dcaea42): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 18:59:01 crc kubenswrapper[4826]: E0319 18:59:01.125994 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-zkslk" podUID="7109581b-42ad-4e72-89be-ae269dcaea42" Mar 19 18:59:01 crc kubenswrapper[4826]: E0319 18:59:01.164551 4826 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 357c925170cbafc742e33faa59acfddfc8282dc11185639e6fdf58e5e556690b is running failed: container process not found" containerID="357c925170cbafc742e33faa59acfddfc8282dc11185639e6fdf58e5e556690b" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 19 18:59:01 crc kubenswrapper[4826]: E0319 18:59:01.164914 4826 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 357c925170cbafc742e33faa59acfddfc8282dc11185639e6fdf58e5e556690b is running failed: container process not found" containerID="357c925170cbafc742e33faa59acfddfc8282dc11185639e6fdf58e5e556690b" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 19 18:59:01 crc kubenswrapper[4826]: E0319 18:59:01.165225 4826 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 357c925170cbafc742e33faa59acfddfc8282dc11185639e6fdf58e5e556690b is running failed: container process not found" containerID="357c925170cbafc742e33faa59acfddfc8282dc11185639e6fdf58e5e556690b" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 19 18:59:01 crc kubenswrapper[4826]: E0319 18:59:01.165289 4826 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 357c925170cbafc742e33faa59acfddfc8282dc11185639e6fdf58e5e556690b is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-m7zht" podUID="97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a" containerName="kube-multus-additional-cni-plugins" Mar 19 18:59:01 crc kubenswrapper[4826]: E0319 18:59:01.202700 4826 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 19 18:59:01 crc kubenswrapper[4826]: E0319 18:59:01.202914 4826 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5zp8h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-psc5t_openshift-marketplace(ca0af29a-9ab6-4d5f-a6fd-bdb5f3d5526c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 18:59:01 crc kubenswrapper[4826]: E0319 18:59:01.204138 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-psc5t" podUID="ca0af29a-9ab6-4d5f-a6fd-bdb5f3d5526c" Mar 19 18:59:01 crc kubenswrapper[4826]: E0319 18:59:01.249420 4826 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 19 18:59:01 crc kubenswrapper[4826]: E0319 18:59:01.249578 4826 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vs6z8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-rrk5r_openshift-marketplace(f4293235-5c04-462c-bef4-8595d0c89ec6): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 18:59:01 crc kubenswrapper[4826]: E0319 18:59:01.250802 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-rrk5r" podUID="f4293235-5c04-462c-bef4-8595d0c89ec6" Mar 19 18:59:01 crc kubenswrapper[4826]: I0319 18:59:01.945882 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 19 18:59:01 crc kubenswrapper[4826]: I0319 18:59:01.946919 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 19 18:59:02 crc kubenswrapper[4826]: I0319 18:59:02.019336 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 19 18:59:02 crc kubenswrapper[4826]: I0319 18:59:02.097937 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/76ee4930-51ba-4831-9d42-469ffb000e9d-var-lock\") pod \"installer-9-crc\" (UID: \"76ee4930-51ba-4831-9d42-469ffb000e9d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 18:59:02 crc kubenswrapper[4826]: I0319 18:59:02.098033 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/76ee4930-51ba-4831-9d42-469ffb000e9d-kubelet-dir\") pod \"installer-9-crc\" (UID: \"76ee4930-51ba-4831-9d42-469ffb000e9d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 18:59:02 crc kubenswrapper[4826]: I0319 18:59:02.098082 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/76ee4930-51ba-4831-9d42-469ffb000e9d-kube-api-access\") pod \"installer-9-crc\" (UID: \"76ee4930-51ba-4831-9d42-469ffb000e9d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 18:59:02 crc kubenswrapper[4826]: E0319 18:59:02.118402 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zkslk" podUID="7109581b-42ad-4e72-89be-ae269dcaea42" Mar 19 18:59:02 crc kubenswrapper[4826]: E0319 18:59:02.153568 4826 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 19 18:59:02 crc kubenswrapper[4826]: E0319 18:59:02.153744 4826 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 18:59:02 crc kubenswrapper[4826]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 19 18:59:02 crc kubenswrapper[4826]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gdq6z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29565778-hxldh_openshift-infra(509cd3a8-f3bb-4214-a70b-e589905ad242): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 19 18:59:02 crc kubenswrapper[4826]: > logger="UnhandledError" Mar 19 18:59:02 crc kubenswrapper[4826]: E0319 18:59:02.154853 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29565778-hxldh" podUID="509cd3a8-f3bb-4214-a70b-e589905ad242" Mar 19 18:59:02 crc kubenswrapper[4826]: I0319 18:59:02.157645 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-m7zht_97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a/kube-multus-additional-cni-plugins/0.log" Mar 19 18:59:02 crc kubenswrapper[4826]: I0319 18:59:02.157775 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-m7zht" Mar 19 18:59:02 crc kubenswrapper[4826]: I0319 18:59:02.165388 4826 scope.go:117] "RemoveContainer" containerID="8152ea476a2ea8194af94b4f363420c50aca349e58542c9217f181414248ff58" Mar 19 18:59:02 crc kubenswrapper[4826]: I0319 18:59:02.180166 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-m7zht_97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a/kube-multus-additional-cni-plugins/0.log" Mar 19 18:59:02 crc kubenswrapper[4826]: I0319 18:59:02.180281 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-m7zht" event={"ID":"97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a","Type":"ContainerDied","Data":"ded79eb4312c739b1cf9f964f7b563bbc6467a4fea8d3743fcefa50e46b9d391"} Mar 19 18:59:02 crc kubenswrapper[4826]: I0319 18:59:02.180354 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-m7zht" Mar 19 18:59:02 crc kubenswrapper[4826]: E0319 18:59:02.192242 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29565778-hxldh" podUID="509cd3a8-f3bb-4214-a70b-e589905ad242" Mar 19 18:59:02 crc kubenswrapper[4826]: E0319 18:59:02.192274 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-rrk5r" podUID="f4293235-5c04-462c-bef4-8595d0c89ec6" Mar 19 18:59:02 crc kubenswrapper[4826]: E0319 18:59:02.193156 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-psc5t" podUID="ca0af29a-9ab6-4d5f-a6fd-bdb5f3d5526c" Mar 19 18:59:02 crc kubenswrapper[4826]: I0319 18:59:02.200022 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/76ee4930-51ba-4831-9d42-469ffb000e9d-var-lock\") pod \"installer-9-crc\" (UID: \"76ee4930-51ba-4831-9d42-469ffb000e9d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 18:59:02 crc kubenswrapper[4826]: I0319 18:59:02.200076 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/76ee4930-51ba-4831-9d42-469ffb000e9d-kubelet-dir\") pod \"installer-9-crc\" (UID: \"76ee4930-51ba-4831-9d42-469ffb000e9d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 18:59:02 crc kubenswrapper[4826]: I0319 18:59:02.200111 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/76ee4930-51ba-4831-9d42-469ffb000e9d-kube-api-access\") pod \"installer-9-crc\" (UID: \"76ee4930-51ba-4831-9d42-469ffb000e9d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 18:59:02 crc kubenswrapper[4826]: I0319 18:59:02.200182 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/76ee4930-51ba-4831-9d42-469ffb000e9d-var-lock\") pod \"installer-9-crc\" (UID: \"76ee4930-51ba-4831-9d42-469ffb000e9d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 18:59:02 crc kubenswrapper[4826]: I0319 18:59:02.200248 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/76ee4930-51ba-4831-9d42-469ffb000e9d-kubelet-dir\") pod \"installer-9-crc\" (UID: \"76ee4930-51ba-4831-9d42-469ffb000e9d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 18:59:02 crc kubenswrapper[4826]: I0319 18:59:02.227660 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/76ee4930-51ba-4831-9d42-469ffb000e9d-kube-api-access\") pod \"installer-9-crc\" (UID: \"76ee4930-51ba-4831-9d42-469ffb000e9d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 18:59:02 crc kubenswrapper[4826]: I0319 18:59:02.231018 4826 scope.go:117] "RemoveContainer" containerID="357c925170cbafc742e33faa59acfddfc8282dc11185639e6fdf58e5e556690b" Mar 19 18:59:02 crc kubenswrapper[4826]: I0319 18:59:02.301180 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shfdt\" (UniqueName: \"kubernetes.io/projected/97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a-kube-api-access-shfdt\") pod \"97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a\" (UID: \"97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a\") " Mar 19 18:59:02 crc kubenswrapper[4826]: I0319 18:59:02.301262 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a-tuning-conf-dir\") pod \"97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a\" (UID: \"97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a\") " Mar 19 18:59:02 crc kubenswrapper[4826]: I0319 18:59:02.301429 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a-cni-sysctl-allowlist\") pod \"97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a\" (UID: \"97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a\") " Mar 19 18:59:02 crc kubenswrapper[4826]: I0319 18:59:02.301458 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a-ready\") pod \"97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a\" (UID: \"97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a\") " Mar 19 18:59:02 crc kubenswrapper[4826]: I0319 18:59:02.301568 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a" (UID: "97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 18:59:02 crc kubenswrapper[4826]: I0319 18:59:02.301873 4826 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a-tuning-conf-dir\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:02 crc kubenswrapper[4826]: I0319 18:59:02.302802 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a" (UID: "97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:59:02 crc kubenswrapper[4826]: I0319 18:59:02.303421 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a-ready" (OuterVolumeSpecName: "ready") pod "97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a" (UID: "97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 18:59:02 crc kubenswrapper[4826]: I0319 18:59:02.309339 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a-kube-api-access-shfdt" (OuterVolumeSpecName: "kube-api-access-shfdt") pod "97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a" (UID: "97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a"). InnerVolumeSpecName "kube-api-access-shfdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:59:02 crc kubenswrapper[4826]: I0319 18:59:02.314918 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 19 18:59:02 crc kubenswrapper[4826]: I0319 18:59:02.404077 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shfdt\" (UniqueName: \"kubernetes.io/projected/97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a-kube-api-access-shfdt\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:02 crc kubenswrapper[4826]: I0319 18:59:02.404128 4826 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:02 crc kubenswrapper[4826]: I0319 18:59:02.404141 4826 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a-ready\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:02 crc kubenswrapper[4826]: I0319 18:59:02.405172 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7bc9785dc4-k6mdp"] Mar 19 18:59:02 crc kubenswrapper[4826]: I0319 18:59:02.541315 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-m7zht"] Mar 19 18:59:02 crc kubenswrapper[4826]: I0319 18:59:02.546166 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-m7zht"] Mar 19 18:59:02 crc kubenswrapper[4826]: I0319 18:59:02.555301 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 19 18:59:02 crc kubenswrapper[4826]: W0319 18:59:02.568014 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod76ee4930_51ba_4831_9d42_469ffb000e9d.slice/crio-05f1a59a966c4555c4656f1e8f92f03862063aeb63e6bc4c871f0d2ae77972e4 WatchSource:0}: Error finding container 05f1a59a966c4555c4656f1e8f92f03862063aeb63e6bc4c871f0d2ae77972e4: Status 404 returned error can't find the container with id 05f1a59a966c4555c4656f1e8f92f03862063aeb63e6bc4c871f0d2ae77972e4 Mar 19 18:59:02 crc kubenswrapper[4826]: I0319 18:59:02.658797 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 19 18:59:02 crc kubenswrapper[4826]: I0319 18:59:02.659576 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-668558775b-vj5kd"] Mar 19 18:59:02 crc kubenswrapper[4826]: W0319 18:59:02.660485 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod4fddc778_f1d1_4508_ad83_e9b8aaf31e1c.slice/crio-cf0e05d4d66fad8f9a0e279266348b8a5aa8db9d2c16b5a96d45b233190fd0f0 WatchSource:0}: Error finding container cf0e05d4d66fad8f9a0e279266348b8a5aa8db9d2c16b5a96d45b233190fd0f0: Status 404 returned error can't find the container with id cf0e05d4d66fad8f9a0e279266348b8a5aa8db9d2c16b5a96d45b233190fd0f0 Mar 19 18:59:02 crc kubenswrapper[4826]: W0319 18:59:02.669571 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf127f891_91c7_43fa_bdf7_b58d406eedc5.slice/crio-53f45f0b4e452df815b441d9318bbc6b6434dfed98525e83a535ae30b682fc9b WatchSource:0}: Error finding container 53f45f0b4e452df815b441d9318bbc6b6434dfed98525e83a535ae30b682fc9b: Status 404 returned error can't find the container with id 53f45f0b4e452df815b441d9318bbc6b6434dfed98525e83a535ae30b682fc9b Mar 19 18:59:03 crc kubenswrapper[4826]: I0319 18:59:03.193127 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7bc9785dc4-k6mdp" event={"ID":"82985cb0-104d-4760-a61a-7165a1f888a3","Type":"ContainerStarted","Data":"01b3c31578f744f3d6b6fe2cac7e05b5cc930b37cdc552d66c06a084a128c116"} Mar 19 18:59:03 crc kubenswrapper[4826]: I0319 18:59:03.193181 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7bc9785dc4-k6mdp" event={"ID":"82985cb0-104d-4760-a61a-7165a1f888a3","Type":"ContainerStarted","Data":"e1b217f819097d8eb4d54d04735b03a1ddcb5ce3170f4afcd7d55a475c2f3eb2"} Mar 19 18:59:03 crc kubenswrapper[4826]: I0319 18:59:03.193854 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7bc9785dc4-k6mdp" Mar 19 18:59:03 crc kubenswrapper[4826]: I0319 18:59:03.201584 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-668558775b-vj5kd" event={"ID":"f127f891-91c7-43fa-bdf7-b58d406eedc5","Type":"ContainerStarted","Data":"d52e515a6c04c545013e5e089bf30ad6ccf4f34e7f779fc3b898350043844a73"} Mar 19 18:59:03 crc kubenswrapper[4826]: I0319 18:59:03.201659 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-668558775b-vj5kd" event={"ID":"f127f891-91c7-43fa-bdf7-b58d406eedc5","Type":"ContainerStarted","Data":"53f45f0b4e452df815b441d9318bbc6b6434dfed98525e83a535ae30b682fc9b"} Mar 19 18:59:03 crc kubenswrapper[4826]: I0319 18:59:03.201907 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-668558775b-vj5kd" podUID="f127f891-91c7-43fa-bdf7-b58d406eedc5" containerName="controller-manager" containerID="cri-o://d52e515a6c04c545013e5e089bf30ad6ccf4f34e7f779fc3b898350043844a73" gracePeriod=30 Mar 19 18:59:03 crc kubenswrapper[4826]: I0319 18:59:03.202594 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-668558775b-vj5kd" Mar 19 18:59:03 crc kubenswrapper[4826]: I0319 18:59:03.214388 4826 generic.go:334] "Generic (PLEG): container finished" podID="007d8118-0079-4d3d-b764-01eadbd419c5" containerID="a8589876c52cd6dae86f925424686ad31894136813d605a0bce98396b843ffa0" exitCode=0 Mar 19 18:59:03 crc kubenswrapper[4826]: I0319 18:59:03.214485 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mprm6" event={"ID":"007d8118-0079-4d3d-b764-01eadbd419c5","Type":"ContainerDied","Data":"a8589876c52cd6dae86f925424686ad31894136813d605a0bce98396b843ffa0"} Mar 19 18:59:03 crc kubenswrapper[4826]: I0319 18:59:03.218614 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"76ee4930-51ba-4831-9d42-469ffb000e9d","Type":"ContainerStarted","Data":"5ef4d51ec32ff8ae1d9aaec3ae8413fab8773bb251a2b2746f2b25fafbf0dbc9"} Mar 19 18:59:03 crc kubenswrapper[4826]: I0319 18:59:03.218638 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"76ee4930-51ba-4831-9d42-469ffb000e9d","Type":"ContainerStarted","Data":"05f1a59a966c4555c4656f1e8f92f03862063aeb63e6bc4c871f0d2ae77972e4"} Mar 19 18:59:03 crc kubenswrapper[4826]: I0319 18:59:03.220526 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"4fddc778-f1d1-4508-ad83-e9b8aaf31e1c","Type":"ContainerStarted","Data":"89c21d436d7691d893287a335cf081f552ecec57e5e94c95b37e09095221aa31"} Mar 19 18:59:03 crc kubenswrapper[4826]: I0319 18:59:03.220560 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"4fddc778-f1d1-4508-ad83-e9b8aaf31e1c","Type":"ContainerStarted","Data":"cf0e05d4d66fad8f9a0e279266348b8a5aa8db9d2c16b5a96d45b233190fd0f0"} Mar 19 18:59:03 crc kubenswrapper[4826]: I0319 18:59:03.232302 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7bc9785dc4-k6mdp" podStartSLOduration=10.232272408 podStartE2EDuration="10.232272408s" podCreationTimestamp="2026-03-19 18:58:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:59:03.231363224 +0000 UTC m=+167.985431547" watchObservedRunningTime="2026-03-19 18:59:03.232272408 +0000 UTC m=+167.986340731" Mar 19 18:59:03 crc kubenswrapper[4826]: I0319 18:59:03.246638 4826 patch_prober.go:28] interesting pod/controller-manager-668558775b-vj5kd container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.57:8443/healthz\": EOF" start-of-body= Mar 19 18:59:03 crc kubenswrapper[4826]: I0319 18:59:03.246729 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-668558775b-vj5kd" podUID="f127f891-91c7-43fa-bdf7-b58d406eedc5" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.57:8443/healthz\": EOF" Mar 19 18:59:03 crc kubenswrapper[4826]: I0319 18:59:03.253513 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-668558775b-vj5kd" podStartSLOduration=30.25349116 podStartE2EDuration="30.25349116s" podCreationTimestamp="2026-03-19 18:58:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:59:03.251142077 +0000 UTC m=+168.005210410" watchObservedRunningTime="2026-03-19 18:59:03.25349116 +0000 UTC m=+168.007559473" Mar 19 18:59:03 crc kubenswrapper[4826]: I0319 18:59:03.289237 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=10.289207834 podStartE2EDuration="10.289207834s" podCreationTimestamp="2026-03-19 18:58:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:59:03.282065045 +0000 UTC m=+168.036133368" watchObservedRunningTime="2026-03-19 18:59:03.289207834 +0000 UTC m=+168.043276147" Mar 19 18:59:03 crc kubenswrapper[4826]: I0319 18:59:03.339019 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.338982211 podStartE2EDuration="2.338982211s" podCreationTimestamp="2026-03-19 18:59:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:59:03.3238067 +0000 UTC m=+168.077875023" watchObservedRunningTime="2026-03-19 18:59:03.338982211 +0000 UTC m=+168.093050524" Mar 19 18:59:03 crc kubenswrapper[4826]: I0319 18:59:03.405214 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7bc9785dc4-k6mdp" Mar 19 18:59:03 crc kubenswrapper[4826]: I0319 18:59:03.543562 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-668558775b-vj5kd" Mar 19 18:59:03 crc kubenswrapper[4826]: I0319 18:59:03.633574 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f127f891-91c7-43fa-bdf7-b58d406eedc5-proxy-ca-bundles\") pod \"f127f891-91c7-43fa-bdf7-b58d406eedc5\" (UID: \"f127f891-91c7-43fa-bdf7-b58d406eedc5\") " Mar 19 18:59:03 crc kubenswrapper[4826]: I0319 18:59:03.633739 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f127f891-91c7-43fa-bdf7-b58d406eedc5-serving-cert\") pod \"f127f891-91c7-43fa-bdf7-b58d406eedc5\" (UID: \"f127f891-91c7-43fa-bdf7-b58d406eedc5\") " Mar 19 18:59:03 crc kubenswrapper[4826]: I0319 18:59:03.633776 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wk4gm\" (UniqueName: \"kubernetes.io/projected/f127f891-91c7-43fa-bdf7-b58d406eedc5-kube-api-access-wk4gm\") pod \"f127f891-91c7-43fa-bdf7-b58d406eedc5\" (UID: \"f127f891-91c7-43fa-bdf7-b58d406eedc5\") " Mar 19 18:59:03 crc kubenswrapper[4826]: I0319 18:59:03.633839 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f127f891-91c7-43fa-bdf7-b58d406eedc5-client-ca\") pod \"f127f891-91c7-43fa-bdf7-b58d406eedc5\" (UID: \"f127f891-91c7-43fa-bdf7-b58d406eedc5\") " Mar 19 18:59:03 crc kubenswrapper[4826]: I0319 18:59:03.633862 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f127f891-91c7-43fa-bdf7-b58d406eedc5-config\") pod \"f127f891-91c7-43fa-bdf7-b58d406eedc5\" (UID: \"f127f891-91c7-43fa-bdf7-b58d406eedc5\") " Mar 19 18:59:03 crc kubenswrapper[4826]: I0319 18:59:03.634586 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f127f891-91c7-43fa-bdf7-b58d406eedc5-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "f127f891-91c7-43fa-bdf7-b58d406eedc5" (UID: "f127f891-91c7-43fa-bdf7-b58d406eedc5"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:59:03 crc kubenswrapper[4826]: I0319 18:59:03.634629 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f127f891-91c7-43fa-bdf7-b58d406eedc5-client-ca" (OuterVolumeSpecName: "client-ca") pod "f127f891-91c7-43fa-bdf7-b58d406eedc5" (UID: "f127f891-91c7-43fa-bdf7-b58d406eedc5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:59:03 crc kubenswrapper[4826]: I0319 18:59:03.634819 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f127f891-91c7-43fa-bdf7-b58d406eedc5-config" (OuterVolumeSpecName: "config") pod "f127f891-91c7-43fa-bdf7-b58d406eedc5" (UID: "f127f891-91c7-43fa-bdf7-b58d406eedc5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:59:03 crc kubenswrapper[4826]: I0319 18:59:03.644815 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f127f891-91c7-43fa-bdf7-b58d406eedc5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f127f891-91c7-43fa-bdf7-b58d406eedc5" (UID: "f127f891-91c7-43fa-bdf7-b58d406eedc5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:59:03 crc kubenswrapper[4826]: I0319 18:59:03.658827 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f127f891-91c7-43fa-bdf7-b58d406eedc5-kube-api-access-wk4gm" (OuterVolumeSpecName: "kube-api-access-wk4gm") pod "f127f891-91c7-43fa-bdf7-b58d406eedc5" (UID: "f127f891-91c7-43fa-bdf7-b58d406eedc5"). InnerVolumeSpecName "kube-api-access-wk4gm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:59:03 crc kubenswrapper[4826]: I0319 18:59:03.735630 4826 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f127f891-91c7-43fa-bdf7-b58d406eedc5-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:03 crc kubenswrapper[4826]: I0319 18:59:03.735714 4826 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f127f891-91c7-43fa-bdf7-b58d406eedc5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:03 crc kubenswrapper[4826]: I0319 18:59:03.735727 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wk4gm\" (UniqueName: \"kubernetes.io/projected/f127f891-91c7-43fa-bdf7-b58d406eedc5-kube-api-access-wk4gm\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:03 crc kubenswrapper[4826]: I0319 18:59:03.735743 4826 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f127f891-91c7-43fa-bdf7-b58d406eedc5-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:03 crc kubenswrapper[4826]: I0319 18:59:03.735755 4826 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f127f891-91c7-43fa-bdf7-b58d406eedc5-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:03 crc kubenswrapper[4826]: I0319 18:59:03.989535 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a" path="/var/lib/kubelet/pods/97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a/volumes" Mar 19 18:59:04 crc kubenswrapper[4826]: I0319 18:59:04.231234 4826 generic.go:334] "Generic (PLEG): container finished" podID="f127f891-91c7-43fa-bdf7-b58d406eedc5" containerID="d52e515a6c04c545013e5e089bf30ad6ccf4f34e7f779fc3b898350043844a73" exitCode=0 Mar 19 18:59:04 crc kubenswrapper[4826]: I0319 18:59:04.231357 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-668558775b-vj5kd" Mar 19 18:59:04 crc kubenswrapper[4826]: I0319 18:59:04.231363 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-668558775b-vj5kd" event={"ID":"f127f891-91c7-43fa-bdf7-b58d406eedc5","Type":"ContainerDied","Data":"d52e515a6c04c545013e5e089bf30ad6ccf4f34e7f779fc3b898350043844a73"} Mar 19 18:59:04 crc kubenswrapper[4826]: I0319 18:59:04.231904 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-668558775b-vj5kd" event={"ID":"f127f891-91c7-43fa-bdf7-b58d406eedc5","Type":"ContainerDied","Data":"53f45f0b4e452df815b441d9318bbc6b6434dfed98525e83a535ae30b682fc9b"} Mar 19 18:59:04 crc kubenswrapper[4826]: I0319 18:59:04.231945 4826 scope.go:117] "RemoveContainer" containerID="d52e515a6c04c545013e5e089bf30ad6ccf4f34e7f779fc3b898350043844a73" Mar 19 18:59:04 crc kubenswrapper[4826]: I0319 18:59:04.236422 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mprm6" event={"ID":"007d8118-0079-4d3d-b764-01eadbd419c5","Type":"ContainerStarted","Data":"dc231ee940cf53856464bb73ce19aa82ccbd10d82c550f08fdc7fb3f0df5f7c3"} Mar 19 18:59:04 crc kubenswrapper[4826]: I0319 18:59:04.241952 4826 generic.go:334] "Generic (PLEG): container finished" podID="4fddc778-f1d1-4508-ad83-e9b8aaf31e1c" containerID="89c21d436d7691d893287a335cf081f552ecec57e5e94c95b37e09095221aa31" exitCode=0 Mar 19 18:59:04 crc kubenswrapper[4826]: I0319 18:59:04.242619 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"4fddc778-f1d1-4508-ad83-e9b8aaf31e1c","Type":"ContainerDied","Data":"89c21d436d7691d893287a335cf081f552ecec57e5e94c95b37e09095221aa31"} Mar 19 18:59:04 crc kubenswrapper[4826]: I0319 18:59:04.259388 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-668558775b-vj5kd"] Mar 19 18:59:04 crc kubenswrapper[4826]: I0319 18:59:04.262345 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-668558775b-vj5kd"] Mar 19 18:59:04 crc kubenswrapper[4826]: I0319 18:59:04.262740 4826 scope.go:117] "RemoveContainer" containerID="d52e515a6c04c545013e5e089bf30ad6ccf4f34e7f779fc3b898350043844a73" Mar 19 18:59:04 crc kubenswrapper[4826]: E0319 18:59:04.263153 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d52e515a6c04c545013e5e089bf30ad6ccf4f34e7f779fc3b898350043844a73\": container with ID starting with d52e515a6c04c545013e5e089bf30ad6ccf4f34e7f779fc3b898350043844a73 not found: ID does not exist" containerID="d52e515a6c04c545013e5e089bf30ad6ccf4f34e7f779fc3b898350043844a73" Mar 19 18:59:04 crc kubenswrapper[4826]: I0319 18:59:04.263206 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d52e515a6c04c545013e5e089bf30ad6ccf4f34e7f779fc3b898350043844a73"} err="failed to get container status \"d52e515a6c04c545013e5e089bf30ad6ccf4f34e7f779fc3b898350043844a73\": rpc error: code = NotFound desc = could not find container \"d52e515a6c04c545013e5e089bf30ad6ccf4f34e7f779fc3b898350043844a73\": container with ID starting with d52e515a6c04c545013e5e089bf30ad6ccf4f34e7f779fc3b898350043844a73 not found: ID does not exist" Mar 19 18:59:04 crc kubenswrapper[4826]: I0319 18:59:04.289023 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mprm6" podStartSLOduration=5.705496436 podStartE2EDuration="50.288996527s" podCreationTimestamp="2026-03-19 18:58:14 +0000 UTC" firstStartedPulling="2026-03-19 18:58:19.082351659 +0000 UTC m=+123.836419972" lastFinishedPulling="2026-03-19 18:59:03.66585175 +0000 UTC m=+168.419920063" observedRunningTime="2026-03-19 18:59:04.285630519 +0000 UTC m=+169.039698892" watchObservedRunningTime="2026-03-19 18:59:04.288996527 +0000 UTC m=+169.043064880" Mar 19 18:59:05 crc kubenswrapper[4826]: I0319 18:59:05.043062 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-76ff657d46-cchgd"] Mar 19 18:59:05 crc kubenswrapper[4826]: E0319 18:59:05.043349 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f127f891-91c7-43fa-bdf7-b58d406eedc5" containerName="controller-manager" Mar 19 18:59:05 crc kubenswrapper[4826]: I0319 18:59:05.043369 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="f127f891-91c7-43fa-bdf7-b58d406eedc5" containerName="controller-manager" Mar 19 18:59:05 crc kubenswrapper[4826]: E0319 18:59:05.043396 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a" containerName="kube-multus-additional-cni-plugins" Mar 19 18:59:05 crc kubenswrapper[4826]: I0319 18:59:05.043404 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a" containerName="kube-multus-additional-cni-plugins" Mar 19 18:59:05 crc kubenswrapper[4826]: I0319 18:59:05.043511 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="97b226df-3fc4-42ff-a5ec-37b9bbf1cd1a" containerName="kube-multus-additional-cni-plugins" Mar 19 18:59:05 crc kubenswrapper[4826]: I0319 18:59:05.043524 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="f127f891-91c7-43fa-bdf7-b58d406eedc5" containerName="controller-manager" Mar 19 18:59:05 crc kubenswrapper[4826]: I0319 18:59:05.043994 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76ff657d46-cchgd" Mar 19 18:59:05 crc kubenswrapper[4826]: I0319 18:59:05.048357 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 18:59:05 crc kubenswrapper[4826]: I0319 18:59:05.048987 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 18:59:05 crc kubenswrapper[4826]: I0319 18:59:05.049334 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 18:59:05 crc kubenswrapper[4826]: I0319 18:59:05.049780 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 19 18:59:05 crc kubenswrapper[4826]: I0319 18:59:05.049840 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 18:59:05 crc kubenswrapper[4826]: I0319 18:59:05.050188 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 18:59:05 crc kubenswrapper[4826]: I0319 18:59:05.060691 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 18:59:05 crc kubenswrapper[4826]: I0319 18:59:05.063354 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-76ff657d46-cchgd"] Mar 19 18:59:05 crc kubenswrapper[4826]: I0319 18:59:05.170529 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p47cs\" (UniqueName: \"kubernetes.io/projected/fb68a190-8139-4eb8-bdd5-d8dcfdb82169-kube-api-access-p47cs\") pod \"controller-manager-76ff657d46-cchgd\" (UID: \"fb68a190-8139-4eb8-bdd5-d8dcfdb82169\") " pod="openshift-controller-manager/controller-manager-76ff657d46-cchgd" Mar 19 18:59:05 crc kubenswrapper[4826]: I0319 18:59:05.170599 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fb68a190-8139-4eb8-bdd5-d8dcfdb82169-proxy-ca-bundles\") pod \"controller-manager-76ff657d46-cchgd\" (UID: \"fb68a190-8139-4eb8-bdd5-d8dcfdb82169\") " pod="openshift-controller-manager/controller-manager-76ff657d46-cchgd" Mar 19 18:59:05 crc kubenswrapper[4826]: I0319 18:59:05.170645 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb68a190-8139-4eb8-bdd5-d8dcfdb82169-serving-cert\") pod \"controller-manager-76ff657d46-cchgd\" (UID: \"fb68a190-8139-4eb8-bdd5-d8dcfdb82169\") " pod="openshift-controller-manager/controller-manager-76ff657d46-cchgd" Mar 19 18:59:05 crc kubenswrapper[4826]: I0319 18:59:05.170705 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fb68a190-8139-4eb8-bdd5-d8dcfdb82169-client-ca\") pod \"controller-manager-76ff657d46-cchgd\" (UID: \"fb68a190-8139-4eb8-bdd5-d8dcfdb82169\") " pod="openshift-controller-manager/controller-manager-76ff657d46-cchgd" Mar 19 18:59:05 crc kubenswrapper[4826]: I0319 18:59:05.170746 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb68a190-8139-4eb8-bdd5-d8dcfdb82169-config\") pod \"controller-manager-76ff657d46-cchgd\" (UID: \"fb68a190-8139-4eb8-bdd5-d8dcfdb82169\") " pod="openshift-controller-manager/controller-manager-76ff657d46-cchgd" Mar 19 18:59:05 crc kubenswrapper[4826]: I0319 18:59:05.272173 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb68a190-8139-4eb8-bdd5-d8dcfdb82169-config\") pod \"controller-manager-76ff657d46-cchgd\" (UID: \"fb68a190-8139-4eb8-bdd5-d8dcfdb82169\") " pod="openshift-controller-manager/controller-manager-76ff657d46-cchgd" Mar 19 18:59:05 crc kubenswrapper[4826]: I0319 18:59:05.272267 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p47cs\" (UniqueName: \"kubernetes.io/projected/fb68a190-8139-4eb8-bdd5-d8dcfdb82169-kube-api-access-p47cs\") pod \"controller-manager-76ff657d46-cchgd\" (UID: \"fb68a190-8139-4eb8-bdd5-d8dcfdb82169\") " pod="openshift-controller-manager/controller-manager-76ff657d46-cchgd" Mar 19 18:59:05 crc kubenswrapper[4826]: I0319 18:59:05.272292 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fb68a190-8139-4eb8-bdd5-d8dcfdb82169-proxy-ca-bundles\") pod \"controller-manager-76ff657d46-cchgd\" (UID: \"fb68a190-8139-4eb8-bdd5-d8dcfdb82169\") " pod="openshift-controller-manager/controller-manager-76ff657d46-cchgd" Mar 19 18:59:05 crc kubenswrapper[4826]: I0319 18:59:05.272321 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb68a190-8139-4eb8-bdd5-d8dcfdb82169-serving-cert\") pod \"controller-manager-76ff657d46-cchgd\" (UID: \"fb68a190-8139-4eb8-bdd5-d8dcfdb82169\") " pod="openshift-controller-manager/controller-manager-76ff657d46-cchgd" Mar 19 18:59:05 crc kubenswrapper[4826]: I0319 18:59:05.272350 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fb68a190-8139-4eb8-bdd5-d8dcfdb82169-client-ca\") pod \"controller-manager-76ff657d46-cchgd\" (UID: \"fb68a190-8139-4eb8-bdd5-d8dcfdb82169\") " pod="openshift-controller-manager/controller-manager-76ff657d46-cchgd" Mar 19 18:59:05 crc kubenswrapper[4826]: I0319 18:59:05.273432 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fb68a190-8139-4eb8-bdd5-d8dcfdb82169-client-ca\") pod \"controller-manager-76ff657d46-cchgd\" (UID: \"fb68a190-8139-4eb8-bdd5-d8dcfdb82169\") " pod="openshift-controller-manager/controller-manager-76ff657d46-cchgd" Mar 19 18:59:05 crc kubenswrapper[4826]: I0319 18:59:05.276145 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fb68a190-8139-4eb8-bdd5-d8dcfdb82169-proxy-ca-bundles\") pod \"controller-manager-76ff657d46-cchgd\" (UID: \"fb68a190-8139-4eb8-bdd5-d8dcfdb82169\") " pod="openshift-controller-manager/controller-manager-76ff657d46-cchgd" Mar 19 18:59:05 crc kubenswrapper[4826]: I0319 18:59:05.276696 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb68a190-8139-4eb8-bdd5-d8dcfdb82169-config\") pod \"controller-manager-76ff657d46-cchgd\" (UID: \"fb68a190-8139-4eb8-bdd5-d8dcfdb82169\") " pod="openshift-controller-manager/controller-manager-76ff657d46-cchgd" Mar 19 18:59:05 crc kubenswrapper[4826]: I0319 18:59:05.282761 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb68a190-8139-4eb8-bdd5-d8dcfdb82169-serving-cert\") pod \"controller-manager-76ff657d46-cchgd\" (UID: \"fb68a190-8139-4eb8-bdd5-d8dcfdb82169\") " pod="openshift-controller-manager/controller-manager-76ff657d46-cchgd" Mar 19 18:59:05 crc kubenswrapper[4826]: I0319 18:59:05.289203 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p47cs\" (UniqueName: \"kubernetes.io/projected/fb68a190-8139-4eb8-bdd5-d8dcfdb82169-kube-api-access-p47cs\") pod \"controller-manager-76ff657d46-cchgd\" (UID: \"fb68a190-8139-4eb8-bdd5-d8dcfdb82169\") " pod="openshift-controller-manager/controller-manager-76ff657d46-cchgd" Mar 19 18:59:05 crc kubenswrapper[4826]: I0319 18:59:05.384057 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76ff657d46-cchgd" Mar 19 18:59:05 crc kubenswrapper[4826]: I0319 18:59:05.498920 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 18:59:05 crc kubenswrapper[4826]: I0319 18:59:05.540830 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mprm6" Mar 19 18:59:05 crc kubenswrapper[4826]: I0319 18:59:05.540896 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mprm6" Mar 19 18:59:05 crc kubenswrapper[4826]: I0319 18:59:05.622454 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-76ff657d46-cchgd"] Mar 19 18:59:05 crc kubenswrapper[4826]: W0319 18:59:05.632534 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb68a190_8139_4eb8_bdd5_d8dcfdb82169.slice/crio-6089af8eaadb0662cc7b2c6a85be35566c22db7b93f3e3e990cd480bc4536a49 WatchSource:0}: Error finding container 6089af8eaadb0662cc7b2c6a85be35566c22db7b93f3e3e990cd480bc4536a49: Status 404 returned error can't find the container with id 6089af8eaadb0662cc7b2c6a85be35566c22db7b93f3e3e990cd480bc4536a49 Mar 19 18:59:05 crc kubenswrapper[4826]: I0319 18:59:05.693021 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4fddc778-f1d1-4508-ad83-e9b8aaf31e1c-kube-api-access\") pod \"4fddc778-f1d1-4508-ad83-e9b8aaf31e1c\" (UID: \"4fddc778-f1d1-4508-ad83-e9b8aaf31e1c\") " Mar 19 18:59:05 crc kubenswrapper[4826]: I0319 18:59:05.693154 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4fddc778-f1d1-4508-ad83-e9b8aaf31e1c-kubelet-dir\") pod \"4fddc778-f1d1-4508-ad83-e9b8aaf31e1c\" (UID: \"4fddc778-f1d1-4508-ad83-e9b8aaf31e1c\") " Mar 19 18:59:05 crc kubenswrapper[4826]: I0319 18:59:05.693241 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4fddc778-f1d1-4508-ad83-e9b8aaf31e1c-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "4fddc778-f1d1-4508-ad83-e9b8aaf31e1c" (UID: "4fddc778-f1d1-4508-ad83-e9b8aaf31e1c"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 18:59:05 crc kubenswrapper[4826]: I0319 18:59:05.693514 4826 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4fddc778-f1d1-4508-ad83-e9b8aaf31e1c-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:05 crc kubenswrapper[4826]: I0319 18:59:05.699199 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fddc778-f1d1-4508-ad83-e9b8aaf31e1c-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "4fddc778-f1d1-4508-ad83-e9b8aaf31e1c" (UID: "4fddc778-f1d1-4508-ad83-e9b8aaf31e1c"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:59:05 crc kubenswrapper[4826]: I0319 18:59:05.795297 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4fddc778-f1d1-4508-ad83-e9b8aaf31e1c-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:05 crc kubenswrapper[4826]: I0319 18:59:05.985467 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f127f891-91c7-43fa-bdf7-b58d406eedc5" path="/var/lib/kubelet/pods/f127f891-91c7-43fa-bdf7-b58d406eedc5/volumes" Mar 19 18:59:06 crc kubenswrapper[4826]: I0319 18:59:06.258786 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-76ff657d46-cchgd" event={"ID":"fb68a190-8139-4eb8-bdd5-d8dcfdb82169","Type":"ContainerStarted","Data":"b4f673ab361734fbd1fbb73179feea59adeaae81b4bd487affaf97185e3ba57c"} Mar 19 18:59:06 crc kubenswrapper[4826]: I0319 18:59:06.258891 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-76ff657d46-cchgd" Mar 19 18:59:06 crc kubenswrapper[4826]: I0319 18:59:06.258912 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-76ff657d46-cchgd" event={"ID":"fb68a190-8139-4eb8-bdd5-d8dcfdb82169","Type":"ContainerStarted","Data":"6089af8eaadb0662cc7b2c6a85be35566c22db7b93f3e3e990cd480bc4536a49"} Mar 19 18:59:06 crc kubenswrapper[4826]: I0319 18:59:06.261021 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"4fddc778-f1d1-4508-ad83-e9b8aaf31e1c","Type":"ContainerDied","Data":"cf0e05d4d66fad8f9a0e279266348b8a5aa8db9d2c16b5a96d45b233190fd0f0"} Mar 19 18:59:06 crc kubenswrapper[4826]: I0319 18:59:06.261065 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf0e05d4d66fad8f9a0e279266348b8a5aa8db9d2c16b5a96d45b233190fd0f0" Mar 19 18:59:06 crc kubenswrapper[4826]: I0319 18:59:06.261068 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 18:59:06 crc kubenswrapper[4826]: I0319 18:59:06.262832 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-76ff657d46-cchgd" Mar 19 18:59:06 crc kubenswrapper[4826]: I0319 18:59:06.277493 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-76ff657d46-cchgd" podStartSLOduration=13.277442811 podStartE2EDuration="13.277442811s" podCreationTimestamp="2026-03-19 18:58:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:59:06.275982662 +0000 UTC m=+171.030050975" watchObservedRunningTime="2026-03-19 18:59:06.277442811 +0000 UTC m=+171.031511144" Mar 19 18:59:06 crc kubenswrapper[4826]: I0319 18:59:06.681464 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-mprm6" podUID="007d8118-0079-4d3d-b764-01eadbd419c5" containerName="registry-server" probeResult="failure" output=< Mar 19 18:59:06 crc kubenswrapper[4826]: timeout: failed to connect service ":50051" within 1s Mar 19 18:59:06 crc kubenswrapper[4826]: > Mar 19 18:59:13 crc kubenswrapper[4826]: I0319 18:59:13.317782 4826 generic.go:334] "Generic (PLEG): container finished" podID="0e0d7689-755d-4e24-a337-4177c37c2437" containerID="5569e1509803a7a675cd2dc0449d62b6c078fc25922ae818777be6d8239b723b" exitCode=0 Mar 19 18:59:13 crc kubenswrapper[4826]: I0319 18:59:13.317867 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2fpvj" event={"ID":"0e0d7689-755d-4e24-a337-4177c37c2437","Type":"ContainerDied","Data":"5569e1509803a7a675cd2dc0449d62b6c078fc25922ae818777be6d8239b723b"} Mar 19 18:59:13 crc kubenswrapper[4826]: I0319 18:59:13.884771 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-76ff657d46-cchgd"] Mar 19 18:59:13 crc kubenswrapper[4826]: I0319 18:59:13.885177 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-76ff657d46-cchgd" podUID="fb68a190-8139-4eb8-bdd5-d8dcfdb82169" containerName="controller-manager" containerID="cri-o://b4f673ab361734fbd1fbb73179feea59adeaae81b4bd487affaf97185e3ba57c" gracePeriod=30 Mar 19 18:59:13 crc kubenswrapper[4826]: I0319 18:59:13.893548 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7bc9785dc4-k6mdp"] Mar 19 18:59:13 crc kubenswrapper[4826]: I0319 18:59:13.893802 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7bc9785dc4-k6mdp" podUID="82985cb0-104d-4760-a61a-7165a1f888a3" containerName="route-controller-manager" containerID="cri-o://01b3c31578f744f3d6b6fe2cac7e05b5cc930b37cdc552d66c06a084a128c116" gracePeriod=30 Mar 19 18:59:14 crc kubenswrapper[4826]: I0319 18:59:14.224881 4826 patch_prober.go:28] interesting pod/route-controller-manager-7bc9785dc4-k6mdp container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: connect: connection refused" start-of-body= Mar 19 18:59:14 crc kubenswrapper[4826]: I0319 18:59:14.225595 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7bc9785dc4-k6mdp" podUID="82985cb0-104d-4760-a61a-7165a1f888a3" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: connect: connection refused" Mar 19 18:59:14 crc kubenswrapper[4826]: I0319 18:59:14.328153 4826 generic.go:334] "Generic (PLEG): container finished" podID="fb68a190-8139-4eb8-bdd5-d8dcfdb82169" containerID="b4f673ab361734fbd1fbb73179feea59adeaae81b4bd487affaf97185e3ba57c" exitCode=0 Mar 19 18:59:14 crc kubenswrapper[4826]: I0319 18:59:14.328830 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-76ff657d46-cchgd" event={"ID":"fb68a190-8139-4eb8-bdd5-d8dcfdb82169","Type":"ContainerDied","Data":"b4f673ab361734fbd1fbb73179feea59adeaae81b4bd487affaf97185e3ba57c"} Mar 19 18:59:14 crc kubenswrapper[4826]: I0319 18:59:14.549089 4826 ???:1] "http: TLS handshake error from 192.168.126.11:56836: no serving certificate available for the kubelet" Mar 19 18:59:15 crc kubenswrapper[4826]: I0319 18:59:15.346418 4826 generic.go:334] "Generic (PLEG): container finished" podID="82985cb0-104d-4760-a61a-7165a1f888a3" containerID="01b3c31578f744f3d6b6fe2cac7e05b5cc930b37cdc552d66c06a084a128c116" exitCode=0 Mar 19 18:59:15 crc kubenswrapper[4826]: I0319 18:59:15.346474 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7bc9785dc4-k6mdp" event={"ID":"82985cb0-104d-4760-a61a-7165a1f888a3","Type":"ContainerDied","Data":"01b3c31578f744f3d6b6fe2cac7e05b5cc930b37cdc552d66c06a084a128c116"} Mar 19 18:59:15 crc kubenswrapper[4826]: I0319 18:59:15.397366 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7bc9785dc4-k6mdp" Mar 19 18:59:15 crc kubenswrapper[4826]: I0319 18:59:15.432070 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67d7bffc66-k8x2s"] Mar 19 18:59:15 crc kubenswrapper[4826]: E0319 18:59:15.433329 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82985cb0-104d-4760-a61a-7165a1f888a3" containerName="route-controller-manager" Mar 19 18:59:15 crc kubenswrapper[4826]: I0319 18:59:15.433378 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="82985cb0-104d-4760-a61a-7165a1f888a3" containerName="route-controller-manager" Mar 19 18:59:15 crc kubenswrapper[4826]: E0319 18:59:15.433426 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fddc778-f1d1-4508-ad83-e9b8aaf31e1c" containerName="pruner" Mar 19 18:59:15 crc kubenswrapper[4826]: I0319 18:59:15.433436 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fddc778-f1d1-4508-ad83-e9b8aaf31e1c" containerName="pruner" Mar 19 18:59:15 crc kubenswrapper[4826]: I0319 18:59:15.433569 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fddc778-f1d1-4508-ad83-e9b8aaf31e1c" containerName="pruner" Mar 19 18:59:15 crc kubenswrapper[4826]: I0319 18:59:15.433584 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="82985cb0-104d-4760-a61a-7165a1f888a3" containerName="route-controller-manager" Mar 19 18:59:15 crc kubenswrapper[4826]: I0319 18:59:15.434458 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67d7bffc66-k8x2s" Mar 19 18:59:15 crc kubenswrapper[4826]: I0319 18:59:15.448591 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67d7bffc66-k8x2s"] Mar 19 18:59:15 crc kubenswrapper[4826]: I0319 18:59:15.512815 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76ff657d46-cchgd" Mar 19 18:59:15 crc kubenswrapper[4826]: I0319 18:59:15.545465 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/82985cb0-104d-4760-a61a-7165a1f888a3-client-ca\") pod \"82985cb0-104d-4760-a61a-7165a1f888a3\" (UID: \"82985cb0-104d-4760-a61a-7165a1f888a3\") " Mar 19 18:59:15 crc kubenswrapper[4826]: I0319 18:59:15.545583 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fpq8\" (UniqueName: \"kubernetes.io/projected/82985cb0-104d-4760-a61a-7165a1f888a3-kube-api-access-8fpq8\") pod \"82985cb0-104d-4760-a61a-7165a1f888a3\" (UID: \"82985cb0-104d-4760-a61a-7165a1f888a3\") " Mar 19 18:59:15 crc kubenswrapper[4826]: I0319 18:59:15.545629 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82985cb0-104d-4760-a61a-7165a1f888a3-config\") pod \"82985cb0-104d-4760-a61a-7165a1f888a3\" (UID: \"82985cb0-104d-4760-a61a-7165a1f888a3\") " Mar 19 18:59:15 crc kubenswrapper[4826]: I0319 18:59:15.549283 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p47cs\" (UniqueName: \"kubernetes.io/projected/fb68a190-8139-4eb8-bdd5-d8dcfdb82169-kube-api-access-p47cs\") pod \"fb68a190-8139-4eb8-bdd5-d8dcfdb82169\" (UID: \"fb68a190-8139-4eb8-bdd5-d8dcfdb82169\") " Mar 19 18:59:15 crc kubenswrapper[4826]: I0319 18:59:15.549366 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fb68a190-8139-4eb8-bdd5-d8dcfdb82169-client-ca\") pod \"fb68a190-8139-4eb8-bdd5-d8dcfdb82169\" (UID: \"fb68a190-8139-4eb8-bdd5-d8dcfdb82169\") " Mar 19 18:59:15 crc kubenswrapper[4826]: I0319 18:59:15.549402 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82985cb0-104d-4760-a61a-7165a1f888a3-serving-cert\") pod \"82985cb0-104d-4760-a61a-7165a1f888a3\" (UID: \"82985cb0-104d-4760-a61a-7165a1f888a3\") " Mar 19 18:59:15 crc kubenswrapper[4826]: I0319 18:59:15.549439 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb68a190-8139-4eb8-bdd5-d8dcfdb82169-config\") pod \"fb68a190-8139-4eb8-bdd5-d8dcfdb82169\" (UID: \"fb68a190-8139-4eb8-bdd5-d8dcfdb82169\") " Mar 19 18:59:15 crc kubenswrapper[4826]: I0319 18:59:15.549629 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdjhq\" (UniqueName: \"kubernetes.io/projected/a7fcb636-1429-47d8-8bef-ad1131064a3d-kube-api-access-cdjhq\") pod \"route-controller-manager-67d7bffc66-k8x2s\" (UID: \"a7fcb636-1429-47d8-8bef-ad1131064a3d\") " pod="openshift-route-controller-manager/route-controller-manager-67d7bffc66-k8x2s" Mar 19 18:59:15 crc kubenswrapper[4826]: I0319 18:59:15.549754 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a7fcb636-1429-47d8-8bef-ad1131064a3d-client-ca\") pod \"route-controller-manager-67d7bffc66-k8x2s\" (UID: \"a7fcb636-1429-47d8-8bef-ad1131064a3d\") " pod="openshift-route-controller-manager/route-controller-manager-67d7bffc66-k8x2s" Mar 19 18:59:15 crc kubenswrapper[4826]: I0319 18:59:15.549802 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7fcb636-1429-47d8-8bef-ad1131064a3d-config\") pod \"route-controller-manager-67d7bffc66-k8x2s\" (UID: \"a7fcb636-1429-47d8-8bef-ad1131064a3d\") " pod="openshift-route-controller-manager/route-controller-manager-67d7bffc66-k8x2s" Mar 19 18:59:15 crc kubenswrapper[4826]: I0319 18:59:15.549833 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7fcb636-1429-47d8-8bef-ad1131064a3d-serving-cert\") pod \"route-controller-manager-67d7bffc66-k8x2s\" (UID: \"a7fcb636-1429-47d8-8bef-ad1131064a3d\") " pod="openshift-route-controller-manager/route-controller-manager-67d7bffc66-k8x2s" Mar 19 18:59:15 crc kubenswrapper[4826]: I0319 18:59:15.555140 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82985cb0-104d-4760-a61a-7165a1f888a3-client-ca" (OuterVolumeSpecName: "client-ca") pod "82985cb0-104d-4760-a61a-7165a1f888a3" (UID: "82985cb0-104d-4760-a61a-7165a1f888a3"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:59:15 crc kubenswrapper[4826]: I0319 18:59:15.557409 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb68a190-8139-4eb8-bdd5-d8dcfdb82169-config" (OuterVolumeSpecName: "config") pod "fb68a190-8139-4eb8-bdd5-d8dcfdb82169" (UID: "fb68a190-8139-4eb8-bdd5-d8dcfdb82169"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:59:15 crc kubenswrapper[4826]: I0319 18:59:15.558780 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82985cb0-104d-4760-a61a-7165a1f888a3-config" (OuterVolumeSpecName: "config") pod "82985cb0-104d-4760-a61a-7165a1f888a3" (UID: "82985cb0-104d-4760-a61a-7165a1f888a3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:59:15 crc kubenswrapper[4826]: I0319 18:59:15.560129 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb68a190-8139-4eb8-bdd5-d8dcfdb82169-client-ca" (OuterVolumeSpecName: "client-ca") pod "fb68a190-8139-4eb8-bdd5-d8dcfdb82169" (UID: "fb68a190-8139-4eb8-bdd5-d8dcfdb82169"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:59:15 crc kubenswrapper[4826]: I0319 18:59:15.560583 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82985cb0-104d-4760-a61a-7165a1f888a3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "82985cb0-104d-4760-a61a-7165a1f888a3" (UID: "82985cb0-104d-4760-a61a-7165a1f888a3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:59:15 crc kubenswrapper[4826]: I0319 18:59:15.560783 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82985cb0-104d-4760-a61a-7165a1f888a3-kube-api-access-8fpq8" (OuterVolumeSpecName: "kube-api-access-8fpq8") pod "82985cb0-104d-4760-a61a-7165a1f888a3" (UID: "82985cb0-104d-4760-a61a-7165a1f888a3"). InnerVolumeSpecName "kube-api-access-8fpq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:59:15 crc kubenswrapper[4826]: I0319 18:59:15.563052 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb68a190-8139-4eb8-bdd5-d8dcfdb82169-kube-api-access-p47cs" (OuterVolumeSpecName: "kube-api-access-p47cs") pod "fb68a190-8139-4eb8-bdd5-d8dcfdb82169" (UID: "fb68a190-8139-4eb8-bdd5-d8dcfdb82169"). InnerVolumeSpecName "kube-api-access-p47cs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:59:15 crc kubenswrapper[4826]: I0319 18:59:15.617642 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mprm6" Mar 19 18:59:15 crc kubenswrapper[4826]: I0319 18:59:15.650713 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fb68a190-8139-4eb8-bdd5-d8dcfdb82169-proxy-ca-bundles\") pod \"fb68a190-8139-4eb8-bdd5-d8dcfdb82169\" (UID: \"fb68a190-8139-4eb8-bdd5-d8dcfdb82169\") " Mar 19 18:59:15 crc kubenswrapper[4826]: I0319 18:59:15.650773 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb68a190-8139-4eb8-bdd5-d8dcfdb82169-serving-cert\") pod \"fb68a190-8139-4eb8-bdd5-d8dcfdb82169\" (UID: \"fb68a190-8139-4eb8-bdd5-d8dcfdb82169\") " Mar 19 18:59:15 crc kubenswrapper[4826]: I0319 18:59:15.650949 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdjhq\" (UniqueName: \"kubernetes.io/projected/a7fcb636-1429-47d8-8bef-ad1131064a3d-kube-api-access-cdjhq\") pod \"route-controller-manager-67d7bffc66-k8x2s\" (UID: \"a7fcb636-1429-47d8-8bef-ad1131064a3d\") " pod="openshift-route-controller-manager/route-controller-manager-67d7bffc66-k8x2s" Mar 19 18:59:15 crc kubenswrapper[4826]: I0319 18:59:15.651010 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a7fcb636-1429-47d8-8bef-ad1131064a3d-client-ca\") pod \"route-controller-manager-67d7bffc66-k8x2s\" (UID: \"a7fcb636-1429-47d8-8bef-ad1131064a3d\") " pod="openshift-route-controller-manager/route-controller-manager-67d7bffc66-k8x2s" Mar 19 18:59:15 crc kubenswrapper[4826]: I0319 18:59:15.651041 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7fcb636-1429-47d8-8bef-ad1131064a3d-config\") pod \"route-controller-manager-67d7bffc66-k8x2s\" (UID: \"a7fcb636-1429-47d8-8bef-ad1131064a3d\") " pod="openshift-route-controller-manager/route-controller-manager-67d7bffc66-k8x2s" Mar 19 18:59:15 crc kubenswrapper[4826]: I0319 18:59:15.651062 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7fcb636-1429-47d8-8bef-ad1131064a3d-serving-cert\") pod \"route-controller-manager-67d7bffc66-k8x2s\" (UID: \"a7fcb636-1429-47d8-8bef-ad1131064a3d\") " pod="openshift-route-controller-manager/route-controller-manager-67d7bffc66-k8x2s" Mar 19 18:59:15 crc kubenswrapper[4826]: I0319 18:59:15.651137 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fpq8\" (UniqueName: \"kubernetes.io/projected/82985cb0-104d-4760-a61a-7165a1f888a3-kube-api-access-8fpq8\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:15 crc kubenswrapper[4826]: I0319 18:59:15.651155 4826 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82985cb0-104d-4760-a61a-7165a1f888a3-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:15 crc kubenswrapper[4826]: I0319 18:59:15.651167 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p47cs\" (UniqueName: \"kubernetes.io/projected/fb68a190-8139-4eb8-bdd5-d8dcfdb82169-kube-api-access-p47cs\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:15 crc kubenswrapper[4826]: I0319 18:59:15.651177 4826 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fb68a190-8139-4eb8-bdd5-d8dcfdb82169-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:15 crc kubenswrapper[4826]: I0319 18:59:15.651187 4826 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82985cb0-104d-4760-a61a-7165a1f888a3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:15 crc kubenswrapper[4826]: I0319 18:59:15.651198 4826 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb68a190-8139-4eb8-bdd5-d8dcfdb82169-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:15 crc kubenswrapper[4826]: I0319 18:59:15.651206 4826 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/82985cb0-104d-4760-a61a-7165a1f888a3-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:15 crc kubenswrapper[4826]: I0319 18:59:15.651204 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb68a190-8139-4eb8-bdd5-d8dcfdb82169-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "fb68a190-8139-4eb8-bdd5-d8dcfdb82169" (UID: "fb68a190-8139-4eb8-bdd5-d8dcfdb82169"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:59:15 crc kubenswrapper[4826]: I0319 18:59:15.652389 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a7fcb636-1429-47d8-8bef-ad1131064a3d-client-ca\") pod \"route-controller-manager-67d7bffc66-k8x2s\" (UID: \"a7fcb636-1429-47d8-8bef-ad1131064a3d\") " pod="openshift-route-controller-manager/route-controller-manager-67d7bffc66-k8x2s" Mar 19 18:59:15 crc kubenswrapper[4826]: I0319 18:59:15.652928 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7fcb636-1429-47d8-8bef-ad1131064a3d-config\") pod \"route-controller-manager-67d7bffc66-k8x2s\" (UID: \"a7fcb636-1429-47d8-8bef-ad1131064a3d\") " pod="openshift-route-controller-manager/route-controller-manager-67d7bffc66-k8x2s" Mar 19 18:59:15 crc kubenswrapper[4826]: I0319 18:59:15.656119 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7fcb636-1429-47d8-8bef-ad1131064a3d-serving-cert\") pod \"route-controller-manager-67d7bffc66-k8x2s\" (UID: \"a7fcb636-1429-47d8-8bef-ad1131064a3d\") " pod="openshift-route-controller-manager/route-controller-manager-67d7bffc66-k8x2s" Mar 19 18:59:15 crc kubenswrapper[4826]: I0319 18:59:15.656915 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb68a190-8139-4eb8-bdd5-d8dcfdb82169-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "fb68a190-8139-4eb8-bdd5-d8dcfdb82169" (UID: "fb68a190-8139-4eb8-bdd5-d8dcfdb82169"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:59:15 crc kubenswrapper[4826]: I0319 18:59:15.666841 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mprm6" Mar 19 18:59:15 crc kubenswrapper[4826]: I0319 18:59:15.670054 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdjhq\" (UniqueName: \"kubernetes.io/projected/a7fcb636-1429-47d8-8bef-ad1131064a3d-kube-api-access-cdjhq\") pod \"route-controller-manager-67d7bffc66-k8x2s\" (UID: \"a7fcb636-1429-47d8-8bef-ad1131064a3d\") " pod="openshift-route-controller-manager/route-controller-manager-67d7bffc66-k8x2s" Mar 19 18:59:15 crc kubenswrapper[4826]: I0319 18:59:15.752852 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67d7bffc66-k8x2s" Mar 19 18:59:15 crc kubenswrapper[4826]: I0319 18:59:15.753937 4826 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fb68a190-8139-4eb8-bdd5-d8dcfdb82169-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:15 crc kubenswrapper[4826]: I0319 18:59:15.754003 4826 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb68a190-8139-4eb8-bdd5-d8dcfdb82169-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:16 crc kubenswrapper[4826]: I0319 18:59:16.022379 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67d7bffc66-k8x2s"] Mar 19 18:59:16 crc kubenswrapper[4826]: W0319 18:59:16.023502 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7fcb636_1429_47d8_8bef_ad1131064a3d.slice/crio-aad4ebea09c62420944e57a3f7ef31cb4aabb53ae8e3cabec075854a0893d15b WatchSource:0}: Error finding container aad4ebea09c62420944e57a3f7ef31cb4aabb53ae8e3cabec075854a0893d15b: Status 404 returned error can't find the container with id aad4ebea09c62420944e57a3f7ef31cb4aabb53ae8e3cabec075854a0893d15b Mar 19 18:59:16 crc kubenswrapper[4826]: I0319 18:59:16.354751 4826 generic.go:334] "Generic (PLEG): container finished" podID="06fdacd5-0f40-4d55-8df2-67ea56f25595" containerID="d5016f0f73675f0f20a63c82ea7680c6b52a107e28bc5551253a53795e4f16a4" exitCode=0 Mar 19 18:59:16 crc kubenswrapper[4826]: I0319 18:59:16.354862 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vsrvh" event={"ID":"06fdacd5-0f40-4d55-8df2-67ea56f25595","Type":"ContainerDied","Data":"d5016f0f73675f0f20a63c82ea7680c6b52a107e28bc5551253a53795e4f16a4"} Mar 19 18:59:16 crc kubenswrapper[4826]: I0319 18:59:16.364430 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76ff657d46-cchgd" Mar 19 18:59:16 crc kubenswrapper[4826]: I0319 18:59:16.364461 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-76ff657d46-cchgd" event={"ID":"fb68a190-8139-4eb8-bdd5-d8dcfdb82169","Type":"ContainerDied","Data":"6089af8eaadb0662cc7b2c6a85be35566c22db7b93f3e3e990cd480bc4536a49"} Mar 19 18:59:16 crc kubenswrapper[4826]: I0319 18:59:16.365110 4826 scope.go:117] "RemoveContainer" containerID="b4f673ab361734fbd1fbb73179feea59adeaae81b4bd487affaf97185e3ba57c" Mar 19 18:59:16 crc kubenswrapper[4826]: I0319 18:59:16.375696 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7bc9785dc4-k6mdp" event={"ID":"82985cb0-104d-4760-a61a-7165a1f888a3","Type":"ContainerDied","Data":"e1b217f819097d8eb4d54d04735b03a1ddcb5ce3170f4afcd7d55a475c2f3eb2"} Mar 19 18:59:16 crc kubenswrapper[4826]: I0319 18:59:16.375848 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7bc9785dc4-k6mdp" Mar 19 18:59:16 crc kubenswrapper[4826]: I0319 18:59:16.377507 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67d7bffc66-k8x2s" event={"ID":"a7fcb636-1429-47d8-8bef-ad1131064a3d","Type":"ContainerStarted","Data":"aad4ebea09c62420944e57a3f7ef31cb4aabb53ae8e3cabec075854a0893d15b"} Mar 19 18:59:16 crc kubenswrapper[4826]: I0319 18:59:16.385624 4826 patch_prober.go:28] interesting pod/controller-manager-76ff657d46-cchgd container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.62:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 18:59:16 crc kubenswrapper[4826]: I0319 18:59:16.385716 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-76ff657d46-cchgd" podUID="fb68a190-8139-4eb8-bdd5-d8dcfdb82169" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.62:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 18:59:16 crc kubenswrapper[4826]: I0319 18:59:16.401071 4826 scope.go:117] "RemoveContainer" containerID="01b3c31578f744f3d6b6fe2cac7e05b5cc930b37cdc552d66c06a084a128c116" Mar 19 18:59:16 crc kubenswrapper[4826]: I0319 18:59:16.401711 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-76ff657d46-cchgd"] Mar 19 18:59:16 crc kubenswrapper[4826]: I0319 18:59:16.405076 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-76ff657d46-cchgd"] Mar 19 18:59:16 crc kubenswrapper[4826]: I0319 18:59:16.418698 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7bc9785dc4-k6mdp"] Mar 19 18:59:16 crc kubenswrapper[4826]: I0319 18:59:16.424756 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7bc9785dc4-k6mdp"] Mar 19 18:59:17 crc kubenswrapper[4826]: I0319 18:59:17.384953 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2fpvj" event={"ID":"0e0d7689-755d-4e24-a337-4177c37c2437","Type":"ContainerStarted","Data":"bfead77d20c83e15418079844cb872a8336a22b8dcb8f367ca67ff35b0cc0102"} Mar 19 18:59:17 crc kubenswrapper[4826]: I0319 18:59:17.387141 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8rs7z" event={"ID":"dcf719a6-7a63-4efa-b8dd-1beba09934f9","Type":"ContainerStarted","Data":"172c01382de3818f295008f8b636baa62dcbc571906ef7f9f2762319ed12e2e3"} Mar 19 18:59:17 crc kubenswrapper[4826]: I0319 18:59:17.390750 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zkslk" event={"ID":"7109581b-42ad-4e72-89be-ae269dcaea42","Type":"ContainerStarted","Data":"c44f6e6ddff34243570c2ba4a492ca1487ef44f2418bc1bf35e422d1cf526dc2"} Mar 19 18:59:17 crc kubenswrapper[4826]: I0319 18:59:17.394016 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vsrvh" event={"ID":"06fdacd5-0f40-4d55-8df2-67ea56f25595","Type":"ContainerStarted","Data":"2fb7b21ab589ac0efc884c09f33932a8e96fcc9b586b78ee8a7d4ccb502be1db"} Mar 19 18:59:17 crc kubenswrapper[4826]: I0319 18:59:17.396152 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565778-hxldh" event={"ID":"509cd3a8-f3bb-4214-a70b-e589905ad242","Type":"ContainerStarted","Data":"0bec69c3b0d5bc0917d38cbb822b84fec430b28433b779379849e2b62c194246"} Mar 19 18:59:17 crc kubenswrapper[4826]: I0319 18:59:17.398241 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lwdqq" event={"ID":"6397cca1-7284-4e40-9b7e-3f8026c72f5f","Type":"ContainerStarted","Data":"a754d01d13643fd564d9e7d8297d9632187cfd6f652e385b3a82c07a48b74f58"} Mar 19 18:59:17 crc kubenswrapper[4826]: I0319 18:59:17.400119 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-psc5t" event={"ID":"ca0af29a-9ab6-4d5f-a6fd-bdb5f3d5526c","Type":"ContainerStarted","Data":"42b4f9585c0139b86067ff1c754dcab464654a52fca57539b9d091f3022c2398"} Mar 19 18:59:17 crc kubenswrapper[4826]: I0319 18:59:17.401293 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vsrvh" Mar 19 18:59:17 crc kubenswrapper[4826]: I0319 18:59:17.401317 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vsrvh" Mar 19 18:59:17 crc kubenswrapper[4826]: I0319 18:59:17.402939 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67d7bffc66-k8x2s" event={"ID":"a7fcb636-1429-47d8-8bef-ad1131064a3d","Type":"ContainerStarted","Data":"9eae0c06074d29e17e06a536ad4714ae4ccedb34372308a7790b4adc15acdaa9"} Mar 19 18:59:17 crc kubenswrapper[4826]: I0319 18:59:17.404784 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-67d7bffc66-k8x2s" Mar 19 18:59:17 crc kubenswrapper[4826]: I0319 18:59:17.409777 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-67d7bffc66-k8x2s" Mar 19 18:59:17 crc kubenswrapper[4826]: I0319 18:59:17.416547 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2fpvj" podStartSLOduration=3.652139735 podStartE2EDuration="1m1.416530759s" podCreationTimestamp="2026-03-19 18:58:16 +0000 UTC" firstStartedPulling="2026-03-19 18:58:19.082036871 +0000 UTC m=+123.836105184" lastFinishedPulling="2026-03-19 18:59:16.846427895 +0000 UTC m=+181.600496208" observedRunningTime="2026-03-19 18:59:17.415271665 +0000 UTC m=+182.169339978" watchObservedRunningTime="2026-03-19 18:59:17.416530759 +0000 UTC m=+182.170599062" Mar 19 18:59:17 crc kubenswrapper[4826]: I0319 18:59:17.434481 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565778-hxldh" podStartSLOduration=12.812040597 podStartE2EDuration="1m17.434464363s" podCreationTimestamp="2026-03-19 18:58:00 +0000 UTC" firstStartedPulling="2026-03-19 18:58:12.058982192 +0000 UTC m=+116.813050505" lastFinishedPulling="2026-03-19 18:59:16.681405958 +0000 UTC m=+181.435474271" observedRunningTime="2026-03-19 18:59:17.432678796 +0000 UTC m=+182.186747109" watchObservedRunningTime="2026-03-19 18:59:17.434464363 +0000 UTC m=+182.188532676" Mar 19 18:59:17 crc kubenswrapper[4826]: I0319 18:59:17.544134 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-67d7bffc66-k8x2s" podStartSLOduration=4.543405406 podStartE2EDuration="4.543405406s" podCreationTimestamp="2026-03-19 18:59:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:59:17.506469638 +0000 UTC m=+182.260537941" watchObservedRunningTime="2026-03-19 18:59:17.543405406 +0000 UTC m=+182.297473719" Mar 19 18:59:17 crc kubenswrapper[4826]: I0319 18:59:17.576842 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vsrvh" podStartSLOduration=4.914732143 podStartE2EDuration="1m1.5768189s" podCreationTimestamp="2026-03-19 18:58:16 +0000 UTC" firstStartedPulling="2026-03-19 18:58:20.516088635 +0000 UTC m=+125.270156938" lastFinishedPulling="2026-03-19 18:59:17.178175382 +0000 UTC m=+181.932243695" observedRunningTime="2026-03-19 18:59:17.549689182 +0000 UTC m=+182.303757495" watchObservedRunningTime="2026-03-19 18:59:17.5768189 +0000 UTC m=+182.330887213" Mar 19 18:59:17 crc kubenswrapper[4826]: I0319 18:59:17.834155 4826 csr.go:261] certificate signing request csr-fn2bf is approved, waiting to be issued Mar 19 18:59:17 crc kubenswrapper[4826]: I0319 18:59:17.841800 4826 csr.go:257] certificate signing request csr-fn2bf is issued Mar 19 18:59:17 crc kubenswrapper[4826]: I0319 18:59:17.984388 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82985cb0-104d-4760-a61a-7165a1f888a3" path="/var/lib/kubelet/pods/82985cb0-104d-4760-a61a-7165a1f888a3/volumes" Mar 19 18:59:17 crc kubenswrapper[4826]: I0319 18:59:17.985005 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb68a190-8139-4eb8-bdd5-d8dcfdb82169" path="/var/lib/kubelet/pods/fb68a190-8139-4eb8-bdd5-d8dcfdb82169/volumes" Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.028535 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mprm6"] Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.028811 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mprm6" podUID="007d8118-0079-4d3d-b764-01eadbd419c5" containerName="registry-server" containerID="cri-o://dc231ee940cf53856464bb73ce19aa82ccbd10d82c550f08fdc7fb3f0df5f7c3" gracePeriod=2 Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.048296 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-c6d575554-k7thv"] Mar 19 18:59:18 crc kubenswrapper[4826]: E0319 18:59:18.048559 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb68a190-8139-4eb8-bdd5-d8dcfdb82169" containerName="controller-manager" Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.048575 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb68a190-8139-4eb8-bdd5-d8dcfdb82169" containerName="controller-manager" Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.048952 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb68a190-8139-4eb8-bdd5-d8dcfdb82169" containerName="controller-manager" Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.049411 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-c6d575554-k7thv" Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.053804 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.054021 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.054033 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.054135 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.055384 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.055530 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.076887 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.081177 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-c6d575554-k7thv"] Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.088341 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xf4m2\" (UniqueName: \"kubernetes.io/projected/04db62ac-86c8-43d1-8d31-c3882c07d4fc-kube-api-access-xf4m2\") pod \"controller-manager-c6d575554-k7thv\" (UID: \"04db62ac-86c8-43d1-8d31-c3882c07d4fc\") " pod="openshift-controller-manager/controller-manager-c6d575554-k7thv" Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.088398 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04db62ac-86c8-43d1-8d31-c3882c07d4fc-config\") pod \"controller-manager-c6d575554-k7thv\" (UID: \"04db62ac-86c8-43d1-8d31-c3882c07d4fc\") " pod="openshift-controller-manager/controller-manager-c6d575554-k7thv" Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.088469 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/04db62ac-86c8-43d1-8d31-c3882c07d4fc-client-ca\") pod \"controller-manager-c6d575554-k7thv\" (UID: \"04db62ac-86c8-43d1-8d31-c3882c07d4fc\") " pod="openshift-controller-manager/controller-manager-c6d575554-k7thv" Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.088504 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/04db62ac-86c8-43d1-8d31-c3882c07d4fc-proxy-ca-bundles\") pod \"controller-manager-c6d575554-k7thv\" (UID: \"04db62ac-86c8-43d1-8d31-c3882c07d4fc\") " pod="openshift-controller-manager/controller-manager-c6d575554-k7thv" Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.088538 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04db62ac-86c8-43d1-8d31-c3882c07d4fc-serving-cert\") pod \"controller-manager-c6d575554-k7thv\" (UID: \"04db62ac-86c8-43d1-8d31-c3882c07d4fc\") " pod="openshift-controller-manager/controller-manager-c6d575554-k7thv" Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.189500 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xf4m2\" (UniqueName: \"kubernetes.io/projected/04db62ac-86c8-43d1-8d31-c3882c07d4fc-kube-api-access-xf4m2\") pod \"controller-manager-c6d575554-k7thv\" (UID: \"04db62ac-86c8-43d1-8d31-c3882c07d4fc\") " pod="openshift-controller-manager/controller-manager-c6d575554-k7thv" Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.189576 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04db62ac-86c8-43d1-8d31-c3882c07d4fc-config\") pod \"controller-manager-c6d575554-k7thv\" (UID: \"04db62ac-86c8-43d1-8d31-c3882c07d4fc\") " pod="openshift-controller-manager/controller-manager-c6d575554-k7thv" Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.189641 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/04db62ac-86c8-43d1-8d31-c3882c07d4fc-client-ca\") pod \"controller-manager-c6d575554-k7thv\" (UID: \"04db62ac-86c8-43d1-8d31-c3882c07d4fc\") " pod="openshift-controller-manager/controller-manager-c6d575554-k7thv" Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.189689 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/04db62ac-86c8-43d1-8d31-c3882c07d4fc-proxy-ca-bundles\") pod \"controller-manager-c6d575554-k7thv\" (UID: \"04db62ac-86c8-43d1-8d31-c3882c07d4fc\") " pod="openshift-controller-manager/controller-manager-c6d575554-k7thv" Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.189714 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04db62ac-86c8-43d1-8d31-c3882c07d4fc-serving-cert\") pod \"controller-manager-c6d575554-k7thv\" (UID: \"04db62ac-86c8-43d1-8d31-c3882c07d4fc\") " pod="openshift-controller-manager/controller-manager-c6d575554-k7thv" Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.191199 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/04db62ac-86c8-43d1-8d31-c3882c07d4fc-client-ca\") pod \"controller-manager-c6d575554-k7thv\" (UID: \"04db62ac-86c8-43d1-8d31-c3882c07d4fc\") " pod="openshift-controller-manager/controller-manager-c6d575554-k7thv" Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.191610 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/04db62ac-86c8-43d1-8d31-c3882c07d4fc-proxy-ca-bundles\") pod \"controller-manager-c6d575554-k7thv\" (UID: \"04db62ac-86c8-43d1-8d31-c3882c07d4fc\") " pod="openshift-controller-manager/controller-manager-c6d575554-k7thv" Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.191937 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04db62ac-86c8-43d1-8d31-c3882c07d4fc-config\") pod \"controller-manager-c6d575554-k7thv\" (UID: \"04db62ac-86c8-43d1-8d31-c3882c07d4fc\") " pod="openshift-controller-manager/controller-manager-c6d575554-k7thv" Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.201681 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04db62ac-86c8-43d1-8d31-c3882c07d4fc-serving-cert\") pod \"controller-manager-c6d575554-k7thv\" (UID: \"04db62ac-86c8-43d1-8d31-c3882c07d4fc\") " pod="openshift-controller-manager/controller-manager-c6d575554-k7thv" Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.205962 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xf4m2\" (UniqueName: \"kubernetes.io/projected/04db62ac-86c8-43d1-8d31-c3882c07d4fc-kube-api-access-xf4m2\") pod \"controller-manager-c6d575554-k7thv\" (UID: \"04db62ac-86c8-43d1-8d31-c3882c07d4fc\") " pod="openshift-controller-manager/controller-manager-c6d575554-k7thv" Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.378182 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-c6d575554-k7thv" Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.399101 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mprm6" Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.424839 4826 generic.go:334] "Generic (PLEG): container finished" podID="ca0af29a-9ab6-4d5f-a6fd-bdb5f3d5526c" containerID="42b4f9585c0139b86067ff1c754dcab464654a52fca57539b9d091f3022c2398" exitCode=0 Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.424907 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-psc5t" event={"ID":"ca0af29a-9ab6-4d5f-a6fd-bdb5f3d5526c","Type":"ContainerDied","Data":"42b4f9585c0139b86067ff1c754dcab464654a52fca57539b9d091f3022c2398"} Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.433608 4826 generic.go:334] "Generic (PLEG): container finished" podID="7109581b-42ad-4e72-89be-ae269dcaea42" containerID="c44f6e6ddff34243570c2ba4a492ca1487ef44f2418bc1bf35e422d1cf526dc2" exitCode=0 Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.433709 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zkslk" event={"ID":"7109581b-42ad-4e72-89be-ae269dcaea42","Type":"ContainerDied","Data":"c44f6e6ddff34243570c2ba4a492ca1487ef44f2418bc1bf35e422d1cf526dc2"} Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.440407 4826 generic.go:334] "Generic (PLEG): container finished" podID="509cd3a8-f3bb-4214-a70b-e589905ad242" containerID="0bec69c3b0d5bc0917d38cbb822b84fec430b28433b779379849e2b62c194246" exitCode=0 Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.440477 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565778-hxldh" event={"ID":"509cd3a8-f3bb-4214-a70b-e589905ad242","Type":"ContainerDied","Data":"0bec69c3b0d5bc0917d38cbb822b84fec430b28433b779379849e2b62c194246"} Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.451230 4826 generic.go:334] "Generic (PLEG): container finished" podID="007d8118-0079-4d3d-b764-01eadbd419c5" containerID="dc231ee940cf53856464bb73ce19aa82ccbd10d82c550f08fdc7fb3f0df5f7c3" exitCode=0 Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.451385 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mprm6" Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.451687 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mprm6" event={"ID":"007d8118-0079-4d3d-b764-01eadbd419c5","Type":"ContainerDied","Data":"dc231ee940cf53856464bb73ce19aa82ccbd10d82c550f08fdc7fb3f0df5f7c3"} Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.451724 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mprm6" event={"ID":"007d8118-0079-4d3d-b764-01eadbd419c5","Type":"ContainerDied","Data":"830bfaecf6788968409c8dbaf53bcc31ac98b36d22acc6987e1535c0d27dcde2"} Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.451744 4826 scope.go:117] "RemoveContainer" containerID="dc231ee940cf53856464bb73ce19aa82ccbd10d82c550f08fdc7fb3f0df5f7c3" Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.454855 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-vsrvh" podUID="06fdacd5-0f40-4d55-8df2-67ea56f25595" containerName="registry-server" probeResult="failure" output=< Mar 19 18:59:18 crc kubenswrapper[4826]: timeout: failed to connect service ":50051" within 1s Mar 19 18:59:18 crc kubenswrapper[4826]: > Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.474284 4826 generic.go:334] "Generic (PLEG): container finished" podID="dcf719a6-7a63-4efa-b8dd-1beba09934f9" containerID="172c01382de3818f295008f8b636baa62dcbc571906ef7f9f2762319ed12e2e3" exitCode=0 Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.474395 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8rs7z" event={"ID":"dcf719a6-7a63-4efa-b8dd-1beba09934f9","Type":"ContainerDied","Data":"172c01382de3818f295008f8b636baa62dcbc571906ef7f9f2762319ed12e2e3"} Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.477320 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rrk5r" event={"ID":"f4293235-5c04-462c-bef4-8595d0c89ec6","Type":"ContainerStarted","Data":"111c51982c185b20f4c4658558b709bc304251905bf99c326ce02c437ef79df4"} Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.480374 4826 generic.go:334] "Generic (PLEG): container finished" podID="6397cca1-7284-4e40-9b7e-3f8026c72f5f" containerID="a754d01d13643fd564d9e7d8297d9632187cfd6f652e385b3a82c07a48b74f58" exitCode=0 Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.480810 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lwdqq" event={"ID":"6397cca1-7284-4e40-9b7e-3f8026c72f5f","Type":"ContainerDied","Data":"a754d01d13643fd564d9e7d8297d9632187cfd6f652e385b3a82c07a48b74f58"} Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.507492 4826 scope.go:117] "RemoveContainer" containerID="a8589876c52cd6dae86f925424686ad31894136813d605a0bce98396b843ffa0" Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.553074 4826 scope.go:117] "RemoveContainer" containerID="935933d38157cc5c5f1feee2866013bfb5884eb978a1257eb5329d92b6ce0a9d" Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.602741 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/007d8118-0079-4d3d-b764-01eadbd419c5-catalog-content\") pod \"007d8118-0079-4d3d-b764-01eadbd419c5\" (UID: \"007d8118-0079-4d3d-b764-01eadbd419c5\") " Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.603306 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/007d8118-0079-4d3d-b764-01eadbd419c5-utilities\") pod \"007d8118-0079-4d3d-b764-01eadbd419c5\" (UID: \"007d8118-0079-4d3d-b764-01eadbd419c5\") " Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.603442 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzm69\" (UniqueName: \"kubernetes.io/projected/007d8118-0079-4d3d-b764-01eadbd419c5-kube-api-access-qzm69\") pod \"007d8118-0079-4d3d-b764-01eadbd419c5\" (UID: \"007d8118-0079-4d3d-b764-01eadbd419c5\") " Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.604836 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/007d8118-0079-4d3d-b764-01eadbd419c5-utilities" (OuterVolumeSpecName: "utilities") pod "007d8118-0079-4d3d-b764-01eadbd419c5" (UID: "007d8118-0079-4d3d-b764-01eadbd419c5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.609518 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/007d8118-0079-4d3d-b764-01eadbd419c5-kube-api-access-qzm69" (OuterVolumeSpecName: "kube-api-access-qzm69") pod "007d8118-0079-4d3d-b764-01eadbd419c5" (UID: "007d8118-0079-4d3d-b764-01eadbd419c5"). InnerVolumeSpecName "kube-api-access-qzm69". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.635962 4826 scope.go:117] "RemoveContainer" containerID="dc231ee940cf53856464bb73ce19aa82ccbd10d82c550f08fdc7fb3f0df5f7c3" Mar 19 18:59:18 crc kubenswrapper[4826]: E0319 18:59:18.636607 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc231ee940cf53856464bb73ce19aa82ccbd10d82c550f08fdc7fb3f0df5f7c3\": container with ID starting with dc231ee940cf53856464bb73ce19aa82ccbd10d82c550f08fdc7fb3f0df5f7c3 not found: ID does not exist" containerID="dc231ee940cf53856464bb73ce19aa82ccbd10d82c550f08fdc7fb3f0df5f7c3" Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.636857 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc231ee940cf53856464bb73ce19aa82ccbd10d82c550f08fdc7fb3f0df5f7c3"} err="failed to get container status \"dc231ee940cf53856464bb73ce19aa82ccbd10d82c550f08fdc7fb3f0df5f7c3\": rpc error: code = NotFound desc = could not find container \"dc231ee940cf53856464bb73ce19aa82ccbd10d82c550f08fdc7fb3f0df5f7c3\": container with ID starting with dc231ee940cf53856464bb73ce19aa82ccbd10d82c550f08fdc7fb3f0df5f7c3 not found: ID does not exist" Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.636910 4826 scope.go:117] "RemoveContainer" containerID="a8589876c52cd6dae86f925424686ad31894136813d605a0bce98396b843ffa0" Mar 19 18:59:18 crc kubenswrapper[4826]: E0319 18:59:18.642307 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8589876c52cd6dae86f925424686ad31894136813d605a0bce98396b843ffa0\": container with ID starting with a8589876c52cd6dae86f925424686ad31894136813d605a0bce98396b843ffa0 not found: ID does not exist" containerID="a8589876c52cd6dae86f925424686ad31894136813d605a0bce98396b843ffa0" Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.642355 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8589876c52cd6dae86f925424686ad31894136813d605a0bce98396b843ffa0"} err="failed to get container status \"a8589876c52cd6dae86f925424686ad31894136813d605a0bce98396b843ffa0\": rpc error: code = NotFound desc = could not find container \"a8589876c52cd6dae86f925424686ad31894136813d605a0bce98396b843ffa0\": container with ID starting with a8589876c52cd6dae86f925424686ad31894136813d605a0bce98396b843ffa0 not found: ID does not exist" Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.642392 4826 scope.go:117] "RemoveContainer" containerID="935933d38157cc5c5f1feee2866013bfb5884eb978a1257eb5329d92b6ce0a9d" Mar 19 18:59:18 crc kubenswrapper[4826]: E0319 18:59:18.642758 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"935933d38157cc5c5f1feee2866013bfb5884eb978a1257eb5329d92b6ce0a9d\": container with ID starting with 935933d38157cc5c5f1feee2866013bfb5884eb978a1257eb5329d92b6ce0a9d not found: ID does not exist" containerID="935933d38157cc5c5f1feee2866013bfb5884eb978a1257eb5329d92b6ce0a9d" Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.642807 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"935933d38157cc5c5f1feee2866013bfb5884eb978a1257eb5329d92b6ce0a9d"} err="failed to get container status \"935933d38157cc5c5f1feee2866013bfb5884eb978a1257eb5329d92b6ce0a9d\": rpc error: code = NotFound desc = could not find container \"935933d38157cc5c5f1feee2866013bfb5884eb978a1257eb5329d92b6ce0a9d\": container with ID starting with 935933d38157cc5c5f1feee2866013bfb5884eb978a1257eb5329d92b6ce0a9d not found: ID does not exist" Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.658171 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/007d8118-0079-4d3d-b764-01eadbd419c5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "007d8118-0079-4d3d-b764-01eadbd419c5" (UID: "007d8118-0079-4d3d-b764-01eadbd419c5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.704949 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzm69\" (UniqueName: \"kubernetes.io/projected/007d8118-0079-4d3d-b764-01eadbd419c5-kube-api-access-qzm69\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.705005 4826 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/007d8118-0079-4d3d-b764-01eadbd419c5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.705025 4826 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/007d8118-0079-4d3d-b764-01eadbd419c5-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.786106 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mprm6"] Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.795110 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mprm6"] Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.843161 4826 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-12 10:34:49.795094289 +0000 UTC Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.843199 4826 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7167h35m30.951898162s for next certificate rotation Mar 19 18:59:18 crc kubenswrapper[4826]: I0319 18:59:18.908324 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-c6d575554-k7thv"] Mar 19 18:59:19 crc kubenswrapper[4826]: I0319 18:59:19.486522 4826 generic.go:334] "Generic (PLEG): container finished" podID="f4293235-5c04-462c-bef4-8595d0c89ec6" containerID="111c51982c185b20f4c4658558b709bc304251905bf99c326ce02c437ef79df4" exitCode=0 Mar 19 18:59:19 crc kubenswrapper[4826]: I0319 18:59:19.486618 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rrk5r" event={"ID":"f4293235-5c04-462c-bef4-8595d0c89ec6","Type":"ContainerDied","Data":"111c51982c185b20f4c4658558b709bc304251905bf99c326ce02c437ef79df4"} Mar 19 18:59:19 crc kubenswrapper[4826]: I0319 18:59:19.490098 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lwdqq" event={"ID":"6397cca1-7284-4e40-9b7e-3f8026c72f5f","Type":"ContainerStarted","Data":"207e00c21fa248a25bcaa99664c76d352f723d37e7e7047e198e62df271bad86"} Mar 19 18:59:19 crc kubenswrapper[4826]: I0319 18:59:19.491914 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-psc5t" event={"ID":"ca0af29a-9ab6-4d5f-a6fd-bdb5f3d5526c","Type":"ContainerStarted","Data":"8c1ffe044a85cadc547ff459f20e7ddf8d2417059853707962e54c4ec31457cc"} Mar 19 18:59:19 crc kubenswrapper[4826]: I0319 18:59:19.493922 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zkslk" event={"ID":"7109581b-42ad-4e72-89be-ae269dcaea42","Type":"ContainerStarted","Data":"71b424507f0985e335f1f90139a9f35f490a1d4526b09d52f483274c93a36146"} Mar 19 18:59:19 crc kubenswrapper[4826]: I0319 18:59:19.495318 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-c6d575554-k7thv" event={"ID":"04db62ac-86c8-43d1-8d31-c3882c07d4fc","Type":"ContainerStarted","Data":"b269e2c9889eff7b20388ca34f2d4c64152a0e26c2fc98ccf7766a308b06e9cc"} Mar 19 18:59:19 crc kubenswrapper[4826]: I0319 18:59:19.495343 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-c6d575554-k7thv" event={"ID":"04db62ac-86c8-43d1-8d31-c3882c07d4fc","Type":"ContainerStarted","Data":"15aad52bc90c27e4c1efac88d0149791c9c71f3a40e4abd094ebc55ad95c88ce"} Mar 19 18:59:19 crc kubenswrapper[4826]: I0319 18:59:19.497185 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-c6d575554-k7thv" Mar 19 18:59:19 crc kubenswrapper[4826]: I0319 18:59:19.499048 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8rs7z" event={"ID":"dcf719a6-7a63-4efa-b8dd-1beba09934f9","Type":"ContainerStarted","Data":"454b0212ff1afe3db8244ac37e01a01b89ad0068661caa0f50cc7f6d0e578733"} Mar 19 18:59:19 crc kubenswrapper[4826]: I0319 18:59:19.527409 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-c6d575554-k7thv" Mar 19 18:59:19 crc kubenswrapper[4826]: I0319 18:59:19.576220 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-psc5t" podStartSLOduration=4.420462629 podStartE2EDuration="1m5.576199112s" podCreationTimestamp="2026-03-19 18:58:14 +0000 UTC" firstStartedPulling="2026-03-19 18:58:17.8010945 +0000 UTC m=+122.555162803" lastFinishedPulling="2026-03-19 18:59:18.956830963 +0000 UTC m=+183.710899286" observedRunningTime="2026-03-19 18:59:19.54780987 +0000 UTC m=+184.301878193" watchObservedRunningTime="2026-03-19 18:59:19.576199112 +0000 UTC m=+184.330267425" Mar 19 18:59:19 crc kubenswrapper[4826]: I0319 18:59:19.579179 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zkslk" podStartSLOduration=4.03925591 podStartE2EDuration="1m2.57917053s" podCreationTimestamp="2026-03-19 18:58:17 +0000 UTC" firstStartedPulling="2026-03-19 18:58:20.492149754 +0000 UTC m=+125.246218067" lastFinishedPulling="2026-03-19 18:59:19.032064374 +0000 UTC m=+183.786132687" observedRunningTime="2026-03-19 18:59:19.576394447 +0000 UTC m=+184.330462760" watchObservedRunningTime="2026-03-19 18:59:19.57917053 +0000 UTC m=+184.333238843" Mar 19 18:59:19 crc kubenswrapper[4826]: I0319 18:59:19.609129 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lwdqq" podStartSLOduration=5.449268347 podStartE2EDuration="1m5.609111052s" podCreationTimestamp="2026-03-19 18:58:14 +0000 UTC" firstStartedPulling="2026-03-19 18:58:19.082117983 +0000 UTC m=+123.836186296" lastFinishedPulling="2026-03-19 18:59:19.241960688 +0000 UTC m=+183.996029001" observedRunningTime="2026-03-19 18:59:19.605444456 +0000 UTC m=+184.359512769" watchObservedRunningTime="2026-03-19 18:59:19.609111052 +0000 UTC m=+184.363179365" Mar 19 18:59:19 crc kubenswrapper[4826]: I0319 18:59:19.665729 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8rs7z" podStartSLOduration=5.664204225 podStartE2EDuration="1m5.66570994s" podCreationTimestamp="2026-03-19 18:58:14 +0000 UTC" firstStartedPulling="2026-03-19 18:58:19.081484396 +0000 UTC m=+123.835552709" lastFinishedPulling="2026-03-19 18:59:19.082990091 +0000 UTC m=+183.837058424" observedRunningTime="2026-03-19 18:59:19.637233316 +0000 UTC m=+184.391301629" watchObservedRunningTime="2026-03-19 18:59:19.66570994 +0000 UTC m=+184.419778253" Mar 19 18:59:19 crc kubenswrapper[4826]: I0319 18:59:19.666526 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-c6d575554-k7thv" podStartSLOduration=6.666521461 podStartE2EDuration="6.666521461s" podCreationTimestamp="2026-03-19 18:59:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:59:19.66307826 +0000 UTC m=+184.417146573" watchObservedRunningTime="2026-03-19 18:59:19.666521461 +0000 UTC m=+184.420589774" Mar 19 18:59:19 crc kubenswrapper[4826]: I0319 18:59:19.815179 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565778-hxldh" Mar 19 18:59:19 crc kubenswrapper[4826]: I0319 18:59:19.844187 4826 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-14 02:35:46.78296317 +0000 UTC Mar 19 18:59:19 crc kubenswrapper[4826]: I0319 18:59:19.844258 4826 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7207h36m26.938707906s for next certificate rotation Mar 19 18:59:19 crc kubenswrapper[4826]: I0319 18:59:19.922307 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdq6z\" (UniqueName: \"kubernetes.io/projected/509cd3a8-f3bb-4214-a70b-e589905ad242-kube-api-access-gdq6z\") pod \"509cd3a8-f3bb-4214-a70b-e589905ad242\" (UID: \"509cd3a8-f3bb-4214-a70b-e589905ad242\") " Mar 19 18:59:19 crc kubenswrapper[4826]: I0319 18:59:19.940252 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/509cd3a8-f3bb-4214-a70b-e589905ad242-kube-api-access-gdq6z" (OuterVolumeSpecName: "kube-api-access-gdq6z") pod "509cd3a8-f3bb-4214-a70b-e589905ad242" (UID: "509cd3a8-f3bb-4214-a70b-e589905ad242"). InnerVolumeSpecName "kube-api-access-gdq6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:59:19 crc kubenswrapper[4826]: I0319 18:59:19.983624 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="007d8118-0079-4d3d-b764-01eadbd419c5" path="/var/lib/kubelet/pods/007d8118-0079-4d3d-b764-01eadbd419c5/volumes" Mar 19 18:59:20 crc kubenswrapper[4826]: I0319 18:59:20.027885 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdq6z\" (UniqueName: \"kubernetes.io/projected/509cd3a8-f3bb-4214-a70b-e589905ad242-kube-api-access-gdq6z\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:20 crc kubenswrapper[4826]: I0319 18:59:20.507807 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rrk5r" event={"ID":"f4293235-5c04-462c-bef4-8595d0c89ec6","Type":"ContainerStarted","Data":"f2589c41928158c45683446c7aacf9c21d181745aa73af91ce6307262c514527"} Mar 19 18:59:20 crc kubenswrapper[4826]: I0319 18:59:20.509811 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565778-hxldh" Mar 19 18:59:20 crc kubenswrapper[4826]: I0319 18:59:20.510210 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565778-hxldh" event={"ID":"509cd3a8-f3bb-4214-a70b-e589905ad242","Type":"ContainerDied","Data":"f732234a1dfd37c93e5311d567ecb0c71e547a83a35ffe4e10ae66c7f109dfb1"} Mar 19 18:59:20 crc kubenswrapper[4826]: I0319 18:59:20.510237 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f732234a1dfd37c93e5311d567ecb0c71e547a83a35ffe4e10ae66c7f109dfb1" Mar 19 18:59:24 crc kubenswrapper[4826]: I0319 18:59:24.885000 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-psc5t" Mar 19 18:59:24 crc kubenswrapper[4826]: I0319 18:59:24.885436 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-psc5t" Mar 19 18:59:24 crc kubenswrapper[4826]: I0319 18:59:24.937476 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-psc5t" Mar 19 18:59:24 crc kubenswrapper[4826]: I0319 18:59:24.965255 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rrk5r" podStartSLOduration=8.089336843 podStartE2EDuration="1m7.96523458s" podCreationTimestamp="2026-03-19 18:58:17 +0000 UTC" firstStartedPulling="2026-03-19 18:58:20.425891243 +0000 UTC m=+125.179959556" lastFinishedPulling="2026-03-19 18:59:20.30178899 +0000 UTC m=+185.055857293" observedRunningTime="2026-03-19 18:59:20.533424358 +0000 UTC m=+185.287492681" watchObservedRunningTime="2026-03-19 18:59:24.96523458 +0000 UTC m=+189.719302893" Mar 19 18:59:24 crc kubenswrapper[4826]: I0319 18:59:24.997110 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lwdqq" Mar 19 18:59:24 crc kubenswrapper[4826]: I0319 18:59:24.997177 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lwdqq" Mar 19 18:59:25 crc kubenswrapper[4826]: I0319 18:59:25.051806 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lwdqq" Mar 19 18:59:25 crc kubenswrapper[4826]: I0319 18:59:25.606192 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-psc5t" Mar 19 18:59:25 crc kubenswrapper[4826]: I0319 18:59:25.638274 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8rs7z" Mar 19 18:59:25 crc kubenswrapper[4826]: I0319 18:59:25.638347 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8rs7z" Mar 19 18:59:25 crc kubenswrapper[4826]: I0319 18:59:25.640875 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lwdqq" Mar 19 18:59:25 crc kubenswrapper[4826]: I0319 18:59:25.704394 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8rs7z" Mar 19 18:59:26 crc kubenswrapper[4826]: I0319 18:59:26.603211 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8rs7z" Mar 19 18:59:26 crc kubenswrapper[4826]: I0319 18:59:26.945117 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2fpvj" Mar 19 18:59:26 crc kubenswrapper[4826]: I0319 18:59:26.945177 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2fpvj" Mar 19 18:59:26 crc kubenswrapper[4826]: I0319 18:59:26.994620 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2fpvj" Mar 19 18:59:27 crc kubenswrapper[4826]: I0319 18:59:27.458780 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vsrvh" Mar 19 18:59:27 crc kubenswrapper[4826]: I0319 18:59:27.495715 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vsrvh" Mar 19 18:59:27 crc kubenswrapper[4826]: I0319 18:59:27.610563 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2fpvj" Mar 19 18:59:27 crc kubenswrapper[4826]: I0319 18:59:27.829196 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8rs7z"] Mar 19 18:59:28 crc kubenswrapper[4826]: I0319 18:59:28.035024 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zkslk" Mar 19 18:59:28 crc kubenswrapper[4826]: I0319 18:59:28.035102 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zkslk" Mar 19 18:59:28 crc kubenswrapper[4826]: I0319 18:59:28.113536 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zkslk" Mar 19 18:59:28 crc kubenswrapper[4826]: I0319 18:59:28.437711 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rrk5r" Mar 19 18:59:28 crc kubenswrapper[4826]: I0319 18:59:28.437800 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rrk5r" Mar 19 18:59:28 crc kubenswrapper[4826]: I0319 18:59:28.492809 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rrk5r" Mar 19 18:59:28 crc kubenswrapper[4826]: I0319 18:59:28.577307 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8rs7z" podUID="dcf719a6-7a63-4efa-b8dd-1beba09934f9" containerName="registry-server" containerID="cri-o://454b0212ff1afe3db8244ac37e01a01b89ad0068661caa0f50cc7f6d0e578733" gracePeriod=2 Mar 19 18:59:28 crc kubenswrapper[4826]: I0319 18:59:28.645173 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rrk5r" Mar 19 18:59:28 crc kubenswrapper[4826]: I0319 18:59:28.652227 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zkslk" Mar 19 18:59:29 crc kubenswrapper[4826]: I0319 18:59:29.233637 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vsrvh"] Mar 19 18:59:29 crc kubenswrapper[4826]: I0319 18:59:29.233975 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vsrvh" podUID="06fdacd5-0f40-4d55-8df2-67ea56f25595" containerName="registry-server" containerID="cri-o://2fb7b21ab589ac0efc884c09f33932a8e96fcc9b586b78ee8a7d4ccb502be1db" gracePeriod=2 Mar 19 18:59:29 crc kubenswrapper[4826]: I0319 18:59:29.590064 4826 generic.go:334] "Generic (PLEG): container finished" podID="06fdacd5-0f40-4d55-8df2-67ea56f25595" containerID="2fb7b21ab589ac0efc884c09f33932a8e96fcc9b586b78ee8a7d4ccb502be1db" exitCode=0 Mar 19 18:59:29 crc kubenswrapper[4826]: I0319 18:59:29.590114 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vsrvh" event={"ID":"06fdacd5-0f40-4d55-8df2-67ea56f25595","Type":"ContainerDied","Data":"2fb7b21ab589ac0efc884c09f33932a8e96fcc9b586b78ee8a7d4ccb502be1db"} Mar 19 18:59:29 crc kubenswrapper[4826]: I0319 18:59:29.598856 4826 generic.go:334] "Generic (PLEG): container finished" podID="dcf719a6-7a63-4efa-b8dd-1beba09934f9" containerID="454b0212ff1afe3db8244ac37e01a01b89ad0068661caa0f50cc7f6d0e578733" exitCode=0 Mar 19 18:59:29 crc kubenswrapper[4826]: I0319 18:59:29.599101 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8rs7z" event={"ID":"dcf719a6-7a63-4efa-b8dd-1beba09934f9","Type":"ContainerDied","Data":"454b0212ff1afe3db8244ac37e01a01b89ad0068661caa0f50cc7f6d0e578733"} Mar 19 18:59:29 crc kubenswrapper[4826]: I0319 18:59:29.845468 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8rs7z" Mar 19 18:59:29 crc kubenswrapper[4826]: I0319 18:59:29.972925 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcf719a6-7a63-4efa-b8dd-1beba09934f9-catalog-content\") pod \"dcf719a6-7a63-4efa-b8dd-1beba09934f9\" (UID: \"dcf719a6-7a63-4efa-b8dd-1beba09934f9\") " Mar 19 18:59:29 crc kubenswrapper[4826]: I0319 18:59:29.972992 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcf719a6-7a63-4efa-b8dd-1beba09934f9-utilities\") pod \"dcf719a6-7a63-4efa-b8dd-1beba09934f9\" (UID: \"dcf719a6-7a63-4efa-b8dd-1beba09934f9\") " Mar 19 18:59:29 crc kubenswrapper[4826]: I0319 18:59:29.973014 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qghf\" (UniqueName: \"kubernetes.io/projected/dcf719a6-7a63-4efa-b8dd-1beba09934f9-kube-api-access-4qghf\") pod \"dcf719a6-7a63-4efa-b8dd-1beba09934f9\" (UID: \"dcf719a6-7a63-4efa-b8dd-1beba09934f9\") " Mar 19 18:59:29 crc kubenswrapper[4826]: I0319 18:59:29.973697 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcf719a6-7a63-4efa-b8dd-1beba09934f9-utilities" (OuterVolumeSpecName: "utilities") pod "dcf719a6-7a63-4efa-b8dd-1beba09934f9" (UID: "dcf719a6-7a63-4efa-b8dd-1beba09934f9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 18:59:29 crc kubenswrapper[4826]: I0319 18:59:29.979838 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcf719a6-7a63-4efa-b8dd-1beba09934f9-kube-api-access-4qghf" (OuterVolumeSpecName: "kube-api-access-4qghf") pod "dcf719a6-7a63-4efa-b8dd-1beba09934f9" (UID: "dcf719a6-7a63-4efa-b8dd-1beba09934f9"). InnerVolumeSpecName "kube-api-access-4qghf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:59:30 crc kubenswrapper[4826]: I0319 18:59:30.041232 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcf719a6-7a63-4efa-b8dd-1beba09934f9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dcf719a6-7a63-4efa-b8dd-1beba09934f9" (UID: "dcf719a6-7a63-4efa-b8dd-1beba09934f9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 18:59:30 crc kubenswrapper[4826]: I0319 18:59:30.074120 4826 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcf719a6-7a63-4efa-b8dd-1beba09934f9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:30 crc kubenswrapper[4826]: I0319 18:59:30.074161 4826 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcf719a6-7a63-4efa-b8dd-1beba09934f9-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:30 crc kubenswrapper[4826]: I0319 18:59:30.074175 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qghf\" (UniqueName: \"kubernetes.io/projected/dcf719a6-7a63-4efa-b8dd-1beba09934f9-kube-api-access-4qghf\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:30 crc kubenswrapper[4826]: I0319 18:59:30.610563 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8rs7z" event={"ID":"dcf719a6-7a63-4efa-b8dd-1beba09934f9","Type":"ContainerDied","Data":"8808d6cdfcc08c439d0ad29456c5553236abe3af32f30aab313ebc2ce01b6ad1"} Mar 19 18:59:30 crc kubenswrapper[4826]: I0319 18:59:30.610642 4826 scope.go:117] "RemoveContainer" containerID="454b0212ff1afe3db8244ac37e01a01b89ad0068661caa0f50cc7f6d0e578733" Mar 19 18:59:30 crc kubenswrapper[4826]: I0319 18:59:30.610690 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8rs7z" Mar 19 18:59:30 crc kubenswrapper[4826]: I0319 18:59:30.633733 4826 scope.go:117] "RemoveContainer" containerID="172c01382de3818f295008f8b636baa62dcbc571906ef7f9f2762319ed12e2e3" Mar 19 18:59:30 crc kubenswrapper[4826]: I0319 18:59:30.653749 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8rs7z"] Mar 19 18:59:30 crc kubenswrapper[4826]: I0319 18:59:30.667765 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8rs7z"] Mar 19 18:59:30 crc kubenswrapper[4826]: I0319 18:59:30.680477 4826 scope.go:117] "RemoveContainer" containerID="fce0ed192b79f75994a875d13faaa198326ed2a587b5059461fafa6ede5ebe93" Mar 19 18:59:30 crc kubenswrapper[4826]: I0319 18:59:30.770783 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vsrvh" Mar 19 18:59:30 crc kubenswrapper[4826]: I0319 18:59:30.888362 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06fdacd5-0f40-4d55-8df2-67ea56f25595-catalog-content\") pod \"06fdacd5-0f40-4d55-8df2-67ea56f25595\" (UID: \"06fdacd5-0f40-4d55-8df2-67ea56f25595\") " Mar 19 18:59:30 crc kubenswrapper[4826]: I0319 18:59:30.888439 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06fdacd5-0f40-4d55-8df2-67ea56f25595-utilities\") pod \"06fdacd5-0f40-4d55-8df2-67ea56f25595\" (UID: \"06fdacd5-0f40-4d55-8df2-67ea56f25595\") " Mar 19 18:59:30 crc kubenswrapper[4826]: I0319 18:59:30.888471 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkc6j\" (UniqueName: \"kubernetes.io/projected/06fdacd5-0f40-4d55-8df2-67ea56f25595-kube-api-access-xkc6j\") pod \"06fdacd5-0f40-4d55-8df2-67ea56f25595\" (UID: \"06fdacd5-0f40-4d55-8df2-67ea56f25595\") " Mar 19 18:59:30 crc kubenswrapper[4826]: I0319 18:59:30.889773 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06fdacd5-0f40-4d55-8df2-67ea56f25595-utilities" (OuterVolumeSpecName: "utilities") pod "06fdacd5-0f40-4d55-8df2-67ea56f25595" (UID: "06fdacd5-0f40-4d55-8df2-67ea56f25595"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 18:59:30 crc kubenswrapper[4826]: I0319 18:59:30.893960 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06fdacd5-0f40-4d55-8df2-67ea56f25595-kube-api-access-xkc6j" (OuterVolumeSpecName: "kube-api-access-xkc6j") pod "06fdacd5-0f40-4d55-8df2-67ea56f25595" (UID: "06fdacd5-0f40-4d55-8df2-67ea56f25595"). InnerVolumeSpecName "kube-api-access-xkc6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:59:30 crc kubenswrapper[4826]: I0319 18:59:30.917909 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06fdacd5-0f40-4d55-8df2-67ea56f25595-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "06fdacd5-0f40-4d55-8df2-67ea56f25595" (UID: "06fdacd5-0f40-4d55-8df2-67ea56f25595"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 18:59:30 crc kubenswrapper[4826]: I0319 18:59:30.990947 4826 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06fdacd5-0f40-4d55-8df2-67ea56f25595-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:30 crc kubenswrapper[4826]: I0319 18:59:30.991344 4826 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06fdacd5-0f40-4d55-8df2-67ea56f25595-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:30 crc kubenswrapper[4826]: I0319 18:59:30.991410 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkc6j\" (UniqueName: \"kubernetes.io/projected/06fdacd5-0f40-4d55-8df2-67ea56f25595-kube-api-access-xkc6j\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:31 crc kubenswrapper[4826]: I0319 18:59:31.628916 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vsrvh" event={"ID":"06fdacd5-0f40-4d55-8df2-67ea56f25595","Type":"ContainerDied","Data":"853904801d629cce8382d4c3d0e460ba5c864b42e9656e1d27b51ed877edf187"} Mar 19 18:59:31 crc kubenswrapper[4826]: I0319 18:59:31.628999 4826 scope.go:117] "RemoveContainer" containerID="2fb7b21ab589ac0efc884c09f33932a8e96fcc9b586b78ee8a7d4ccb502be1db" Mar 19 18:59:31 crc kubenswrapper[4826]: I0319 18:59:31.629114 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vsrvh" Mar 19 18:59:31 crc kubenswrapper[4826]: I0319 18:59:31.646460 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rrk5r"] Mar 19 18:59:31 crc kubenswrapper[4826]: I0319 18:59:31.649051 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rrk5r" podUID="f4293235-5c04-462c-bef4-8595d0c89ec6" containerName="registry-server" containerID="cri-o://f2589c41928158c45683446c7aacf9c21d181745aa73af91ce6307262c514527" gracePeriod=2 Mar 19 18:59:31 crc kubenswrapper[4826]: I0319 18:59:31.655002 4826 scope.go:117] "RemoveContainer" containerID="d5016f0f73675f0f20a63c82ea7680c6b52a107e28bc5551253a53795e4f16a4" Mar 19 18:59:31 crc kubenswrapper[4826]: I0319 18:59:31.678816 4826 scope.go:117] "RemoveContainer" containerID="7a031b5b2dbd89420ef42b2c1ecc4234428d6efa972eb316e2ebbd27d0aa48ce" Mar 19 18:59:31 crc kubenswrapper[4826]: I0319 18:59:31.683411 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vsrvh"] Mar 19 18:59:31 crc kubenswrapper[4826]: I0319 18:59:31.685643 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vsrvh"] Mar 19 18:59:31 crc kubenswrapper[4826]: I0319 18:59:31.990336 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06fdacd5-0f40-4d55-8df2-67ea56f25595" path="/var/lib/kubelet/pods/06fdacd5-0f40-4d55-8df2-67ea56f25595/volumes" Mar 19 18:59:31 crc kubenswrapper[4826]: I0319 18:59:31.991908 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcf719a6-7a63-4efa-b8dd-1beba09934f9" path="/var/lib/kubelet/pods/dcf719a6-7a63-4efa-b8dd-1beba09934f9/volumes" Mar 19 18:59:32 crc kubenswrapper[4826]: I0319 18:59:32.642515 4826 generic.go:334] "Generic (PLEG): container finished" podID="f4293235-5c04-462c-bef4-8595d0c89ec6" containerID="f2589c41928158c45683446c7aacf9c21d181745aa73af91ce6307262c514527" exitCode=0 Mar 19 18:59:32 crc kubenswrapper[4826]: I0319 18:59:32.642593 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rrk5r" event={"ID":"f4293235-5c04-462c-bef4-8595d0c89ec6","Type":"ContainerDied","Data":"f2589c41928158c45683446c7aacf9c21d181745aa73af91ce6307262c514527"} Mar 19 18:59:32 crc kubenswrapper[4826]: I0319 18:59:32.723830 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rrk5r" Mar 19 18:59:32 crc kubenswrapper[4826]: I0319 18:59:32.819831 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4293235-5c04-462c-bef4-8595d0c89ec6-catalog-content\") pod \"f4293235-5c04-462c-bef4-8595d0c89ec6\" (UID: \"f4293235-5c04-462c-bef4-8595d0c89ec6\") " Mar 19 18:59:32 crc kubenswrapper[4826]: I0319 18:59:32.820444 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vs6z8\" (UniqueName: \"kubernetes.io/projected/f4293235-5c04-462c-bef4-8595d0c89ec6-kube-api-access-vs6z8\") pod \"f4293235-5c04-462c-bef4-8595d0c89ec6\" (UID: \"f4293235-5c04-462c-bef4-8595d0c89ec6\") " Mar 19 18:59:32 crc kubenswrapper[4826]: I0319 18:59:32.820493 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4293235-5c04-462c-bef4-8595d0c89ec6-utilities\") pod \"f4293235-5c04-462c-bef4-8595d0c89ec6\" (UID: \"f4293235-5c04-462c-bef4-8595d0c89ec6\") " Mar 19 18:59:32 crc kubenswrapper[4826]: I0319 18:59:32.821430 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4293235-5c04-462c-bef4-8595d0c89ec6-utilities" (OuterVolumeSpecName: "utilities") pod "f4293235-5c04-462c-bef4-8595d0c89ec6" (UID: "f4293235-5c04-462c-bef4-8595d0c89ec6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 18:59:32 crc kubenswrapper[4826]: I0319 18:59:32.835605 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4293235-5c04-462c-bef4-8595d0c89ec6-kube-api-access-vs6z8" (OuterVolumeSpecName: "kube-api-access-vs6z8") pod "f4293235-5c04-462c-bef4-8595d0c89ec6" (UID: "f4293235-5c04-462c-bef4-8595d0c89ec6"). InnerVolumeSpecName "kube-api-access-vs6z8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:59:32 crc kubenswrapper[4826]: I0319 18:59:32.922412 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vs6z8\" (UniqueName: \"kubernetes.io/projected/f4293235-5c04-462c-bef4-8595d0c89ec6-kube-api-access-vs6z8\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:32 crc kubenswrapper[4826]: I0319 18:59:32.922472 4826 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4293235-5c04-462c-bef4-8595d0c89ec6-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:33 crc kubenswrapper[4826]: I0319 18:59:33.076615 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4293235-5c04-462c-bef4-8595d0c89ec6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f4293235-5c04-462c-bef4-8595d0c89ec6" (UID: "f4293235-5c04-462c-bef4-8595d0c89ec6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 18:59:33 crc kubenswrapper[4826]: I0319 18:59:33.125646 4826 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4293235-5c04-462c-bef4-8595d0c89ec6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:33 crc kubenswrapper[4826]: I0319 18:59:33.654827 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rrk5r" event={"ID":"f4293235-5c04-462c-bef4-8595d0c89ec6","Type":"ContainerDied","Data":"cbb652796d0158a9aaa43d53c18282ff40d16f75b4f0526c646a1a3c05141110"} Mar 19 18:59:33 crc kubenswrapper[4826]: I0319 18:59:33.654869 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rrk5r" Mar 19 18:59:33 crc kubenswrapper[4826]: I0319 18:59:33.654914 4826 scope.go:117] "RemoveContainer" containerID="f2589c41928158c45683446c7aacf9c21d181745aa73af91ce6307262c514527" Mar 19 18:59:33 crc kubenswrapper[4826]: I0319 18:59:33.677675 4826 scope.go:117] "RemoveContainer" containerID="111c51982c185b20f4c4658558b709bc304251905bf99c326ce02c437ef79df4" Mar 19 18:59:33 crc kubenswrapper[4826]: I0319 18:59:33.694874 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rrk5r"] Mar 19 18:59:33 crc kubenswrapper[4826]: I0319 18:59:33.700229 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rrk5r"] Mar 19 18:59:33 crc kubenswrapper[4826]: I0319 18:59:33.703774 4826 scope.go:117] "RemoveContainer" containerID="82335a34432660f4c22f60a6784a80e48e14edfc797730f574e363efdb7ab22a" Mar 19 18:59:33 crc kubenswrapper[4826]: I0319 18:59:33.881592 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-c6d575554-k7thv"] Mar 19 18:59:33 crc kubenswrapper[4826]: I0319 18:59:33.881984 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-c6d575554-k7thv" podUID="04db62ac-86c8-43d1-8d31-c3882c07d4fc" containerName="controller-manager" containerID="cri-o://b269e2c9889eff7b20388ca34f2d4c64152a0e26c2fc98ccf7766a308b06e9cc" gracePeriod=30 Mar 19 18:59:33 crc kubenswrapper[4826]: I0319 18:59:33.985479 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4293235-5c04-462c-bef4-8595d0c89ec6" path="/var/lib/kubelet/pods/f4293235-5c04-462c-bef4-8595d0c89ec6/volumes" Mar 19 18:59:33 crc kubenswrapper[4826]: I0319 18:59:33.986210 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67d7bffc66-k8x2s"] Mar 19 18:59:33 crc kubenswrapper[4826]: I0319 18:59:33.986444 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-67d7bffc66-k8x2s" podUID="a7fcb636-1429-47d8-8bef-ad1131064a3d" containerName="route-controller-manager" containerID="cri-o://9eae0c06074d29e17e06a536ad4714ae4ccedb34372308a7790b4adc15acdaa9" gracePeriod=30 Mar 19 18:59:34 crc kubenswrapper[4826]: I0319 18:59:34.444346 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67d7bffc66-k8x2s" Mar 19 18:59:34 crc kubenswrapper[4826]: I0319 18:59:34.447872 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-c6d575554-k7thv" Mar 19 18:59:34 crc kubenswrapper[4826]: I0319 18:59:34.547164 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7fcb636-1429-47d8-8bef-ad1131064a3d-config\") pod \"a7fcb636-1429-47d8-8bef-ad1131064a3d\" (UID: \"a7fcb636-1429-47d8-8bef-ad1131064a3d\") " Mar 19 18:59:34 crc kubenswrapper[4826]: I0319 18:59:34.547217 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04db62ac-86c8-43d1-8d31-c3882c07d4fc-serving-cert\") pod \"04db62ac-86c8-43d1-8d31-c3882c07d4fc\" (UID: \"04db62ac-86c8-43d1-8d31-c3882c07d4fc\") " Mar 19 18:59:34 crc kubenswrapper[4826]: I0319 18:59:34.547246 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7fcb636-1429-47d8-8bef-ad1131064a3d-serving-cert\") pod \"a7fcb636-1429-47d8-8bef-ad1131064a3d\" (UID: \"a7fcb636-1429-47d8-8bef-ad1131064a3d\") " Mar 19 18:59:34 crc kubenswrapper[4826]: I0319 18:59:34.547269 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a7fcb636-1429-47d8-8bef-ad1131064a3d-client-ca\") pod \"a7fcb636-1429-47d8-8bef-ad1131064a3d\" (UID: \"a7fcb636-1429-47d8-8bef-ad1131064a3d\") " Mar 19 18:59:34 crc kubenswrapper[4826]: I0319 18:59:34.547289 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/04db62ac-86c8-43d1-8d31-c3882c07d4fc-client-ca\") pod \"04db62ac-86c8-43d1-8d31-c3882c07d4fc\" (UID: \"04db62ac-86c8-43d1-8d31-c3882c07d4fc\") " Mar 19 18:59:34 crc kubenswrapper[4826]: I0319 18:59:34.547321 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdjhq\" (UniqueName: \"kubernetes.io/projected/a7fcb636-1429-47d8-8bef-ad1131064a3d-kube-api-access-cdjhq\") pod \"a7fcb636-1429-47d8-8bef-ad1131064a3d\" (UID: \"a7fcb636-1429-47d8-8bef-ad1131064a3d\") " Mar 19 18:59:34 crc kubenswrapper[4826]: I0319 18:59:34.547341 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/04db62ac-86c8-43d1-8d31-c3882c07d4fc-proxy-ca-bundles\") pod \"04db62ac-86c8-43d1-8d31-c3882c07d4fc\" (UID: \"04db62ac-86c8-43d1-8d31-c3882c07d4fc\") " Mar 19 18:59:34 crc kubenswrapper[4826]: I0319 18:59:34.547361 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xf4m2\" (UniqueName: \"kubernetes.io/projected/04db62ac-86c8-43d1-8d31-c3882c07d4fc-kube-api-access-xf4m2\") pod \"04db62ac-86c8-43d1-8d31-c3882c07d4fc\" (UID: \"04db62ac-86c8-43d1-8d31-c3882c07d4fc\") " Mar 19 18:59:34 crc kubenswrapper[4826]: I0319 18:59:34.547383 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04db62ac-86c8-43d1-8d31-c3882c07d4fc-config\") pod \"04db62ac-86c8-43d1-8d31-c3882c07d4fc\" (UID: \"04db62ac-86c8-43d1-8d31-c3882c07d4fc\") " Mar 19 18:59:34 crc kubenswrapper[4826]: I0319 18:59:34.548375 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04db62ac-86c8-43d1-8d31-c3882c07d4fc-config" (OuterVolumeSpecName: "config") pod "04db62ac-86c8-43d1-8d31-c3882c07d4fc" (UID: "04db62ac-86c8-43d1-8d31-c3882c07d4fc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:59:34 crc kubenswrapper[4826]: I0319 18:59:34.548780 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7fcb636-1429-47d8-8bef-ad1131064a3d-config" (OuterVolumeSpecName: "config") pod "a7fcb636-1429-47d8-8bef-ad1131064a3d" (UID: "a7fcb636-1429-47d8-8bef-ad1131064a3d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:59:34 crc kubenswrapper[4826]: I0319 18:59:34.549423 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04db62ac-86c8-43d1-8d31-c3882c07d4fc-client-ca" (OuterVolumeSpecName: "client-ca") pod "04db62ac-86c8-43d1-8d31-c3882c07d4fc" (UID: "04db62ac-86c8-43d1-8d31-c3882c07d4fc"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:59:34 crc kubenswrapper[4826]: I0319 18:59:34.550255 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04db62ac-86c8-43d1-8d31-c3882c07d4fc-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "04db62ac-86c8-43d1-8d31-c3882c07d4fc" (UID: "04db62ac-86c8-43d1-8d31-c3882c07d4fc"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:59:34 crc kubenswrapper[4826]: I0319 18:59:34.550249 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7fcb636-1429-47d8-8bef-ad1131064a3d-client-ca" (OuterVolumeSpecName: "client-ca") pod "a7fcb636-1429-47d8-8bef-ad1131064a3d" (UID: "a7fcb636-1429-47d8-8bef-ad1131064a3d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 18:59:34 crc kubenswrapper[4826]: I0319 18:59:34.552526 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04db62ac-86c8-43d1-8d31-c3882c07d4fc-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "04db62ac-86c8-43d1-8d31-c3882c07d4fc" (UID: "04db62ac-86c8-43d1-8d31-c3882c07d4fc"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:59:34 crc kubenswrapper[4826]: I0319 18:59:34.553563 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7fcb636-1429-47d8-8bef-ad1131064a3d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a7fcb636-1429-47d8-8bef-ad1131064a3d" (UID: "a7fcb636-1429-47d8-8bef-ad1131064a3d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 18:59:34 crc kubenswrapper[4826]: I0319 18:59:34.553694 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7fcb636-1429-47d8-8bef-ad1131064a3d-kube-api-access-cdjhq" (OuterVolumeSpecName: "kube-api-access-cdjhq") pod "a7fcb636-1429-47d8-8bef-ad1131064a3d" (UID: "a7fcb636-1429-47d8-8bef-ad1131064a3d"). InnerVolumeSpecName "kube-api-access-cdjhq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:59:34 crc kubenswrapper[4826]: I0319 18:59:34.554643 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04db62ac-86c8-43d1-8d31-c3882c07d4fc-kube-api-access-xf4m2" (OuterVolumeSpecName: "kube-api-access-xf4m2") pod "04db62ac-86c8-43d1-8d31-c3882c07d4fc" (UID: "04db62ac-86c8-43d1-8d31-c3882c07d4fc"). InnerVolumeSpecName "kube-api-access-xf4m2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:59:34 crc kubenswrapper[4826]: I0319 18:59:34.648628 4826 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7fcb636-1429-47d8-8bef-ad1131064a3d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:34 crc kubenswrapper[4826]: I0319 18:59:34.648683 4826 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a7fcb636-1429-47d8-8bef-ad1131064a3d-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:34 crc kubenswrapper[4826]: I0319 18:59:34.648693 4826 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/04db62ac-86c8-43d1-8d31-c3882c07d4fc-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:34 crc kubenswrapper[4826]: I0319 18:59:34.648703 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdjhq\" (UniqueName: \"kubernetes.io/projected/a7fcb636-1429-47d8-8bef-ad1131064a3d-kube-api-access-cdjhq\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:34 crc kubenswrapper[4826]: I0319 18:59:34.648715 4826 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/04db62ac-86c8-43d1-8d31-c3882c07d4fc-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:34 crc kubenswrapper[4826]: I0319 18:59:34.648724 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xf4m2\" (UniqueName: \"kubernetes.io/projected/04db62ac-86c8-43d1-8d31-c3882c07d4fc-kube-api-access-xf4m2\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:34 crc kubenswrapper[4826]: I0319 18:59:34.648732 4826 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04db62ac-86c8-43d1-8d31-c3882c07d4fc-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:34 crc kubenswrapper[4826]: I0319 18:59:34.648740 4826 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7fcb636-1429-47d8-8bef-ad1131064a3d-config\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:34 crc kubenswrapper[4826]: I0319 18:59:34.648748 4826 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04db62ac-86c8-43d1-8d31-c3882c07d4fc-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:34 crc kubenswrapper[4826]: I0319 18:59:34.660261 4826 generic.go:334] "Generic (PLEG): container finished" podID="a7fcb636-1429-47d8-8bef-ad1131064a3d" containerID="9eae0c06074d29e17e06a536ad4714ae4ccedb34372308a7790b4adc15acdaa9" exitCode=0 Mar 19 18:59:34 crc kubenswrapper[4826]: I0319 18:59:34.660389 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67d7bffc66-k8x2s" event={"ID":"a7fcb636-1429-47d8-8bef-ad1131064a3d","Type":"ContainerDied","Data":"9eae0c06074d29e17e06a536ad4714ae4ccedb34372308a7790b4adc15acdaa9"} Mar 19 18:59:34 crc kubenswrapper[4826]: I0319 18:59:34.660407 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67d7bffc66-k8x2s" Mar 19 18:59:34 crc kubenswrapper[4826]: I0319 18:59:34.660441 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67d7bffc66-k8x2s" event={"ID":"a7fcb636-1429-47d8-8bef-ad1131064a3d","Type":"ContainerDied","Data":"aad4ebea09c62420944e57a3f7ef31cb4aabb53ae8e3cabec075854a0893d15b"} Mar 19 18:59:34 crc kubenswrapper[4826]: I0319 18:59:34.660463 4826 scope.go:117] "RemoveContainer" containerID="9eae0c06074d29e17e06a536ad4714ae4ccedb34372308a7790b4adc15acdaa9" Mar 19 18:59:34 crc kubenswrapper[4826]: I0319 18:59:34.662867 4826 generic.go:334] "Generic (PLEG): container finished" podID="04db62ac-86c8-43d1-8d31-c3882c07d4fc" containerID="b269e2c9889eff7b20388ca34f2d4c64152a0e26c2fc98ccf7766a308b06e9cc" exitCode=0 Mar 19 18:59:34 crc kubenswrapper[4826]: I0319 18:59:34.662996 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-c6d575554-k7thv" Mar 19 18:59:34 crc kubenswrapper[4826]: I0319 18:59:34.663165 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-c6d575554-k7thv" event={"ID":"04db62ac-86c8-43d1-8d31-c3882c07d4fc","Type":"ContainerDied","Data":"b269e2c9889eff7b20388ca34f2d4c64152a0e26c2fc98ccf7766a308b06e9cc"} Mar 19 18:59:34 crc kubenswrapper[4826]: I0319 18:59:34.663259 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-c6d575554-k7thv" event={"ID":"04db62ac-86c8-43d1-8d31-c3882c07d4fc","Type":"ContainerDied","Data":"15aad52bc90c27e4c1efac88d0149791c9c71f3a40e4abd094ebc55ad95c88ce"} Mar 19 18:59:34 crc kubenswrapper[4826]: I0319 18:59:34.677688 4826 scope.go:117] "RemoveContainer" containerID="9eae0c06074d29e17e06a536ad4714ae4ccedb34372308a7790b4adc15acdaa9" Mar 19 18:59:34 crc kubenswrapper[4826]: E0319 18:59:34.678213 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9eae0c06074d29e17e06a536ad4714ae4ccedb34372308a7790b4adc15acdaa9\": container with ID starting with 9eae0c06074d29e17e06a536ad4714ae4ccedb34372308a7790b4adc15acdaa9 not found: ID does not exist" containerID="9eae0c06074d29e17e06a536ad4714ae4ccedb34372308a7790b4adc15acdaa9" Mar 19 18:59:34 crc kubenswrapper[4826]: I0319 18:59:34.678258 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9eae0c06074d29e17e06a536ad4714ae4ccedb34372308a7790b4adc15acdaa9"} err="failed to get container status \"9eae0c06074d29e17e06a536ad4714ae4ccedb34372308a7790b4adc15acdaa9\": rpc error: code = NotFound desc = could not find container \"9eae0c06074d29e17e06a536ad4714ae4ccedb34372308a7790b4adc15acdaa9\": container with ID starting with 9eae0c06074d29e17e06a536ad4714ae4ccedb34372308a7790b4adc15acdaa9 not found: ID does not exist" Mar 19 18:59:34 crc kubenswrapper[4826]: I0319 18:59:34.678284 4826 scope.go:117] "RemoveContainer" containerID="b269e2c9889eff7b20388ca34f2d4c64152a0e26c2fc98ccf7766a308b06e9cc" Mar 19 18:59:34 crc kubenswrapper[4826]: I0319 18:59:34.701821 4826 scope.go:117] "RemoveContainer" containerID="b269e2c9889eff7b20388ca34f2d4c64152a0e26c2fc98ccf7766a308b06e9cc" Mar 19 18:59:34 crc kubenswrapper[4826]: I0319 18:59:34.701957 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67d7bffc66-k8x2s"] Mar 19 18:59:34 crc kubenswrapper[4826]: E0319 18:59:34.702352 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b269e2c9889eff7b20388ca34f2d4c64152a0e26c2fc98ccf7766a308b06e9cc\": container with ID starting with b269e2c9889eff7b20388ca34f2d4c64152a0e26c2fc98ccf7766a308b06e9cc not found: ID does not exist" containerID="b269e2c9889eff7b20388ca34f2d4c64152a0e26c2fc98ccf7766a308b06e9cc" Mar 19 18:59:34 crc kubenswrapper[4826]: I0319 18:59:34.702415 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b269e2c9889eff7b20388ca34f2d4c64152a0e26c2fc98ccf7766a308b06e9cc"} err="failed to get container status \"b269e2c9889eff7b20388ca34f2d4c64152a0e26c2fc98ccf7766a308b06e9cc\": rpc error: code = NotFound desc = could not find container \"b269e2c9889eff7b20388ca34f2d4c64152a0e26c2fc98ccf7766a308b06e9cc\": container with ID starting with b269e2c9889eff7b20388ca34f2d4c64152a0e26c2fc98ccf7766a308b06e9cc not found: ID does not exist" Mar 19 18:59:34 crc kubenswrapper[4826]: I0319 18:59:34.706182 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67d7bffc66-k8x2s"] Mar 19 18:59:34 crc kubenswrapper[4826]: I0319 18:59:34.718033 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-c6d575554-k7thv"] Mar 19 18:59:34 crc kubenswrapper[4826]: I0319 18:59:34.721975 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-c6d575554-k7thv"] Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.061940 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-64c9cf5fcb-kq7k9"] Mar 19 18:59:35 crc kubenswrapper[4826]: E0319 18:59:35.062287 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06fdacd5-0f40-4d55-8df2-67ea56f25595" containerName="extract-content" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.062302 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="06fdacd5-0f40-4d55-8df2-67ea56f25595" containerName="extract-content" Mar 19 18:59:35 crc kubenswrapper[4826]: E0319 18:59:35.062324 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4293235-5c04-462c-bef4-8595d0c89ec6" containerName="extract-utilities" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.062332 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4293235-5c04-462c-bef4-8595d0c89ec6" containerName="extract-utilities" Mar 19 18:59:35 crc kubenswrapper[4826]: E0319 18:59:35.062345 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06fdacd5-0f40-4d55-8df2-67ea56f25595" containerName="extract-utilities" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.062352 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="06fdacd5-0f40-4d55-8df2-67ea56f25595" containerName="extract-utilities" Mar 19 18:59:35 crc kubenswrapper[4826]: E0319 18:59:35.062361 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcf719a6-7a63-4efa-b8dd-1beba09934f9" containerName="registry-server" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.062368 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcf719a6-7a63-4efa-b8dd-1beba09934f9" containerName="registry-server" Mar 19 18:59:35 crc kubenswrapper[4826]: E0319 18:59:35.062380 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="007d8118-0079-4d3d-b764-01eadbd419c5" containerName="extract-utilities" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.062387 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="007d8118-0079-4d3d-b764-01eadbd419c5" containerName="extract-utilities" Mar 19 18:59:35 crc kubenswrapper[4826]: E0319 18:59:35.062395 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="509cd3a8-f3bb-4214-a70b-e589905ad242" containerName="oc" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.062404 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="509cd3a8-f3bb-4214-a70b-e589905ad242" containerName="oc" Mar 19 18:59:35 crc kubenswrapper[4826]: E0319 18:59:35.062414 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4293235-5c04-462c-bef4-8595d0c89ec6" containerName="extract-content" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.062420 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4293235-5c04-462c-bef4-8595d0c89ec6" containerName="extract-content" Mar 19 18:59:35 crc kubenswrapper[4826]: E0319 18:59:35.062428 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04db62ac-86c8-43d1-8d31-c3882c07d4fc" containerName="controller-manager" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.062435 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="04db62ac-86c8-43d1-8d31-c3882c07d4fc" containerName="controller-manager" Mar 19 18:59:35 crc kubenswrapper[4826]: E0319 18:59:35.062444 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcf719a6-7a63-4efa-b8dd-1beba09934f9" containerName="extract-content" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.062450 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcf719a6-7a63-4efa-b8dd-1beba09934f9" containerName="extract-content" Mar 19 18:59:35 crc kubenswrapper[4826]: E0319 18:59:35.062458 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4293235-5c04-462c-bef4-8595d0c89ec6" containerName="registry-server" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.062468 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4293235-5c04-462c-bef4-8595d0c89ec6" containerName="registry-server" Mar 19 18:59:35 crc kubenswrapper[4826]: E0319 18:59:35.062478 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="007d8118-0079-4d3d-b764-01eadbd419c5" containerName="extract-content" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.062485 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="007d8118-0079-4d3d-b764-01eadbd419c5" containerName="extract-content" Mar 19 18:59:35 crc kubenswrapper[4826]: E0319 18:59:35.062492 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7fcb636-1429-47d8-8bef-ad1131064a3d" containerName="route-controller-manager" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.062498 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7fcb636-1429-47d8-8bef-ad1131064a3d" containerName="route-controller-manager" Mar 19 18:59:35 crc kubenswrapper[4826]: E0319 18:59:35.062507 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcf719a6-7a63-4efa-b8dd-1beba09934f9" containerName="extract-utilities" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.062516 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcf719a6-7a63-4efa-b8dd-1beba09934f9" containerName="extract-utilities" Mar 19 18:59:35 crc kubenswrapper[4826]: E0319 18:59:35.062525 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="007d8118-0079-4d3d-b764-01eadbd419c5" containerName="registry-server" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.062532 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="007d8118-0079-4d3d-b764-01eadbd419c5" containerName="registry-server" Mar 19 18:59:35 crc kubenswrapper[4826]: E0319 18:59:35.062542 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06fdacd5-0f40-4d55-8df2-67ea56f25595" containerName="registry-server" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.062549 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="06fdacd5-0f40-4d55-8df2-67ea56f25595" containerName="registry-server" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.062687 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="509cd3a8-f3bb-4214-a70b-e589905ad242" containerName="oc" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.062697 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7fcb636-1429-47d8-8bef-ad1131064a3d" containerName="route-controller-manager" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.062706 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcf719a6-7a63-4efa-b8dd-1beba09934f9" containerName="registry-server" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.062712 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4293235-5c04-462c-bef4-8595d0c89ec6" containerName="registry-server" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.062721 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="04db62ac-86c8-43d1-8d31-c3882c07d4fc" containerName="controller-manager" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.062733 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="06fdacd5-0f40-4d55-8df2-67ea56f25595" containerName="registry-server" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.062744 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="007d8118-0079-4d3d-b764-01eadbd419c5" containerName="registry-server" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.063284 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64c9cf5fcb-kq7k9" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.064439 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5ffd9cb86-bg57h"] Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.065413 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5ffd9cb86-bg57h" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.067923 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.068458 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.068646 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.069095 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.069147 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.069302 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.069315 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.069382 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.069524 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.069699 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.070606 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.072135 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.080326 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.085218 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-64c9cf5fcb-kq7k9"] Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.089175 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5ffd9cb86-bg57h"] Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.152837 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8-config\") pod \"controller-manager-64c9cf5fcb-kq7k9\" (UID: \"ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8\") " pod="openshift-controller-manager/controller-manager-64c9cf5fcb-kq7k9" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.152898 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8-client-ca\") pod \"controller-manager-64c9cf5fcb-kq7k9\" (UID: \"ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8\") " pod="openshift-controller-manager/controller-manager-64c9cf5fcb-kq7k9" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.152921 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59640edf-3068-4993-8b50-c049106935ee-config\") pod \"route-controller-manager-5ffd9cb86-bg57h\" (UID: \"59640edf-3068-4993-8b50-c049106935ee\") " pod="openshift-route-controller-manager/route-controller-manager-5ffd9cb86-bg57h" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.152942 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/59640edf-3068-4993-8b50-c049106935ee-client-ca\") pod \"route-controller-manager-5ffd9cb86-bg57h\" (UID: \"59640edf-3068-4993-8b50-c049106935ee\") " pod="openshift-route-controller-manager/route-controller-manager-5ffd9cb86-bg57h" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.152973 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8-proxy-ca-bundles\") pod \"controller-manager-64c9cf5fcb-kq7k9\" (UID: \"ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8\") " pod="openshift-controller-manager/controller-manager-64c9cf5fcb-kq7k9" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.152990 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slnn9\" (UniqueName: \"kubernetes.io/projected/59640edf-3068-4993-8b50-c049106935ee-kube-api-access-slnn9\") pod \"route-controller-manager-5ffd9cb86-bg57h\" (UID: \"59640edf-3068-4993-8b50-c049106935ee\") " pod="openshift-route-controller-manager/route-controller-manager-5ffd9cb86-bg57h" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.153014 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8-serving-cert\") pod \"controller-manager-64c9cf5fcb-kq7k9\" (UID: \"ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8\") " pod="openshift-controller-manager/controller-manager-64c9cf5fcb-kq7k9" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.153294 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59640edf-3068-4993-8b50-c049106935ee-serving-cert\") pod \"route-controller-manager-5ffd9cb86-bg57h\" (UID: \"59640edf-3068-4993-8b50-c049106935ee\") " pod="openshift-route-controller-manager/route-controller-manager-5ffd9cb86-bg57h" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.153376 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ln2sk\" (UniqueName: \"kubernetes.io/projected/ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8-kube-api-access-ln2sk\") pod \"controller-manager-64c9cf5fcb-kq7k9\" (UID: \"ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8\") " pod="openshift-controller-manager/controller-manager-64c9cf5fcb-kq7k9" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.254692 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8-proxy-ca-bundles\") pod \"controller-manager-64c9cf5fcb-kq7k9\" (UID: \"ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8\") " pod="openshift-controller-manager/controller-manager-64c9cf5fcb-kq7k9" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.254745 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slnn9\" (UniqueName: \"kubernetes.io/projected/59640edf-3068-4993-8b50-c049106935ee-kube-api-access-slnn9\") pod \"route-controller-manager-5ffd9cb86-bg57h\" (UID: \"59640edf-3068-4993-8b50-c049106935ee\") " pod="openshift-route-controller-manager/route-controller-manager-5ffd9cb86-bg57h" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.254774 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8-serving-cert\") pod \"controller-manager-64c9cf5fcb-kq7k9\" (UID: \"ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8\") " pod="openshift-controller-manager/controller-manager-64c9cf5fcb-kq7k9" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.254811 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59640edf-3068-4993-8b50-c049106935ee-serving-cert\") pod \"route-controller-manager-5ffd9cb86-bg57h\" (UID: \"59640edf-3068-4993-8b50-c049106935ee\") " pod="openshift-route-controller-manager/route-controller-manager-5ffd9cb86-bg57h" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.254850 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ln2sk\" (UniqueName: \"kubernetes.io/projected/ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8-kube-api-access-ln2sk\") pod \"controller-manager-64c9cf5fcb-kq7k9\" (UID: \"ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8\") " pod="openshift-controller-manager/controller-manager-64c9cf5fcb-kq7k9" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.254901 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8-config\") pod \"controller-manager-64c9cf5fcb-kq7k9\" (UID: \"ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8\") " pod="openshift-controller-manager/controller-manager-64c9cf5fcb-kq7k9" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.254934 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8-client-ca\") pod \"controller-manager-64c9cf5fcb-kq7k9\" (UID: \"ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8\") " pod="openshift-controller-manager/controller-manager-64c9cf5fcb-kq7k9" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.254957 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59640edf-3068-4993-8b50-c049106935ee-config\") pod \"route-controller-manager-5ffd9cb86-bg57h\" (UID: \"59640edf-3068-4993-8b50-c049106935ee\") " pod="openshift-route-controller-manager/route-controller-manager-5ffd9cb86-bg57h" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.254982 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/59640edf-3068-4993-8b50-c049106935ee-client-ca\") pod \"route-controller-manager-5ffd9cb86-bg57h\" (UID: \"59640edf-3068-4993-8b50-c049106935ee\") " pod="openshift-route-controller-manager/route-controller-manager-5ffd9cb86-bg57h" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.256213 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/59640edf-3068-4993-8b50-c049106935ee-client-ca\") pod \"route-controller-manager-5ffd9cb86-bg57h\" (UID: \"59640edf-3068-4993-8b50-c049106935ee\") " pod="openshift-route-controller-manager/route-controller-manager-5ffd9cb86-bg57h" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.257029 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8-proxy-ca-bundles\") pod \"controller-manager-64c9cf5fcb-kq7k9\" (UID: \"ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8\") " pod="openshift-controller-manager/controller-manager-64c9cf5fcb-kq7k9" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.257259 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59640edf-3068-4993-8b50-c049106935ee-config\") pod \"route-controller-manager-5ffd9cb86-bg57h\" (UID: \"59640edf-3068-4993-8b50-c049106935ee\") " pod="openshift-route-controller-manager/route-controller-manager-5ffd9cb86-bg57h" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.257491 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8-config\") pod \"controller-manager-64c9cf5fcb-kq7k9\" (UID: \"ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8\") " pod="openshift-controller-manager/controller-manager-64c9cf5fcb-kq7k9" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.260082 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8-client-ca\") pod \"controller-manager-64c9cf5fcb-kq7k9\" (UID: \"ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8\") " pod="openshift-controller-manager/controller-manager-64c9cf5fcb-kq7k9" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.261216 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8-serving-cert\") pod \"controller-manager-64c9cf5fcb-kq7k9\" (UID: \"ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8\") " pod="openshift-controller-manager/controller-manager-64c9cf5fcb-kq7k9" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.261265 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59640edf-3068-4993-8b50-c049106935ee-serving-cert\") pod \"route-controller-manager-5ffd9cb86-bg57h\" (UID: \"59640edf-3068-4993-8b50-c049106935ee\") " pod="openshift-route-controller-manager/route-controller-manager-5ffd9cb86-bg57h" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.287005 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ln2sk\" (UniqueName: \"kubernetes.io/projected/ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8-kube-api-access-ln2sk\") pod \"controller-manager-64c9cf5fcb-kq7k9\" (UID: \"ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8\") " pod="openshift-controller-manager/controller-manager-64c9cf5fcb-kq7k9" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.290792 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slnn9\" (UniqueName: \"kubernetes.io/projected/59640edf-3068-4993-8b50-c049106935ee-kube-api-access-slnn9\") pod \"route-controller-manager-5ffd9cb86-bg57h\" (UID: \"59640edf-3068-4993-8b50-c049106935ee\") " pod="openshift-route-controller-manager/route-controller-manager-5ffd9cb86-bg57h" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.393949 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64c9cf5fcb-kq7k9" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.406595 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5ffd9cb86-bg57h" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.654937 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-64c9cf5fcb-kq7k9"] Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.675047 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64c9cf5fcb-kq7k9" event={"ID":"ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8","Type":"ContainerStarted","Data":"ef5861fb683048f27977415b22572739fd26519dc69a703979a87cf419c3fe5a"} Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.942917 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5ffd9cb86-bg57h"] Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.987574 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04db62ac-86c8-43d1-8d31-c3882c07d4fc" path="/var/lib/kubelet/pods/04db62ac-86c8-43d1-8d31-c3882c07d4fc/volumes" Mar 19 18:59:35 crc kubenswrapper[4826]: I0319 18:59:35.989093 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7fcb636-1429-47d8-8bef-ad1131064a3d" path="/var/lib/kubelet/pods/a7fcb636-1429-47d8-8bef-ad1131064a3d/volumes" Mar 19 18:59:36 crc kubenswrapper[4826]: I0319 18:59:36.686857 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64c9cf5fcb-kq7k9" event={"ID":"ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8","Type":"ContainerStarted","Data":"fd88dbfbae7a3b7e1aa5860fbfc5ca58b40879e9050d0f513a269557ae1f7813"} Mar 19 18:59:36 crc kubenswrapper[4826]: I0319 18:59:36.687715 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-64c9cf5fcb-kq7k9" Mar 19 18:59:36 crc kubenswrapper[4826]: I0319 18:59:36.688527 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5ffd9cb86-bg57h" event={"ID":"59640edf-3068-4993-8b50-c049106935ee","Type":"ContainerStarted","Data":"cb3568bc85dcd89368e79d7fcdd3fef2bd4c8973e88249e6a850d7bf0efa0dbe"} Mar 19 18:59:36 crc kubenswrapper[4826]: I0319 18:59:36.688553 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5ffd9cb86-bg57h" event={"ID":"59640edf-3068-4993-8b50-c049106935ee","Type":"ContainerStarted","Data":"fa550695db4b8560ef6ed765ed33de6a28714c3730a634164f6bc332002252dc"} Mar 19 18:59:36 crc kubenswrapper[4826]: I0319 18:59:36.688788 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5ffd9cb86-bg57h" Mar 19 18:59:36 crc kubenswrapper[4826]: I0319 18:59:36.692920 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-64c9cf5fcb-kq7k9" Mar 19 18:59:36 crc kubenswrapper[4826]: I0319 18:59:36.695411 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5ffd9cb86-bg57h" Mar 19 18:59:36 crc kubenswrapper[4826]: I0319 18:59:36.713930 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-64c9cf5fcb-kq7k9" podStartSLOduration=3.713911614 podStartE2EDuration="3.713911614s" podCreationTimestamp="2026-03-19 18:59:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:59:36.710553239 +0000 UTC m=+201.464621552" watchObservedRunningTime="2026-03-19 18:59:36.713911614 +0000 UTC m=+201.467979927" Mar 19 18:59:36 crc kubenswrapper[4826]: I0319 18:59:36.750848 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5ffd9cb86-bg57h" podStartSLOduration=2.750830437 podStartE2EDuration="2.750830437s" podCreationTimestamp="2026-03-19 18:59:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 18:59:36.749308234 +0000 UTC m=+201.503376627" watchObservedRunningTime="2026-03-19 18:59:36.750830437 +0000 UTC m=+201.504898740" Mar 19 18:59:40 crc kubenswrapper[4826]: I0319 18:59:40.674535 4826 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 19 18:59:40 crc kubenswrapper[4826]: I0319 18:59:40.675732 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 18:59:40 crc kubenswrapper[4826]: I0319 18:59:40.675850 4826 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 19 18:59:40 crc kubenswrapper[4826]: I0319 18:59:40.676233 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://1c7e8bd5e1686bfadfc6f61c8436c0ca538cebbebb8fdafa685621c729b143ae" gracePeriod=15 Mar 19 18:59:40 crc kubenswrapper[4826]: I0319 18:59:40.676312 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://4b5b9841cd846e58c72f4acc03b1509604b816bef5c45da0fc98f7483671822a" gracePeriod=15 Mar 19 18:59:40 crc kubenswrapper[4826]: I0319 18:59:40.676350 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://0ea714363aa8ce7507efbab8cbb23b850bb2fa272d7cf20eb2c9eb8af0a3da21" gracePeriod=15 Mar 19 18:59:40 crc kubenswrapper[4826]: I0319 18:59:40.676401 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://bbf79230bb3f40d8a5de7b681913877be3e763cae02c99c6ebe12bff0e0319ab" gracePeriod=15 Mar 19 18:59:40 crc kubenswrapper[4826]: I0319 18:59:40.676401 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://924eaf45c108010a89e613176dcd7b45a23f2ab5f857de0f6f9d92f5f4ddc7f8" gracePeriod=15 Mar 19 18:59:40 crc kubenswrapper[4826]: I0319 18:59:40.676957 4826 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 19 18:59:40 crc kubenswrapper[4826]: E0319 18:59:40.677116 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 18:59:40 crc kubenswrapper[4826]: I0319 18:59:40.677137 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 18:59:40 crc kubenswrapper[4826]: E0319 18:59:40.677152 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 19 18:59:40 crc kubenswrapper[4826]: I0319 18:59:40.677160 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 19 18:59:40 crc kubenswrapper[4826]: E0319 18:59:40.677172 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 18:59:40 crc kubenswrapper[4826]: I0319 18:59:40.677181 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 18:59:40 crc kubenswrapper[4826]: E0319 18:59:40.677193 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 19 18:59:40 crc kubenswrapper[4826]: I0319 18:59:40.677203 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 19 18:59:40 crc kubenswrapper[4826]: E0319 18:59:40.677214 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 19 18:59:40 crc kubenswrapper[4826]: I0319 18:59:40.677221 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 19 18:59:40 crc kubenswrapper[4826]: E0319 18:59:40.677233 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 18:59:40 crc kubenswrapper[4826]: I0319 18:59:40.677240 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 18:59:40 crc kubenswrapper[4826]: E0319 18:59:40.677252 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 19 18:59:40 crc kubenswrapper[4826]: I0319 18:59:40.677260 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 19 18:59:40 crc kubenswrapper[4826]: E0319 18:59:40.677269 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 19 18:59:40 crc kubenswrapper[4826]: I0319 18:59:40.677277 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 19 18:59:40 crc kubenswrapper[4826]: I0319 18:59:40.677390 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 19 18:59:40 crc kubenswrapper[4826]: I0319 18:59:40.677404 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 19 18:59:40 crc kubenswrapper[4826]: I0319 18:59:40.677414 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 18:59:40 crc kubenswrapper[4826]: I0319 18:59:40.677423 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 18:59:40 crc kubenswrapper[4826]: I0319 18:59:40.677432 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 18:59:40 crc kubenswrapper[4826]: I0319 18:59:40.677441 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 19 18:59:40 crc kubenswrapper[4826]: I0319 18:59:40.677450 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 19 18:59:40 crc kubenswrapper[4826]: I0319 18:59:40.677461 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 18:59:40 crc kubenswrapper[4826]: E0319 18:59:40.677585 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 18:59:40 crc kubenswrapper[4826]: I0319 18:59:40.677596 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 18:59:40 crc kubenswrapper[4826]: E0319 18:59:40.677609 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 18:59:40 crc kubenswrapper[4826]: I0319 18:59:40.677617 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 18:59:40 crc kubenswrapper[4826]: I0319 18:59:40.677775 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 18:59:40 crc kubenswrapper[4826]: I0319 18:59:40.743916 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 18:59:40 crc kubenswrapper[4826]: I0319 18:59:40.744043 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:59:40 crc kubenswrapper[4826]: I0319 18:59:40.744099 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 18:59:40 crc kubenswrapper[4826]: I0319 18:59:40.744148 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:59:40 crc kubenswrapper[4826]: I0319 18:59:40.744241 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 18:59:40 crc kubenswrapper[4826]: I0319 18:59:40.744273 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:59:40 crc kubenswrapper[4826]: I0319 18:59:40.744329 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 18:59:40 crc kubenswrapper[4826]: I0319 18:59:40.744359 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 18:59:40 crc kubenswrapper[4826]: I0319 18:59:40.849713 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:59:40 crc kubenswrapper[4826]: I0319 18:59:40.850194 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 18:59:40 crc kubenswrapper[4826]: I0319 18:59:40.850235 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:59:40 crc kubenswrapper[4826]: I0319 18:59:40.850320 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 18:59:40 crc kubenswrapper[4826]: I0319 18:59:40.850367 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:59:40 crc kubenswrapper[4826]: I0319 18:59:40.850403 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 18:59:40 crc kubenswrapper[4826]: I0319 18:59:40.850434 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 18:59:40 crc kubenswrapper[4826]: I0319 18:59:40.850466 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 18:59:40 crc kubenswrapper[4826]: I0319 18:59:40.850584 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 18:59:40 crc kubenswrapper[4826]: I0319 18:59:40.849850 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:59:40 crc kubenswrapper[4826]: I0319 18:59:40.850676 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 18:59:40 crc kubenswrapper[4826]: I0319 18:59:40.850709 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:59:40 crc kubenswrapper[4826]: I0319 18:59:40.850737 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 18:59:40 crc kubenswrapper[4826]: I0319 18:59:40.850769 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:59:40 crc kubenswrapper[4826]: I0319 18:59:40.850797 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 18:59:40 crc kubenswrapper[4826]: I0319 18:59:40.850831 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 18:59:41 crc kubenswrapper[4826]: I0319 18:59:41.739474 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 19 18:59:41 crc kubenswrapper[4826]: I0319 18:59:41.741943 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 19 18:59:41 crc kubenswrapper[4826]: I0319 18:59:41.742864 4826 scope.go:117] "RemoveContainer" containerID="d6543dc21146ffce18eefd1d6f58480662c580fc8dbb20550656709811dd6cc7" Mar 19 18:59:41 crc kubenswrapper[4826]: I0319 18:59:41.742778 4826 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="924eaf45c108010a89e613176dcd7b45a23f2ab5f857de0f6f9d92f5f4ddc7f8" exitCode=0 Mar 19 18:59:41 crc kubenswrapper[4826]: I0319 18:59:41.743362 4826 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4b5b9841cd846e58c72f4acc03b1509604b816bef5c45da0fc98f7483671822a" exitCode=0 Mar 19 18:59:41 crc kubenswrapper[4826]: I0319 18:59:41.743375 4826 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0ea714363aa8ce7507efbab8cbb23b850bb2fa272d7cf20eb2c9eb8af0a3da21" exitCode=0 Mar 19 18:59:41 crc kubenswrapper[4826]: I0319 18:59:41.743388 4826 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="bbf79230bb3f40d8a5de7b681913877be3e763cae02c99c6ebe12bff0e0319ab" exitCode=2 Mar 19 18:59:41 crc kubenswrapper[4826]: I0319 18:59:41.745349 4826 generic.go:334] "Generic (PLEG): container finished" podID="76ee4930-51ba-4831-9d42-469ffb000e9d" containerID="5ef4d51ec32ff8ae1d9aaec3ae8413fab8773bb251a2b2746f2b25fafbf0dbc9" exitCode=0 Mar 19 18:59:41 crc kubenswrapper[4826]: I0319 18:59:41.745472 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"76ee4930-51ba-4831-9d42-469ffb000e9d","Type":"ContainerDied","Data":"5ef4d51ec32ff8ae1d9aaec3ae8413fab8773bb251a2b2746f2b25fafbf0dbc9"} Mar 19 18:59:41 crc kubenswrapper[4826]: I0319 18:59:41.746247 4826 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" Mar 19 18:59:41 crc kubenswrapper[4826]: I0319 18:59:41.746527 4826 status_manager.go:851] "Failed to get status for pod" podUID="76ee4930-51ba-4831-9d42-469ffb000e9d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" Mar 19 18:59:42 crc kubenswrapper[4826]: E0319 18:59:42.352555 4826 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:59:42Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:59:42Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:59:42Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:59:42Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:2fd3c01420dada0aac3ddcf5f3e15bb4f77216eb1b18b7543cb5c955674faac6\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:b2e45bfeec42763e914803d0552e0f83028f1caf63487926ea163bb344ae59a4\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1746814424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:9d7d63c5c1a1c201b9d6a680e47dcad7003d26867b0a82fa07411ad0014a2bf5\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:c2426feb9ad97ce3bf42a2d9546c6959370de6acf6441ff9d22973e0a00e7a64\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1252412131},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:898c67bf7fc973e99114f3148976a6c21ae0dbe413051415588fa9b995f5b331\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:a641939d2096609a4cf6eec872a1476b7c671bfd81cffc2edeb6e9f13c9deeba\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1231028434},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:3a0729ac2d723d4dfbe8dab8121792a6f3caebacff42048e4ed85dd2ae1ca741\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d1d373f7f344e2d85cc27acf105a4ab3f429077302678e1323eef34664302ac7\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1223675094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-cli@sha256:69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9\\\",\\\"registry.redhat.io/openshift4/ose-cli@sha256:ef83967297f619f45075e7fd1428a1eb981622a6c174c46fb53b158ed24bed85\\\",\\\"registry.redhat.io/openshift4/ose-cli:latest\\\"],\\\"sizeBytes\\\":584351326},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" Mar 19 18:59:42 crc kubenswrapper[4826]: E0319 18:59:42.353945 4826 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" Mar 19 18:59:42 crc kubenswrapper[4826]: E0319 18:59:42.354617 4826 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" Mar 19 18:59:42 crc kubenswrapper[4826]: E0319 18:59:42.355372 4826 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" Mar 19 18:59:42 crc kubenswrapper[4826]: E0319 18:59:42.356093 4826 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" Mar 19 18:59:42 crc kubenswrapper[4826]: E0319 18:59:42.356297 4826 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 18:59:42 crc kubenswrapper[4826]: E0319 18:59:42.705974 4826 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" Mar 19 18:59:42 crc kubenswrapper[4826]: E0319 18:59:42.707242 4826 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" Mar 19 18:59:42 crc kubenswrapper[4826]: E0319 18:59:42.708150 4826 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" Mar 19 18:59:42 crc kubenswrapper[4826]: E0319 18:59:42.708596 4826 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" Mar 19 18:59:42 crc kubenswrapper[4826]: E0319 18:59:42.708988 4826 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" Mar 19 18:59:42 crc kubenswrapper[4826]: I0319 18:59:42.709164 4826 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 19 18:59:42 crc kubenswrapper[4826]: E0319 18:59:42.709772 4826 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" interval="200ms" Mar 19 18:59:42 crc kubenswrapper[4826]: I0319 18:59:42.765248 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 19 18:59:42 crc kubenswrapper[4826]: E0319 18:59:42.910724 4826 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" interval="400ms" Mar 19 18:59:43 crc kubenswrapper[4826]: I0319 18:59:43.065464 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 19 18:59:43 crc kubenswrapper[4826]: I0319 18:59:43.066550 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:59:43 crc kubenswrapper[4826]: I0319 18:59:43.067332 4826 status_manager.go:851] "Failed to get status for pod" podUID="76ee4930-51ba-4831-9d42-469ffb000e9d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" Mar 19 18:59:43 crc kubenswrapper[4826]: I0319 18:59:43.068288 4826 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" Mar 19 18:59:43 crc kubenswrapper[4826]: I0319 18:59:43.184676 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 19 18:59:43 crc kubenswrapper[4826]: I0319 18:59:43.185403 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 19 18:59:43 crc kubenswrapper[4826]: I0319 18:59:43.185439 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 19 18:59:43 crc kubenswrapper[4826]: I0319 18:59:43.184744 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 18:59:43 crc kubenswrapper[4826]: I0319 18:59:43.185617 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 18:59:43 crc kubenswrapper[4826]: I0319 18:59:43.185749 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 18:59:43 crc kubenswrapper[4826]: I0319 18:59:43.186553 4826 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:43 crc kubenswrapper[4826]: I0319 18:59:43.186581 4826 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:43 crc kubenswrapper[4826]: I0319 18:59:43.186594 4826 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:43 crc kubenswrapper[4826]: I0319 18:59:43.197180 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 19 18:59:43 crc kubenswrapper[4826]: I0319 18:59:43.197865 4826 status_manager.go:851] "Failed to get status for pod" podUID="76ee4930-51ba-4831-9d42-469ffb000e9d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" Mar 19 18:59:43 crc kubenswrapper[4826]: I0319 18:59:43.198596 4826 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" Mar 19 18:59:43 crc kubenswrapper[4826]: I0319 18:59:43.287688 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/76ee4930-51ba-4831-9d42-469ffb000e9d-kube-api-access\") pod \"76ee4930-51ba-4831-9d42-469ffb000e9d\" (UID: \"76ee4930-51ba-4831-9d42-469ffb000e9d\") " Mar 19 18:59:43 crc kubenswrapper[4826]: I0319 18:59:43.287878 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/76ee4930-51ba-4831-9d42-469ffb000e9d-var-lock\") pod \"76ee4930-51ba-4831-9d42-469ffb000e9d\" (UID: \"76ee4930-51ba-4831-9d42-469ffb000e9d\") " Mar 19 18:59:43 crc kubenswrapper[4826]: I0319 18:59:43.287975 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/76ee4930-51ba-4831-9d42-469ffb000e9d-kubelet-dir\") pod \"76ee4930-51ba-4831-9d42-469ffb000e9d\" (UID: \"76ee4930-51ba-4831-9d42-469ffb000e9d\") " Mar 19 18:59:43 crc kubenswrapper[4826]: I0319 18:59:43.288324 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/76ee4930-51ba-4831-9d42-469ffb000e9d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "76ee4930-51ba-4831-9d42-469ffb000e9d" (UID: "76ee4930-51ba-4831-9d42-469ffb000e9d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 18:59:43 crc kubenswrapper[4826]: I0319 18:59:43.288408 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/76ee4930-51ba-4831-9d42-469ffb000e9d-var-lock" (OuterVolumeSpecName: "var-lock") pod "76ee4930-51ba-4831-9d42-469ffb000e9d" (UID: "76ee4930-51ba-4831-9d42-469ffb000e9d"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 18:59:43 crc kubenswrapper[4826]: I0319 18:59:43.298647 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76ee4930-51ba-4831-9d42-469ffb000e9d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "76ee4930-51ba-4831-9d42-469ffb000e9d" (UID: "76ee4930-51ba-4831-9d42-469ffb000e9d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 18:59:43 crc kubenswrapper[4826]: E0319 18:59:43.312537 4826 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" interval="800ms" Mar 19 18:59:43 crc kubenswrapper[4826]: I0319 18:59:43.390072 4826 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/76ee4930-51ba-4831-9d42-469ffb000e9d-var-lock\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:43 crc kubenswrapper[4826]: I0319 18:59:43.390166 4826 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/76ee4930-51ba-4831-9d42-469ffb000e9d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:43 crc kubenswrapper[4826]: I0319 18:59:43.390183 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/76ee4930-51ba-4831-9d42-469ffb000e9d-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 18:59:43 crc kubenswrapper[4826]: I0319 18:59:43.796095 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 19 18:59:43 crc kubenswrapper[4826]: I0319 18:59:43.801215 4826 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1c7e8bd5e1686bfadfc6f61c8436c0ca538cebbebb8fdafa685621c729b143ae" exitCode=0 Mar 19 18:59:43 crc kubenswrapper[4826]: I0319 18:59:43.801270 4826 scope.go:117] "RemoveContainer" containerID="924eaf45c108010a89e613176dcd7b45a23f2ab5f857de0f6f9d92f5f4ddc7f8" Mar 19 18:59:43 crc kubenswrapper[4826]: I0319 18:59:43.801384 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:59:43 crc kubenswrapper[4826]: I0319 18:59:43.805349 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"76ee4930-51ba-4831-9d42-469ffb000e9d","Type":"ContainerDied","Data":"05f1a59a966c4555c4656f1e8f92f03862063aeb63e6bc4c871f0d2ae77972e4"} Mar 19 18:59:43 crc kubenswrapper[4826]: I0319 18:59:43.805392 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 19 18:59:43 crc kubenswrapper[4826]: I0319 18:59:43.805411 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05f1a59a966c4555c4656f1e8f92f03862063aeb63e6bc4c871f0d2ae77972e4" Mar 19 18:59:43 crc kubenswrapper[4826]: I0319 18:59:43.821578 4826 status_manager.go:851] "Failed to get status for pod" podUID="76ee4930-51ba-4831-9d42-469ffb000e9d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" Mar 19 18:59:43 crc kubenswrapper[4826]: I0319 18:59:43.822261 4826 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" Mar 19 18:59:43 crc kubenswrapper[4826]: I0319 18:59:43.823184 4826 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" Mar 19 18:59:43 crc kubenswrapper[4826]: I0319 18:59:43.824127 4826 status_manager.go:851] "Failed to get status for pod" podUID="76ee4930-51ba-4831-9d42-469ffb000e9d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" Mar 19 18:59:43 crc kubenswrapper[4826]: I0319 18:59:43.828253 4826 scope.go:117] "RemoveContainer" containerID="4b5b9841cd846e58c72f4acc03b1509604b816bef5c45da0fc98f7483671822a" Mar 19 18:59:43 crc kubenswrapper[4826]: I0319 18:59:43.849115 4826 scope.go:117] "RemoveContainer" containerID="0ea714363aa8ce7507efbab8cbb23b850bb2fa272d7cf20eb2c9eb8af0a3da21" Mar 19 18:59:43 crc kubenswrapper[4826]: I0319 18:59:43.870058 4826 scope.go:117] "RemoveContainer" containerID="bbf79230bb3f40d8a5de7b681913877be3e763cae02c99c6ebe12bff0e0319ab" Mar 19 18:59:43 crc kubenswrapper[4826]: I0319 18:59:43.889114 4826 scope.go:117] "RemoveContainer" containerID="1c7e8bd5e1686bfadfc6f61c8436c0ca538cebbebb8fdafa685621c729b143ae" Mar 19 18:59:43 crc kubenswrapper[4826]: I0319 18:59:43.906243 4826 scope.go:117] "RemoveContainer" containerID="114ac8bf22a2fbdae76be3b65c0c6a0b81a43812c8fb3559af532d5f14eb50d8" Mar 19 18:59:43 crc kubenswrapper[4826]: I0319 18:59:43.929630 4826 scope.go:117] "RemoveContainer" containerID="924eaf45c108010a89e613176dcd7b45a23f2ab5f857de0f6f9d92f5f4ddc7f8" Mar 19 18:59:43 crc kubenswrapper[4826]: E0319 18:59:43.930241 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"924eaf45c108010a89e613176dcd7b45a23f2ab5f857de0f6f9d92f5f4ddc7f8\": container with ID starting with 924eaf45c108010a89e613176dcd7b45a23f2ab5f857de0f6f9d92f5f4ddc7f8 not found: ID does not exist" containerID="924eaf45c108010a89e613176dcd7b45a23f2ab5f857de0f6f9d92f5f4ddc7f8" Mar 19 18:59:43 crc kubenswrapper[4826]: I0319 18:59:43.930290 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"924eaf45c108010a89e613176dcd7b45a23f2ab5f857de0f6f9d92f5f4ddc7f8"} err="failed to get container status \"924eaf45c108010a89e613176dcd7b45a23f2ab5f857de0f6f9d92f5f4ddc7f8\": rpc error: code = NotFound desc = could not find container \"924eaf45c108010a89e613176dcd7b45a23f2ab5f857de0f6f9d92f5f4ddc7f8\": container with ID starting with 924eaf45c108010a89e613176dcd7b45a23f2ab5f857de0f6f9d92f5f4ddc7f8 not found: ID does not exist" Mar 19 18:59:43 crc kubenswrapper[4826]: I0319 18:59:43.930321 4826 scope.go:117] "RemoveContainer" containerID="4b5b9841cd846e58c72f4acc03b1509604b816bef5c45da0fc98f7483671822a" Mar 19 18:59:43 crc kubenswrapper[4826]: E0319 18:59:43.930612 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b5b9841cd846e58c72f4acc03b1509604b816bef5c45da0fc98f7483671822a\": container with ID starting with 4b5b9841cd846e58c72f4acc03b1509604b816bef5c45da0fc98f7483671822a not found: ID does not exist" containerID="4b5b9841cd846e58c72f4acc03b1509604b816bef5c45da0fc98f7483671822a" Mar 19 18:59:43 crc kubenswrapper[4826]: I0319 18:59:43.930731 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b5b9841cd846e58c72f4acc03b1509604b816bef5c45da0fc98f7483671822a"} err="failed to get container status \"4b5b9841cd846e58c72f4acc03b1509604b816bef5c45da0fc98f7483671822a\": rpc error: code = NotFound desc = could not find container \"4b5b9841cd846e58c72f4acc03b1509604b816bef5c45da0fc98f7483671822a\": container with ID starting with 4b5b9841cd846e58c72f4acc03b1509604b816bef5c45da0fc98f7483671822a not found: ID does not exist" Mar 19 18:59:43 crc kubenswrapper[4826]: I0319 18:59:43.930744 4826 scope.go:117] "RemoveContainer" containerID="0ea714363aa8ce7507efbab8cbb23b850bb2fa272d7cf20eb2c9eb8af0a3da21" Mar 19 18:59:43 crc kubenswrapper[4826]: E0319 18:59:43.931177 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ea714363aa8ce7507efbab8cbb23b850bb2fa272d7cf20eb2c9eb8af0a3da21\": container with ID starting with 0ea714363aa8ce7507efbab8cbb23b850bb2fa272d7cf20eb2c9eb8af0a3da21 not found: ID does not exist" containerID="0ea714363aa8ce7507efbab8cbb23b850bb2fa272d7cf20eb2c9eb8af0a3da21" Mar 19 18:59:43 crc kubenswrapper[4826]: I0319 18:59:43.931195 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ea714363aa8ce7507efbab8cbb23b850bb2fa272d7cf20eb2c9eb8af0a3da21"} err="failed to get container status \"0ea714363aa8ce7507efbab8cbb23b850bb2fa272d7cf20eb2c9eb8af0a3da21\": rpc error: code = NotFound desc = could not find container \"0ea714363aa8ce7507efbab8cbb23b850bb2fa272d7cf20eb2c9eb8af0a3da21\": container with ID starting with 0ea714363aa8ce7507efbab8cbb23b850bb2fa272d7cf20eb2c9eb8af0a3da21 not found: ID does not exist" Mar 19 18:59:43 crc kubenswrapper[4826]: I0319 18:59:43.931208 4826 scope.go:117] "RemoveContainer" containerID="bbf79230bb3f40d8a5de7b681913877be3e763cae02c99c6ebe12bff0e0319ab" Mar 19 18:59:43 crc kubenswrapper[4826]: E0319 18:59:43.931502 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbf79230bb3f40d8a5de7b681913877be3e763cae02c99c6ebe12bff0e0319ab\": container with ID starting with bbf79230bb3f40d8a5de7b681913877be3e763cae02c99c6ebe12bff0e0319ab not found: ID does not exist" containerID="bbf79230bb3f40d8a5de7b681913877be3e763cae02c99c6ebe12bff0e0319ab" Mar 19 18:59:43 crc kubenswrapper[4826]: I0319 18:59:43.931523 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbf79230bb3f40d8a5de7b681913877be3e763cae02c99c6ebe12bff0e0319ab"} err="failed to get container status \"bbf79230bb3f40d8a5de7b681913877be3e763cae02c99c6ebe12bff0e0319ab\": rpc error: code = NotFound desc = could not find container \"bbf79230bb3f40d8a5de7b681913877be3e763cae02c99c6ebe12bff0e0319ab\": container with ID starting with bbf79230bb3f40d8a5de7b681913877be3e763cae02c99c6ebe12bff0e0319ab not found: ID does not exist" Mar 19 18:59:43 crc kubenswrapper[4826]: I0319 18:59:43.931542 4826 scope.go:117] "RemoveContainer" containerID="1c7e8bd5e1686bfadfc6f61c8436c0ca538cebbebb8fdafa685621c729b143ae" Mar 19 18:59:43 crc kubenswrapper[4826]: E0319 18:59:43.931870 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c7e8bd5e1686bfadfc6f61c8436c0ca538cebbebb8fdafa685621c729b143ae\": container with ID starting with 1c7e8bd5e1686bfadfc6f61c8436c0ca538cebbebb8fdafa685621c729b143ae not found: ID does not exist" containerID="1c7e8bd5e1686bfadfc6f61c8436c0ca538cebbebb8fdafa685621c729b143ae" Mar 19 18:59:43 crc kubenswrapper[4826]: I0319 18:59:43.931920 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c7e8bd5e1686bfadfc6f61c8436c0ca538cebbebb8fdafa685621c729b143ae"} err="failed to get container status \"1c7e8bd5e1686bfadfc6f61c8436c0ca538cebbebb8fdafa685621c729b143ae\": rpc error: code = NotFound desc = could not find container \"1c7e8bd5e1686bfadfc6f61c8436c0ca538cebbebb8fdafa685621c729b143ae\": container with ID starting with 1c7e8bd5e1686bfadfc6f61c8436c0ca538cebbebb8fdafa685621c729b143ae not found: ID does not exist" Mar 19 18:59:43 crc kubenswrapper[4826]: I0319 18:59:43.931957 4826 scope.go:117] "RemoveContainer" containerID="114ac8bf22a2fbdae76be3b65c0c6a0b81a43812c8fb3559af532d5f14eb50d8" Mar 19 18:59:43 crc kubenswrapper[4826]: E0319 18:59:43.933562 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"114ac8bf22a2fbdae76be3b65c0c6a0b81a43812c8fb3559af532d5f14eb50d8\": container with ID starting with 114ac8bf22a2fbdae76be3b65c0c6a0b81a43812c8fb3559af532d5f14eb50d8 not found: ID does not exist" containerID="114ac8bf22a2fbdae76be3b65c0c6a0b81a43812c8fb3559af532d5f14eb50d8" Mar 19 18:59:43 crc kubenswrapper[4826]: I0319 18:59:43.933632 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"114ac8bf22a2fbdae76be3b65c0c6a0b81a43812c8fb3559af532d5f14eb50d8"} err="failed to get container status \"114ac8bf22a2fbdae76be3b65c0c6a0b81a43812c8fb3559af532d5f14eb50d8\": rpc error: code = NotFound desc = could not find container \"114ac8bf22a2fbdae76be3b65c0c6a0b81a43812c8fb3559af532d5f14eb50d8\": container with ID starting with 114ac8bf22a2fbdae76be3b65c0c6a0b81a43812c8fb3559af532d5f14eb50d8 not found: ID does not exist" Mar 19 18:59:43 crc kubenswrapper[4826]: I0319 18:59:43.984595 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 19 18:59:44 crc kubenswrapper[4826]: E0319 18:59:44.114366 4826 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" interval="1.6s" Mar 19 18:59:45 crc kubenswrapper[4826]: E0319 18:59:45.716277 4826 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" interval="3.2s" Mar 19 18:59:45 crc kubenswrapper[4826]: E0319 18:59:45.738987 4826 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.69:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 18:59:45 crc kubenswrapper[4826]: I0319 18:59:45.739717 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 18:59:45 crc kubenswrapper[4826]: E0319 18:59:45.779782 4826 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.69:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189e5333f0ce2239 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:59:45.779032633 +0000 UTC m=+210.533100946,LastTimestamp:2026-03-19 18:59:45.779032633 +0000 UTC m=+210.533100946,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:59:45 crc kubenswrapper[4826]: I0319 18:59:45.843225 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"bb6e0f5e769be03f6d3723c5efc49b458a2d0728902e2527da0898ab41c4e107"} Mar 19 18:59:45 crc kubenswrapper[4826]: I0319 18:59:45.979436 4826 status_manager.go:851] "Failed to get status for pod" podUID="76ee4930-51ba-4831-9d42-469ffb000e9d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" Mar 19 18:59:46 crc kubenswrapper[4826]: I0319 18:59:46.852121 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"f37976ff6b1cba0fe4073772f8ba9852d055d5ddcad89ae680c2eb10d31e0ddc"} Mar 19 18:59:46 crc kubenswrapper[4826]: I0319 18:59:46.853202 4826 status_manager.go:851] "Failed to get status for pod" podUID="76ee4930-51ba-4831-9d42-469ffb000e9d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" Mar 19 18:59:46 crc kubenswrapper[4826]: E0319 18:59:46.853249 4826 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.69:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 18:59:47 crc kubenswrapper[4826]: E0319 18:59:47.859525 4826 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.69:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 18:59:48 crc kubenswrapper[4826]: E0319 18:59:48.917919 4826 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" interval="6.4s" Mar 19 18:59:52 crc kubenswrapper[4826]: E0319 18:59:52.549378 4826 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:59:52Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:59:52Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:59:52Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T18:59:52Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:2fd3c01420dada0aac3ddcf5f3e15bb4f77216eb1b18b7543cb5c955674faac6\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:b2e45bfeec42763e914803d0552e0f83028f1caf63487926ea163bb344ae59a4\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1746814424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:9d7d63c5c1a1c201b9d6a680e47dcad7003d26867b0a82fa07411ad0014a2bf5\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:c2426feb9ad97ce3bf42a2d9546c6959370de6acf6441ff9d22973e0a00e7a64\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1252412131},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:898c67bf7fc973e99114f3148976a6c21ae0dbe413051415588fa9b995f5b331\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:a641939d2096609a4cf6eec872a1476b7c671bfd81cffc2edeb6e9f13c9deeba\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1231028434},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:3a0729ac2d723d4dfbe8dab8121792a6f3caebacff42048e4ed85dd2ae1ca741\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d1d373f7f344e2d85cc27acf105a4ab3f429077302678e1323eef34664302ac7\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1223675094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-cli@sha256:69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9\\\",\\\"registry.redhat.io/openshift4/ose-cli@sha256:ef83967297f619f45075e7fd1428a1eb981622a6c174c46fb53b158ed24bed85\\\",\\\"registry.redhat.io/openshift4/ose-cli:latest\\\"],\\\"sizeBytes\\\":584351326},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" Mar 19 18:59:52 crc kubenswrapper[4826]: E0319 18:59:52.550393 4826 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" Mar 19 18:59:52 crc kubenswrapper[4826]: E0319 18:59:52.550778 4826 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" Mar 19 18:59:52 crc kubenswrapper[4826]: E0319 18:59:52.551076 4826 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" Mar 19 18:59:52 crc kubenswrapper[4826]: E0319 18:59:52.551225 4826 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" Mar 19 18:59:52 crc kubenswrapper[4826]: E0319 18:59:52.551242 4826 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 18:59:53 crc kubenswrapper[4826]: E0319 18:59:53.360286 4826 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.69:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189e5333f0ce2239 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 18:59:45.779032633 +0000 UTC m=+210.533100946,LastTimestamp:2026-03-19 18:59:45.779032633 +0000 UTC m=+210.533100946,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 18:59:53 crc kubenswrapper[4826]: I0319 18:59:53.980599 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:59:53 crc kubenswrapper[4826]: I0319 18:59:53.981900 4826 status_manager.go:851] "Failed to get status for pod" podUID="76ee4930-51ba-4831-9d42-469ffb000e9d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" Mar 19 18:59:54 crc kubenswrapper[4826]: I0319 18:59:54.004832 4826 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e43f424d-e1b9-437a-9f69-704a183575d4" Mar 19 18:59:54 crc kubenswrapper[4826]: I0319 18:59:54.004892 4826 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e43f424d-e1b9-437a-9f69-704a183575d4" Mar 19 18:59:54 crc kubenswrapper[4826]: E0319 18:59:54.005614 4826 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:59:54 crc kubenswrapper[4826]: I0319 18:59:54.006455 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:59:54 crc kubenswrapper[4826]: W0319 18:59:54.035914 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-75b33badf318d44150c0fe04897c6306d88f38176d832588ce10694b9d031960 WatchSource:0}: Error finding container 75b33badf318d44150c0fe04897c6306d88f38176d832588ce10694b9d031960: Status 404 returned error can't find the container with id 75b33badf318d44150c0fe04897c6306d88f38176d832588ce10694b9d031960 Mar 19 18:59:54 crc kubenswrapper[4826]: I0319 18:59:54.915974 4826 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="a4403811044bb11b7446b1ca785e197d5484f78550169d5188bf78647bbb2403" exitCode=0 Mar 19 18:59:54 crc kubenswrapper[4826]: I0319 18:59:54.916104 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"a4403811044bb11b7446b1ca785e197d5484f78550169d5188bf78647bbb2403"} Mar 19 18:59:54 crc kubenswrapper[4826]: I0319 18:59:54.916482 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"75b33badf318d44150c0fe04897c6306d88f38176d832588ce10694b9d031960"} Mar 19 18:59:54 crc kubenswrapper[4826]: I0319 18:59:54.916943 4826 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e43f424d-e1b9-437a-9f69-704a183575d4" Mar 19 18:59:54 crc kubenswrapper[4826]: I0319 18:59:54.916969 4826 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e43f424d-e1b9-437a-9f69-704a183575d4" Mar 19 18:59:54 crc kubenswrapper[4826]: E0319 18:59:54.918090 4826 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:59:54 crc kubenswrapper[4826]: I0319 18:59:54.918586 4826 status_manager.go:851] "Failed to get status for pod" podUID="76ee4930-51ba-4831-9d42-469ffb000e9d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" Mar 19 18:59:55 crc kubenswrapper[4826]: E0319 18:59:55.318952 4826 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" interval="7s" Mar 19 18:59:55 crc kubenswrapper[4826]: I0319 18:59:55.401278 4826 patch_prober.go:28] interesting pod/machine-config-daemon-zz87p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 18:59:55 crc kubenswrapper[4826]: I0319 18:59:55.401392 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 18:59:55 crc kubenswrapper[4826]: I0319 18:59:55.931270 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 19 18:59:55 crc kubenswrapper[4826]: I0319 18:59:55.932544 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 19 18:59:55 crc kubenswrapper[4826]: I0319 18:59:55.932622 4826 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="a22f8de90a48c727556cc628544f3262bb1f7f32592a6672b8895a9e395d28af" exitCode=1 Mar 19 18:59:55 crc kubenswrapper[4826]: I0319 18:59:55.932740 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"a22f8de90a48c727556cc628544f3262bb1f7f32592a6672b8895a9e395d28af"} Mar 19 18:59:55 crc kubenswrapper[4826]: I0319 18:59:55.934561 4826 scope.go:117] "RemoveContainer" containerID="a22f8de90a48c727556cc628544f3262bb1f7f32592a6672b8895a9e395d28af" Mar 19 18:59:55 crc kubenswrapper[4826]: I0319 18:59:55.946496 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e3442cd3adf4b4541ce6a18934aa11368c43265de685f849f3a4c67b59c59ca3"} Mar 19 18:59:55 crc kubenswrapper[4826]: I0319 18:59:55.946576 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"bfc8817922f399af55f046416e2b0ba3c779b6b9dabf60e79b588f702920af67"} Mar 19 18:59:55 crc kubenswrapper[4826]: I0319 18:59:55.946599 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c2e039351ac0159d6dbb0b56b543c9b1108b010d172ee33b2629d00bfffd5953"} Mar 19 18:59:56 crc kubenswrapper[4826]: I0319 18:59:56.904020 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 18:59:56 crc kubenswrapper[4826]: I0319 18:59:56.956462 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 19 18:59:56 crc kubenswrapper[4826]: I0319 18:59:56.956928 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 19 18:59:56 crc kubenswrapper[4826]: I0319 18:59:56.957009 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f29e73691785ec65fa525c1e8b3f6ee226df96250aa3138bf89302e5e9c8b33e"} Mar 19 18:59:56 crc kubenswrapper[4826]: I0319 18:59:56.961522 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"21b25e4fde389400404fbbafe5cfbb2fb7682a9ae76620bc654a261b10b44a84"} Mar 19 18:59:56 crc kubenswrapper[4826]: I0319 18:59:56.961584 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"5bc2026820e2d8e287ddd72ccdd071fdc9487c515ac103018ae7c751ad9c6df0"} Mar 19 18:59:56 crc kubenswrapper[4826]: I0319 18:59:56.961937 4826 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e43f424d-e1b9-437a-9f69-704a183575d4" Mar 19 18:59:56 crc kubenswrapper[4826]: I0319 18:59:56.961956 4826 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e43f424d-e1b9-437a-9f69-704a183575d4" Mar 19 18:59:56 crc kubenswrapper[4826]: I0319 18:59:56.966825 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:59:59 crc kubenswrapper[4826]: I0319 18:59:59.007148 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:59:59 crc kubenswrapper[4826]: I0319 18:59:59.007812 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:59:59 crc kubenswrapper[4826]: I0319 18:59:59.012955 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 18:59:59 crc kubenswrapper[4826]: I0319 18:59:59.257757 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 19:00:01 crc kubenswrapper[4826]: I0319 19:00:01.974227 4826 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 19:00:01 crc kubenswrapper[4826]: I0319 19:00:01.993965 4826 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e43f424d-e1b9-437a-9f69-704a183575d4" Mar 19 19:00:01 crc kubenswrapper[4826]: I0319 19:00:01.994007 4826 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e43f424d-e1b9-437a-9f69-704a183575d4" Mar 19 19:00:01 crc kubenswrapper[4826]: I0319 19:00:01.999068 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 19:00:02 crc kubenswrapper[4826]: I0319 19:00:02.010048 4826 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="9300619c-225e-443f-94bc-c464d1c7ceda" Mar 19 19:00:02 crc kubenswrapper[4826]: I0319 19:00:02.999563 4826 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e43f424d-e1b9-437a-9f69-704a183575d4" Mar 19 19:00:02 crc kubenswrapper[4826]: I0319 19:00:02.999603 4826 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e43f424d-e1b9-437a-9f69-704a183575d4" Mar 19 19:00:03 crc kubenswrapper[4826]: I0319 19:00:03.002167 4826 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="9300619c-225e-443f-94bc-c464d1c7ceda" Mar 19 19:00:05 crc kubenswrapper[4826]: I0319 19:00:05.537224 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 19:00:05 crc kubenswrapper[4826]: I0319 19:00:05.544348 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 19:00:06 crc kubenswrapper[4826]: I0319 19:00:06.038155 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 19:00:11 crc kubenswrapper[4826]: I0319 19:00:11.898837 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 19 19:00:12 crc kubenswrapper[4826]: I0319 19:00:12.171132 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 19 19:00:12 crc kubenswrapper[4826]: I0319 19:00:12.238525 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 19 19:00:12 crc kubenswrapper[4826]: I0319 19:00:12.295437 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 19:00:12 crc kubenswrapper[4826]: I0319 19:00:12.346801 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 19 19:00:12 crc kubenswrapper[4826]: I0319 19:00:12.346900 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 19 19:00:12 crc kubenswrapper[4826]: I0319 19:00:12.950276 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 19 19:00:13 crc kubenswrapper[4826]: I0319 19:00:13.154803 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 19 19:00:13 crc kubenswrapper[4826]: I0319 19:00:13.158969 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 19 19:00:13 crc kubenswrapper[4826]: I0319 19:00:13.176393 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 19 19:00:13 crc kubenswrapper[4826]: I0319 19:00:13.266230 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 19 19:00:13 crc kubenswrapper[4826]: I0319 19:00:13.296847 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 19 19:00:13 crc kubenswrapper[4826]: I0319 19:00:13.443028 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 19 19:00:13 crc kubenswrapper[4826]: I0319 19:00:13.575541 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 19 19:00:13 crc kubenswrapper[4826]: I0319 19:00:13.579517 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 19 19:00:13 crc kubenswrapper[4826]: I0319 19:00:13.749011 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 19 19:00:14 crc kubenswrapper[4826]: I0319 19:00:14.013129 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 19 19:00:14 crc kubenswrapper[4826]: I0319 19:00:14.016870 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 19 19:00:14 crc kubenswrapper[4826]: I0319 19:00:14.143300 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 19 19:00:14 crc kubenswrapper[4826]: I0319 19:00:14.263646 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 19 19:00:14 crc kubenswrapper[4826]: I0319 19:00:14.326572 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 19 19:00:14 crc kubenswrapper[4826]: I0319 19:00:14.504032 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 19 19:00:14 crc kubenswrapper[4826]: I0319 19:00:14.526624 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 19 19:00:14 crc kubenswrapper[4826]: I0319 19:00:14.550347 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 19 19:00:14 crc kubenswrapper[4826]: I0319 19:00:14.565354 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 19 19:00:14 crc kubenswrapper[4826]: I0319 19:00:14.804464 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 19 19:00:15 crc kubenswrapper[4826]: I0319 19:00:15.041469 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 19 19:00:15 crc kubenswrapper[4826]: I0319 19:00:15.143774 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 19 19:00:15 crc kubenswrapper[4826]: I0319 19:00:15.204976 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 19 19:00:15 crc kubenswrapper[4826]: I0319 19:00:15.228475 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 19 19:00:15 crc kubenswrapper[4826]: I0319 19:00:15.245627 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 19 19:00:15 crc kubenswrapper[4826]: I0319 19:00:15.423644 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 19 19:00:15 crc kubenswrapper[4826]: I0319 19:00:15.547008 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 19 19:00:15 crc kubenswrapper[4826]: I0319 19:00:15.655799 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 19 19:00:15 crc kubenswrapper[4826]: I0319 19:00:15.715579 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 19 19:00:15 crc kubenswrapper[4826]: I0319 19:00:15.817257 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 19 19:00:15 crc kubenswrapper[4826]: I0319 19:00:15.846997 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 19 19:00:15 crc kubenswrapper[4826]: I0319 19:00:15.850769 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 19 19:00:16 crc kubenswrapper[4826]: I0319 19:00:16.034075 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 19 19:00:16 crc kubenswrapper[4826]: I0319 19:00:16.106246 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 19 19:00:16 crc kubenswrapper[4826]: I0319 19:00:16.158449 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 19 19:00:16 crc kubenswrapper[4826]: I0319 19:00:16.175380 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 19 19:00:16 crc kubenswrapper[4826]: I0319 19:00:16.221434 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 19 19:00:16 crc kubenswrapper[4826]: I0319 19:00:16.224917 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 19 19:00:16 crc kubenswrapper[4826]: I0319 19:00:16.244351 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 19 19:00:16 crc kubenswrapper[4826]: I0319 19:00:16.268027 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 19 19:00:16 crc kubenswrapper[4826]: I0319 19:00:16.280212 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 19 19:00:16 crc kubenswrapper[4826]: I0319 19:00:16.285117 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 19 19:00:16 crc kubenswrapper[4826]: I0319 19:00:16.320305 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 19 19:00:16 crc kubenswrapper[4826]: I0319 19:00:16.360443 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 19 19:00:16 crc kubenswrapper[4826]: I0319 19:00:16.385393 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 19 19:00:16 crc kubenswrapper[4826]: I0319 19:00:16.402242 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 19:00:16 crc kubenswrapper[4826]: I0319 19:00:16.508413 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 19 19:00:16 crc kubenswrapper[4826]: I0319 19:00:16.605676 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 19 19:00:16 crc kubenswrapper[4826]: I0319 19:00:16.609019 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 19 19:00:16 crc kubenswrapper[4826]: I0319 19:00:16.664568 4826 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 19 19:00:16 crc kubenswrapper[4826]: I0319 19:00:16.676730 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 19:00:16 crc kubenswrapper[4826]: I0319 19:00:16.688328 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 19 19:00:16 crc kubenswrapper[4826]: I0319 19:00:16.711533 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 19 19:00:16 crc kubenswrapper[4826]: I0319 19:00:16.777546 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 19 19:00:16 crc kubenswrapper[4826]: I0319 19:00:16.786094 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 19 19:00:16 crc kubenswrapper[4826]: I0319 19:00:16.889278 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 19 19:00:16 crc kubenswrapper[4826]: I0319 19:00:16.896289 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 19 19:00:16 crc kubenswrapper[4826]: I0319 19:00:16.948631 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 19 19:00:16 crc kubenswrapper[4826]: I0319 19:00:16.990605 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 19 19:00:17 crc kubenswrapper[4826]: I0319 19:00:17.048157 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 19 19:00:17 crc kubenswrapper[4826]: I0319 19:00:17.146992 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 19 19:00:17 crc kubenswrapper[4826]: I0319 19:00:17.201281 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 19 19:00:17 crc kubenswrapper[4826]: I0319 19:00:17.230011 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 19 19:00:17 crc kubenswrapper[4826]: I0319 19:00:17.285538 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 19 19:00:17 crc kubenswrapper[4826]: I0319 19:00:17.306611 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 19 19:00:17 crc kubenswrapper[4826]: I0319 19:00:17.330306 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 19 19:00:17 crc kubenswrapper[4826]: I0319 19:00:17.339207 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 19 19:00:17 crc kubenswrapper[4826]: I0319 19:00:17.451546 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 19 19:00:17 crc kubenswrapper[4826]: I0319 19:00:17.464327 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 19 19:00:17 crc kubenswrapper[4826]: I0319 19:00:17.489195 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 19 19:00:17 crc kubenswrapper[4826]: I0319 19:00:17.529798 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 19 19:00:17 crc kubenswrapper[4826]: I0319 19:00:17.662638 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 19 19:00:17 crc kubenswrapper[4826]: I0319 19:00:17.781451 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 19 19:00:17 crc kubenswrapper[4826]: I0319 19:00:17.829005 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 19 19:00:17 crc kubenswrapper[4826]: I0319 19:00:17.885717 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 19 19:00:17 crc kubenswrapper[4826]: I0319 19:00:17.899410 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 19 19:00:18 crc kubenswrapper[4826]: I0319 19:00:18.059628 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 19 19:00:18 crc kubenswrapper[4826]: I0319 19:00:18.141070 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 19 19:00:18 crc kubenswrapper[4826]: I0319 19:00:18.211039 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 19 19:00:18 crc kubenswrapper[4826]: I0319 19:00:18.240682 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 19 19:00:18 crc kubenswrapper[4826]: I0319 19:00:18.260611 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 19 19:00:18 crc kubenswrapper[4826]: I0319 19:00:18.300502 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 19 19:00:18 crc kubenswrapper[4826]: I0319 19:00:18.307829 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 19 19:00:18 crc kubenswrapper[4826]: I0319 19:00:18.355438 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 19 19:00:18 crc kubenswrapper[4826]: I0319 19:00:18.356802 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 19 19:00:18 crc kubenswrapper[4826]: I0319 19:00:18.367455 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 19 19:00:18 crc kubenswrapper[4826]: I0319 19:00:18.469091 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 19 19:00:18 crc kubenswrapper[4826]: I0319 19:00:18.589039 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 19 19:00:18 crc kubenswrapper[4826]: I0319 19:00:18.633282 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 19 19:00:18 crc kubenswrapper[4826]: I0319 19:00:18.645590 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 19 19:00:18 crc kubenswrapper[4826]: I0319 19:00:18.833392 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 19 19:00:18 crc kubenswrapper[4826]: I0319 19:00:18.837762 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 19 19:00:18 crc kubenswrapper[4826]: I0319 19:00:18.860035 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 19 19:00:18 crc kubenswrapper[4826]: I0319 19:00:18.887895 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 19 19:00:18 crc kubenswrapper[4826]: I0319 19:00:18.927541 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 19 19:00:18 crc kubenswrapper[4826]: I0319 19:00:18.968926 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 19 19:00:19 crc kubenswrapper[4826]: I0319 19:00:19.113467 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 19 19:00:19 crc kubenswrapper[4826]: I0319 19:00:19.118312 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 19 19:00:19 crc kubenswrapper[4826]: I0319 19:00:19.165226 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 19 19:00:19 crc kubenswrapper[4826]: I0319 19:00:19.170750 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 19 19:00:19 crc kubenswrapper[4826]: I0319 19:00:19.220702 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 19 19:00:19 crc kubenswrapper[4826]: I0319 19:00:19.344435 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 19 19:00:19 crc kubenswrapper[4826]: I0319 19:00:19.374346 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 19 19:00:19 crc kubenswrapper[4826]: I0319 19:00:19.422295 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 19 19:00:19 crc kubenswrapper[4826]: I0319 19:00:19.497131 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 19 19:00:19 crc kubenswrapper[4826]: I0319 19:00:19.535166 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 19 19:00:19 crc kubenswrapper[4826]: I0319 19:00:19.586490 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 19 19:00:19 crc kubenswrapper[4826]: I0319 19:00:19.643172 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 19 19:00:19 crc kubenswrapper[4826]: I0319 19:00:19.656759 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 19 19:00:19 crc kubenswrapper[4826]: I0319 19:00:19.712938 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 19 19:00:19 crc kubenswrapper[4826]: I0319 19:00:19.766294 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 19 19:00:19 crc kubenswrapper[4826]: I0319 19:00:19.957128 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 19 19:00:19 crc kubenswrapper[4826]: I0319 19:00:19.975993 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 19 19:00:19 crc kubenswrapper[4826]: I0319 19:00:19.983637 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 19 19:00:20 crc kubenswrapper[4826]: I0319 19:00:20.043462 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 19 19:00:20 crc kubenswrapper[4826]: I0319 19:00:20.130940 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 19 19:00:20 crc kubenswrapper[4826]: I0319 19:00:20.237403 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 19 19:00:20 crc kubenswrapper[4826]: I0319 19:00:20.239170 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 19 19:00:20 crc kubenswrapper[4826]: I0319 19:00:20.286841 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 19 19:00:20 crc kubenswrapper[4826]: I0319 19:00:20.295140 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 19 19:00:20 crc kubenswrapper[4826]: I0319 19:00:20.299287 4826 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 19 19:00:20 crc kubenswrapper[4826]: I0319 19:00:20.307960 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 19 19:00:20 crc kubenswrapper[4826]: I0319 19:00:20.308057 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565780-79mff","openshift-kube-apiserver/kube-apiserver-crc","openshift-infra/auto-csr-approver-29565780-w8btv"] Mar 19 19:00:20 crc kubenswrapper[4826]: E0319 19:00:20.308425 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76ee4930-51ba-4831-9d42-469ffb000e9d" containerName="installer" Mar 19 19:00:20 crc kubenswrapper[4826]: I0319 19:00:20.308464 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="76ee4930-51ba-4831-9d42-469ffb000e9d" containerName="installer" Mar 19 19:00:20 crc kubenswrapper[4826]: I0319 19:00:20.308720 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="76ee4930-51ba-4831-9d42-469ffb000e9d" containerName="installer" Mar 19 19:00:20 crc kubenswrapper[4826]: I0319 19:00:20.309700 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565780-79mff" Mar 19 19:00:20 crc kubenswrapper[4826]: I0319 19:00:20.310844 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565780-w8btv" Mar 19 19:00:20 crc kubenswrapper[4826]: I0319 19:00:20.312209 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 19 19:00:20 crc kubenswrapper[4826]: I0319 19:00:20.312591 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 19 19:00:20 crc kubenswrapper[4826]: I0319 19:00:20.313844 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b27wl" Mar 19 19:00:20 crc kubenswrapper[4826]: I0319 19:00:20.314265 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 19:00:20 crc kubenswrapper[4826]: I0319 19:00:20.315792 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 19:00:20 crc kubenswrapper[4826]: I0319 19:00:20.316191 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 19:00:20 crc kubenswrapper[4826]: I0319 19:00:20.326819 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 19 19:00:20 crc kubenswrapper[4826]: I0319 19:00:20.327061 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 19 19:00:20 crc kubenswrapper[4826]: I0319 19:00:20.347184 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=19.347154134 podStartE2EDuration="19.347154134s" podCreationTimestamp="2026-03-19 19:00:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:00:20.341905816 +0000 UTC m=+245.095974179" watchObservedRunningTime="2026-03-19 19:00:20.347154134 +0000 UTC m=+245.101222457" Mar 19 19:00:20 crc kubenswrapper[4826]: I0319 19:00:20.415581 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 19 19:00:20 crc kubenswrapper[4826]: I0319 19:00:20.438314 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/797e9c0f-d7d9-461c-ab73-7cda8c133c4d-config-volume\") pod \"collect-profiles-29565780-79mff\" (UID: \"797e9c0f-d7d9-461c-ab73-7cda8c133c4d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565780-79mff" Mar 19 19:00:20 crc kubenswrapper[4826]: I0319 19:00:20.438504 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/797e9c0f-d7d9-461c-ab73-7cda8c133c4d-secret-volume\") pod \"collect-profiles-29565780-79mff\" (UID: \"797e9c0f-d7d9-461c-ab73-7cda8c133c4d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565780-79mff" Mar 19 19:00:20 crc kubenswrapper[4826]: I0319 19:00:20.438647 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xdlc\" (UniqueName: \"kubernetes.io/projected/c6b4c82a-0ec5-412b-96c9-f61bc323c223-kube-api-access-2xdlc\") pod \"auto-csr-approver-29565780-w8btv\" (UID: \"c6b4c82a-0ec5-412b-96c9-f61bc323c223\") " pod="openshift-infra/auto-csr-approver-29565780-w8btv" Mar 19 19:00:20 crc kubenswrapper[4826]: I0319 19:00:20.438690 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22c7v\" (UniqueName: \"kubernetes.io/projected/797e9c0f-d7d9-461c-ab73-7cda8c133c4d-kube-api-access-22c7v\") pod \"collect-profiles-29565780-79mff\" (UID: \"797e9c0f-d7d9-461c-ab73-7cda8c133c4d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565780-79mff" Mar 19 19:00:20 crc kubenswrapper[4826]: I0319 19:00:20.475343 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 19:00:20 crc kubenswrapper[4826]: I0319 19:00:20.539504 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/797e9c0f-d7d9-461c-ab73-7cda8c133c4d-secret-volume\") pod \"collect-profiles-29565780-79mff\" (UID: \"797e9c0f-d7d9-461c-ab73-7cda8c133c4d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565780-79mff" Mar 19 19:00:20 crc kubenswrapper[4826]: I0319 19:00:20.539568 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xdlc\" (UniqueName: \"kubernetes.io/projected/c6b4c82a-0ec5-412b-96c9-f61bc323c223-kube-api-access-2xdlc\") pod \"auto-csr-approver-29565780-w8btv\" (UID: \"c6b4c82a-0ec5-412b-96c9-f61bc323c223\") " pod="openshift-infra/auto-csr-approver-29565780-w8btv" Mar 19 19:00:20 crc kubenswrapper[4826]: I0319 19:00:20.539592 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22c7v\" (UniqueName: \"kubernetes.io/projected/797e9c0f-d7d9-461c-ab73-7cda8c133c4d-kube-api-access-22c7v\") pod \"collect-profiles-29565780-79mff\" (UID: \"797e9c0f-d7d9-461c-ab73-7cda8c133c4d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565780-79mff" Mar 19 19:00:20 crc kubenswrapper[4826]: I0319 19:00:20.539618 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/797e9c0f-d7d9-461c-ab73-7cda8c133c4d-config-volume\") pod \"collect-profiles-29565780-79mff\" (UID: \"797e9c0f-d7d9-461c-ab73-7cda8c133c4d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565780-79mff" Mar 19 19:00:20 crc kubenswrapper[4826]: I0319 19:00:20.540541 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/797e9c0f-d7d9-461c-ab73-7cda8c133c4d-config-volume\") pod \"collect-profiles-29565780-79mff\" (UID: \"797e9c0f-d7d9-461c-ab73-7cda8c133c4d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565780-79mff" Mar 19 19:00:20 crc kubenswrapper[4826]: I0319 19:00:20.552188 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 19 19:00:20 crc kubenswrapper[4826]: I0319 19:00:20.554534 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/797e9c0f-d7d9-461c-ab73-7cda8c133c4d-secret-volume\") pod \"collect-profiles-29565780-79mff\" (UID: \"797e9c0f-d7d9-461c-ab73-7cda8c133c4d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565780-79mff" Mar 19 19:00:20 crc kubenswrapper[4826]: I0319 19:00:20.557554 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22c7v\" (UniqueName: \"kubernetes.io/projected/797e9c0f-d7d9-461c-ab73-7cda8c133c4d-kube-api-access-22c7v\") pod \"collect-profiles-29565780-79mff\" (UID: \"797e9c0f-d7d9-461c-ab73-7cda8c133c4d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565780-79mff" Mar 19 19:00:20 crc kubenswrapper[4826]: I0319 19:00:20.561181 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 19 19:00:20 crc kubenswrapper[4826]: I0319 19:00:20.562061 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xdlc\" (UniqueName: \"kubernetes.io/projected/c6b4c82a-0ec5-412b-96c9-f61bc323c223-kube-api-access-2xdlc\") pod \"auto-csr-approver-29565780-w8btv\" (UID: \"c6b4c82a-0ec5-412b-96c9-f61bc323c223\") " pod="openshift-infra/auto-csr-approver-29565780-w8btv" Mar 19 19:00:20 crc kubenswrapper[4826]: I0319 19:00:20.577327 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 19 19:00:20 crc kubenswrapper[4826]: I0319 19:00:20.610098 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 19 19:00:20 crc kubenswrapper[4826]: I0319 19:00:20.626627 4826 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 19 19:00:20 crc kubenswrapper[4826]: I0319 19:00:20.629421 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 19 19:00:20 crc kubenswrapper[4826]: I0319 19:00:20.636326 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565780-79mff" Mar 19 19:00:20 crc kubenswrapper[4826]: I0319 19:00:20.647253 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565780-w8btv" Mar 19 19:00:20 crc kubenswrapper[4826]: I0319 19:00:20.687891 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 19 19:00:20 crc kubenswrapper[4826]: I0319 19:00:20.774765 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 19 19:00:20 crc kubenswrapper[4826]: I0319 19:00:20.775757 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 19 19:00:20 crc kubenswrapper[4826]: I0319 19:00:20.823243 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 19 19:00:20 crc kubenswrapper[4826]: I0319 19:00:20.842875 4826 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 19 19:00:20 crc kubenswrapper[4826]: I0319 19:00:20.861202 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 19 19:00:20 crc kubenswrapper[4826]: I0319 19:00:20.933467 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 19 19:00:20 crc kubenswrapper[4826]: I0319 19:00:20.999950 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 19 19:00:21 crc kubenswrapper[4826]: I0319 19:00:21.015966 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 19 19:00:21 crc kubenswrapper[4826]: I0319 19:00:21.065462 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 19 19:00:21 crc kubenswrapper[4826]: I0319 19:00:21.201469 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 19 19:00:21 crc kubenswrapper[4826]: I0319 19:00:21.230894 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 19 19:00:21 crc kubenswrapper[4826]: I0319 19:00:21.235421 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 19 19:00:21 crc kubenswrapper[4826]: I0319 19:00:21.268466 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 19 19:00:21 crc kubenswrapper[4826]: I0319 19:00:21.278008 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 19 19:00:21 crc kubenswrapper[4826]: I0319 19:00:21.348234 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 19 19:00:21 crc kubenswrapper[4826]: I0319 19:00:21.362792 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 19 19:00:21 crc kubenswrapper[4826]: I0319 19:00:21.415413 4826 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 19 19:00:21 crc kubenswrapper[4826]: I0319 19:00:21.509042 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 19 19:00:21 crc kubenswrapper[4826]: I0319 19:00:21.526082 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 19 19:00:21 crc kubenswrapper[4826]: I0319 19:00:21.564345 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 19 19:00:21 crc kubenswrapper[4826]: I0319 19:00:21.580740 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 19 19:00:21 crc kubenswrapper[4826]: I0319 19:00:21.667025 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 19 19:00:21 crc kubenswrapper[4826]: I0319 19:00:21.715781 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 19:00:21 crc kubenswrapper[4826]: I0319 19:00:21.858515 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 19 19:00:21 crc kubenswrapper[4826]: I0319 19:00:21.867454 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 19 19:00:21 crc kubenswrapper[4826]: I0319 19:00:21.881808 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 19 19:00:21 crc kubenswrapper[4826]: I0319 19:00:21.924562 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 19 19:00:21 crc kubenswrapper[4826]: I0319 19:00:21.953227 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 19 19:00:22 crc kubenswrapper[4826]: I0319 19:00:22.100476 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 19 19:00:22 crc kubenswrapper[4826]: I0319 19:00:22.142164 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 19 19:00:22 crc kubenswrapper[4826]: I0319 19:00:22.236145 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 19 19:00:22 crc kubenswrapper[4826]: I0319 19:00:22.299564 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 19:00:22 crc kubenswrapper[4826]: I0319 19:00:22.410469 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 19 19:00:22 crc kubenswrapper[4826]: I0319 19:00:22.472099 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 19 19:00:22 crc kubenswrapper[4826]: I0319 19:00:22.565966 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 19 19:00:22 crc kubenswrapper[4826]: I0319 19:00:22.583377 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 19 19:00:22 crc kubenswrapper[4826]: I0319 19:00:22.638787 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 19 19:00:22 crc kubenswrapper[4826]: I0319 19:00:22.737252 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 19 19:00:22 crc kubenswrapper[4826]: I0319 19:00:22.839345 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 19 19:00:22 crc kubenswrapper[4826]: I0319 19:00:22.847909 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 19 19:00:22 crc kubenswrapper[4826]: I0319 19:00:22.915534 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 19 19:00:22 crc kubenswrapper[4826]: I0319 19:00:22.942340 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 19:00:23 crc kubenswrapper[4826]: I0319 19:00:23.160576 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 19 19:00:23 crc kubenswrapper[4826]: I0319 19:00:23.172305 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 19 19:00:23 crc kubenswrapper[4826]: I0319 19:00:23.277358 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 19 19:00:23 crc kubenswrapper[4826]: I0319 19:00:23.316423 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 19 19:00:23 crc kubenswrapper[4826]: I0319 19:00:23.476048 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 19 19:00:23 crc kubenswrapper[4826]: I0319 19:00:23.508132 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 19 19:00:23 crc kubenswrapper[4826]: I0319 19:00:23.513971 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 19 19:00:23 crc kubenswrapper[4826]: I0319 19:00:23.573229 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 19 19:00:23 crc kubenswrapper[4826]: I0319 19:00:23.633766 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 19 19:00:23 crc kubenswrapper[4826]: I0319 19:00:23.732822 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 19 19:00:23 crc kubenswrapper[4826]: I0319 19:00:23.751143 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 19 19:00:23 crc kubenswrapper[4826]: I0319 19:00:23.767545 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 19 19:00:23 crc kubenswrapper[4826]: I0319 19:00:23.851086 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 19 19:00:23 crc kubenswrapper[4826]: I0319 19:00:23.992108 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 19 19:00:24 crc kubenswrapper[4826]: I0319 19:00:24.025108 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 19 19:00:24 crc kubenswrapper[4826]: I0319 19:00:24.032571 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 19 19:00:24 crc kubenswrapper[4826]: I0319 19:00:24.040428 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 19 19:00:24 crc kubenswrapper[4826]: I0319 19:00:24.098799 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 19 19:00:24 crc kubenswrapper[4826]: I0319 19:00:24.129232 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 19 19:00:24 crc kubenswrapper[4826]: I0319 19:00:24.163782 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 19 19:00:24 crc kubenswrapper[4826]: I0319 19:00:24.289518 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 19 19:00:24 crc kubenswrapper[4826]: I0319 19:00:24.426170 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 19 19:00:24 crc kubenswrapper[4826]: I0319 19:00:24.448287 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 19 19:00:24 crc kubenswrapper[4826]: I0319 19:00:24.457828 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 19 19:00:24 crc kubenswrapper[4826]: I0319 19:00:24.552547 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 19 19:00:24 crc kubenswrapper[4826]: I0319 19:00:24.557746 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 19 19:00:24 crc kubenswrapper[4826]: I0319 19:00:24.615200 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 19:00:24 crc kubenswrapper[4826]: I0319 19:00:24.626135 4826 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 19 19:00:24 crc kubenswrapper[4826]: I0319 19:00:24.626425 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://f37976ff6b1cba0fe4073772f8ba9852d055d5ddcad89ae680c2eb10d31e0ddc" gracePeriod=5 Mar 19 19:00:24 crc kubenswrapper[4826]: I0319 19:00:24.734641 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 19 19:00:24 crc kubenswrapper[4826]: I0319 19:00:24.791167 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 19 19:00:24 crc kubenswrapper[4826]: I0319 19:00:24.810426 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 19 19:00:24 crc kubenswrapper[4826]: I0319 19:00:24.904106 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 19 19:00:24 crc kubenswrapper[4826]: I0319 19:00:24.915141 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 19 19:00:24 crc kubenswrapper[4826]: I0319 19:00:24.924302 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 19 19:00:24 crc kubenswrapper[4826]: I0319 19:00:24.954478 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 19 19:00:24 crc kubenswrapper[4826]: I0319 19:00:24.974065 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 19 19:00:24 crc kubenswrapper[4826]: I0319 19:00:24.986464 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 19:00:25 crc kubenswrapper[4826]: I0319 19:00:25.001026 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 19 19:00:25 crc kubenswrapper[4826]: I0319 19:00:25.068795 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 19 19:00:25 crc kubenswrapper[4826]: I0319 19:00:25.090977 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 19 19:00:25 crc kubenswrapper[4826]: I0319 19:00:25.224455 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 19 19:00:25 crc kubenswrapper[4826]: I0319 19:00:25.240167 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 19 19:00:25 crc kubenswrapper[4826]: I0319 19:00:25.373050 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 19 19:00:25 crc kubenswrapper[4826]: I0319 19:00:25.400887 4826 patch_prober.go:28] interesting pod/machine-config-daemon-zz87p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:00:25 crc kubenswrapper[4826]: I0319 19:00:25.400958 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:00:25 crc kubenswrapper[4826]: I0319 19:00:25.534890 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 19 19:00:25 crc kubenswrapper[4826]: I0319 19:00:25.656528 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 19 19:00:25 crc kubenswrapper[4826]: I0319 19:00:25.725150 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 19 19:00:25 crc kubenswrapper[4826]: I0319 19:00:25.765644 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 19 19:00:25 crc kubenswrapper[4826]: I0319 19:00:25.832159 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 19 19:00:25 crc kubenswrapper[4826]: I0319 19:00:25.860471 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 19 19:00:25 crc kubenswrapper[4826]: I0319 19:00:25.912170 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 19 19:00:26 crc kubenswrapper[4826]: I0319 19:00:26.052149 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 19 19:00:26 crc kubenswrapper[4826]: I0319 19:00:26.253073 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 19 19:00:26 crc kubenswrapper[4826]: I0319 19:00:26.258148 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 19 19:00:26 crc kubenswrapper[4826]: I0319 19:00:26.350685 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 19 19:00:26 crc kubenswrapper[4826]: I0319 19:00:26.388612 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 19 19:00:26 crc kubenswrapper[4826]: I0319 19:00:26.436745 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 19 19:00:26 crc kubenswrapper[4826]: I0319 19:00:26.440494 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 19 19:00:26 crc kubenswrapper[4826]: I0319 19:00:26.680995 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 19 19:00:26 crc kubenswrapper[4826]: I0319 19:00:26.707318 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 19 19:00:26 crc kubenswrapper[4826]: I0319 19:00:26.812321 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 19:00:27 crc kubenswrapper[4826]: I0319 19:00:27.002977 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 19 19:00:27 crc kubenswrapper[4826]: I0319 19:00:27.219544 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 19:00:27 crc kubenswrapper[4826]: I0319 19:00:27.346347 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 19 19:00:27 crc kubenswrapper[4826]: I0319 19:00:27.362237 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 19 19:00:27 crc kubenswrapper[4826]: I0319 19:00:27.930571 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 19 19:00:28 crc kubenswrapper[4826]: I0319 19:00:28.040013 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 19 19:00:28 crc kubenswrapper[4826]: I0319 19:00:28.269476 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 19 19:00:28 crc kubenswrapper[4826]: I0319 19:00:28.366160 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 19 19:00:29 crc kubenswrapper[4826]: I0319 19:00:29.041228 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 19 19:00:29 crc kubenswrapper[4826]: I0319 19:00:29.084237 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 19 19:00:29 crc kubenswrapper[4826]: I0319 19:00:29.321687 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565780-w8btv"] Mar 19 19:00:29 crc kubenswrapper[4826]: I0319 19:00:29.342511 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565780-79mff"] Mar 19 19:00:29 crc kubenswrapper[4826]: I0319 19:00:29.609264 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565780-79mff"] Mar 19 19:00:29 crc kubenswrapper[4826]: I0319 19:00:29.682169 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 19 19:00:29 crc kubenswrapper[4826]: I0319 19:00:29.752243 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565780-w8btv"] Mar 19 19:00:29 crc kubenswrapper[4826]: W0319 19:00:29.771598 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc6b4c82a_0ec5_412b_96c9_f61bc323c223.slice/crio-1d62d3704177f444604d5eb6f3f6768fb30c7f5d703006ad1691b874994ccb5f WatchSource:0}: Error finding container 1d62d3704177f444604d5eb6f3f6768fb30c7f5d703006ad1691b874994ccb5f: Status 404 returned error can't find the container with id 1d62d3704177f444604d5eb6f3f6768fb30c7f5d703006ad1691b874994ccb5f Mar 19 19:00:30 crc kubenswrapper[4826]: I0319 19:00:30.198113 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 19 19:00:30 crc kubenswrapper[4826]: I0319 19:00:30.198256 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 19:00:30 crc kubenswrapper[4826]: I0319 19:00:30.201133 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 19 19:00:30 crc kubenswrapper[4826]: I0319 19:00:30.214151 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 19 19:00:30 crc kubenswrapper[4826]: I0319 19:00:30.214204 4826 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="f37976ff6b1cba0fe4073772f8ba9852d055d5ddcad89ae680c2eb10d31e0ddc" exitCode=137 Mar 19 19:00:30 crc kubenswrapper[4826]: I0319 19:00:30.214283 4826 scope.go:117] "RemoveContainer" containerID="f37976ff6b1cba0fe4073772f8ba9852d055d5ddcad89ae680c2eb10d31e0ddc" Mar 19 19:00:30 crc kubenswrapper[4826]: I0319 19:00:30.214316 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 19:00:30 crc kubenswrapper[4826]: I0319 19:00:30.217961 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565780-w8btv" event={"ID":"c6b4c82a-0ec5-412b-96c9-f61bc323c223","Type":"ContainerStarted","Data":"1d62d3704177f444604d5eb6f3f6768fb30c7f5d703006ad1691b874994ccb5f"} Mar 19 19:00:30 crc kubenswrapper[4826]: I0319 19:00:30.220831 4826 generic.go:334] "Generic (PLEG): container finished" podID="797e9c0f-d7d9-461c-ab73-7cda8c133c4d" containerID="1cf3c843ba2a4331da681572d286424c30a9112a0cb639da13e8ea62b88d6cb3" exitCode=0 Mar 19 19:00:30 crc kubenswrapper[4826]: I0319 19:00:30.220878 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565780-79mff" event={"ID":"797e9c0f-d7d9-461c-ab73-7cda8c133c4d","Type":"ContainerDied","Data":"1cf3c843ba2a4331da681572d286424c30a9112a0cb639da13e8ea62b88d6cb3"} Mar 19 19:00:30 crc kubenswrapper[4826]: I0319 19:00:30.220906 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565780-79mff" event={"ID":"797e9c0f-d7d9-461c-ab73-7cda8c133c4d","Type":"ContainerStarted","Data":"9735ad9c0a456ecea4e26e5847e5a3bc17e3202a0697e070257e12dd528e6c45"} Mar 19 19:00:30 crc kubenswrapper[4826]: I0319 19:00:30.247998 4826 scope.go:117] "RemoveContainer" containerID="f37976ff6b1cba0fe4073772f8ba9852d055d5ddcad89ae680c2eb10d31e0ddc" Mar 19 19:00:30 crc kubenswrapper[4826]: E0319 19:00:30.249239 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f37976ff6b1cba0fe4073772f8ba9852d055d5ddcad89ae680c2eb10d31e0ddc\": container with ID starting with f37976ff6b1cba0fe4073772f8ba9852d055d5ddcad89ae680c2eb10d31e0ddc not found: ID does not exist" containerID="f37976ff6b1cba0fe4073772f8ba9852d055d5ddcad89ae680c2eb10d31e0ddc" Mar 19 19:00:30 crc kubenswrapper[4826]: I0319 19:00:30.249294 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f37976ff6b1cba0fe4073772f8ba9852d055d5ddcad89ae680c2eb10d31e0ddc"} err="failed to get container status \"f37976ff6b1cba0fe4073772f8ba9852d055d5ddcad89ae680c2eb10d31e0ddc\": rpc error: code = NotFound desc = could not find container \"f37976ff6b1cba0fe4073772f8ba9852d055d5ddcad89ae680c2eb10d31e0ddc\": container with ID starting with f37976ff6b1cba0fe4073772f8ba9852d055d5ddcad89ae680c2eb10d31e0ddc not found: ID does not exist" Mar 19 19:00:30 crc kubenswrapper[4826]: I0319 19:00:30.374050 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 19 19:00:30 crc kubenswrapper[4826]: I0319 19:00:30.374172 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 19 19:00:30 crc kubenswrapper[4826]: I0319 19:00:30.374214 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 19 19:00:30 crc kubenswrapper[4826]: I0319 19:00:30.374254 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 19:00:30 crc kubenswrapper[4826]: I0319 19:00:30.374291 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 19 19:00:30 crc kubenswrapper[4826]: I0319 19:00:30.374358 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 19:00:30 crc kubenswrapper[4826]: I0319 19:00:30.374374 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 19:00:30 crc kubenswrapper[4826]: I0319 19:00:30.374411 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 19 19:00:30 crc kubenswrapper[4826]: I0319 19:00:30.374503 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 19:00:30 crc kubenswrapper[4826]: I0319 19:00:30.375368 4826 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 19 19:00:30 crc kubenswrapper[4826]: I0319 19:00:30.375551 4826 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 19 19:00:30 crc kubenswrapper[4826]: I0319 19:00:30.375564 4826 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 19 19:00:30 crc kubenswrapper[4826]: I0319 19:00:30.375572 4826 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 19 19:00:30 crc kubenswrapper[4826]: I0319 19:00:30.381994 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 19:00:30 crc kubenswrapper[4826]: I0319 19:00:30.476616 4826 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 19 19:00:31 crc kubenswrapper[4826]: I0319 19:00:31.233337 4826 generic.go:334] "Generic (PLEG): container finished" podID="c6b4c82a-0ec5-412b-96c9-f61bc323c223" containerID="bd3d498d949ef15f2b6717c00121ba069a6744779c7bfaef6091b1a983c11748" exitCode=0 Mar 19 19:00:31 crc kubenswrapper[4826]: I0319 19:00:31.233408 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565780-w8btv" event={"ID":"c6b4c82a-0ec5-412b-96c9-f61bc323c223","Type":"ContainerDied","Data":"bd3d498d949ef15f2b6717c00121ba069a6744779c7bfaef6091b1a983c11748"} Mar 19 19:00:31 crc kubenswrapper[4826]: I0319 19:00:31.646580 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565780-79mff" Mar 19 19:00:31 crc kubenswrapper[4826]: I0319 19:00:31.695404 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/797e9c0f-d7d9-461c-ab73-7cda8c133c4d-secret-volume\") pod \"797e9c0f-d7d9-461c-ab73-7cda8c133c4d\" (UID: \"797e9c0f-d7d9-461c-ab73-7cda8c133c4d\") " Mar 19 19:00:31 crc kubenswrapper[4826]: I0319 19:00:31.695506 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22c7v\" (UniqueName: \"kubernetes.io/projected/797e9c0f-d7d9-461c-ab73-7cda8c133c4d-kube-api-access-22c7v\") pod \"797e9c0f-d7d9-461c-ab73-7cda8c133c4d\" (UID: \"797e9c0f-d7d9-461c-ab73-7cda8c133c4d\") " Mar 19 19:00:31 crc kubenswrapper[4826]: I0319 19:00:31.695550 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/797e9c0f-d7d9-461c-ab73-7cda8c133c4d-config-volume\") pod \"797e9c0f-d7d9-461c-ab73-7cda8c133c4d\" (UID: \"797e9c0f-d7d9-461c-ab73-7cda8c133c4d\") " Mar 19 19:00:31 crc kubenswrapper[4826]: I0319 19:00:31.697158 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/797e9c0f-d7d9-461c-ab73-7cda8c133c4d-config-volume" (OuterVolumeSpecName: "config-volume") pod "797e9c0f-d7d9-461c-ab73-7cda8c133c4d" (UID: "797e9c0f-d7d9-461c-ab73-7cda8c133c4d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:00:31 crc kubenswrapper[4826]: I0319 19:00:31.703497 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/797e9c0f-d7d9-461c-ab73-7cda8c133c4d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "797e9c0f-d7d9-461c-ab73-7cda8c133c4d" (UID: "797e9c0f-d7d9-461c-ab73-7cda8c133c4d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:00:31 crc kubenswrapper[4826]: I0319 19:00:31.707515 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/797e9c0f-d7d9-461c-ab73-7cda8c133c4d-kube-api-access-22c7v" (OuterVolumeSpecName: "kube-api-access-22c7v") pod "797e9c0f-d7d9-461c-ab73-7cda8c133c4d" (UID: "797e9c0f-d7d9-461c-ab73-7cda8c133c4d"). InnerVolumeSpecName "kube-api-access-22c7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:00:31 crc kubenswrapper[4826]: I0319 19:00:31.796596 4826 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/797e9c0f-d7d9-461c-ab73-7cda8c133c4d-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 19 19:00:31 crc kubenswrapper[4826]: I0319 19:00:31.796636 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22c7v\" (UniqueName: \"kubernetes.io/projected/797e9c0f-d7d9-461c-ab73-7cda8c133c4d-kube-api-access-22c7v\") on node \"crc\" DevicePath \"\"" Mar 19 19:00:31 crc kubenswrapper[4826]: I0319 19:00:31.796684 4826 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/797e9c0f-d7d9-461c-ab73-7cda8c133c4d-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 19:00:31 crc kubenswrapper[4826]: I0319 19:00:31.985277 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 19 19:00:32 crc kubenswrapper[4826]: I0319 19:00:32.242035 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565780-79mff" Mar 19 19:00:32 crc kubenswrapper[4826]: I0319 19:00:32.242856 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565780-79mff" event={"ID":"797e9c0f-d7d9-461c-ab73-7cda8c133c4d","Type":"ContainerDied","Data":"9735ad9c0a456ecea4e26e5847e5a3bc17e3202a0697e070257e12dd528e6c45"} Mar 19 19:00:32 crc kubenswrapper[4826]: I0319 19:00:32.242896 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9735ad9c0a456ecea4e26e5847e5a3bc17e3202a0697e070257e12dd528e6c45" Mar 19 19:00:32 crc kubenswrapper[4826]: I0319 19:00:32.614586 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565780-w8btv" Mar 19 19:00:32 crc kubenswrapper[4826]: I0319 19:00:32.726332 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xdlc\" (UniqueName: \"kubernetes.io/projected/c6b4c82a-0ec5-412b-96c9-f61bc323c223-kube-api-access-2xdlc\") pod \"c6b4c82a-0ec5-412b-96c9-f61bc323c223\" (UID: \"c6b4c82a-0ec5-412b-96c9-f61bc323c223\") " Mar 19 19:00:32 crc kubenswrapper[4826]: I0319 19:00:32.732694 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6b4c82a-0ec5-412b-96c9-f61bc323c223-kube-api-access-2xdlc" (OuterVolumeSpecName: "kube-api-access-2xdlc") pod "c6b4c82a-0ec5-412b-96c9-f61bc323c223" (UID: "c6b4c82a-0ec5-412b-96c9-f61bc323c223"). InnerVolumeSpecName "kube-api-access-2xdlc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:00:32 crc kubenswrapper[4826]: I0319 19:00:32.828811 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xdlc\" (UniqueName: \"kubernetes.io/projected/c6b4c82a-0ec5-412b-96c9-f61bc323c223-kube-api-access-2xdlc\") on node \"crc\" DevicePath \"\"" Mar 19 19:00:33 crc kubenswrapper[4826]: I0319 19:00:33.253243 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565780-w8btv" event={"ID":"c6b4c82a-0ec5-412b-96c9-f61bc323c223","Type":"ContainerDied","Data":"1d62d3704177f444604d5eb6f3f6768fb30c7f5d703006ad1691b874994ccb5f"} Mar 19 19:00:33 crc kubenswrapper[4826]: I0319 19:00:33.253300 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d62d3704177f444604d5eb6f3f6768fb30c7f5d703006ad1691b874994ccb5f" Mar 19 19:00:33 crc kubenswrapper[4826]: I0319 19:00:33.254333 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565780-w8btv" Mar 19 19:00:33 crc kubenswrapper[4826]: I0319 19:00:33.903139 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-64c9cf5fcb-kq7k9"] Mar 19 19:00:33 crc kubenswrapper[4826]: I0319 19:00:33.903832 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-64c9cf5fcb-kq7k9" podUID="ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8" containerName="controller-manager" containerID="cri-o://fd88dbfbae7a3b7e1aa5860fbfc5ca58b40879e9050d0f513a269557ae1f7813" gracePeriod=30 Mar 19 19:00:33 crc kubenswrapper[4826]: I0319 19:00:33.909534 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5ffd9cb86-bg57h"] Mar 19 19:00:33 crc kubenswrapper[4826]: I0319 19:00:33.909931 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5ffd9cb86-bg57h" podUID="59640edf-3068-4993-8b50-c049106935ee" containerName="route-controller-manager" containerID="cri-o://cb3568bc85dcd89368e79d7fcdd3fef2bd4c8973e88249e6a850d7bf0efa0dbe" gracePeriod=30 Mar 19 19:00:34 crc kubenswrapper[4826]: I0319 19:00:34.264982 4826 generic.go:334] "Generic (PLEG): container finished" podID="59640edf-3068-4993-8b50-c049106935ee" containerID="cb3568bc85dcd89368e79d7fcdd3fef2bd4c8973e88249e6a850d7bf0efa0dbe" exitCode=0 Mar 19 19:00:34 crc kubenswrapper[4826]: I0319 19:00:34.265214 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5ffd9cb86-bg57h" event={"ID":"59640edf-3068-4993-8b50-c049106935ee","Type":"ContainerDied","Data":"cb3568bc85dcd89368e79d7fcdd3fef2bd4c8973e88249e6a850d7bf0efa0dbe"} Mar 19 19:00:34 crc kubenswrapper[4826]: I0319 19:00:34.267752 4826 generic.go:334] "Generic (PLEG): container finished" podID="ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8" containerID="fd88dbfbae7a3b7e1aa5860fbfc5ca58b40879e9050d0f513a269557ae1f7813" exitCode=0 Mar 19 19:00:34 crc kubenswrapper[4826]: I0319 19:00:34.267794 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64c9cf5fcb-kq7k9" event={"ID":"ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8","Type":"ContainerDied","Data":"fd88dbfbae7a3b7e1aa5860fbfc5ca58b40879e9050d0f513a269557ae1f7813"} Mar 19 19:00:34 crc kubenswrapper[4826]: I0319 19:00:34.378671 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64c9cf5fcb-kq7k9" Mar 19 19:00:34 crc kubenswrapper[4826]: I0319 19:00:34.382676 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5ffd9cb86-bg57h" Mar 19 19:00:34 crc kubenswrapper[4826]: I0319 19:00:34.453582 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59640edf-3068-4993-8b50-c049106935ee-config\") pod \"59640edf-3068-4993-8b50-c049106935ee\" (UID: \"59640edf-3068-4993-8b50-c049106935ee\") " Mar 19 19:00:34 crc kubenswrapper[4826]: I0319 19:00:34.453619 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ln2sk\" (UniqueName: \"kubernetes.io/projected/ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8-kube-api-access-ln2sk\") pod \"ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8\" (UID: \"ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8\") " Mar 19 19:00:34 crc kubenswrapper[4826]: I0319 19:00:34.453646 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8-client-ca\") pod \"ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8\" (UID: \"ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8\") " Mar 19 19:00:34 crc kubenswrapper[4826]: I0319 19:00:34.453724 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8-proxy-ca-bundles\") pod \"ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8\" (UID: \"ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8\") " Mar 19 19:00:34 crc kubenswrapper[4826]: I0319 19:00:34.453749 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8-config\") pod \"ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8\" (UID: \"ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8\") " Mar 19 19:00:34 crc kubenswrapper[4826]: I0319 19:00:34.453772 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59640edf-3068-4993-8b50-c049106935ee-serving-cert\") pod \"59640edf-3068-4993-8b50-c049106935ee\" (UID: \"59640edf-3068-4993-8b50-c049106935ee\") " Mar 19 19:00:34 crc kubenswrapper[4826]: I0319 19:00:34.453850 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/59640edf-3068-4993-8b50-c049106935ee-client-ca\") pod \"59640edf-3068-4993-8b50-c049106935ee\" (UID: \"59640edf-3068-4993-8b50-c049106935ee\") " Mar 19 19:00:34 crc kubenswrapper[4826]: I0319 19:00:34.453884 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8-serving-cert\") pod \"ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8\" (UID: \"ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8\") " Mar 19 19:00:34 crc kubenswrapper[4826]: I0319 19:00:34.453920 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slnn9\" (UniqueName: \"kubernetes.io/projected/59640edf-3068-4993-8b50-c049106935ee-kube-api-access-slnn9\") pod \"59640edf-3068-4993-8b50-c049106935ee\" (UID: \"59640edf-3068-4993-8b50-c049106935ee\") " Mar 19 19:00:34 crc kubenswrapper[4826]: I0319 19:00:34.454454 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59640edf-3068-4993-8b50-c049106935ee-config" (OuterVolumeSpecName: "config") pod "59640edf-3068-4993-8b50-c049106935ee" (UID: "59640edf-3068-4993-8b50-c049106935ee"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:00:34 crc kubenswrapper[4826]: I0319 19:00:34.454614 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8-client-ca" (OuterVolumeSpecName: "client-ca") pod "ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8" (UID: "ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:00:34 crc kubenswrapper[4826]: I0319 19:00:34.454777 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8" (UID: "ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:00:34 crc kubenswrapper[4826]: I0319 19:00:34.454804 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59640edf-3068-4993-8b50-c049106935ee-client-ca" (OuterVolumeSpecName: "client-ca") pod "59640edf-3068-4993-8b50-c049106935ee" (UID: "59640edf-3068-4993-8b50-c049106935ee"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:00:34 crc kubenswrapper[4826]: I0319 19:00:34.455788 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8-config" (OuterVolumeSpecName: "config") pod "ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8" (UID: "ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:00:34 crc kubenswrapper[4826]: I0319 19:00:34.461811 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8" (UID: "ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:00:34 crc kubenswrapper[4826]: I0319 19:00:34.461837 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8-kube-api-access-ln2sk" (OuterVolumeSpecName: "kube-api-access-ln2sk") pod "ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8" (UID: "ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8"). InnerVolumeSpecName "kube-api-access-ln2sk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:00:34 crc kubenswrapper[4826]: I0319 19:00:34.461873 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59640edf-3068-4993-8b50-c049106935ee-kube-api-access-slnn9" (OuterVolumeSpecName: "kube-api-access-slnn9") pod "59640edf-3068-4993-8b50-c049106935ee" (UID: "59640edf-3068-4993-8b50-c049106935ee"). InnerVolumeSpecName "kube-api-access-slnn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:00:34 crc kubenswrapper[4826]: I0319 19:00:34.461872 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59640edf-3068-4993-8b50-c049106935ee-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "59640edf-3068-4993-8b50-c049106935ee" (UID: "59640edf-3068-4993-8b50-c049106935ee"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:00:34 crc kubenswrapper[4826]: I0319 19:00:34.555026 4826 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 19:00:34 crc kubenswrapper[4826]: I0319 19:00:34.555056 4826 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 19:00:34 crc kubenswrapper[4826]: I0319 19:00:34.555067 4826 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:00:34 crc kubenswrapper[4826]: I0319 19:00:34.555076 4826 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59640edf-3068-4993-8b50-c049106935ee-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 19:00:34 crc kubenswrapper[4826]: I0319 19:00:34.555085 4826 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/59640edf-3068-4993-8b50-c049106935ee-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 19:00:34 crc kubenswrapper[4826]: I0319 19:00:34.555093 4826 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 19:00:34 crc kubenswrapper[4826]: I0319 19:00:34.555101 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-slnn9\" (UniqueName: \"kubernetes.io/projected/59640edf-3068-4993-8b50-c049106935ee-kube-api-access-slnn9\") on node \"crc\" DevicePath \"\"" Mar 19 19:00:34 crc kubenswrapper[4826]: I0319 19:00:34.555111 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ln2sk\" (UniqueName: \"kubernetes.io/projected/ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8-kube-api-access-ln2sk\") on node \"crc\" DevicePath \"\"" Mar 19 19:00:34 crc kubenswrapper[4826]: I0319 19:00:34.555119 4826 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59640edf-3068-4993-8b50-c049106935ee-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:00:35 crc kubenswrapper[4826]: I0319 19:00:35.110054 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bb4bb89f7-9sg9f"] Mar 19 19:00:35 crc kubenswrapper[4826]: E0319 19:00:35.110484 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 19 19:00:35 crc kubenswrapper[4826]: I0319 19:00:35.110515 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 19 19:00:35 crc kubenswrapper[4826]: E0319 19:00:35.110536 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6b4c82a-0ec5-412b-96c9-f61bc323c223" containerName="oc" Mar 19 19:00:35 crc kubenswrapper[4826]: I0319 19:00:35.110551 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6b4c82a-0ec5-412b-96c9-f61bc323c223" containerName="oc" Mar 19 19:00:35 crc kubenswrapper[4826]: E0319 19:00:35.110575 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8" containerName="controller-manager" Mar 19 19:00:35 crc kubenswrapper[4826]: I0319 19:00:35.110589 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8" containerName="controller-manager" Mar 19 19:00:35 crc kubenswrapper[4826]: E0319 19:00:35.110612 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="797e9c0f-d7d9-461c-ab73-7cda8c133c4d" containerName="collect-profiles" Mar 19 19:00:35 crc kubenswrapper[4826]: I0319 19:00:35.110625 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="797e9c0f-d7d9-461c-ab73-7cda8c133c4d" containerName="collect-profiles" Mar 19 19:00:35 crc kubenswrapper[4826]: E0319 19:00:35.110677 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59640edf-3068-4993-8b50-c049106935ee" containerName="route-controller-manager" Mar 19 19:00:35 crc kubenswrapper[4826]: I0319 19:00:35.110692 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="59640edf-3068-4993-8b50-c049106935ee" containerName="route-controller-manager" Mar 19 19:00:35 crc kubenswrapper[4826]: I0319 19:00:35.110886 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="59640edf-3068-4993-8b50-c049106935ee" containerName="route-controller-manager" Mar 19 19:00:35 crc kubenswrapper[4826]: I0319 19:00:35.110909 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="797e9c0f-d7d9-461c-ab73-7cda8c133c4d" containerName="collect-profiles" Mar 19 19:00:35 crc kubenswrapper[4826]: I0319 19:00:35.110929 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6b4c82a-0ec5-412b-96c9-f61bc323c223" containerName="oc" Mar 19 19:00:35 crc kubenswrapper[4826]: I0319 19:00:35.110947 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8" containerName="controller-manager" Mar 19 19:00:35 crc kubenswrapper[4826]: I0319 19:00:35.110963 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 19 19:00:35 crc kubenswrapper[4826]: I0319 19:00:35.111554 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-bb4bb89f7-9sg9f" Mar 19 19:00:35 crc kubenswrapper[4826]: I0319 19:00:35.116413 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-567cb464d6-bm4t6"] Mar 19 19:00:35 crc kubenswrapper[4826]: I0319 19:00:35.117451 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-567cb464d6-bm4t6" Mar 19 19:00:35 crc kubenswrapper[4826]: I0319 19:00:35.122696 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bb4bb89f7-9sg9f"] Mar 19 19:00:35 crc kubenswrapper[4826]: I0319 19:00:35.163138 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c3b10a6-1abc-42b9-8602-916ecefb6c0b-config\") pod \"route-controller-manager-bb4bb89f7-9sg9f\" (UID: \"8c3b10a6-1abc-42b9-8602-916ecefb6c0b\") " pod="openshift-route-controller-manager/route-controller-manager-bb4bb89f7-9sg9f" Mar 19 19:00:35 crc kubenswrapper[4826]: I0319 19:00:35.163200 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e5996d80-d5eb-423c-8965-1f5704c3dd69-proxy-ca-bundles\") pod \"controller-manager-567cb464d6-bm4t6\" (UID: \"e5996d80-d5eb-423c-8965-1f5704c3dd69\") " pod="openshift-controller-manager/controller-manager-567cb464d6-bm4t6" Mar 19 19:00:35 crc kubenswrapper[4826]: I0319 19:00:35.163246 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8c3b10a6-1abc-42b9-8602-916ecefb6c0b-client-ca\") pod \"route-controller-manager-bb4bb89f7-9sg9f\" (UID: \"8c3b10a6-1abc-42b9-8602-916ecefb6c0b\") " pod="openshift-route-controller-manager/route-controller-manager-bb4bb89f7-9sg9f" Mar 19 19:00:35 crc kubenswrapper[4826]: I0319 19:00:35.163284 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5996d80-d5eb-423c-8965-1f5704c3dd69-config\") pod \"controller-manager-567cb464d6-bm4t6\" (UID: \"e5996d80-d5eb-423c-8965-1f5704c3dd69\") " pod="openshift-controller-manager/controller-manager-567cb464d6-bm4t6" Mar 19 19:00:35 crc kubenswrapper[4826]: I0319 19:00:35.163319 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e5996d80-d5eb-423c-8965-1f5704c3dd69-client-ca\") pod \"controller-manager-567cb464d6-bm4t6\" (UID: \"e5996d80-d5eb-423c-8965-1f5704c3dd69\") " pod="openshift-controller-manager/controller-manager-567cb464d6-bm4t6" Mar 19 19:00:35 crc kubenswrapper[4826]: I0319 19:00:35.163385 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h5px\" (UniqueName: \"kubernetes.io/projected/8c3b10a6-1abc-42b9-8602-916ecefb6c0b-kube-api-access-7h5px\") pod \"route-controller-manager-bb4bb89f7-9sg9f\" (UID: \"8c3b10a6-1abc-42b9-8602-916ecefb6c0b\") " pod="openshift-route-controller-manager/route-controller-manager-bb4bb89f7-9sg9f" Mar 19 19:00:35 crc kubenswrapper[4826]: I0319 19:00:35.163436 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c3b10a6-1abc-42b9-8602-916ecefb6c0b-serving-cert\") pod \"route-controller-manager-bb4bb89f7-9sg9f\" (UID: \"8c3b10a6-1abc-42b9-8602-916ecefb6c0b\") " pod="openshift-route-controller-manager/route-controller-manager-bb4bb89f7-9sg9f" Mar 19 19:00:35 crc kubenswrapper[4826]: I0319 19:00:35.163469 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxppq\" (UniqueName: \"kubernetes.io/projected/e5996d80-d5eb-423c-8965-1f5704c3dd69-kube-api-access-xxppq\") pod \"controller-manager-567cb464d6-bm4t6\" (UID: \"e5996d80-d5eb-423c-8965-1f5704c3dd69\") " pod="openshift-controller-manager/controller-manager-567cb464d6-bm4t6" Mar 19 19:00:35 crc kubenswrapper[4826]: I0319 19:00:35.163584 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e5996d80-d5eb-423c-8965-1f5704c3dd69-serving-cert\") pod \"controller-manager-567cb464d6-bm4t6\" (UID: \"e5996d80-d5eb-423c-8965-1f5704c3dd69\") " pod="openshift-controller-manager/controller-manager-567cb464d6-bm4t6" Mar 19 19:00:35 crc kubenswrapper[4826]: I0319 19:00:35.184450 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-567cb464d6-bm4t6"] Mar 19 19:00:35 crc kubenswrapper[4826]: I0319 19:00:35.263977 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7h5px\" (UniqueName: \"kubernetes.io/projected/8c3b10a6-1abc-42b9-8602-916ecefb6c0b-kube-api-access-7h5px\") pod \"route-controller-manager-bb4bb89f7-9sg9f\" (UID: \"8c3b10a6-1abc-42b9-8602-916ecefb6c0b\") " pod="openshift-route-controller-manager/route-controller-manager-bb4bb89f7-9sg9f" Mar 19 19:00:35 crc kubenswrapper[4826]: I0319 19:00:35.264026 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c3b10a6-1abc-42b9-8602-916ecefb6c0b-serving-cert\") pod \"route-controller-manager-bb4bb89f7-9sg9f\" (UID: \"8c3b10a6-1abc-42b9-8602-916ecefb6c0b\") " pod="openshift-route-controller-manager/route-controller-manager-bb4bb89f7-9sg9f" Mar 19 19:00:35 crc kubenswrapper[4826]: I0319 19:00:35.264044 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxppq\" (UniqueName: \"kubernetes.io/projected/e5996d80-d5eb-423c-8965-1f5704c3dd69-kube-api-access-xxppq\") pod \"controller-manager-567cb464d6-bm4t6\" (UID: \"e5996d80-d5eb-423c-8965-1f5704c3dd69\") " pod="openshift-controller-manager/controller-manager-567cb464d6-bm4t6" Mar 19 19:00:35 crc kubenswrapper[4826]: I0319 19:00:35.264070 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e5996d80-d5eb-423c-8965-1f5704c3dd69-serving-cert\") pod \"controller-manager-567cb464d6-bm4t6\" (UID: \"e5996d80-d5eb-423c-8965-1f5704c3dd69\") " pod="openshift-controller-manager/controller-manager-567cb464d6-bm4t6" Mar 19 19:00:35 crc kubenswrapper[4826]: I0319 19:00:35.264099 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c3b10a6-1abc-42b9-8602-916ecefb6c0b-config\") pod \"route-controller-manager-bb4bb89f7-9sg9f\" (UID: \"8c3b10a6-1abc-42b9-8602-916ecefb6c0b\") " pod="openshift-route-controller-manager/route-controller-manager-bb4bb89f7-9sg9f" Mar 19 19:00:35 crc kubenswrapper[4826]: I0319 19:00:35.264119 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e5996d80-d5eb-423c-8965-1f5704c3dd69-proxy-ca-bundles\") pod \"controller-manager-567cb464d6-bm4t6\" (UID: \"e5996d80-d5eb-423c-8965-1f5704c3dd69\") " pod="openshift-controller-manager/controller-manager-567cb464d6-bm4t6" Mar 19 19:00:35 crc kubenswrapper[4826]: I0319 19:00:35.264140 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8c3b10a6-1abc-42b9-8602-916ecefb6c0b-client-ca\") pod \"route-controller-manager-bb4bb89f7-9sg9f\" (UID: \"8c3b10a6-1abc-42b9-8602-916ecefb6c0b\") " pod="openshift-route-controller-manager/route-controller-manager-bb4bb89f7-9sg9f" Mar 19 19:00:35 crc kubenswrapper[4826]: I0319 19:00:35.264157 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5996d80-d5eb-423c-8965-1f5704c3dd69-config\") pod \"controller-manager-567cb464d6-bm4t6\" (UID: \"e5996d80-d5eb-423c-8965-1f5704c3dd69\") " pod="openshift-controller-manager/controller-manager-567cb464d6-bm4t6" Mar 19 19:00:35 crc kubenswrapper[4826]: I0319 19:00:35.264172 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e5996d80-d5eb-423c-8965-1f5704c3dd69-client-ca\") pod \"controller-manager-567cb464d6-bm4t6\" (UID: \"e5996d80-d5eb-423c-8965-1f5704c3dd69\") " pod="openshift-controller-manager/controller-manager-567cb464d6-bm4t6" Mar 19 19:00:35 crc kubenswrapper[4826]: I0319 19:00:35.265926 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e5996d80-d5eb-423c-8965-1f5704c3dd69-proxy-ca-bundles\") pod \"controller-manager-567cb464d6-bm4t6\" (UID: \"e5996d80-d5eb-423c-8965-1f5704c3dd69\") " pod="openshift-controller-manager/controller-manager-567cb464d6-bm4t6" Mar 19 19:00:35 crc kubenswrapper[4826]: I0319 19:00:35.266605 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e5996d80-d5eb-423c-8965-1f5704c3dd69-client-ca\") pod \"controller-manager-567cb464d6-bm4t6\" (UID: \"e5996d80-d5eb-423c-8965-1f5704c3dd69\") " pod="openshift-controller-manager/controller-manager-567cb464d6-bm4t6" Mar 19 19:00:35 crc kubenswrapper[4826]: I0319 19:00:35.266730 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c3b10a6-1abc-42b9-8602-916ecefb6c0b-config\") pod \"route-controller-manager-bb4bb89f7-9sg9f\" (UID: \"8c3b10a6-1abc-42b9-8602-916ecefb6c0b\") " pod="openshift-route-controller-manager/route-controller-manager-bb4bb89f7-9sg9f" Mar 19 19:00:35 crc kubenswrapper[4826]: I0319 19:00:35.266907 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5996d80-d5eb-423c-8965-1f5704c3dd69-config\") pod \"controller-manager-567cb464d6-bm4t6\" (UID: \"e5996d80-d5eb-423c-8965-1f5704c3dd69\") " pod="openshift-controller-manager/controller-manager-567cb464d6-bm4t6" Mar 19 19:00:35 crc kubenswrapper[4826]: I0319 19:00:35.268077 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8c3b10a6-1abc-42b9-8602-916ecefb6c0b-client-ca\") pod \"route-controller-manager-bb4bb89f7-9sg9f\" (UID: \"8c3b10a6-1abc-42b9-8602-916ecefb6c0b\") " pod="openshift-route-controller-manager/route-controller-manager-bb4bb89f7-9sg9f" Mar 19 19:00:35 crc kubenswrapper[4826]: I0319 19:00:35.270394 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c3b10a6-1abc-42b9-8602-916ecefb6c0b-serving-cert\") pod \"route-controller-manager-bb4bb89f7-9sg9f\" (UID: \"8c3b10a6-1abc-42b9-8602-916ecefb6c0b\") " pod="openshift-route-controller-manager/route-controller-manager-bb4bb89f7-9sg9f" Mar 19 19:00:35 crc kubenswrapper[4826]: I0319 19:00:35.273251 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e5996d80-d5eb-423c-8965-1f5704c3dd69-serving-cert\") pod \"controller-manager-567cb464d6-bm4t6\" (UID: \"e5996d80-d5eb-423c-8965-1f5704c3dd69\") " pod="openshift-controller-manager/controller-manager-567cb464d6-bm4t6" Mar 19 19:00:35 crc kubenswrapper[4826]: I0319 19:00:35.280758 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64c9cf5fcb-kq7k9" Mar 19 19:00:35 crc kubenswrapper[4826]: I0319 19:00:35.280876 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxppq\" (UniqueName: \"kubernetes.io/projected/e5996d80-d5eb-423c-8965-1f5704c3dd69-kube-api-access-xxppq\") pod \"controller-manager-567cb464d6-bm4t6\" (UID: \"e5996d80-d5eb-423c-8965-1f5704c3dd69\") " pod="openshift-controller-manager/controller-manager-567cb464d6-bm4t6" Mar 19 19:00:35 crc kubenswrapper[4826]: I0319 19:00:35.280936 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64c9cf5fcb-kq7k9" event={"ID":"ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8","Type":"ContainerDied","Data":"ef5861fb683048f27977415b22572739fd26519dc69a703979a87cf419c3fe5a"} Mar 19 19:00:35 crc kubenswrapper[4826]: I0319 19:00:35.281010 4826 scope.go:117] "RemoveContainer" containerID="fd88dbfbae7a3b7e1aa5860fbfc5ca58b40879e9050d0f513a269557ae1f7813" Mar 19 19:00:35 crc kubenswrapper[4826]: I0319 19:00:35.284310 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5ffd9cb86-bg57h" event={"ID":"59640edf-3068-4993-8b50-c049106935ee","Type":"ContainerDied","Data":"fa550695db4b8560ef6ed765ed33de6a28714c3730a634164f6bc332002252dc"} Mar 19 19:00:35 crc kubenswrapper[4826]: I0319 19:00:35.285950 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5ffd9cb86-bg57h" Mar 19 19:00:35 crc kubenswrapper[4826]: I0319 19:00:35.295344 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h5px\" (UniqueName: \"kubernetes.io/projected/8c3b10a6-1abc-42b9-8602-916ecefb6c0b-kube-api-access-7h5px\") pod \"route-controller-manager-bb4bb89f7-9sg9f\" (UID: \"8c3b10a6-1abc-42b9-8602-916ecefb6c0b\") " pod="openshift-route-controller-manager/route-controller-manager-bb4bb89f7-9sg9f" Mar 19 19:00:35 crc kubenswrapper[4826]: I0319 19:00:35.330875 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5ffd9cb86-bg57h"] Mar 19 19:00:35 crc kubenswrapper[4826]: I0319 19:00:35.333144 4826 scope.go:117] "RemoveContainer" containerID="cb3568bc85dcd89368e79d7fcdd3fef2bd4c8973e88249e6a850d7bf0efa0dbe" Mar 19 19:00:35 crc kubenswrapper[4826]: I0319 19:00:35.347030 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5ffd9cb86-bg57h"] Mar 19 19:00:35 crc kubenswrapper[4826]: I0319 19:00:35.354722 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-64c9cf5fcb-kq7k9"] Mar 19 19:00:35 crc kubenswrapper[4826]: I0319 19:00:35.360800 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-64c9cf5fcb-kq7k9"] Mar 19 19:00:35 crc kubenswrapper[4826]: I0319 19:00:35.475588 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-bb4bb89f7-9sg9f" Mar 19 19:00:35 crc kubenswrapper[4826]: I0319 19:00:35.484288 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-567cb464d6-bm4t6" Mar 19 19:00:35 crc kubenswrapper[4826]: W0319 19:00:35.777157 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c3b10a6_1abc_42b9_8602_916ecefb6c0b.slice/crio-8e77f7c8032f61e36f19665d0947d80e067414126950de8a7c3cff03ab991573 WatchSource:0}: Error finding container 8e77f7c8032f61e36f19665d0947d80e067414126950de8a7c3cff03ab991573: Status 404 returned error can't find the container with id 8e77f7c8032f61e36f19665d0947d80e067414126950de8a7c3cff03ab991573 Mar 19 19:00:35 crc kubenswrapper[4826]: I0319 19:00:35.778910 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bb4bb89f7-9sg9f"] Mar 19 19:00:35 crc kubenswrapper[4826]: I0319 19:00:35.935403 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-567cb464d6-bm4t6"] Mar 19 19:00:35 crc kubenswrapper[4826]: I0319 19:00:35.984288 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59640edf-3068-4993-8b50-c049106935ee" path="/var/lib/kubelet/pods/59640edf-3068-4993-8b50-c049106935ee/volumes" Mar 19 19:00:35 crc kubenswrapper[4826]: I0319 19:00:35.984983 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8" path="/var/lib/kubelet/pods/ace0dc84-db80-4c3d-bbbf-d49f3f3e79d8/volumes" Mar 19 19:00:36 crc kubenswrapper[4826]: I0319 19:00:36.290185 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-567cb464d6-bm4t6" event={"ID":"e5996d80-d5eb-423c-8965-1f5704c3dd69","Type":"ContainerStarted","Data":"79bd64d7d2c89268df3c32958214a5b3ac6e94666db0ac84e1fea7d21c755d03"} Mar 19 19:00:36 crc kubenswrapper[4826]: I0319 19:00:36.290245 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-567cb464d6-bm4t6" event={"ID":"e5996d80-d5eb-423c-8965-1f5704c3dd69","Type":"ContainerStarted","Data":"29593e2d22c0ee2a60ad28bdc2b84ae4bdb7d5a5ed49a452644003a96795a25f"} Mar 19 19:00:36 crc kubenswrapper[4826]: I0319 19:00:36.290357 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-567cb464d6-bm4t6" Mar 19 19:00:36 crc kubenswrapper[4826]: I0319 19:00:36.291625 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-bb4bb89f7-9sg9f" event={"ID":"8c3b10a6-1abc-42b9-8602-916ecefb6c0b","Type":"ContainerStarted","Data":"e5986e47ee6f9cc99f24b00ef66495a83bd923b087b16f2dc621ea1cc56c90c0"} Mar 19 19:00:36 crc kubenswrapper[4826]: I0319 19:00:36.291648 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-bb4bb89f7-9sg9f" event={"ID":"8c3b10a6-1abc-42b9-8602-916ecefb6c0b","Type":"ContainerStarted","Data":"8e77f7c8032f61e36f19665d0947d80e067414126950de8a7c3cff03ab991573"} Mar 19 19:00:36 crc kubenswrapper[4826]: I0319 19:00:36.291824 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-bb4bb89f7-9sg9f" Mar 19 19:00:36 crc kubenswrapper[4826]: I0319 19:00:36.297820 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-bb4bb89f7-9sg9f" Mar 19 19:00:36 crc kubenswrapper[4826]: I0319 19:00:36.309590 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-567cb464d6-bm4t6" podStartSLOduration=3.309575476 podStartE2EDuration="3.309575476s" podCreationTimestamp="2026-03-19 19:00:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:00:36.307442475 +0000 UTC m=+261.061510788" watchObservedRunningTime="2026-03-19 19:00:36.309575476 +0000 UTC m=+261.063643779" Mar 19 19:00:36 crc kubenswrapper[4826]: I0319 19:00:36.316715 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-567cb464d6-bm4t6" Mar 19 19:00:36 crc kubenswrapper[4826]: I0319 19:00:36.331025 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-bb4bb89f7-9sg9f" podStartSLOduration=3.331008492 podStartE2EDuration="3.331008492s" podCreationTimestamp="2026-03-19 19:00:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:00:36.327573014 +0000 UTC m=+261.081641317" watchObservedRunningTime="2026-03-19 19:00:36.331008492 +0000 UTC m=+261.085076805" Mar 19 19:00:46 crc kubenswrapper[4826]: I0319 19:00:46.753574 4826 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 19 19:00:50 crc kubenswrapper[4826]: I0319 19:00:50.002975 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-4t5nj"] Mar 19 19:00:53 crc kubenswrapper[4826]: I0319 19:00:53.876960 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bb4bb89f7-9sg9f"] Mar 19 19:00:53 crc kubenswrapper[4826]: I0319 19:00:53.877860 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-bb4bb89f7-9sg9f" podUID="8c3b10a6-1abc-42b9-8602-916ecefb6c0b" containerName="route-controller-manager" containerID="cri-o://e5986e47ee6f9cc99f24b00ef66495a83bd923b087b16f2dc621ea1cc56c90c0" gracePeriod=30 Mar 19 19:00:54 crc kubenswrapper[4826]: I0319 19:00:54.393613 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-bb4bb89f7-9sg9f" Mar 19 19:00:54 crc kubenswrapper[4826]: I0319 19:00:54.416195 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c3b10a6-1abc-42b9-8602-916ecefb6c0b-serving-cert\") pod \"8c3b10a6-1abc-42b9-8602-916ecefb6c0b\" (UID: \"8c3b10a6-1abc-42b9-8602-916ecefb6c0b\") " Mar 19 19:00:54 crc kubenswrapper[4826]: I0319 19:00:54.416287 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8c3b10a6-1abc-42b9-8602-916ecefb6c0b-client-ca\") pod \"8c3b10a6-1abc-42b9-8602-916ecefb6c0b\" (UID: \"8c3b10a6-1abc-42b9-8602-916ecefb6c0b\") " Mar 19 19:00:54 crc kubenswrapper[4826]: I0319 19:00:54.416399 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7h5px\" (UniqueName: \"kubernetes.io/projected/8c3b10a6-1abc-42b9-8602-916ecefb6c0b-kube-api-access-7h5px\") pod \"8c3b10a6-1abc-42b9-8602-916ecefb6c0b\" (UID: \"8c3b10a6-1abc-42b9-8602-916ecefb6c0b\") " Mar 19 19:00:54 crc kubenswrapper[4826]: I0319 19:00:54.416509 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c3b10a6-1abc-42b9-8602-916ecefb6c0b-config\") pod \"8c3b10a6-1abc-42b9-8602-916ecefb6c0b\" (UID: \"8c3b10a6-1abc-42b9-8602-916ecefb6c0b\") " Mar 19 19:00:54 crc kubenswrapper[4826]: I0319 19:00:54.417193 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c3b10a6-1abc-42b9-8602-916ecefb6c0b-client-ca" (OuterVolumeSpecName: "client-ca") pod "8c3b10a6-1abc-42b9-8602-916ecefb6c0b" (UID: "8c3b10a6-1abc-42b9-8602-916ecefb6c0b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:00:54 crc kubenswrapper[4826]: I0319 19:00:54.417828 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c3b10a6-1abc-42b9-8602-916ecefb6c0b-config" (OuterVolumeSpecName: "config") pod "8c3b10a6-1abc-42b9-8602-916ecefb6c0b" (UID: "8c3b10a6-1abc-42b9-8602-916ecefb6c0b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:00:54 crc kubenswrapper[4826]: I0319 19:00:54.419127 4826 generic.go:334] "Generic (PLEG): container finished" podID="8c3b10a6-1abc-42b9-8602-916ecefb6c0b" containerID="e5986e47ee6f9cc99f24b00ef66495a83bd923b087b16f2dc621ea1cc56c90c0" exitCode=0 Mar 19 19:00:54 crc kubenswrapper[4826]: I0319 19:00:54.419194 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-bb4bb89f7-9sg9f" event={"ID":"8c3b10a6-1abc-42b9-8602-916ecefb6c0b","Type":"ContainerDied","Data":"e5986e47ee6f9cc99f24b00ef66495a83bd923b087b16f2dc621ea1cc56c90c0"} Mar 19 19:00:54 crc kubenswrapper[4826]: I0319 19:00:54.419248 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-bb4bb89f7-9sg9f" event={"ID":"8c3b10a6-1abc-42b9-8602-916ecefb6c0b","Type":"ContainerDied","Data":"8e77f7c8032f61e36f19665d0947d80e067414126950de8a7c3cff03ab991573"} Mar 19 19:00:54 crc kubenswrapper[4826]: I0319 19:00:54.419301 4826 scope.go:117] "RemoveContainer" containerID="e5986e47ee6f9cc99f24b00ef66495a83bd923b087b16f2dc621ea1cc56c90c0" Mar 19 19:00:54 crc kubenswrapper[4826]: I0319 19:00:54.419479 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-bb4bb89f7-9sg9f" Mar 19 19:00:54 crc kubenswrapper[4826]: I0319 19:00:54.429502 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c3b10a6-1abc-42b9-8602-916ecefb6c0b-kube-api-access-7h5px" (OuterVolumeSpecName: "kube-api-access-7h5px") pod "8c3b10a6-1abc-42b9-8602-916ecefb6c0b" (UID: "8c3b10a6-1abc-42b9-8602-916ecefb6c0b"). InnerVolumeSpecName "kube-api-access-7h5px". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:00:54 crc kubenswrapper[4826]: I0319 19:00:54.453881 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c3b10a6-1abc-42b9-8602-916ecefb6c0b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8c3b10a6-1abc-42b9-8602-916ecefb6c0b" (UID: "8c3b10a6-1abc-42b9-8602-916ecefb6c0b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:00:54 crc kubenswrapper[4826]: I0319 19:00:54.469579 4826 scope.go:117] "RemoveContainer" containerID="e5986e47ee6f9cc99f24b00ef66495a83bd923b087b16f2dc621ea1cc56c90c0" Mar 19 19:00:54 crc kubenswrapper[4826]: E0319 19:00:54.473137 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5986e47ee6f9cc99f24b00ef66495a83bd923b087b16f2dc621ea1cc56c90c0\": container with ID starting with e5986e47ee6f9cc99f24b00ef66495a83bd923b087b16f2dc621ea1cc56c90c0 not found: ID does not exist" containerID="e5986e47ee6f9cc99f24b00ef66495a83bd923b087b16f2dc621ea1cc56c90c0" Mar 19 19:00:54 crc kubenswrapper[4826]: I0319 19:00:54.473194 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5986e47ee6f9cc99f24b00ef66495a83bd923b087b16f2dc621ea1cc56c90c0"} err="failed to get container status \"e5986e47ee6f9cc99f24b00ef66495a83bd923b087b16f2dc621ea1cc56c90c0\": rpc error: code = NotFound desc = could not find container \"e5986e47ee6f9cc99f24b00ef66495a83bd923b087b16f2dc621ea1cc56c90c0\": container with ID starting with e5986e47ee6f9cc99f24b00ef66495a83bd923b087b16f2dc621ea1cc56c90c0 not found: ID does not exist" Mar 19 19:00:54 crc kubenswrapper[4826]: I0319 19:00:54.518515 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7h5px\" (UniqueName: \"kubernetes.io/projected/8c3b10a6-1abc-42b9-8602-916ecefb6c0b-kube-api-access-7h5px\") on node \"crc\" DevicePath \"\"" Mar 19 19:00:54 crc kubenswrapper[4826]: I0319 19:00:54.518560 4826 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c3b10a6-1abc-42b9-8602-916ecefb6c0b-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:00:54 crc kubenswrapper[4826]: I0319 19:00:54.518581 4826 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c3b10a6-1abc-42b9-8602-916ecefb6c0b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 19:00:54 crc kubenswrapper[4826]: I0319 19:00:54.518599 4826 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8c3b10a6-1abc-42b9-8602-916ecefb6c0b-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 19:00:54 crc kubenswrapper[4826]: I0319 19:00:54.758716 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bb4bb89f7-9sg9f"] Mar 19 19:00:54 crc kubenswrapper[4826]: I0319 19:00:54.763192 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bb4bb89f7-9sg9f"] Mar 19 19:00:55 crc kubenswrapper[4826]: I0319 19:00:55.117480 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c7f6f4957-zk6dz"] Mar 19 19:00:55 crc kubenswrapper[4826]: E0319 19:00:55.117793 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c3b10a6-1abc-42b9-8602-916ecefb6c0b" containerName="route-controller-manager" Mar 19 19:00:55 crc kubenswrapper[4826]: I0319 19:00:55.117813 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c3b10a6-1abc-42b9-8602-916ecefb6c0b" containerName="route-controller-manager" Mar 19 19:00:55 crc kubenswrapper[4826]: I0319 19:00:55.117990 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c3b10a6-1abc-42b9-8602-916ecefb6c0b" containerName="route-controller-manager" Mar 19 19:00:55 crc kubenswrapper[4826]: I0319 19:00:55.118529 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c7f6f4957-zk6dz" Mar 19 19:00:55 crc kubenswrapper[4826]: I0319 19:00:55.123278 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 19:00:55 crc kubenswrapper[4826]: I0319 19:00:55.123468 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 19:00:55 crc kubenswrapper[4826]: I0319 19:00:55.123278 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 19:00:55 crc kubenswrapper[4826]: I0319 19:00:55.124114 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 19:00:55 crc kubenswrapper[4826]: I0319 19:00:55.126700 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 19 19:00:55 crc kubenswrapper[4826]: I0319 19:00:55.128857 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 19:00:55 crc kubenswrapper[4826]: I0319 19:00:55.184978 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c7f6f4957-zk6dz"] Mar 19 19:00:55 crc kubenswrapper[4826]: I0319 19:00:55.226622 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klmvb\" (UniqueName: \"kubernetes.io/projected/1c8a877f-8bc5-4c4b-83ca-d42488a5efd2-kube-api-access-klmvb\") pod \"route-controller-manager-6c7f6f4957-zk6dz\" (UID: \"1c8a877f-8bc5-4c4b-83ca-d42488a5efd2\") " pod="openshift-route-controller-manager/route-controller-manager-6c7f6f4957-zk6dz" Mar 19 19:00:55 crc kubenswrapper[4826]: I0319 19:00:55.226843 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c8a877f-8bc5-4c4b-83ca-d42488a5efd2-serving-cert\") pod \"route-controller-manager-6c7f6f4957-zk6dz\" (UID: \"1c8a877f-8bc5-4c4b-83ca-d42488a5efd2\") " pod="openshift-route-controller-manager/route-controller-manager-6c7f6f4957-zk6dz" Mar 19 19:00:55 crc kubenswrapper[4826]: I0319 19:00:55.227259 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c8a877f-8bc5-4c4b-83ca-d42488a5efd2-config\") pod \"route-controller-manager-6c7f6f4957-zk6dz\" (UID: \"1c8a877f-8bc5-4c4b-83ca-d42488a5efd2\") " pod="openshift-route-controller-manager/route-controller-manager-6c7f6f4957-zk6dz" Mar 19 19:00:55 crc kubenswrapper[4826]: I0319 19:00:55.227381 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1c8a877f-8bc5-4c4b-83ca-d42488a5efd2-client-ca\") pod \"route-controller-manager-6c7f6f4957-zk6dz\" (UID: \"1c8a877f-8bc5-4c4b-83ca-d42488a5efd2\") " pod="openshift-route-controller-manager/route-controller-manager-6c7f6f4957-zk6dz" Mar 19 19:00:55 crc kubenswrapper[4826]: I0319 19:00:55.328420 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c8a877f-8bc5-4c4b-83ca-d42488a5efd2-config\") pod \"route-controller-manager-6c7f6f4957-zk6dz\" (UID: \"1c8a877f-8bc5-4c4b-83ca-d42488a5efd2\") " pod="openshift-route-controller-manager/route-controller-manager-6c7f6f4957-zk6dz" Mar 19 19:00:55 crc kubenswrapper[4826]: I0319 19:00:55.328560 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1c8a877f-8bc5-4c4b-83ca-d42488a5efd2-client-ca\") pod \"route-controller-manager-6c7f6f4957-zk6dz\" (UID: \"1c8a877f-8bc5-4c4b-83ca-d42488a5efd2\") " pod="openshift-route-controller-manager/route-controller-manager-6c7f6f4957-zk6dz" Mar 19 19:00:55 crc kubenswrapper[4826]: I0319 19:00:55.328647 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klmvb\" (UniqueName: \"kubernetes.io/projected/1c8a877f-8bc5-4c4b-83ca-d42488a5efd2-kube-api-access-klmvb\") pod \"route-controller-manager-6c7f6f4957-zk6dz\" (UID: \"1c8a877f-8bc5-4c4b-83ca-d42488a5efd2\") " pod="openshift-route-controller-manager/route-controller-manager-6c7f6f4957-zk6dz" Mar 19 19:00:55 crc kubenswrapper[4826]: I0319 19:00:55.328781 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c8a877f-8bc5-4c4b-83ca-d42488a5efd2-serving-cert\") pod \"route-controller-manager-6c7f6f4957-zk6dz\" (UID: \"1c8a877f-8bc5-4c4b-83ca-d42488a5efd2\") " pod="openshift-route-controller-manager/route-controller-manager-6c7f6f4957-zk6dz" Mar 19 19:00:55 crc kubenswrapper[4826]: I0319 19:00:55.330155 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1c8a877f-8bc5-4c4b-83ca-d42488a5efd2-client-ca\") pod \"route-controller-manager-6c7f6f4957-zk6dz\" (UID: \"1c8a877f-8bc5-4c4b-83ca-d42488a5efd2\") " pod="openshift-route-controller-manager/route-controller-manager-6c7f6f4957-zk6dz" Mar 19 19:00:55 crc kubenswrapper[4826]: I0319 19:00:55.331405 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c8a877f-8bc5-4c4b-83ca-d42488a5efd2-config\") pod \"route-controller-manager-6c7f6f4957-zk6dz\" (UID: \"1c8a877f-8bc5-4c4b-83ca-d42488a5efd2\") " pod="openshift-route-controller-manager/route-controller-manager-6c7f6f4957-zk6dz" Mar 19 19:00:55 crc kubenswrapper[4826]: I0319 19:00:55.334878 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c8a877f-8bc5-4c4b-83ca-d42488a5efd2-serving-cert\") pod \"route-controller-manager-6c7f6f4957-zk6dz\" (UID: \"1c8a877f-8bc5-4c4b-83ca-d42488a5efd2\") " pod="openshift-route-controller-manager/route-controller-manager-6c7f6f4957-zk6dz" Mar 19 19:00:55 crc kubenswrapper[4826]: I0319 19:00:55.359244 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klmvb\" (UniqueName: \"kubernetes.io/projected/1c8a877f-8bc5-4c4b-83ca-d42488a5efd2-kube-api-access-klmvb\") pod \"route-controller-manager-6c7f6f4957-zk6dz\" (UID: \"1c8a877f-8bc5-4c4b-83ca-d42488a5efd2\") " pod="openshift-route-controller-manager/route-controller-manager-6c7f6f4957-zk6dz" Mar 19 19:00:55 crc kubenswrapper[4826]: I0319 19:00:55.400817 4826 patch_prober.go:28] interesting pod/machine-config-daemon-zz87p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:00:55 crc kubenswrapper[4826]: I0319 19:00:55.400907 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:00:55 crc kubenswrapper[4826]: I0319 19:00:55.401027 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" Mar 19 19:00:55 crc kubenswrapper[4826]: I0319 19:00:55.401931 4826 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9d92f655f3b11b40bcc07704e2387d92e12e6f78e1df6ba8885d1c76be823e80"} pod="openshift-machine-config-operator/machine-config-daemon-zz87p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 19:00:55 crc kubenswrapper[4826]: I0319 19:00:55.402082 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" containerID="cri-o://9d92f655f3b11b40bcc07704e2387d92e12e6f78e1df6ba8885d1c76be823e80" gracePeriod=600 Mar 19 19:00:55 crc kubenswrapper[4826]: I0319 19:00:55.486900 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c7f6f4957-zk6dz" Mar 19 19:00:55 crc kubenswrapper[4826]: I0319 19:00:55.960828 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c7f6f4957-zk6dz"] Mar 19 19:00:55 crc kubenswrapper[4826]: W0319 19:00:55.971559 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c8a877f_8bc5_4c4b_83ca_d42488a5efd2.slice/crio-4a247967414213767ae12eaefa14042f9c015fa6c6528079242b38f2c8de89fb WatchSource:0}: Error finding container 4a247967414213767ae12eaefa14042f9c015fa6c6528079242b38f2c8de89fb: Status 404 returned error can't find the container with id 4a247967414213767ae12eaefa14042f9c015fa6c6528079242b38f2c8de89fb Mar 19 19:00:55 crc kubenswrapper[4826]: I0319 19:00:55.984489 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c3b10a6-1abc-42b9-8602-916ecefb6c0b" path="/var/lib/kubelet/pods/8c3b10a6-1abc-42b9-8602-916ecefb6c0b/volumes" Mar 19 19:00:56 crc kubenswrapper[4826]: I0319 19:00:56.437155 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c7f6f4957-zk6dz" event={"ID":"1c8a877f-8bc5-4c4b-83ca-d42488a5efd2","Type":"ContainerStarted","Data":"82aff541ffd0ba9843885369f016589125ab760601a66da579150cac8a580be9"} Mar 19 19:00:56 crc kubenswrapper[4826]: I0319 19:00:56.437559 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6c7f6f4957-zk6dz" Mar 19 19:00:56 crc kubenswrapper[4826]: I0319 19:00:56.437583 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c7f6f4957-zk6dz" event={"ID":"1c8a877f-8bc5-4c4b-83ca-d42488a5efd2","Type":"ContainerStarted","Data":"4a247967414213767ae12eaefa14042f9c015fa6c6528079242b38f2c8de89fb"} Mar 19 19:00:56 crc kubenswrapper[4826]: I0319 19:00:56.439730 4826 generic.go:334] "Generic (PLEG): container finished" podID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerID="9d92f655f3b11b40bcc07704e2387d92e12e6f78e1df6ba8885d1c76be823e80" exitCode=0 Mar 19 19:00:56 crc kubenswrapper[4826]: I0319 19:00:56.439780 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" event={"ID":"b456fa3f-c7a7-45ca-b560-e7a9b21be05a","Type":"ContainerDied","Data":"9d92f655f3b11b40bcc07704e2387d92e12e6f78e1df6ba8885d1c76be823e80"} Mar 19 19:00:56 crc kubenswrapper[4826]: I0319 19:00:56.439817 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" event={"ID":"b456fa3f-c7a7-45ca-b560-e7a9b21be05a","Type":"ContainerStarted","Data":"3ee9874037af13ba40ff8e8492fc1cbe83ef3f4c7edf979a85a6a720d737c911"} Mar 19 19:00:56 crc kubenswrapper[4826]: I0319 19:00:56.468933 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6c7f6f4957-zk6dz" podStartSLOduration=3.468904088 podStartE2EDuration="3.468904088s" podCreationTimestamp="2026-03-19 19:00:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:00:56.461459085 +0000 UTC m=+281.215527448" watchObservedRunningTime="2026-03-19 19:00:56.468904088 +0000 UTC m=+281.222972411" Mar 19 19:00:56 crc kubenswrapper[4826]: I0319 19:00:56.528761 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6c7f6f4957-zk6dz" Mar 19 19:00:59 crc kubenswrapper[4826]: I0319 19:00:59.787185 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-77cc2"] Mar 19 19:00:59 crc kubenswrapper[4826]: I0319 19:00:59.788402 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-77cc2" Mar 19 19:00:59 crc kubenswrapper[4826]: I0319 19:00:59.798838 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-77cc2"] Mar 19 19:00:59 crc kubenswrapper[4826]: I0319 19:00:59.922116 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/85a0e24a-07c2-4184-8f80-90479e82f839-bound-sa-token\") pod \"image-registry-66df7c8f76-77cc2\" (UID: \"85a0e24a-07c2-4184-8f80-90479e82f839\") " pod="openshift-image-registry/image-registry-66df7c8f76-77cc2" Mar 19 19:00:59 crc kubenswrapper[4826]: I0319 19:00:59.922166 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/85a0e24a-07c2-4184-8f80-90479e82f839-ca-trust-extracted\") pod \"image-registry-66df7c8f76-77cc2\" (UID: \"85a0e24a-07c2-4184-8f80-90479e82f839\") " pod="openshift-image-registry/image-registry-66df7c8f76-77cc2" Mar 19 19:00:59 crc kubenswrapper[4826]: I0319 19:00:59.922207 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-77cc2\" (UID: \"85a0e24a-07c2-4184-8f80-90479e82f839\") " pod="openshift-image-registry/image-registry-66df7c8f76-77cc2" Mar 19 19:00:59 crc kubenswrapper[4826]: I0319 19:00:59.922243 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/85a0e24a-07c2-4184-8f80-90479e82f839-trusted-ca\") pod \"image-registry-66df7c8f76-77cc2\" (UID: \"85a0e24a-07c2-4184-8f80-90479e82f839\") " pod="openshift-image-registry/image-registry-66df7c8f76-77cc2" Mar 19 19:00:59 crc kubenswrapper[4826]: I0319 19:00:59.922280 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/85a0e24a-07c2-4184-8f80-90479e82f839-registry-certificates\") pod \"image-registry-66df7c8f76-77cc2\" (UID: \"85a0e24a-07c2-4184-8f80-90479e82f839\") " pod="openshift-image-registry/image-registry-66df7c8f76-77cc2" Mar 19 19:00:59 crc kubenswrapper[4826]: I0319 19:00:59.922316 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/85a0e24a-07c2-4184-8f80-90479e82f839-installation-pull-secrets\") pod \"image-registry-66df7c8f76-77cc2\" (UID: \"85a0e24a-07c2-4184-8f80-90479e82f839\") " pod="openshift-image-registry/image-registry-66df7c8f76-77cc2" Mar 19 19:00:59 crc kubenswrapper[4826]: I0319 19:00:59.922343 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmlzc\" (UniqueName: \"kubernetes.io/projected/85a0e24a-07c2-4184-8f80-90479e82f839-kube-api-access-rmlzc\") pod \"image-registry-66df7c8f76-77cc2\" (UID: \"85a0e24a-07c2-4184-8f80-90479e82f839\") " pod="openshift-image-registry/image-registry-66df7c8f76-77cc2" Mar 19 19:00:59 crc kubenswrapper[4826]: I0319 19:00:59.922382 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/85a0e24a-07c2-4184-8f80-90479e82f839-registry-tls\") pod \"image-registry-66df7c8f76-77cc2\" (UID: \"85a0e24a-07c2-4184-8f80-90479e82f839\") " pod="openshift-image-registry/image-registry-66df7c8f76-77cc2" Mar 19 19:00:59 crc kubenswrapper[4826]: I0319 19:00:59.952955 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-77cc2\" (UID: \"85a0e24a-07c2-4184-8f80-90479e82f839\") " pod="openshift-image-registry/image-registry-66df7c8f76-77cc2" Mar 19 19:01:00 crc kubenswrapper[4826]: I0319 19:01:00.024291 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/85a0e24a-07c2-4184-8f80-90479e82f839-bound-sa-token\") pod \"image-registry-66df7c8f76-77cc2\" (UID: \"85a0e24a-07c2-4184-8f80-90479e82f839\") " pod="openshift-image-registry/image-registry-66df7c8f76-77cc2" Mar 19 19:01:00 crc kubenswrapper[4826]: I0319 19:01:00.024358 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/85a0e24a-07c2-4184-8f80-90479e82f839-ca-trust-extracted\") pod \"image-registry-66df7c8f76-77cc2\" (UID: \"85a0e24a-07c2-4184-8f80-90479e82f839\") " pod="openshift-image-registry/image-registry-66df7c8f76-77cc2" Mar 19 19:01:00 crc kubenswrapper[4826]: I0319 19:01:00.024433 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/85a0e24a-07c2-4184-8f80-90479e82f839-trusted-ca\") pod \"image-registry-66df7c8f76-77cc2\" (UID: \"85a0e24a-07c2-4184-8f80-90479e82f839\") " pod="openshift-image-registry/image-registry-66df7c8f76-77cc2" Mar 19 19:01:00 crc kubenswrapper[4826]: I0319 19:01:00.024487 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/85a0e24a-07c2-4184-8f80-90479e82f839-registry-certificates\") pod \"image-registry-66df7c8f76-77cc2\" (UID: \"85a0e24a-07c2-4184-8f80-90479e82f839\") " pod="openshift-image-registry/image-registry-66df7c8f76-77cc2" Mar 19 19:01:00 crc kubenswrapper[4826]: I0319 19:01:00.024540 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/85a0e24a-07c2-4184-8f80-90479e82f839-installation-pull-secrets\") pod \"image-registry-66df7c8f76-77cc2\" (UID: \"85a0e24a-07c2-4184-8f80-90479e82f839\") " pod="openshift-image-registry/image-registry-66df7c8f76-77cc2" Mar 19 19:01:00 crc kubenswrapper[4826]: I0319 19:01:00.024575 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmlzc\" (UniqueName: \"kubernetes.io/projected/85a0e24a-07c2-4184-8f80-90479e82f839-kube-api-access-rmlzc\") pod \"image-registry-66df7c8f76-77cc2\" (UID: \"85a0e24a-07c2-4184-8f80-90479e82f839\") " pod="openshift-image-registry/image-registry-66df7c8f76-77cc2" Mar 19 19:01:00 crc kubenswrapper[4826]: I0319 19:01:00.024615 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/85a0e24a-07c2-4184-8f80-90479e82f839-registry-tls\") pod \"image-registry-66df7c8f76-77cc2\" (UID: \"85a0e24a-07c2-4184-8f80-90479e82f839\") " pod="openshift-image-registry/image-registry-66df7c8f76-77cc2" Mar 19 19:01:00 crc kubenswrapper[4826]: I0319 19:01:00.025241 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/85a0e24a-07c2-4184-8f80-90479e82f839-ca-trust-extracted\") pod \"image-registry-66df7c8f76-77cc2\" (UID: \"85a0e24a-07c2-4184-8f80-90479e82f839\") " pod="openshift-image-registry/image-registry-66df7c8f76-77cc2" Mar 19 19:01:00 crc kubenswrapper[4826]: I0319 19:01:00.025837 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/85a0e24a-07c2-4184-8f80-90479e82f839-trusted-ca\") pod \"image-registry-66df7c8f76-77cc2\" (UID: \"85a0e24a-07c2-4184-8f80-90479e82f839\") " pod="openshift-image-registry/image-registry-66df7c8f76-77cc2" Mar 19 19:01:00 crc kubenswrapper[4826]: I0319 19:01:00.027388 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/85a0e24a-07c2-4184-8f80-90479e82f839-registry-certificates\") pod \"image-registry-66df7c8f76-77cc2\" (UID: \"85a0e24a-07c2-4184-8f80-90479e82f839\") " pod="openshift-image-registry/image-registry-66df7c8f76-77cc2" Mar 19 19:01:00 crc kubenswrapper[4826]: I0319 19:01:00.031084 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/85a0e24a-07c2-4184-8f80-90479e82f839-installation-pull-secrets\") pod \"image-registry-66df7c8f76-77cc2\" (UID: \"85a0e24a-07c2-4184-8f80-90479e82f839\") " pod="openshift-image-registry/image-registry-66df7c8f76-77cc2" Mar 19 19:01:00 crc kubenswrapper[4826]: I0319 19:01:00.039595 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/85a0e24a-07c2-4184-8f80-90479e82f839-registry-tls\") pod \"image-registry-66df7c8f76-77cc2\" (UID: \"85a0e24a-07c2-4184-8f80-90479e82f839\") " pod="openshift-image-registry/image-registry-66df7c8f76-77cc2" Mar 19 19:01:00 crc kubenswrapper[4826]: I0319 19:01:00.041306 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/85a0e24a-07c2-4184-8f80-90479e82f839-bound-sa-token\") pod \"image-registry-66df7c8f76-77cc2\" (UID: \"85a0e24a-07c2-4184-8f80-90479e82f839\") " pod="openshift-image-registry/image-registry-66df7c8f76-77cc2" Mar 19 19:01:00 crc kubenswrapper[4826]: I0319 19:01:00.042357 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmlzc\" (UniqueName: \"kubernetes.io/projected/85a0e24a-07c2-4184-8f80-90479e82f839-kube-api-access-rmlzc\") pod \"image-registry-66df7c8f76-77cc2\" (UID: \"85a0e24a-07c2-4184-8f80-90479e82f839\") " pod="openshift-image-registry/image-registry-66df7c8f76-77cc2" Mar 19 19:01:00 crc kubenswrapper[4826]: I0319 19:01:00.106603 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-77cc2" Mar 19 19:01:00 crc kubenswrapper[4826]: I0319 19:01:00.608108 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-77cc2"] Mar 19 19:01:00 crc kubenswrapper[4826]: W0319 19:01:00.615450 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85a0e24a_07c2_4184_8f80_90479e82f839.slice/crio-bcce914f29562a72f409bfffb0eeadf07690a38b52eb3ba70e21cd71cb44087c WatchSource:0}: Error finding container bcce914f29562a72f409bfffb0eeadf07690a38b52eb3ba70e21cd71cb44087c: Status 404 returned error can't find the container with id bcce914f29562a72f409bfffb0eeadf07690a38b52eb3ba70e21cd71cb44087c Mar 19 19:01:01 crc kubenswrapper[4826]: I0319 19:01:01.473700 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-77cc2" event={"ID":"85a0e24a-07c2-4184-8f80-90479e82f839","Type":"ContainerStarted","Data":"6b2995114511d3710a9b1f66f5d6a156142de9b257a4fe60eeca2f7444f6444f"} Mar 19 19:01:01 crc kubenswrapper[4826]: I0319 19:01:01.474091 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-77cc2" Mar 19 19:01:01 crc kubenswrapper[4826]: I0319 19:01:01.474108 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-77cc2" event={"ID":"85a0e24a-07c2-4184-8f80-90479e82f839","Type":"ContainerStarted","Data":"bcce914f29562a72f409bfffb0eeadf07690a38b52eb3ba70e21cd71cb44087c"} Mar 19 19:01:01 crc kubenswrapper[4826]: I0319 19:01:01.498842 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-77cc2" podStartSLOduration=2.49882157 podStartE2EDuration="2.49882157s" podCreationTimestamp="2026-03-19 19:00:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:01:01.498000846 +0000 UTC m=+286.252069169" watchObservedRunningTime="2026-03-19 19:01:01.49882157 +0000 UTC m=+286.252889903" Mar 19 19:01:13 crc kubenswrapper[4826]: I0319 19:01:13.898166 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c7f6f4957-zk6dz"] Mar 19 19:01:13 crc kubenswrapper[4826]: I0319 19:01:13.899203 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6c7f6f4957-zk6dz" podUID="1c8a877f-8bc5-4c4b-83ca-d42488a5efd2" containerName="route-controller-manager" containerID="cri-o://82aff541ffd0ba9843885369f016589125ab760601a66da579150cac8a580be9" gracePeriod=30 Mar 19 19:01:14 crc kubenswrapper[4826]: I0319 19:01:14.353629 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c7f6f4957-zk6dz" Mar 19 19:01:14 crc kubenswrapper[4826]: I0319 19:01:14.433336 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1c8a877f-8bc5-4c4b-83ca-d42488a5efd2-client-ca\") pod \"1c8a877f-8bc5-4c4b-83ca-d42488a5efd2\" (UID: \"1c8a877f-8bc5-4c4b-83ca-d42488a5efd2\") " Mar 19 19:01:14 crc kubenswrapper[4826]: I0319 19:01:14.433547 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klmvb\" (UniqueName: \"kubernetes.io/projected/1c8a877f-8bc5-4c4b-83ca-d42488a5efd2-kube-api-access-klmvb\") pod \"1c8a877f-8bc5-4c4b-83ca-d42488a5efd2\" (UID: \"1c8a877f-8bc5-4c4b-83ca-d42488a5efd2\") " Mar 19 19:01:14 crc kubenswrapper[4826]: I0319 19:01:14.433727 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c8a877f-8bc5-4c4b-83ca-d42488a5efd2-config\") pod \"1c8a877f-8bc5-4c4b-83ca-d42488a5efd2\" (UID: \"1c8a877f-8bc5-4c4b-83ca-d42488a5efd2\") " Mar 19 19:01:14 crc kubenswrapper[4826]: I0319 19:01:14.433783 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c8a877f-8bc5-4c4b-83ca-d42488a5efd2-serving-cert\") pod \"1c8a877f-8bc5-4c4b-83ca-d42488a5efd2\" (UID: \"1c8a877f-8bc5-4c4b-83ca-d42488a5efd2\") " Mar 19 19:01:14 crc kubenswrapper[4826]: I0319 19:01:14.434802 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c8a877f-8bc5-4c4b-83ca-d42488a5efd2-client-ca" (OuterVolumeSpecName: "client-ca") pod "1c8a877f-8bc5-4c4b-83ca-d42488a5efd2" (UID: "1c8a877f-8bc5-4c4b-83ca-d42488a5efd2"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:01:14 crc kubenswrapper[4826]: I0319 19:01:14.437272 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c8a877f-8bc5-4c4b-83ca-d42488a5efd2-config" (OuterVolumeSpecName: "config") pod "1c8a877f-8bc5-4c4b-83ca-d42488a5efd2" (UID: "1c8a877f-8bc5-4c4b-83ca-d42488a5efd2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:01:14 crc kubenswrapper[4826]: I0319 19:01:14.437401 4826 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1c8a877f-8bc5-4c4b-83ca-d42488a5efd2-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 19:01:14 crc kubenswrapper[4826]: I0319 19:01:14.446220 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c8a877f-8bc5-4c4b-83ca-d42488a5efd2-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1c8a877f-8bc5-4c4b-83ca-d42488a5efd2" (UID: "1c8a877f-8bc5-4c4b-83ca-d42488a5efd2"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:01:14 crc kubenswrapper[4826]: I0319 19:01:14.448801 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c8a877f-8bc5-4c4b-83ca-d42488a5efd2-kube-api-access-klmvb" (OuterVolumeSpecName: "kube-api-access-klmvb") pod "1c8a877f-8bc5-4c4b-83ca-d42488a5efd2" (UID: "1c8a877f-8bc5-4c4b-83ca-d42488a5efd2"). InnerVolumeSpecName "kube-api-access-klmvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:01:14 crc kubenswrapper[4826]: I0319 19:01:14.539003 4826 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c8a877f-8bc5-4c4b-83ca-d42488a5efd2-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:01:14 crc kubenswrapper[4826]: I0319 19:01:14.539034 4826 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c8a877f-8bc5-4c4b-83ca-d42488a5efd2-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 19:01:14 crc kubenswrapper[4826]: I0319 19:01:14.539047 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-klmvb\" (UniqueName: \"kubernetes.io/projected/1c8a877f-8bc5-4c4b-83ca-d42488a5efd2-kube-api-access-klmvb\") on node \"crc\" DevicePath \"\"" Mar 19 19:01:14 crc kubenswrapper[4826]: I0319 19:01:14.557076 4826 generic.go:334] "Generic (PLEG): container finished" podID="1c8a877f-8bc5-4c4b-83ca-d42488a5efd2" containerID="82aff541ffd0ba9843885369f016589125ab760601a66da579150cac8a580be9" exitCode=0 Mar 19 19:01:14 crc kubenswrapper[4826]: I0319 19:01:14.557147 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c7f6f4957-zk6dz" event={"ID":"1c8a877f-8bc5-4c4b-83ca-d42488a5efd2","Type":"ContainerDied","Data":"82aff541ffd0ba9843885369f016589125ab760601a66da579150cac8a580be9"} Mar 19 19:01:14 crc kubenswrapper[4826]: I0319 19:01:14.557209 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c7f6f4957-zk6dz" event={"ID":"1c8a877f-8bc5-4c4b-83ca-d42488a5efd2","Type":"ContainerDied","Data":"4a247967414213767ae12eaefa14042f9c015fa6c6528079242b38f2c8de89fb"} Mar 19 19:01:14 crc kubenswrapper[4826]: I0319 19:01:14.557273 4826 scope.go:117] "RemoveContainer" containerID="82aff541ffd0ba9843885369f016589125ab760601a66da579150cac8a580be9" Mar 19 19:01:14 crc kubenswrapper[4826]: I0319 19:01:14.557539 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c7f6f4957-zk6dz" Mar 19 19:01:14 crc kubenswrapper[4826]: I0319 19:01:14.580459 4826 scope.go:117] "RemoveContainer" containerID="82aff541ffd0ba9843885369f016589125ab760601a66da579150cac8a580be9" Mar 19 19:01:14 crc kubenswrapper[4826]: E0319 19:01:14.582835 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82aff541ffd0ba9843885369f016589125ab760601a66da579150cac8a580be9\": container with ID starting with 82aff541ffd0ba9843885369f016589125ab760601a66da579150cac8a580be9 not found: ID does not exist" containerID="82aff541ffd0ba9843885369f016589125ab760601a66da579150cac8a580be9" Mar 19 19:01:14 crc kubenswrapper[4826]: I0319 19:01:14.582870 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82aff541ffd0ba9843885369f016589125ab760601a66da579150cac8a580be9"} err="failed to get container status \"82aff541ffd0ba9843885369f016589125ab760601a66da579150cac8a580be9\": rpc error: code = NotFound desc = could not find container \"82aff541ffd0ba9843885369f016589125ab760601a66da579150cac8a580be9\": container with ID starting with 82aff541ffd0ba9843885369f016589125ab760601a66da579150cac8a580be9 not found: ID does not exist" Mar 19 19:01:14 crc kubenswrapper[4826]: I0319 19:01:14.596459 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c7f6f4957-zk6dz"] Mar 19 19:01:14 crc kubenswrapper[4826]: I0319 19:01:14.602888 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c7f6f4957-zk6dz"] Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.032930 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-4t5nj" podUID="7b5ed6bf-c032-4782-86eb-4803da62cb59" containerName="oauth-openshift" containerID="cri-o://74631407175b2a500e4c0f3c7e68c1cc4fb09d80bf5f57c3acbef11c960f217d" gracePeriod=15 Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.131078 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bb4bb89f7-bhb8x"] Mar 19 19:01:15 crc kubenswrapper[4826]: E0319 19:01:15.140987 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c8a877f-8bc5-4c4b-83ca-d42488a5efd2" containerName="route-controller-manager" Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.141036 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c8a877f-8bc5-4c4b-83ca-d42488a5efd2" containerName="route-controller-manager" Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.141290 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c8a877f-8bc5-4c4b-83ca-d42488a5efd2" containerName="route-controller-manager" Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.141975 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-bb4bb89f7-bhb8x" Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.147871 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.148298 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.148735 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.149234 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.149838 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.150244 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.155971 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bb4bb89f7-bhb8x"] Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.249345 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f25fb62-ec83-409e-88fb-0073d07869b9-config\") pod \"route-controller-manager-bb4bb89f7-bhb8x\" (UID: \"5f25fb62-ec83-409e-88fb-0073d07869b9\") " pod="openshift-route-controller-manager/route-controller-manager-bb4bb89f7-bhb8x" Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.249424 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmjzt\" (UniqueName: \"kubernetes.io/projected/5f25fb62-ec83-409e-88fb-0073d07869b9-kube-api-access-lmjzt\") pod \"route-controller-manager-bb4bb89f7-bhb8x\" (UID: \"5f25fb62-ec83-409e-88fb-0073d07869b9\") " pod="openshift-route-controller-manager/route-controller-manager-bb4bb89f7-bhb8x" Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.249599 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5f25fb62-ec83-409e-88fb-0073d07869b9-client-ca\") pod \"route-controller-manager-bb4bb89f7-bhb8x\" (UID: \"5f25fb62-ec83-409e-88fb-0073d07869b9\") " pod="openshift-route-controller-manager/route-controller-manager-bb4bb89f7-bhb8x" Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.249924 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f25fb62-ec83-409e-88fb-0073d07869b9-serving-cert\") pod \"route-controller-manager-bb4bb89f7-bhb8x\" (UID: \"5f25fb62-ec83-409e-88fb-0073d07869b9\") " pod="openshift-route-controller-manager/route-controller-manager-bb4bb89f7-bhb8x" Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.350929 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f25fb62-ec83-409e-88fb-0073d07869b9-config\") pod \"route-controller-manager-bb4bb89f7-bhb8x\" (UID: \"5f25fb62-ec83-409e-88fb-0073d07869b9\") " pod="openshift-route-controller-manager/route-controller-manager-bb4bb89f7-bhb8x" Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.351009 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmjzt\" (UniqueName: \"kubernetes.io/projected/5f25fb62-ec83-409e-88fb-0073d07869b9-kube-api-access-lmjzt\") pod \"route-controller-manager-bb4bb89f7-bhb8x\" (UID: \"5f25fb62-ec83-409e-88fb-0073d07869b9\") " pod="openshift-route-controller-manager/route-controller-manager-bb4bb89f7-bhb8x" Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.351107 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5f25fb62-ec83-409e-88fb-0073d07869b9-client-ca\") pod \"route-controller-manager-bb4bb89f7-bhb8x\" (UID: \"5f25fb62-ec83-409e-88fb-0073d07869b9\") " pod="openshift-route-controller-manager/route-controller-manager-bb4bb89f7-bhb8x" Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.351255 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f25fb62-ec83-409e-88fb-0073d07869b9-serving-cert\") pod \"route-controller-manager-bb4bb89f7-bhb8x\" (UID: \"5f25fb62-ec83-409e-88fb-0073d07869b9\") " pod="openshift-route-controller-manager/route-controller-manager-bb4bb89f7-bhb8x" Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.352488 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f25fb62-ec83-409e-88fb-0073d07869b9-config\") pod \"route-controller-manager-bb4bb89f7-bhb8x\" (UID: \"5f25fb62-ec83-409e-88fb-0073d07869b9\") " pod="openshift-route-controller-manager/route-controller-manager-bb4bb89f7-bhb8x" Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.352604 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5f25fb62-ec83-409e-88fb-0073d07869b9-client-ca\") pod \"route-controller-manager-bb4bb89f7-bhb8x\" (UID: \"5f25fb62-ec83-409e-88fb-0073d07869b9\") " pod="openshift-route-controller-manager/route-controller-manager-bb4bb89f7-bhb8x" Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.357329 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f25fb62-ec83-409e-88fb-0073d07869b9-serving-cert\") pod \"route-controller-manager-bb4bb89f7-bhb8x\" (UID: \"5f25fb62-ec83-409e-88fb-0073d07869b9\") " pod="openshift-route-controller-manager/route-controller-manager-bb4bb89f7-bhb8x" Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.381262 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmjzt\" (UniqueName: \"kubernetes.io/projected/5f25fb62-ec83-409e-88fb-0073d07869b9-kube-api-access-lmjzt\") pod \"route-controller-manager-bb4bb89f7-bhb8x\" (UID: \"5f25fb62-ec83-409e-88fb-0073d07869b9\") " pod="openshift-route-controller-manager/route-controller-manager-bb4bb89f7-bhb8x" Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.488633 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-bb4bb89f7-bhb8x" Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.526505 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-4t5nj" Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.555099 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-user-template-provider-selection\") pod \"7b5ed6bf-c032-4782-86eb-4803da62cb59\" (UID: \"7b5ed6bf-c032-4782-86eb-4803da62cb59\") " Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.555173 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-system-ocp-branding-template\") pod \"7b5ed6bf-c032-4782-86eb-4803da62cb59\" (UID: \"7b5ed6bf-c032-4782-86eb-4803da62cb59\") " Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.555243 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7wwk\" (UniqueName: \"kubernetes.io/projected/7b5ed6bf-c032-4782-86eb-4803da62cb59-kube-api-access-n7wwk\") pod \"7b5ed6bf-c032-4782-86eb-4803da62cb59\" (UID: \"7b5ed6bf-c032-4782-86eb-4803da62cb59\") " Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.555302 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-system-trusted-ca-bundle\") pod \"7b5ed6bf-c032-4782-86eb-4803da62cb59\" (UID: \"7b5ed6bf-c032-4782-86eb-4803da62cb59\") " Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.555326 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-system-router-certs\") pod \"7b5ed6bf-c032-4782-86eb-4803da62cb59\" (UID: \"7b5ed6bf-c032-4782-86eb-4803da62cb59\") " Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.555354 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-user-template-login\") pod \"7b5ed6bf-c032-4782-86eb-4803da62cb59\" (UID: \"7b5ed6bf-c032-4782-86eb-4803da62cb59\") " Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.555390 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-user-template-error\") pod \"7b5ed6bf-c032-4782-86eb-4803da62cb59\" (UID: \"7b5ed6bf-c032-4782-86eb-4803da62cb59\") " Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.555434 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-system-cliconfig\") pod \"7b5ed6bf-c032-4782-86eb-4803da62cb59\" (UID: \"7b5ed6bf-c032-4782-86eb-4803da62cb59\") " Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.555465 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7b5ed6bf-c032-4782-86eb-4803da62cb59-audit-policies\") pod \"7b5ed6bf-c032-4782-86eb-4803da62cb59\" (UID: \"7b5ed6bf-c032-4782-86eb-4803da62cb59\") " Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.555489 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7b5ed6bf-c032-4782-86eb-4803da62cb59-audit-dir\") pod \"7b5ed6bf-c032-4782-86eb-4803da62cb59\" (UID: \"7b5ed6bf-c032-4782-86eb-4803da62cb59\") " Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.555519 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-system-session\") pod \"7b5ed6bf-c032-4782-86eb-4803da62cb59\" (UID: \"7b5ed6bf-c032-4782-86eb-4803da62cb59\") " Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.555551 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-user-idp-0-file-data\") pod \"7b5ed6bf-c032-4782-86eb-4803da62cb59\" (UID: \"7b5ed6bf-c032-4782-86eb-4803da62cb59\") " Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.555580 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-system-serving-cert\") pod \"7b5ed6bf-c032-4782-86eb-4803da62cb59\" (UID: \"7b5ed6bf-c032-4782-86eb-4803da62cb59\") " Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.555605 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-system-service-ca\") pod \"7b5ed6bf-c032-4782-86eb-4803da62cb59\" (UID: \"7b5ed6bf-c032-4782-86eb-4803da62cb59\") " Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.556950 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "7b5ed6bf-c032-4782-86eb-4803da62cb59" (UID: "7b5ed6bf-c032-4782-86eb-4803da62cb59"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.556976 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "7b5ed6bf-c032-4782-86eb-4803da62cb59" (UID: "7b5ed6bf-c032-4782-86eb-4803da62cb59"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.557299 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7b5ed6bf-c032-4782-86eb-4803da62cb59-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "7b5ed6bf-c032-4782-86eb-4803da62cb59" (UID: "7b5ed6bf-c032-4782-86eb-4803da62cb59"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.557474 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "7b5ed6bf-c032-4782-86eb-4803da62cb59" (UID: "7b5ed6bf-c032-4782-86eb-4803da62cb59"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.560924 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b5ed6bf-c032-4782-86eb-4803da62cb59-kube-api-access-n7wwk" (OuterVolumeSpecName: "kube-api-access-n7wwk") pod "7b5ed6bf-c032-4782-86eb-4803da62cb59" (UID: "7b5ed6bf-c032-4782-86eb-4803da62cb59"). InnerVolumeSpecName "kube-api-access-n7wwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.561356 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b5ed6bf-c032-4782-86eb-4803da62cb59-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "7b5ed6bf-c032-4782-86eb-4803da62cb59" (UID: "7b5ed6bf-c032-4782-86eb-4803da62cb59"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.561497 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "7b5ed6bf-c032-4782-86eb-4803da62cb59" (UID: "7b5ed6bf-c032-4782-86eb-4803da62cb59"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.565016 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "7b5ed6bf-c032-4782-86eb-4803da62cb59" (UID: "7b5ed6bf-c032-4782-86eb-4803da62cb59"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.565856 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "7b5ed6bf-c032-4782-86eb-4803da62cb59" (UID: "7b5ed6bf-c032-4782-86eb-4803da62cb59"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.566063 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "7b5ed6bf-c032-4782-86eb-4803da62cb59" (UID: "7b5ed6bf-c032-4782-86eb-4803da62cb59"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.567359 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "7b5ed6bf-c032-4782-86eb-4803da62cb59" (UID: "7b5ed6bf-c032-4782-86eb-4803da62cb59"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.569599 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "7b5ed6bf-c032-4782-86eb-4803da62cb59" (UID: "7b5ed6bf-c032-4782-86eb-4803da62cb59"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.570274 4826 generic.go:334] "Generic (PLEG): container finished" podID="7b5ed6bf-c032-4782-86eb-4803da62cb59" containerID="74631407175b2a500e4c0f3c7e68c1cc4fb09d80bf5f57c3acbef11c960f217d" exitCode=0 Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.570330 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-4t5nj" event={"ID":"7b5ed6bf-c032-4782-86eb-4803da62cb59","Type":"ContainerDied","Data":"74631407175b2a500e4c0f3c7e68c1cc4fb09d80bf5f57c3acbef11c960f217d"} Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.570351 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-4t5nj" Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.570368 4826 scope.go:117] "RemoveContainer" containerID="74631407175b2a500e4c0f3c7e68c1cc4fb09d80bf5f57c3acbef11c960f217d" Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.570357 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-4t5nj" event={"ID":"7b5ed6bf-c032-4782-86eb-4803da62cb59","Type":"ContainerDied","Data":"2658ee96c6ef5f5d81200c5cd8ac3ab3646124091534ae879d3aa094b5ef4801"} Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.576557 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "7b5ed6bf-c032-4782-86eb-4803da62cb59" (UID: "7b5ed6bf-c032-4782-86eb-4803da62cb59"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.580649 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "7b5ed6bf-c032-4782-86eb-4803da62cb59" (UID: "7b5ed6bf-c032-4782-86eb-4803da62cb59"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.622517 4826 scope.go:117] "RemoveContainer" containerID="74631407175b2a500e4c0f3c7e68c1cc4fb09d80bf5f57c3acbef11c960f217d" Mar 19 19:01:15 crc kubenswrapper[4826]: E0319 19:01:15.623192 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74631407175b2a500e4c0f3c7e68c1cc4fb09d80bf5f57c3acbef11c960f217d\": container with ID starting with 74631407175b2a500e4c0f3c7e68c1cc4fb09d80bf5f57c3acbef11c960f217d not found: ID does not exist" containerID="74631407175b2a500e4c0f3c7e68c1cc4fb09d80bf5f57c3acbef11c960f217d" Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.623251 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74631407175b2a500e4c0f3c7e68c1cc4fb09d80bf5f57c3acbef11c960f217d"} err="failed to get container status \"74631407175b2a500e4c0f3c7e68c1cc4fb09d80bf5f57c3acbef11c960f217d\": rpc error: code = NotFound desc = could not find container \"74631407175b2a500e4c0f3c7e68c1cc4fb09d80bf5f57c3acbef11c960f217d\": container with ID starting with 74631407175b2a500e4c0f3c7e68c1cc4fb09d80bf5f57c3acbef11c960f217d not found: ID does not exist" Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.656882 4826 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.656931 4826 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.656942 4826 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.656955 4826 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.656968 4826 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.657011 4826 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7b5ed6bf-c032-4782-86eb-4803da62cb59-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.657022 4826 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7b5ed6bf-c032-4782-86eb-4803da62cb59-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.657133 4826 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.657146 4826 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.657158 4826 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.657169 4826 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.657182 4826 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.657196 4826 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7b5ed6bf-c032-4782-86eb-4803da62cb59-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.657207 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7wwk\" (UniqueName: \"kubernetes.io/projected/7b5ed6bf-c032-4782-86eb-4803da62cb59-kube-api-access-n7wwk\") on node \"crc\" DevicePath \"\"" Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.931827 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-4t5nj"] Mar 19 19:01:15 crc kubenswrapper[4826]: W0319 19:01:15.939673 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f25fb62_ec83_409e_88fb_0073d07869b9.slice/crio-c17fa94a954ab66142c90520587ffe24115216e43e1bba4ee63d5c3545b39991 WatchSource:0}: Error finding container c17fa94a954ab66142c90520587ffe24115216e43e1bba4ee63d5c3545b39991: Status 404 returned error can't find the container with id c17fa94a954ab66142c90520587ffe24115216e43e1bba4ee63d5c3545b39991 Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.940504 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-4t5nj"] Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.948147 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bb4bb89f7-bhb8x"] Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.990835 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c8a877f-8bc5-4c4b-83ca-d42488a5efd2" path="/var/lib/kubelet/pods/1c8a877f-8bc5-4c4b-83ca-d42488a5efd2/volumes" Mar 19 19:01:15 crc kubenswrapper[4826]: I0319 19:01:15.992728 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b5ed6bf-c032-4782-86eb-4803da62cb59" path="/var/lib/kubelet/pods/7b5ed6bf-c032-4782-86eb-4803da62cb59/volumes" Mar 19 19:01:16 crc kubenswrapper[4826]: I0319 19:01:16.581459 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-bb4bb89f7-bhb8x" event={"ID":"5f25fb62-ec83-409e-88fb-0073d07869b9","Type":"ContainerStarted","Data":"124b4e552bd3b6885f319faa228718a478ff32940ded948a03982e9e648fe0bb"} Mar 19 19:01:16 crc kubenswrapper[4826]: I0319 19:01:16.581851 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-bb4bb89f7-bhb8x" Mar 19 19:01:16 crc kubenswrapper[4826]: I0319 19:01:16.581867 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-bb4bb89f7-bhb8x" event={"ID":"5f25fb62-ec83-409e-88fb-0073d07869b9","Type":"ContainerStarted","Data":"c17fa94a954ab66142c90520587ffe24115216e43e1bba4ee63d5c3545b39991"} Mar 19 19:01:16 crc kubenswrapper[4826]: I0319 19:01:16.588245 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-bb4bb89f7-bhb8x" Mar 19 19:01:16 crc kubenswrapper[4826]: I0319 19:01:16.607002 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-bb4bb89f7-bhb8x" podStartSLOduration=3.606978702 podStartE2EDuration="3.606978702s" podCreationTimestamp="2026-03-19 19:01:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:01:16.60582597 +0000 UTC m=+301.359894313" watchObservedRunningTime="2026-03-19 19:01:16.606978702 +0000 UTC m=+301.361047025" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.138913 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-55bb4f975f-zpl6z"] Mar 19 19:01:19 crc kubenswrapper[4826]: E0319 19:01:19.139875 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b5ed6bf-c032-4782-86eb-4803da62cb59" containerName="oauth-openshift" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.139910 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b5ed6bf-c032-4782-86eb-4803da62cb59" containerName="oauth-openshift" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.140169 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b5ed6bf-c032-4782-86eb-4803da62cb59" containerName="oauth-openshift" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.141012 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.143191 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.145554 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.145822 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.145934 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.146282 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.148280 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.148634 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.148693 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.148634 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.149069 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.148744 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.154274 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.185710 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.185811 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-55bb4f975f-zpl6z"] Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.194196 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.198574 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.207577 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/46e578cd-3724-4abe-805c-554b384ed050-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-55bb4f975f-zpl6z\" (UID: \"46e578cd-3724-4abe-805c-554b384ed050\") " pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.207755 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/46e578cd-3724-4abe-805c-554b384ed050-v4-0-config-system-service-ca\") pod \"oauth-openshift-55bb4f975f-zpl6z\" (UID: \"46e578cd-3724-4abe-805c-554b384ed050\") " pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.207828 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4zjt\" (UniqueName: \"kubernetes.io/projected/46e578cd-3724-4abe-805c-554b384ed050-kube-api-access-w4zjt\") pod \"oauth-openshift-55bb4f975f-zpl6z\" (UID: \"46e578cd-3724-4abe-805c-554b384ed050\") " pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.207882 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/46e578cd-3724-4abe-805c-554b384ed050-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-55bb4f975f-zpl6z\" (UID: \"46e578cd-3724-4abe-805c-554b384ed050\") " pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.207982 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/46e578cd-3724-4abe-805c-554b384ed050-v4-0-config-system-serving-cert\") pod \"oauth-openshift-55bb4f975f-zpl6z\" (UID: \"46e578cd-3724-4abe-805c-554b384ed050\") " pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.208034 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/46e578cd-3724-4abe-805c-554b384ed050-v4-0-config-system-router-certs\") pod \"oauth-openshift-55bb4f975f-zpl6z\" (UID: \"46e578cd-3724-4abe-805c-554b384ed050\") " pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.208324 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/46e578cd-3724-4abe-805c-554b384ed050-audit-policies\") pod \"oauth-openshift-55bb4f975f-zpl6z\" (UID: \"46e578cd-3724-4abe-805c-554b384ed050\") " pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.208423 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/46e578cd-3724-4abe-805c-554b384ed050-audit-dir\") pod \"oauth-openshift-55bb4f975f-zpl6z\" (UID: \"46e578cd-3724-4abe-805c-554b384ed050\") " pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.208550 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/46e578cd-3724-4abe-805c-554b384ed050-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-55bb4f975f-zpl6z\" (UID: \"46e578cd-3724-4abe-805c-554b384ed050\") " pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.208637 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46e578cd-3724-4abe-805c-554b384ed050-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-55bb4f975f-zpl6z\" (UID: \"46e578cd-3724-4abe-805c-554b384ed050\") " pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.208815 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/46e578cd-3724-4abe-805c-554b384ed050-v4-0-config-system-cliconfig\") pod \"oauth-openshift-55bb4f975f-zpl6z\" (UID: \"46e578cd-3724-4abe-805c-554b384ed050\") " pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.208965 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/46e578cd-3724-4abe-805c-554b384ed050-v4-0-config-user-template-error\") pod \"oauth-openshift-55bb4f975f-zpl6z\" (UID: \"46e578cd-3724-4abe-805c-554b384ed050\") " pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.209072 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/46e578cd-3724-4abe-805c-554b384ed050-v4-0-config-system-session\") pod \"oauth-openshift-55bb4f975f-zpl6z\" (UID: \"46e578cd-3724-4abe-805c-554b384ed050\") " pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.209126 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/46e578cd-3724-4abe-805c-554b384ed050-v4-0-config-user-template-login\") pod \"oauth-openshift-55bb4f975f-zpl6z\" (UID: \"46e578cd-3724-4abe-805c-554b384ed050\") " pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.309723 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/46e578cd-3724-4abe-805c-554b384ed050-v4-0-config-user-template-login\") pod \"oauth-openshift-55bb4f975f-zpl6z\" (UID: \"46e578cd-3724-4abe-805c-554b384ed050\") " pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.309829 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/46e578cd-3724-4abe-805c-554b384ed050-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-55bb4f975f-zpl6z\" (UID: \"46e578cd-3724-4abe-805c-554b384ed050\") " pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.309885 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/46e578cd-3724-4abe-805c-554b384ed050-v4-0-config-system-service-ca\") pod \"oauth-openshift-55bb4f975f-zpl6z\" (UID: \"46e578cd-3724-4abe-805c-554b384ed050\") " pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.309927 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4zjt\" (UniqueName: \"kubernetes.io/projected/46e578cd-3724-4abe-805c-554b384ed050-kube-api-access-w4zjt\") pod \"oauth-openshift-55bb4f975f-zpl6z\" (UID: \"46e578cd-3724-4abe-805c-554b384ed050\") " pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.309962 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/46e578cd-3724-4abe-805c-554b384ed050-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-55bb4f975f-zpl6z\" (UID: \"46e578cd-3724-4abe-805c-554b384ed050\") " pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.309995 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/46e578cd-3724-4abe-805c-554b384ed050-v4-0-config-system-serving-cert\") pod \"oauth-openshift-55bb4f975f-zpl6z\" (UID: \"46e578cd-3724-4abe-805c-554b384ed050\") " pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.310025 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/46e578cd-3724-4abe-805c-554b384ed050-v4-0-config-system-router-certs\") pod \"oauth-openshift-55bb4f975f-zpl6z\" (UID: \"46e578cd-3724-4abe-805c-554b384ed050\") " pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.310063 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/46e578cd-3724-4abe-805c-554b384ed050-audit-policies\") pod \"oauth-openshift-55bb4f975f-zpl6z\" (UID: \"46e578cd-3724-4abe-805c-554b384ed050\") " pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.310115 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/46e578cd-3724-4abe-805c-554b384ed050-audit-dir\") pod \"oauth-openshift-55bb4f975f-zpl6z\" (UID: \"46e578cd-3724-4abe-805c-554b384ed050\") " pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.310156 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/46e578cd-3724-4abe-805c-554b384ed050-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-55bb4f975f-zpl6z\" (UID: \"46e578cd-3724-4abe-805c-554b384ed050\") " pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.310205 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46e578cd-3724-4abe-805c-554b384ed050-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-55bb4f975f-zpl6z\" (UID: \"46e578cd-3724-4abe-805c-554b384ed050\") " pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.310243 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/46e578cd-3724-4abe-805c-554b384ed050-v4-0-config-system-cliconfig\") pod \"oauth-openshift-55bb4f975f-zpl6z\" (UID: \"46e578cd-3724-4abe-805c-554b384ed050\") " pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.310352 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/46e578cd-3724-4abe-805c-554b384ed050-v4-0-config-user-template-error\") pod \"oauth-openshift-55bb4f975f-zpl6z\" (UID: \"46e578cd-3724-4abe-805c-554b384ed050\") " pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.310406 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/46e578cd-3724-4abe-805c-554b384ed050-v4-0-config-system-session\") pod \"oauth-openshift-55bb4f975f-zpl6z\" (UID: \"46e578cd-3724-4abe-805c-554b384ed050\") " pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.310867 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/46e578cd-3724-4abe-805c-554b384ed050-audit-dir\") pod \"oauth-openshift-55bb4f975f-zpl6z\" (UID: \"46e578cd-3724-4abe-805c-554b384ed050\") " pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.312075 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/46e578cd-3724-4abe-805c-554b384ed050-v4-0-config-system-cliconfig\") pod \"oauth-openshift-55bb4f975f-zpl6z\" (UID: \"46e578cd-3724-4abe-805c-554b384ed050\") " pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.312061 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/46e578cd-3724-4abe-805c-554b384ed050-v4-0-config-system-service-ca\") pod \"oauth-openshift-55bb4f975f-zpl6z\" (UID: \"46e578cd-3724-4abe-805c-554b384ed050\") " pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.312771 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/46e578cd-3724-4abe-805c-554b384ed050-audit-policies\") pod \"oauth-openshift-55bb4f975f-zpl6z\" (UID: \"46e578cd-3724-4abe-805c-554b384ed050\") " pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.313234 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46e578cd-3724-4abe-805c-554b384ed050-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-55bb4f975f-zpl6z\" (UID: \"46e578cd-3724-4abe-805c-554b384ed050\") " pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.317610 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/46e578cd-3724-4abe-805c-554b384ed050-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-55bb4f975f-zpl6z\" (UID: \"46e578cd-3724-4abe-805c-554b384ed050\") " pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.318680 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/46e578cd-3724-4abe-805c-554b384ed050-v4-0-config-system-session\") pod \"oauth-openshift-55bb4f975f-zpl6z\" (UID: \"46e578cd-3724-4abe-805c-554b384ed050\") " pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.318831 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/46e578cd-3724-4abe-805c-554b384ed050-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-55bb4f975f-zpl6z\" (UID: \"46e578cd-3724-4abe-805c-554b384ed050\") " pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.319985 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/46e578cd-3724-4abe-805c-554b384ed050-v4-0-config-system-serving-cert\") pod \"oauth-openshift-55bb4f975f-zpl6z\" (UID: \"46e578cd-3724-4abe-805c-554b384ed050\") " pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.320323 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/46e578cd-3724-4abe-805c-554b384ed050-v4-0-config-user-template-login\") pod \"oauth-openshift-55bb4f975f-zpl6z\" (UID: \"46e578cd-3724-4abe-805c-554b384ed050\") " pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.320323 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/46e578cd-3724-4abe-805c-554b384ed050-v4-0-config-system-router-certs\") pod \"oauth-openshift-55bb4f975f-zpl6z\" (UID: \"46e578cd-3724-4abe-805c-554b384ed050\") " pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.320336 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/46e578cd-3724-4abe-805c-554b384ed050-v4-0-config-user-template-error\") pod \"oauth-openshift-55bb4f975f-zpl6z\" (UID: \"46e578cd-3724-4abe-805c-554b384ed050\") " pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.321094 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/46e578cd-3724-4abe-805c-554b384ed050-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-55bb4f975f-zpl6z\" (UID: \"46e578cd-3724-4abe-805c-554b384ed050\") " pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.340892 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4zjt\" (UniqueName: \"kubernetes.io/projected/46e578cd-3724-4abe-805c-554b384ed050-kube-api-access-w4zjt\") pod \"oauth-openshift-55bb4f975f-zpl6z\" (UID: \"46e578cd-3724-4abe-805c-554b384ed050\") " pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.480840 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" Mar 19 19:01:19 crc kubenswrapper[4826]: I0319 19:01:19.957793 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-55bb4f975f-zpl6z"] Mar 19 19:01:20 crc kubenswrapper[4826]: I0319 19:01:20.114700 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-77cc2" Mar 19 19:01:20 crc kubenswrapper[4826]: I0319 19:01:20.193249 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mwhjm"] Mar 19 19:01:20 crc kubenswrapper[4826]: I0319 19:01:20.614743 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" event={"ID":"46e578cd-3724-4abe-805c-554b384ed050","Type":"ContainerStarted","Data":"9bc32b51b5566b427ffa287240d9eb0613e8145bd9253dd2736092863a4a7221"} Mar 19 19:01:20 crc kubenswrapper[4826]: I0319 19:01:20.615255 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" event={"ID":"46e578cd-3724-4abe-805c-554b384ed050","Type":"ContainerStarted","Data":"f72eaea1a966a080b3e34c4bb19d041c1394f6a64b7383cf53aa5ad5ddf170c9"} Mar 19 19:01:20 crc kubenswrapper[4826]: I0319 19:01:20.618602 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" Mar 19 19:01:20 crc kubenswrapper[4826]: I0319 19:01:20.654551 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" podStartSLOduration=16.654518593 podStartE2EDuration="16.654518593s" podCreationTimestamp="2026-03-19 19:01:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:01:20.64460783 +0000 UTC m=+305.398676193" watchObservedRunningTime="2026-03-19 19:01:20.654518593 +0000 UTC m=+305.408586946" Mar 19 19:01:20 crc kubenswrapper[4826]: I0319 19:01:20.894287 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" Mar 19 19:01:45 crc kubenswrapper[4826]: I0319 19:01:45.246942 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" podUID="4b7a5ec4-de52-4a0c-9e90-5a2835f6476e" containerName="registry" containerID="cri-o://16ae911a0fab6988359d706d8d95881dc9d87e72d919a9b6d045d59288b5f632" gracePeriod=30 Mar 19 19:01:45 crc kubenswrapper[4826]: I0319 19:01:45.726436 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 19:01:45 crc kubenswrapper[4826]: I0319 19:01:45.788955 4826 generic.go:334] "Generic (PLEG): container finished" podID="4b7a5ec4-de52-4a0c-9e90-5a2835f6476e" containerID="16ae911a0fab6988359d706d8d95881dc9d87e72d919a9b6d045d59288b5f632" exitCode=0 Mar 19 19:01:45 crc kubenswrapper[4826]: I0319 19:01:45.789005 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" event={"ID":"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e","Type":"ContainerDied","Data":"16ae911a0fab6988359d706d8d95881dc9d87e72d919a9b6d045d59288b5f632"} Mar 19 19:01:45 crc kubenswrapper[4826]: I0319 19:01:45.789051 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" event={"ID":"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e","Type":"ContainerDied","Data":"7063689c24369653482c0bd06c861d5cf00a046eaa975b22b6e2a91b623ef4e1"} Mar 19 19:01:45 crc kubenswrapper[4826]: I0319 19:01:45.789078 4826 scope.go:117] "RemoveContainer" containerID="16ae911a0fab6988359d706d8d95881dc9d87e72d919a9b6d045d59288b5f632" Mar 19 19:01:45 crc kubenswrapper[4826]: I0319 19:01:45.789611 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mwhjm" Mar 19 19:01:45 crc kubenswrapper[4826]: I0319 19:01:45.808777 4826 scope.go:117] "RemoveContainer" containerID="16ae911a0fab6988359d706d8d95881dc9d87e72d919a9b6d045d59288b5f632" Mar 19 19:01:45 crc kubenswrapper[4826]: E0319 19:01:45.809287 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16ae911a0fab6988359d706d8d95881dc9d87e72d919a9b6d045d59288b5f632\": container with ID starting with 16ae911a0fab6988359d706d8d95881dc9d87e72d919a9b6d045d59288b5f632 not found: ID does not exist" containerID="16ae911a0fab6988359d706d8d95881dc9d87e72d919a9b6d045d59288b5f632" Mar 19 19:01:45 crc kubenswrapper[4826]: I0319 19:01:45.809334 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16ae911a0fab6988359d706d8d95881dc9d87e72d919a9b6d045d59288b5f632"} err="failed to get container status \"16ae911a0fab6988359d706d8d95881dc9d87e72d919a9b6d045d59288b5f632\": rpc error: code = NotFound desc = could not find container \"16ae911a0fab6988359d706d8d95881dc9d87e72d919a9b6d045d59288b5f632\": container with ID starting with 16ae911a0fab6988359d706d8d95881dc9d87e72d919a9b6d045d59288b5f632 not found: ID does not exist" Mar 19 19:01:45 crc kubenswrapper[4826]: I0319 19:01:45.899522 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4b7a5ec4-de52-4a0c-9e90-5a2835f6476e-ca-trust-extracted\") pod \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " Mar 19 19:01:45 crc kubenswrapper[4826]: I0319 19:01:45.899927 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4b7a5ec4-de52-4a0c-9e90-5a2835f6476e-registry-tls\") pod \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " Mar 19 19:01:45 crc kubenswrapper[4826]: I0319 19:01:45.900011 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4b7a5ec4-de52-4a0c-9e90-5a2835f6476e-trusted-ca\") pod \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " Mar 19 19:01:45 crc kubenswrapper[4826]: I0319 19:01:45.900065 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4b7a5ec4-de52-4a0c-9e90-5a2835f6476e-registry-certificates\") pod \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " Mar 19 19:01:45 crc kubenswrapper[4826]: I0319 19:01:45.900101 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vblpc\" (UniqueName: \"kubernetes.io/projected/4b7a5ec4-de52-4a0c-9e90-5a2835f6476e-kube-api-access-vblpc\") pod \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " Mar 19 19:01:45 crc kubenswrapper[4826]: I0319 19:01:45.900325 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " Mar 19 19:01:45 crc kubenswrapper[4826]: I0319 19:01:45.900363 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4b7a5ec4-de52-4a0c-9e90-5a2835f6476e-installation-pull-secrets\") pod \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " Mar 19 19:01:45 crc kubenswrapper[4826]: I0319 19:01:45.900403 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4b7a5ec4-de52-4a0c-9e90-5a2835f6476e-bound-sa-token\") pod \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\" (UID: \"4b7a5ec4-de52-4a0c-9e90-5a2835f6476e\") " Mar 19 19:01:45 crc kubenswrapper[4826]: I0319 19:01:45.902249 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b7a5ec4-de52-4a0c-9e90-5a2835f6476e-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:01:45 crc kubenswrapper[4826]: I0319 19:01:45.902470 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b7a5ec4-de52-4a0c-9e90-5a2835f6476e-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:01:45 crc kubenswrapper[4826]: I0319 19:01:45.907409 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b7a5ec4-de52-4a0c-9e90-5a2835f6476e-kube-api-access-vblpc" (OuterVolumeSpecName: "kube-api-access-vblpc") pod "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e"). InnerVolumeSpecName "kube-api-access-vblpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:01:45 crc kubenswrapper[4826]: I0319 19:01:45.911898 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b7a5ec4-de52-4a0c-9e90-5a2835f6476e-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:01:45 crc kubenswrapper[4826]: I0319 19:01:45.912128 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b7a5ec4-de52-4a0c-9e90-5a2835f6476e-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:01:45 crc kubenswrapper[4826]: I0319 19:01:45.912213 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b7a5ec4-de52-4a0c-9e90-5a2835f6476e-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:01:45 crc kubenswrapper[4826]: I0319 19:01:45.915636 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 19 19:01:45 crc kubenswrapper[4826]: I0319 19:01:45.940734 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b7a5ec4-de52-4a0c-9e90-5a2835f6476e-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e" (UID: "4b7a5ec4-de52-4a0c-9e90-5a2835f6476e"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:01:46 crc kubenswrapper[4826]: I0319 19:01:46.002038 4826 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4b7a5ec4-de52-4a0c-9e90-5a2835f6476e-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 19 19:01:46 crc kubenswrapper[4826]: I0319 19:01:46.002078 4826 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4b7a5ec4-de52-4a0c-9e90-5a2835f6476e-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 19:01:46 crc kubenswrapper[4826]: I0319 19:01:46.002097 4826 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4b7a5ec4-de52-4a0c-9e90-5a2835f6476e-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 19 19:01:46 crc kubenswrapper[4826]: I0319 19:01:46.002115 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vblpc\" (UniqueName: \"kubernetes.io/projected/4b7a5ec4-de52-4a0c-9e90-5a2835f6476e-kube-api-access-vblpc\") on node \"crc\" DevicePath \"\"" Mar 19 19:01:46 crc kubenswrapper[4826]: I0319 19:01:46.002132 4826 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4b7a5ec4-de52-4a0c-9e90-5a2835f6476e-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 19 19:01:46 crc kubenswrapper[4826]: I0319 19:01:46.002148 4826 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4b7a5ec4-de52-4a0c-9e90-5a2835f6476e-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 19 19:01:46 crc kubenswrapper[4826]: I0319 19:01:46.002165 4826 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4b7a5ec4-de52-4a0c-9e90-5a2835f6476e-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 19 19:01:46 crc kubenswrapper[4826]: I0319 19:01:46.119396 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mwhjm"] Mar 19 19:01:46 crc kubenswrapper[4826]: I0319 19:01:46.135809 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mwhjm"] Mar 19 19:01:47 crc kubenswrapper[4826]: I0319 19:01:47.985095 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b7a5ec4-de52-4a0c-9e90-5a2835f6476e" path="/var/lib/kubelet/pods/4b7a5ec4-de52-4a0c-9e90-5a2835f6476e/volumes" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.132229 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-psc5t"] Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.134511 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-psc5t" podUID="ca0af29a-9ab6-4d5f-a6fd-bdb5f3d5526c" containerName="registry-server" containerID="cri-o://8c1ffe044a85cadc547ff459f20e7ddf8d2417059853707962e54c4ec31457cc" gracePeriod=30 Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.152817 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lwdqq"] Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.153075 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lwdqq" podUID="6397cca1-7284-4e40-9b7e-3f8026c72f5f" containerName="registry-server" containerID="cri-o://207e00c21fa248a25bcaa99664c76d352f723d37e7e7047e198e62df271bad86" gracePeriod=30 Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.171326 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-flj2h"] Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.171575 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-flj2h" podUID="ffb5bca9-57f3-415c-b2c2-c5088fe6c5d9" containerName="marketplace-operator" containerID="cri-o://7df9b284d591f0263e0c1a5abc3c274f43472b3d37014e63cb605c3924259aef" gracePeriod=30 Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.182640 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2fpvj"] Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.182917 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2fpvj" podUID="0e0d7689-755d-4e24-a337-4177c37c2437" containerName="registry-server" containerID="cri-o://bfead77d20c83e15418079844cb872a8336a22b8dcb8f367ca67ff35b0cc0102" gracePeriod=30 Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.194978 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-66v8z"] Mar 19 19:01:59 crc kubenswrapper[4826]: E0319 19:01:59.195190 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b7a5ec4-de52-4a0c-9e90-5a2835f6476e" containerName="registry" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.195202 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b7a5ec4-de52-4a0c-9e90-5a2835f6476e" containerName="registry" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.195282 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b7a5ec4-de52-4a0c-9e90-5a2835f6476e" containerName="registry" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.195637 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-66v8z" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.203254 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zkslk"] Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.203595 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zkslk" podUID="7109581b-42ad-4e72-89be-ae269dcaea42" containerName="registry-server" containerID="cri-o://71b424507f0985e335f1f90139a9f35f490a1d4526b09d52f483274c93a36146" gracePeriod=30 Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.222343 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-66v8z"] Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.385012 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f182fb72-66c7-4d5d-bccd-29a47b27f4c6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-66v8z\" (UID: \"f182fb72-66c7-4d5d-bccd-29a47b27f4c6\") " pod="openshift-marketplace/marketplace-operator-79b997595-66v8z" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.385314 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f182fb72-66c7-4d5d-bccd-29a47b27f4c6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-66v8z\" (UID: \"f182fb72-66c7-4d5d-bccd-29a47b27f4c6\") " pod="openshift-marketplace/marketplace-operator-79b997595-66v8z" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.385340 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rb58\" (UniqueName: \"kubernetes.io/projected/f182fb72-66c7-4d5d-bccd-29a47b27f4c6-kube-api-access-9rb58\") pod \"marketplace-operator-79b997595-66v8z\" (UID: \"f182fb72-66c7-4d5d-bccd-29a47b27f4c6\") " pod="openshift-marketplace/marketplace-operator-79b997595-66v8z" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.486000 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f182fb72-66c7-4d5d-bccd-29a47b27f4c6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-66v8z\" (UID: \"f182fb72-66c7-4d5d-bccd-29a47b27f4c6\") " pod="openshift-marketplace/marketplace-operator-79b997595-66v8z" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.486053 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f182fb72-66c7-4d5d-bccd-29a47b27f4c6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-66v8z\" (UID: \"f182fb72-66c7-4d5d-bccd-29a47b27f4c6\") " pod="openshift-marketplace/marketplace-operator-79b997595-66v8z" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.486070 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rb58\" (UniqueName: \"kubernetes.io/projected/f182fb72-66c7-4d5d-bccd-29a47b27f4c6-kube-api-access-9rb58\") pod \"marketplace-operator-79b997595-66v8z\" (UID: \"f182fb72-66c7-4d5d-bccd-29a47b27f4c6\") " pod="openshift-marketplace/marketplace-operator-79b997595-66v8z" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.487732 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f182fb72-66c7-4d5d-bccd-29a47b27f4c6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-66v8z\" (UID: \"f182fb72-66c7-4d5d-bccd-29a47b27f4c6\") " pod="openshift-marketplace/marketplace-operator-79b997595-66v8z" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.502826 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f182fb72-66c7-4d5d-bccd-29a47b27f4c6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-66v8z\" (UID: \"f182fb72-66c7-4d5d-bccd-29a47b27f4c6\") " pod="openshift-marketplace/marketplace-operator-79b997595-66v8z" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.506129 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rb58\" (UniqueName: \"kubernetes.io/projected/f182fb72-66c7-4d5d-bccd-29a47b27f4c6-kube-api-access-9rb58\") pod \"marketplace-operator-79b997595-66v8z\" (UID: \"f182fb72-66c7-4d5d-bccd-29a47b27f4c6\") " pod="openshift-marketplace/marketplace-operator-79b997595-66v8z" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.644314 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-66v8z" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.647566 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lwdqq" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.655130 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2fpvj" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.659400 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-flj2h" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.674193 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zkslk" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.676835 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-psc5t" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.689249 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6397cca1-7284-4e40-9b7e-3f8026c72f5f-catalog-content\") pod \"6397cca1-7284-4e40-9b7e-3f8026c72f5f\" (UID: \"6397cca1-7284-4e40-9b7e-3f8026c72f5f\") " Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.689304 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e0d7689-755d-4e24-a337-4177c37c2437-utilities\") pod \"0e0d7689-755d-4e24-a337-4177c37c2437\" (UID: \"0e0d7689-755d-4e24-a337-4177c37c2437\") " Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.690938 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e0d7689-755d-4e24-a337-4177c37c2437-utilities" (OuterVolumeSpecName: "utilities") pod "0e0d7689-755d-4e24-a337-4177c37c2437" (UID: "0e0d7689-755d-4e24-a337-4177c37c2437"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.690989 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcbbb\" (UniqueName: \"kubernetes.io/projected/7109581b-42ad-4e72-89be-ae269dcaea42-kube-api-access-vcbbb\") pod \"7109581b-42ad-4e72-89be-ae269dcaea42\" (UID: \"7109581b-42ad-4e72-89be-ae269dcaea42\") " Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.691058 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e0d7689-755d-4e24-a337-4177c37c2437-catalog-content\") pod \"0e0d7689-755d-4e24-a337-4177c37c2437\" (UID: \"0e0d7689-755d-4e24-a337-4177c37c2437\") " Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.695301 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7109581b-42ad-4e72-89be-ae269dcaea42-kube-api-access-vcbbb" (OuterVolumeSpecName: "kube-api-access-vcbbb") pod "7109581b-42ad-4e72-89be-ae269dcaea42" (UID: "7109581b-42ad-4e72-89be-ae269dcaea42"). InnerVolumeSpecName "kube-api-access-vcbbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.698998 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b68bp\" (UniqueName: \"kubernetes.io/projected/6397cca1-7284-4e40-9b7e-3f8026c72f5f-kube-api-access-b68bp\") pod \"6397cca1-7284-4e40-9b7e-3f8026c72f5f\" (UID: \"6397cca1-7284-4e40-9b7e-3f8026c72f5f\") " Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.699041 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7109581b-42ad-4e72-89be-ae269dcaea42-catalog-content\") pod \"7109581b-42ad-4e72-89be-ae269dcaea42\" (UID: \"7109581b-42ad-4e72-89be-ae269dcaea42\") " Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.699127 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6397cca1-7284-4e40-9b7e-3f8026c72f5f-utilities\") pod \"6397cca1-7284-4e40-9b7e-3f8026c72f5f\" (UID: \"6397cca1-7284-4e40-9b7e-3f8026c72f5f\") " Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.699160 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zp8h\" (UniqueName: \"kubernetes.io/projected/ca0af29a-9ab6-4d5f-a6fd-bdb5f3d5526c-kube-api-access-5zp8h\") pod \"ca0af29a-9ab6-4d5f-a6fd-bdb5f3d5526c\" (UID: \"ca0af29a-9ab6-4d5f-a6fd-bdb5f3d5526c\") " Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.699198 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca0af29a-9ab6-4d5f-a6fd-bdb5f3d5526c-catalog-content\") pod \"ca0af29a-9ab6-4d5f-a6fd-bdb5f3d5526c\" (UID: \"ca0af29a-9ab6-4d5f-a6fd-bdb5f3d5526c\") " Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.699227 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7109581b-42ad-4e72-89be-ae269dcaea42-utilities\") pod \"7109581b-42ad-4e72-89be-ae269dcaea42\" (UID: \"7109581b-42ad-4e72-89be-ae269dcaea42\") " Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.699263 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ffb5bca9-57f3-415c-b2c2-c5088fe6c5d9-marketplace-operator-metrics\") pod \"ffb5bca9-57f3-415c-b2c2-c5088fe6c5d9\" (UID: \"ffb5bca9-57f3-415c-b2c2-c5088fe6c5d9\") " Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.699337 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ffb5bca9-57f3-415c-b2c2-c5088fe6c5d9-marketplace-trusted-ca\") pod \"ffb5bca9-57f3-415c-b2c2-c5088fe6c5d9\" (UID: \"ffb5bca9-57f3-415c-b2c2-c5088fe6c5d9\") " Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.699362 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca0af29a-9ab6-4d5f-a6fd-bdb5f3d5526c-utilities\") pod \"ca0af29a-9ab6-4d5f-a6fd-bdb5f3d5526c\" (UID: \"ca0af29a-9ab6-4d5f-a6fd-bdb5f3d5526c\") " Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.699391 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wsr27\" (UniqueName: \"kubernetes.io/projected/ffb5bca9-57f3-415c-b2c2-c5088fe6c5d9-kube-api-access-wsr27\") pod \"ffb5bca9-57f3-415c-b2c2-c5088fe6c5d9\" (UID: \"ffb5bca9-57f3-415c-b2c2-c5088fe6c5d9\") " Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.699415 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcngb\" (UniqueName: \"kubernetes.io/projected/0e0d7689-755d-4e24-a337-4177c37c2437-kube-api-access-zcngb\") pod \"0e0d7689-755d-4e24-a337-4177c37c2437\" (UID: \"0e0d7689-755d-4e24-a337-4177c37c2437\") " Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.700307 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7109581b-42ad-4e72-89be-ae269dcaea42-utilities" (OuterVolumeSpecName: "utilities") pod "7109581b-42ad-4e72-89be-ae269dcaea42" (UID: "7109581b-42ad-4e72-89be-ae269dcaea42"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.701615 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6397cca1-7284-4e40-9b7e-3f8026c72f5f-utilities" (OuterVolumeSpecName: "utilities") pod "6397cca1-7284-4e40-9b7e-3f8026c72f5f" (UID: "6397cca1-7284-4e40-9b7e-3f8026c72f5f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.701705 4826 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7109581b-42ad-4e72-89be-ae269dcaea42-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.701726 4826 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e0d7689-755d-4e24-a337-4177c37c2437-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.701739 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcbbb\" (UniqueName: \"kubernetes.io/projected/7109581b-42ad-4e72-89be-ae269dcaea42-kube-api-access-vcbbb\") on node \"crc\" DevicePath \"\"" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.703818 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffb5bca9-57f3-415c-b2c2-c5088fe6c5d9-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "ffb5bca9-57f3-415c-b2c2-c5088fe6c5d9" (UID: "ffb5bca9-57f3-415c-b2c2-c5088fe6c5d9"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.706312 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca0af29a-9ab6-4d5f-a6fd-bdb5f3d5526c-utilities" (OuterVolumeSpecName: "utilities") pod "ca0af29a-9ab6-4d5f-a6fd-bdb5f3d5526c" (UID: "ca0af29a-9ab6-4d5f-a6fd-bdb5f3d5526c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.707848 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6397cca1-7284-4e40-9b7e-3f8026c72f5f-kube-api-access-b68bp" (OuterVolumeSpecName: "kube-api-access-b68bp") pod "6397cca1-7284-4e40-9b7e-3f8026c72f5f" (UID: "6397cca1-7284-4e40-9b7e-3f8026c72f5f"). InnerVolumeSpecName "kube-api-access-b68bp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.708060 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffb5bca9-57f3-415c-b2c2-c5088fe6c5d9-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "ffb5bca9-57f3-415c-b2c2-c5088fe6c5d9" (UID: "ffb5bca9-57f3-415c-b2c2-c5088fe6c5d9"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.710942 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca0af29a-9ab6-4d5f-a6fd-bdb5f3d5526c-kube-api-access-5zp8h" (OuterVolumeSpecName: "kube-api-access-5zp8h") pod "ca0af29a-9ab6-4d5f-a6fd-bdb5f3d5526c" (UID: "ca0af29a-9ab6-4d5f-a6fd-bdb5f3d5526c"). InnerVolumeSpecName "kube-api-access-5zp8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.712469 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffb5bca9-57f3-415c-b2c2-c5088fe6c5d9-kube-api-access-wsr27" (OuterVolumeSpecName: "kube-api-access-wsr27") pod "ffb5bca9-57f3-415c-b2c2-c5088fe6c5d9" (UID: "ffb5bca9-57f3-415c-b2c2-c5088fe6c5d9"). InnerVolumeSpecName "kube-api-access-wsr27". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.714689 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e0d7689-755d-4e24-a337-4177c37c2437-kube-api-access-zcngb" (OuterVolumeSpecName: "kube-api-access-zcngb") pod "0e0d7689-755d-4e24-a337-4177c37c2437" (UID: "0e0d7689-755d-4e24-a337-4177c37c2437"). InnerVolumeSpecName "kube-api-access-zcngb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.762538 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6397cca1-7284-4e40-9b7e-3f8026c72f5f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6397cca1-7284-4e40-9b7e-3f8026c72f5f" (UID: "6397cca1-7284-4e40-9b7e-3f8026c72f5f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.764385 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e0d7689-755d-4e24-a337-4177c37c2437-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0e0d7689-755d-4e24-a337-4177c37c2437" (UID: "0e0d7689-755d-4e24-a337-4177c37c2437"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.779058 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca0af29a-9ab6-4d5f-a6fd-bdb5f3d5526c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ca0af29a-9ab6-4d5f-a6fd-bdb5f3d5526c" (UID: "ca0af29a-9ab6-4d5f-a6fd-bdb5f3d5526c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.803005 4826 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ffb5bca9-57f3-415c-b2c2-c5088fe6c5d9-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.803316 4826 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca0af29a-9ab6-4d5f-a6fd-bdb5f3d5526c-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.803329 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wsr27\" (UniqueName: \"kubernetes.io/projected/ffb5bca9-57f3-415c-b2c2-c5088fe6c5d9-kube-api-access-wsr27\") on node \"crc\" DevicePath \"\"" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.803338 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcngb\" (UniqueName: \"kubernetes.io/projected/0e0d7689-755d-4e24-a337-4177c37c2437-kube-api-access-zcngb\") on node \"crc\" DevicePath \"\"" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.803346 4826 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6397cca1-7284-4e40-9b7e-3f8026c72f5f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.803354 4826 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e0d7689-755d-4e24-a337-4177c37c2437-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.803363 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b68bp\" (UniqueName: \"kubernetes.io/projected/6397cca1-7284-4e40-9b7e-3f8026c72f5f-kube-api-access-b68bp\") on node \"crc\" DevicePath \"\"" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.803371 4826 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6397cca1-7284-4e40-9b7e-3f8026c72f5f-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.803379 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zp8h\" (UniqueName: \"kubernetes.io/projected/ca0af29a-9ab6-4d5f-a6fd-bdb5f3d5526c-kube-api-access-5zp8h\") on node \"crc\" DevicePath \"\"" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.803386 4826 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca0af29a-9ab6-4d5f-a6fd-bdb5f3d5526c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.803394 4826 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ffb5bca9-57f3-415c-b2c2-c5088fe6c5d9-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.889719 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7109581b-42ad-4e72-89be-ae269dcaea42-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7109581b-42ad-4e72-89be-ae269dcaea42" (UID: "7109581b-42ad-4e72-89be-ae269dcaea42"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.899537 4826 generic.go:334] "Generic (PLEG): container finished" podID="0e0d7689-755d-4e24-a337-4177c37c2437" containerID="bfead77d20c83e15418079844cb872a8336a22b8dcb8f367ca67ff35b0cc0102" exitCode=0 Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.899590 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2fpvj" event={"ID":"0e0d7689-755d-4e24-a337-4177c37c2437","Type":"ContainerDied","Data":"bfead77d20c83e15418079844cb872a8336a22b8dcb8f367ca67ff35b0cc0102"} Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.899617 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2fpvj" event={"ID":"0e0d7689-755d-4e24-a337-4177c37c2437","Type":"ContainerDied","Data":"b95365443890536067099892c17281b9ba29d7aa7272278995046563a6ae5121"} Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.899633 4826 scope.go:117] "RemoveContainer" containerID="bfead77d20c83e15418079844cb872a8336a22b8dcb8f367ca67ff35b0cc0102" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.899753 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2fpvj" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.905216 4826 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7109581b-42ad-4e72-89be-ae269dcaea42-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.908537 4826 generic.go:334] "Generic (PLEG): container finished" podID="ffb5bca9-57f3-415c-b2c2-c5088fe6c5d9" containerID="7df9b284d591f0263e0c1a5abc3c274f43472b3d37014e63cb605c3924259aef" exitCode=0 Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.908725 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-flj2h" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.908780 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-flj2h" event={"ID":"ffb5bca9-57f3-415c-b2c2-c5088fe6c5d9","Type":"ContainerDied","Data":"7df9b284d591f0263e0c1a5abc3c274f43472b3d37014e63cb605c3924259aef"} Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.908807 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-flj2h" event={"ID":"ffb5bca9-57f3-415c-b2c2-c5088fe6c5d9","Type":"ContainerDied","Data":"d31b3b850c369f0d05e723c9102b3b9f24bb6a7e11c48499933d9534695a950c"} Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.919858 4826 generic.go:334] "Generic (PLEG): container finished" podID="ca0af29a-9ab6-4d5f-a6fd-bdb5f3d5526c" containerID="8c1ffe044a85cadc547ff459f20e7ddf8d2417059853707962e54c4ec31457cc" exitCode=0 Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.919961 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-psc5t" event={"ID":"ca0af29a-9ab6-4d5f-a6fd-bdb5f3d5526c","Type":"ContainerDied","Data":"8c1ffe044a85cadc547ff459f20e7ddf8d2417059853707962e54c4ec31457cc"} Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.919981 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-psc5t" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.919992 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-psc5t" event={"ID":"ca0af29a-9ab6-4d5f-a6fd-bdb5f3d5526c","Type":"ContainerDied","Data":"be6fcb17cd5035bc0108081053febfd13705bdea7bba9efe0eb9505951f08038"} Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.924181 4826 generic.go:334] "Generic (PLEG): container finished" podID="7109581b-42ad-4e72-89be-ae269dcaea42" containerID="71b424507f0985e335f1f90139a9f35f490a1d4526b09d52f483274c93a36146" exitCode=0 Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.924236 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zkslk" event={"ID":"7109581b-42ad-4e72-89be-ae269dcaea42","Type":"ContainerDied","Data":"71b424507f0985e335f1f90139a9f35f490a1d4526b09d52f483274c93a36146"} Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.924260 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zkslk" event={"ID":"7109581b-42ad-4e72-89be-ae269dcaea42","Type":"ContainerDied","Data":"0bd6a7374fa5a9777c60eabca042d04ba570c647b7d7883c326f47d132019430"} Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.924316 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zkslk" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.929037 4826 scope.go:117] "RemoveContainer" containerID="5569e1509803a7a675cd2dc0449d62b6c078fc25922ae818777be6d8239b723b" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.933512 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2fpvj"] Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.935197 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lwdqq" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.936563 4826 generic.go:334] "Generic (PLEG): container finished" podID="6397cca1-7284-4e40-9b7e-3f8026c72f5f" containerID="207e00c21fa248a25bcaa99664c76d352f723d37e7e7047e198e62df271bad86" exitCode=0 Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.936614 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lwdqq" event={"ID":"6397cca1-7284-4e40-9b7e-3f8026c72f5f","Type":"ContainerDied","Data":"207e00c21fa248a25bcaa99664c76d352f723d37e7e7047e198e62df271bad86"} Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.936644 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lwdqq" event={"ID":"6397cca1-7284-4e40-9b7e-3f8026c72f5f","Type":"ContainerDied","Data":"538c7fed2191a231d74d752f0d7bd6c3b9fcdba88b9571b2b897b3defb394568"} Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.939359 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2fpvj"] Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.956831 4826 scope.go:117] "RemoveContainer" containerID="7343345644b75d17b5be21256f8aa9cc9d3cdc32652ea8fdf64752900330e54b" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.958725 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-flj2h"] Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.965893 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-flj2h"] Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.973859 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-psc5t"] Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.984681 4826 scope.go:117] "RemoveContainer" containerID="bfead77d20c83e15418079844cb872a8336a22b8dcb8f367ca67ff35b0cc0102" Mar 19 19:01:59 crc kubenswrapper[4826]: E0319 19:01:59.985494 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfead77d20c83e15418079844cb872a8336a22b8dcb8f367ca67ff35b0cc0102\": container with ID starting with bfead77d20c83e15418079844cb872a8336a22b8dcb8f367ca67ff35b0cc0102 not found: ID does not exist" containerID="bfead77d20c83e15418079844cb872a8336a22b8dcb8f367ca67ff35b0cc0102" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.985620 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfead77d20c83e15418079844cb872a8336a22b8dcb8f367ca67ff35b0cc0102"} err="failed to get container status \"bfead77d20c83e15418079844cb872a8336a22b8dcb8f367ca67ff35b0cc0102\": rpc error: code = NotFound desc = could not find container \"bfead77d20c83e15418079844cb872a8336a22b8dcb8f367ca67ff35b0cc0102\": container with ID starting with bfead77d20c83e15418079844cb872a8336a22b8dcb8f367ca67ff35b0cc0102 not found: ID does not exist" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.985688 4826 scope.go:117] "RemoveContainer" containerID="5569e1509803a7a675cd2dc0449d62b6c078fc25922ae818777be6d8239b723b" Mar 19 19:01:59 crc kubenswrapper[4826]: E0319 19:01:59.986253 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5569e1509803a7a675cd2dc0449d62b6c078fc25922ae818777be6d8239b723b\": container with ID starting with 5569e1509803a7a675cd2dc0449d62b6c078fc25922ae818777be6d8239b723b not found: ID does not exist" containerID="5569e1509803a7a675cd2dc0449d62b6c078fc25922ae818777be6d8239b723b" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.986299 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5569e1509803a7a675cd2dc0449d62b6c078fc25922ae818777be6d8239b723b"} err="failed to get container status \"5569e1509803a7a675cd2dc0449d62b6c078fc25922ae818777be6d8239b723b\": rpc error: code = NotFound desc = could not find container \"5569e1509803a7a675cd2dc0449d62b6c078fc25922ae818777be6d8239b723b\": container with ID starting with 5569e1509803a7a675cd2dc0449d62b6c078fc25922ae818777be6d8239b723b not found: ID does not exist" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.986325 4826 scope.go:117] "RemoveContainer" containerID="7343345644b75d17b5be21256f8aa9cc9d3cdc32652ea8fdf64752900330e54b" Mar 19 19:01:59 crc kubenswrapper[4826]: E0319 19:01:59.987058 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7343345644b75d17b5be21256f8aa9cc9d3cdc32652ea8fdf64752900330e54b\": container with ID starting with 7343345644b75d17b5be21256f8aa9cc9d3cdc32652ea8fdf64752900330e54b not found: ID does not exist" containerID="7343345644b75d17b5be21256f8aa9cc9d3cdc32652ea8fdf64752900330e54b" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.987308 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7343345644b75d17b5be21256f8aa9cc9d3cdc32652ea8fdf64752900330e54b"} err="failed to get container status \"7343345644b75d17b5be21256f8aa9cc9d3cdc32652ea8fdf64752900330e54b\": rpc error: code = NotFound desc = could not find container \"7343345644b75d17b5be21256f8aa9cc9d3cdc32652ea8fdf64752900330e54b\": container with ID starting with 7343345644b75d17b5be21256f8aa9cc9d3cdc32652ea8fdf64752900330e54b not found: ID does not exist" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.987345 4826 scope.go:117] "RemoveContainer" containerID="7df9b284d591f0263e0c1a5abc3c274f43472b3d37014e63cb605c3924259aef" Mar 19 19:01:59 crc kubenswrapper[4826]: I0319 19:01:59.997428 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e0d7689-755d-4e24-a337-4177c37c2437" path="/var/lib/kubelet/pods/0e0d7689-755d-4e24-a337-4177c37c2437/volumes" Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:01:59.998887 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffb5bca9-57f3-415c-b2c2-c5088fe6c5d9" path="/var/lib/kubelet/pods/ffb5bca9-57f3-415c-b2c2-c5088fe6c5d9/volumes" Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.001190 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-psc5t"] Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.001228 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zkslk"] Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.004854 4826 scope.go:117] "RemoveContainer" containerID="7df9b284d591f0263e0c1a5abc3c274f43472b3d37014e63cb605c3924259aef" Mar 19 19:02:00 crc kubenswrapper[4826]: E0319 19:02:00.005823 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7df9b284d591f0263e0c1a5abc3c274f43472b3d37014e63cb605c3924259aef\": container with ID starting with 7df9b284d591f0263e0c1a5abc3c274f43472b3d37014e63cb605c3924259aef not found: ID does not exist" containerID="7df9b284d591f0263e0c1a5abc3c274f43472b3d37014e63cb605c3924259aef" Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.005856 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7df9b284d591f0263e0c1a5abc3c274f43472b3d37014e63cb605c3924259aef"} err="failed to get container status \"7df9b284d591f0263e0c1a5abc3c274f43472b3d37014e63cb605c3924259aef\": rpc error: code = NotFound desc = could not find container \"7df9b284d591f0263e0c1a5abc3c274f43472b3d37014e63cb605c3924259aef\": container with ID starting with 7df9b284d591f0263e0c1a5abc3c274f43472b3d37014e63cb605c3924259aef not found: ID does not exist" Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.005877 4826 scope.go:117] "RemoveContainer" containerID="8c1ffe044a85cadc547ff459f20e7ddf8d2417059853707962e54c4ec31457cc" Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.010416 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zkslk"] Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.015065 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lwdqq"] Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.018485 4826 scope.go:117] "RemoveContainer" containerID="42b4f9585c0139b86067ff1c754dcab464654a52fca57539b9d091f3022c2398" Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.020023 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lwdqq"] Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.039022 4826 scope.go:117] "RemoveContainer" containerID="2441382ea1909fa40ab74a874916e31708b8e86f3926810690ee5b075d0d2267" Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.062026 4826 scope.go:117] "RemoveContainer" containerID="8c1ffe044a85cadc547ff459f20e7ddf8d2417059853707962e54c4ec31457cc" Mar 19 19:02:00 crc kubenswrapper[4826]: E0319 19:02:00.062897 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c1ffe044a85cadc547ff459f20e7ddf8d2417059853707962e54c4ec31457cc\": container with ID starting with 8c1ffe044a85cadc547ff459f20e7ddf8d2417059853707962e54c4ec31457cc not found: ID does not exist" containerID="8c1ffe044a85cadc547ff459f20e7ddf8d2417059853707962e54c4ec31457cc" Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.062929 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c1ffe044a85cadc547ff459f20e7ddf8d2417059853707962e54c4ec31457cc"} err="failed to get container status \"8c1ffe044a85cadc547ff459f20e7ddf8d2417059853707962e54c4ec31457cc\": rpc error: code = NotFound desc = could not find container \"8c1ffe044a85cadc547ff459f20e7ddf8d2417059853707962e54c4ec31457cc\": container with ID starting with 8c1ffe044a85cadc547ff459f20e7ddf8d2417059853707962e54c4ec31457cc not found: ID does not exist" Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.062950 4826 scope.go:117] "RemoveContainer" containerID="42b4f9585c0139b86067ff1c754dcab464654a52fca57539b9d091f3022c2398" Mar 19 19:02:00 crc kubenswrapper[4826]: E0319 19:02:00.064854 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42b4f9585c0139b86067ff1c754dcab464654a52fca57539b9d091f3022c2398\": container with ID starting with 42b4f9585c0139b86067ff1c754dcab464654a52fca57539b9d091f3022c2398 not found: ID does not exist" containerID="42b4f9585c0139b86067ff1c754dcab464654a52fca57539b9d091f3022c2398" Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.064912 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42b4f9585c0139b86067ff1c754dcab464654a52fca57539b9d091f3022c2398"} err="failed to get container status \"42b4f9585c0139b86067ff1c754dcab464654a52fca57539b9d091f3022c2398\": rpc error: code = NotFound desc = could not find container \"42b4f9585c0139b86067ff1c754dcab464654a52fca57539b9d091f3022c2398\": container with ID starting with 42b4f9585c0139b86067ff1c754dcab464654a52fca57539b9d091f3022c2398 not found: ID does not exist" Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.065165 4826 scope.go:117] "RemoveContainer" containerID="2441382ea1909fa40ab74a874916e31708b8e86f3926810690ee5b075d0d2267" Mar 19 19:02:00 crc kubenswrapper[4826]: E0319 19:02:00.066837 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2441382ea1909fa40ab74a874916e31708b8e86f3926810690ee5b075d0d2267\": container with ID starting with 2441382ea1909fa40ab74a874916e31708b8e86f3926810690ee5b075d0d2267 not found: ID does not exist" containerID="2441382ea1909fa40ab74a874916e31708b8e86f3926810690ee5b075d0d2267" Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.066909 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2441382ea1909fa40ab74a874916e31708b8e86f3926810690ee5b075d0d2267"} err="failed to get container status \"2441382ea1909fa40ab74a874916e31708b8e86f3926810690ee5b075d0d2267\": rpc error: code = NotFound desc = could not find container \"2441382ea1909fa40ab74a874916e31708b8e86f3926810690ee5b075d0d2267\": container with ID starting with 2441382ea1909fa40ab74a874916e31708b8e86f3926810690ee5b075d0d2267 not found: ID does not exist" Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.066938 4826 scope.go:117] "RemoveContainer" containerID="71b424507f0985e335f1f90139a9f35f490a1d4526b09d52f483274c93a36146" Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.080732 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-66v8z"] Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.082222 4826 scope.go:117] "RemoveContainer" containerID="c44f6e6ddff34243570c2ba4a492ca1487ef44f2418bc1bf35e422d1cf526dc2" Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.098996 4826 scope.go:117] "RemoveContainer" containerID="adc9487e2da585a7282579470adf2501541cde341a769d6d631593359b636b5b" Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.128927 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565782-kmh7v"] Mar 19 19:02:00 crc kubenswrapper[4826]: E0319 19:02:00.129379 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7109581b-42ad-4e72-89be-ae269dcaea42" containerName="extract-content" Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.129480 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="7109581b-42ad-4e72-89be-ae269dcaea42" containerName="extract-content" Mar 19 19:02:00 crc kubenswrapper[4826]: E0319 19:02:00.129590 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6397cca1-7284-4e40-9b7e-3f8026c72f5f" containerName="extract-utilities" Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.129738 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="6397cca1-7284-4e40-9b7e-3f8026c72f5f" containerName="extract-utilities" Mar 19 19:02:00 crc kubenswrapper[4826]: E0319 19:02:00.129825 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e0d7689-755d-4e24-a337-4177c37c2437" containerName="extract-utilities" Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.129890 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e0d7689-755d-4e24-a337-4177c37c2437" containerName="extract-utilities" Mar 19 19:02:00 crc kubenswrapper[4826]: E0319 19:02:00.129948 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca0af29a-9ab6-4d5f-a6fd-bdb5f3d5526c" containerName="extract-utilities" Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.130012 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca0af29a-9ab6-4d5f-a6fd-bdb5f3d5526c" containerName="extract-utilities" Mar 19 19:02:00 crc kubenswrapper[4826]: E0319 19:02:00.130068 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6397cca1-7284-4e40-9b7e-3f8026c72f5f" containerName="registry-server" Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.130391 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="6397cca1-7284-4e40-9b7e-3f8026c72f5f" containerName="registry-server" Mar 19 19:02:00 crc kubenswrapper[4826]: E0319 19:02:00.130841 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffb5bca9-57f3-415c-b2c2-c5088fe6c5d9" containerName="marketplace-operator" Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.130924 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffb5bca9-57f3-415c-b2c2-c5088fe6c5d9" containerName="marketplace-operator" Mar 19 19:02:00 crc kubenswrapper[4826]: E0319 19:02:00.131007 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e0d7689-755d-4e24-a337-4177c37c2437" containerName="extract-content" Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.131089 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e0d7689-755d-4e24-a337-4177c37c2437" containerName="extract-content" Mar 19 19:02:00 crc kubenswrapper[4826]: E0319 19:02:00.131172 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca0af29a-9ab6-4d5f-a6fd-bdb5f3d5526c" containerName="extract-content" Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.131255 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca0af29a-9ab6-4d5f-a6fd-bdb5f3d5526c" containerName="extract-content" Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.131297 4826 scope.go:117] "RemoveContainer" containerID="71b424507f0985e335f1f90139a9f35f490a1d4526b09d52f483274c93a36146" Mar 19 19:02:00 crc kubenswrapper[4826]: E0319 19:02:00.131331 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7109581b-42ad-4e72-89be-ae269dcaea42" containerName="extract-utilities" Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.131450 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="7109581b-42ad-4e72-89be-ae269dcaea42" containerName="extract-utilities" Mar 19 19:02:00 crc kubenswrapper[4826]: E0319 19:02:00.131502 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7109581b-42ad-4e72-89be-ae269dcaea42" containerName="registry-server" Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.131513 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="7109581b-42ad-4e72-89be-ae269dcaea42" containerName="registry-server" Mar 19 19:02:00 crc kubenswrapper[4826]: E0319 19:02:00.131527 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e0d7689-755d-4e24-a337-4177c37c2437" containerName="registry-server" Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.131535 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e0d7689-755d-4e24-a337-4177c37c2437" containerName="registry-server" Mar 19 19:02:00 crc kubenswrapper[4826]: E0319 19:02:00.131548 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6397cca1-7284-4e40-9b7e-3f8026c72f5f" containerName="extract-content" Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.131582 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="6397cca1-7284-4e40-9b7e-3f8026c72f5f" containerName="extract-content" Mar 19 19:02:00 crc kubenswrapper[4826]: E0319 19:02:00.131602 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca0af29a-9ab6-4d5f-a6fd-bdb5f3d5526c" containerName="registry-server" Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.131610 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca0af29a-9ab6-4d5f-a6fd-bdb5f3d5526c" containerName="registry-server" Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.131876 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffb5bca9-57f3-415c-b2c2-c5088fe6c5d9" containerName="marketplace-operator" Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.131901 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e0d7689-755d-4e24-a337-4177c37c2437" containerName="registry-server" Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.131913 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="7109581b-42ad-4e72-89be-ae269dcaea42" containerName="registry-server" Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.131929 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca0af29a-9ab6-4d5f-a6fd-bdb5f3d5526c" containerName="registry-server" Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.131941 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="6397cca1-7284-4e40-9b7e-3f8026c72f5f" containerName="registry-server" Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.132344 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565782-kmh7v" Mar 19 19:02:00 crc kubenswrapper[4826]: E0319 19:02:00.132418 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71b424507f0985e335f1f90139a9f35f490a1d4526b09d52f483274c93a36146\": container with ID starting with 71b424507f0985e335f1f90139a9f35f490a1d4526b09d52f483274c93a36146 not found: ID does not exist" containerID="71b424507f0985e335f1f90139a9f35f490a1d4526b09d52f483274c93a36146" Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.132465 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71b424507f0985e335f1f90139a9f35f490a1d4526b09d52f483274c93a36146"} err="failed to get container status \"71b424507f0985e335f1f90139a9f35f490a1d4526b09d52f483274c93a36146\": rpc error: code = NotFound desc = could not find container \"71b424507f0985e335f1f90139a9f35f490a1d4526b09d52f483274c93a36146\": container with ID starting with 71b424507f0985e335f1f90139a9f35f490a1d4526b09d52f483274c93a36146 not found: ID does not exist" Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.136328 4826 scope.go:117] "RemoveContainer" containerID="c44f6e6ddff34243570c2ba4a492ca1487ef44f2418bc1bf35e422d1cf526dc2" Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.136135 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565782-kmh7v"] Mar 19 19:02:00 crc kubenswrapper[4826]: E0319 19:02:00.136841 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c44f6e6ddff34243570c2ba4a492ca1487ef44f2418bc1bf35e422d1cf526dc2\": container with ID starting with c44f6e6ddff34243570c2ba4a492ca1487ef44f2418bc1bf35e422d1cf526dc2 not found: ID does not exist" containerID="c44f6e6ddff34243570c2ba4a492ca1487ef44f2418bc1bf35e422d1cf526dc2" Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.137939 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c44f6e6ddff34243570c2ba4a492ca1487ef44f2418bc1bf35e422d1cf526dc2"} err="failed to get container status \"c44f6e6ddff34243570c2ba4a492ca1487ef44f2418bc1bf35e422d1cf526dc2\": rpc error: code = NotFound desc = could not find container \"c44f6e6ddff34243570c2ba4a492ca1487ef44f2418bc1bf35e422d1cf526dc2\": container with ID starting with c44f6e6ddff34243570c2ba4a492ca1487ef44f2418bc1bf35e422d1cf526dc2 not found: ID does not exist" Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.138073 4826 scope.go:117] "RemoveContainer" containerID="adc9487e2da585a7282579470adf2501541cde341a769d6d631593359b636b5b" Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.138376 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 19:02:00 crc kubenswrapper[4826]: E0319 19:02:00.138842 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adc9487e2da585a7282579470adf2501541cde341a769d6d631593359b636b5b\": container with ID starting with adc9487e2da585a7282579470adf2501541cde341a769d6d631593359b636b5b not found: ID does not exist" containerID="adc9487e2da585a7282579470adf2501541cde341a769d6d631593359b636b5b" Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.138896 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adc9487e2da585a7282579470adf2501541cde341a769d6d631593359b636b5b"} err="failed to get container status \"adc9487e2da585a7282579470adf2501541cde341a769d6d631593359b636b5b\": rpc error: code = NotFound desc = could not find container \"adc9487e2da585a7282579470adf2501541cde341a769d6d631593359b636b5b\": container with ID starting with adc9487e2da585a7282579470adf2501541cde341a769d6d631593359b636b5b not found: ID does not exist" Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.138930 4826 scope.go:117] "RemoveContainer" containerID="207e00c21fa248a25bcaa99664c76d352f723d37e7e7047e198e62df271bad86" Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.142003 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.143378 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b27wl" Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.173208 4826 scope.go:117] "RemoveContainer" containerID="a754d01d13643fd564d9e7d8297d9632187cfd6f652e385b3a82c07a48b74f58" Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.195228 4826 scope.go:117] "RemoveContainer" containerID="01b9e4327f4b37897867f0466ff0f9cf9c675d9b152e4b94acd69fd58ef474cb" Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.209924 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4j6w\" (UniqueName: \"kubernetes.io/projected/0e93cc0d-3987-4256-b7cb-6796e66fd509-kube-api-access-s4j6w\") pod \"auto-csr-approver-29565782-kmh7v\" (UID: \"0e93cc0d-3987-4256-b7cb-6796e66fd509\") " pod="openshift-infra/auto-csr-approver-29565782-kmh7v" Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.226249 4826 scope.go:117] "RemoveContainer" containerID="207e00c21fa248a25bcaa99664c76d352f723d37e7e7047e198e62df271bad86" Mar 19 19:02:00 crc kubenswrapper[4826]: E0319 19:02:00.226806 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"207e00c21fa248a25bcaa99664c76d352f723d37e7e7047e198e62df271bad86\": container with ID starting with 207e00c21fa248a25bcaa99664c76d352f723d37e7e7047e198e62df271bad86 not found: ID does not exist" containerID="207e00c21fa248a25bcaa99664c76d352f723d37e7e7047e198e62df271bad86" Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.226841 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"207e00c21fa248a25bcaa99664c76d352f723d37e7e7047e198e62df271bad86"} err="failed to get container status \"207e00c21fa248a25bcaa99664c76d352f723d37e7e7047e198e62df271bad86\": rpc error: code = NotFound desc = could not find container \"207e00c21fa248a25bcaa99664c76d352f723d37e7e7047e198e62df271bad86\": container with ID starting with 207e00c21fa248a25bcaa99664c76d352f723d37e7e7047e198e62df271bad86 not found: ID does not exist" Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.226868 4826 scope.go:117] "RemoveContainer" containerID="a754d01d13643fd564d9e7d8297d9632187cfd6f652e385b3a82c07a48b74f58" Mar 19 19:02:00 crc kubenswrapper[4826]: E0319 19:02:00.227083 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a754d01d13643fd564d9e7d8297d9632187cfd6f652e385b3a82c07a48b74f58\": container with ID starting with a754d01d13643fd564d9e7d8297d9632187cfd6f652e385b3a82c07a48b74f58 not found: ID does not exist" containerID="a754d01d13643fd564d9e7d8297d9632187cfd6f652e385b3a82c07a48b74f58" Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.227108 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a754d01d13643fd564d9e7d8297d9632187cfd6f652e385b3a82c07a48b74f58"} err="failed to get container status \"a754d01d13643fd564d9e7d8297d9632187cfd6f652e385b3a82c07a48b74f58\": rpc error: code = NotFound desc = could not find container \"a754d01d13643fd564d9e7d8297d9632187cfd6f652e385b3a82c07a48b74f58\": container with ID starting with a754d01d13643fd564d9e7d8297d9632187cfd6f652e385b3a82c07a48b74f58 not found: ID does not exist" Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.227125 4826 scope.go:117] "RemoveContainer" containerID="01b9e4327f4b37897867f0466ff0f9cf9c675d9b152e4b94acd69fd58ef474cb" Mar 19 19:02:00 crc kubenswrapper[4826]: E0319 19:02:00.227454 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01b9e4327f4b37897867f0466ff0f9cf9c675d9b152e4b94acd69fd58ef474cb\": container with ID starting with 01b9e4327f4b37897867f0466ff0f9cf9c675d9b152e4b94acd69fd58ef474cb not found: ID does not exist" containerID="01b9e4327f4b37897867f0466ff0f9cf9c675d9b152e4b94acd69fd58ef474cb" Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.227478 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01b9e4327f4b37897867f0466ff0f9cf9c675d9b152e4b94acd69fd58ef474cb"} err="failed to get container status \"01b9e4327f4b37897867f0466ff0f9cf9c675d9b152e4b94acd69fd58ef474cb\": rpc error: code = NotFound desc = could not find container \"01b9e4327f4b37897867f0466ff0f9cf9c675d9b152e4b94acd69fd58ef474cb\": container with ID starting with 01b9e4327f4b37897867f0466ff0f9cf9c675d9b152e4b94acd69fd58ef474cb not found: ID does not exist" Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.311094 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4j6w\" (UniqueName: \"kubernetes.io/projected/0e93cc0d-3987-4256-b7cb-6796e66fd509-kube-api-access-s4j6w\") pod \"auto-csr-approver-29565782-kmh7v\" (UID: \"0e93cc0d-3987-4256-b7cb-6796e66fd509\") " pod="openshift-infra/auto-csr-approver-29565782-kmh7v" Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.335340 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4j6w\" (UniqueName: \"kubernetes.io/projected/0e93cc0d-3987-4256-b7cb-6796e66fd509-kube-api-access-s4j6w\") pod \"auto-csr-approver-29565782-kmh7v\" (UID: \"0e93cc0d-3987-4256-b7cb-6796e66fd509\") " pod="openshift-infra/auto-csr-approver-29565782-kmh7v" Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.458909 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565782-kmh7v" Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.914222 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565782-kmh7v"] Mar 19 19:02:00 crc kubenswrapper[4826]: W0319 19:02:00.916019 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e93cc0d_3987_4256_b7cb_6796e66fd509.slice/crio-44c6ae027e3797b2038571c273f9e513858c203ef9eba209a2a8f74d21f74dbd WatchSource:0}: Error finding container 44c6ae027e3797b2038571c273f9e513858c203ef9eba209a2a8f74d21f74dbd: Status 404 returned error can't find the container with id 44c6ae027e3797b2038571c273f9e513858c203ef9eba209a2a8f74d21f74dbd Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.942941 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565782-kmh7v" event={"ID":"0e93cc0d-3987-4256-b7cb-6796e66fd509","Type":"ContainerStarted","Data":"44c6ae027e3797b2038571c273f9e513858c203ef9eba209a2a8f74d21f74dbd"} Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.945370 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-66v8z" event={"ID":"f182fb72-66c7-4d5d-bccd-29a47b27f4c6","Type":"ContainerStarted","Data":"68307e5a08a95cfd72fde6061fc39632cfa52ca8644e891be3785baefa2852fd"} Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.945426 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-66v8z" event={"ID":"f182fb72-66c7-4d5d-bccd-29a47b27f4c6","Type":"ContainerStarted","Data":"387c84227621292392953112cc933cebbd0273c98ad6e309b85e75f5e487261d"} Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.945673 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-66v8z" Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.951369 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-66v8z" Mar 19 19:02:00 crc kubenswrapper[4826]: I0319 19:02:00.963346 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-66v8z" podStartSLOduration=1.9633335920000001 podStartE2EDuration="1.963333592s" podCreationTimestamp="2026-03-19 19:01:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:02:00.959706168 +0000 UTC m=+345.713774481" watchObservedRunningTime="2026-03-19 19:02:00.963333592 +0000 UTC m=+345.717401895" Mar 19 19:02:01 crc kubenswrapper[4826]: I0319 19:02:01.149010 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2v9rn"] Mar 19 19:02:01 crc kubenswrapper[4826]: I0319 19:02:01.149969 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2v9rn" Mar 19 19:02:01 crc kubenswrapper[4826]: I0319 19:02:01.154414 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 19 19:02:01 crc kubenswrapper[4826]: I0319 19:02:01.163274 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2v9rn"] Mar 19 19:02:01 crc kubenswrapper[4826]: I0319 19:02:01.222467 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ph2nv\" (UniqueName: \"kubernetes.io/projected/4baa475c-f008-41cf-99d8-88723e43abf0-kube-api-access-ph2nv\") pod \"certified-operators-2v9rn\" (UID: \"4baa475c-f008-41cf-99d8-88723e43abf0\") " pod="openshift-marketplace/certified-operators-2v9rn" Mar 19 19:02:01 crc kubenswrapper[4826]: I0319 19:02:01.222512 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4baa475c-f008-41cf-99d8-88723e43abf0-utilities\") pod \"certified-operators-2v9rn\" (UID: \"4baa475c-f008-41cf-99d8-88723e43abf0\") " pod="openshift-marketplace/certified-operators-2v9rn" Mar 19 19:02:01 crc kubenswrapper[4826]: I0319 19:02:01.222563 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4baa475c-f008-41cf-99d8-88723e43abf0-catalog-content\") pod \"certified-operators-2v9rn\" (UID: \"4baa475c-f008-41cf-99d8-88723e43abf0\") " pod="openshift-marketplace/certified-operators-2v9rn" Mar 19 19:02:01 crc kubenswrapper[4826]: I0319 19:02:01.323826 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ph2nv\" (UniqueName: \"kubernetes.io/projected/4baa475c-f008-41cf-99d8-88723e43abf0-kube-api-access-ph2nv\") pod \"certified-operators-2v9rn\" (UID: \"4baa475c-f008-41cf-99d8-88723e43abf0\") " pod="openshift-marketplace/certified-operators-2v9rn" Mar 19 19:02:01 crc kubenswrapper[4826]: I0319 19:02:01.323881 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4baa475c-f008-41cf-99d8-88723e43abf0-utilities\") pod \"certified-operators-2v9rn\" (UID: \"4baa475c-f008-41cf-99d8-88723e43abf0\") " pod="openshift-marketplace/certified-operators-2v9rn" Mar 19 19:02:01 crc kubenswrapper[4826]: I0319 19:02:01.323931 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4baa475c-f008-41cf-99d8-88723e43abf0-catalog-content\") pod \"certified-operators-2v9rn\" (UID: \"4baa475c-f008-41cf-99d8-88723e43abf0\") " pod="openshift-marketplace/certified-operators-2v9rn" Mar 19 19:02:01 crc kubenswrapper[4826]: I0319 19:02:01.324812 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4baa475c-f008-41cf-99d8-88723e43abf0-catalog-content\") pod \"certified-operators-2v9rn\" (UID: \"4baa475c-f008-41cf-99d8-88723e43abf0\") " pod="openshift-marketplace/certified-operators-2v9rn" Mar 19 19:02:01 crc kubenswrapper[4826]: I0319 19:02:01.324823 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4baa475c-f008-41cf-99d8-88723e43abf0-utilities\") pod \"certified-operators-2v9rn\" (UID: \"4baa475c-f008-41cf-99d8-88723e43abf0\") " pod="openshift-marketplace/certified-operators-2v9rn" Mar 19 19:02:01 crc kubenswrapper[4826]: I0319 19:02:01.352944 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph2nv\" (UniqueName: \"kubernetes.io/projected/4baa475c-f008-41cf-99d8-88723e43abf0-kube-api-access-ph2nv\") pod \"certified-operators-2v9rn\" (UID: \"4baa475c-f008-41cf-99d8-88723e43abf0\") " pod="openshift-marketplace/certified-operators-2v9rn" Mar 19 19:02:01 crc kubenswrapper[4826]: I0319 19:02:01.488464 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2v9rn" Mar 19 19:02:01 crc kubenswrapper[4826]: I0319 19:02:01.758336 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-plvms"] Mar 19 19:02:01 crc kubenswrapper[4826]: I0319 19:02:01.760244 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-plvms" Mar 19 19:02:01 crc kubenswrapper[4826]: I0319 19:02:01.762151 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 19 19:02:01 crc kubenswrapper[4826]: I0319 19:02:01.764882 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2v9rn"] Mar 19 19:02:01 crc kubenswrapper[4826]: I0319 19:02:01.768698 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-plvms"] Mar 19 19:02:01 crc kubenswrapper[4826]: I0319 19:02:01.830492 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/623d1506-574a-4eee-9f1c-cc0ee85e9083-utilities\") pod \"redhat-marketplace-plvms\" (UID: \"623d1506-574a-4eee-9f1c-cc0ee85e9083\") " pod="openshift-marketplace/redhat-marketplace-plvms" Mar 19 19:02:01 crc kubenswrapper[4826]: I0319 19:02:01.830528 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpdz2\" (UniqueName: \"kubernetes.io/projected/623d1506-574a-4eee-9f1c-cc0ee85e9083-kube-api-access-gpdz2\") pod \"redhat-marketplace-plvms\" (UID: \"623d1506-574a-4eee-9f1c-cc0ee85e9083\") " pod="openshift-marketplace/redhat-marketplace-plvms" Mar 19 19:02:01 crc kubenswrapper[4826]: I0319 19:02:01.830562 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/623d1506-574a-4eee-9f1c-cc0ee85e9083-catalog-content\") pod \"redhat-marketplace-plvms\" (UID: \"623d1506-574a-4eee-9f1c-cc0ee85e9083\") " pod="openshift-marketplace/redhat-marketplace-plvms" Mar 19 19:02:01 crc kubenswrapper[4826]: I0319 19:02:01.932310 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/623d1506-574a-4eee-9f1c-cc0ee85e9083-utilities\") pod \"redhat-marketplace-plvms\" (UID: \"623d1506-574a-4eee-9f1c-cc0ee85e9083\") " pod="openshift-marketplace/redhat-marketplace-plvms" Mar 19 19:02:01 crc kubenswrapper[4826]: I0319 19:02:01.932617 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpdz2\" (UniqueName: \"kubernetes.io/projected/623d1506-574a-4eee-9f1c-cc0ee85e9083-kube-api-access-gpdz2\") pod \"redhat-marketplace-plvms\" (UID: \"623d1506-574a-4eee-9f1c-cc0ee85e9083\") " pod="openshift-marketplace/redhat-marketplace-plvms" Mar 19 19:02:01 crc kubenswrapper[4826]: I0319 19:02:01.932856 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/623d1506-574a-4eee-9f1c-cc0ee85e9083-catalog-content\") pod \"redhat-marketplace-plvms\" (UID: \"623d1506-574a-4eee-9f1c-cc0ee85e9083\") " pod="openshift-marketplace/redhat-marketplace-plvms" Mar 19 19:02:01 crc kubenswrapper[4826]: I0319 19:02:01.933574 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/623d1506-574a-4eee-9f1c-cc0ee85e9083-utilities\") pod \"redhat-marketplace-plvms\" (UID: \"623d1506-574a-4eee-9f1c-cc0ee85e9083\") " pod="openshift-marketplace/redhat-marketplace-plvms" Mar 19 19:02:01 crc kubenswrapper[4826]: I0319 19:02:01.934527 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/623d1506-574a-4eee-9f1c-cc0ee85e9083-catalog-content\") pod \"redhat-marketplace-plvms\" (UID: \"623d1506-574a-4eee-9f1c-cc0ee85e9083\") " pod="openshift-marketplace/redhat-marketplace-plvms" Mar 19 19:02:01 crc kubenswrapper[4826]: I0319 19:02:01.959618 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2v9rn" event={"ID":"4baa475c-f008-41cf-99d8-88723e43abf0","Type":"ContainerStarted","Data":"a12383b49bba35c8a07ee384ef9f634c2cb1c1a73d15fd4d30d4a3260552997d"} Mar 19 19:02:01 crc kubenswrapper[4826]: I0319 19:02:01.959670 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2v9rn" event={"ID":"4baa475c-f008-41cf-99d8-88723e43abf0","Type":"ContainerStarted","Data":"11f2dd7d8e81892f718cd65b9e12b759fb3352a5f952ed556a8f6f53278f27c2"} Mar 19 19:02:01 crc kubenswrapper[4826]: I0319 19:02:01.967869 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpdz2\" (UniqueName: \"kubernetes.io/projected/623d1506-574a-4eee-9f1c-cc0ee85e9083-kube-api-access-gpdz2\") pod \"redhat-marketplace-plvms\" (UID: \"623d1506-574a-4eee-9f1c-cc0ee85e9083\") " pod="openshift-marketplace/redhat-marketplace-plvms" Mar 19 19:02:01 crc kubenswrapper[4826]: I0319 19:02:01.984359 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6397cca1-7284-4e40-9b7e-3f8026c72f5f" path="/var/lib/kubelet/pods/6397cca1-7284-4e40-9b7e-3f8026c72f5f/volumes" Mar 19 19:02:01 crc kubenswrapper[4826]: I0319 19:02:01.984991 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7109581b-42ad-4e72-89be-ae269dcaea42" path="/var/lib/kubelet/pods/7109581b-42ad-4e72-89be-ae269dcaea42/volumes" Mar 19 19:02:01 crc kubenswrapper[4826]: I0319 19:02:01.985625 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca0af29a-9ab6-4d5f-a6fd-bdb5f3d5526c" path="/var/lib/kubelet/pods/ca0af29a-9ab6-4d5f-a6fd-bdb5f3d5526c/volumes" Mar 19 19:02:02 crc kubenswrapper[4826]: I0319 19:02:02.117560 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-plvms" Mar 19 19:02:02 crc kubenswrapper[4826]: I0319 19:02:02.506727 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-plvms"] Mar 19 19:02:02 crc kubenswrapper[4826]: W0319 19:02:02.523324 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod623d1506_574a_4eee_9f1c_cc0ee85e9083.slice/crio-2ca621e671fff7fc73fd280f67bfde3279ab58b3b76f72e38c415958e2af1a66 WatchSource:0}: Error finding container 2ca621e671fff7fc73fd280f67bfde3279ab58b3b76f72e38c415958e2af1a66: Status 404 returned error can't find the container with id 2ca621e671fff7fc73fd280f67bfde3279ab58b3b76f72e38c415958e2af1a66 Mar 19 19:02:02 crc kubenswrapper[4826]: I0319 19:02:02.968187 4826 generic.go:334] "Generic (PLEG): container finished" podID="4baa475c-f008-41cf-99d8-88723e43abf0" containerID="a12383b49bba35c8a07ee384ef9f634c2cb1c1a73d15fd4d30d4a3260552997d" exitCode=0 Mar 19 19:02:02 crc kubenswrapper[4826]: I0319 19:02:02.968285 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2v9rn" event={"ID":"4baa475c-f008-41cf-99d8-88723e43abf0","Type":"ContainerDied","Data":"a12383b49bba35c8a07ee384ef9f634c2cb1c1a73d15fd4d30d4a3260552997d"} Mar 19 19:02:02 crc kubenswrapper[4826]: I0319 19:02:02.972483 4826 generic.go:334] "Generic (PLEG): container finished" podID="623d1506-574a-4eee-9f1c-cc0ee85e9083" containerID="438a77aec2f0b6ee830e112a9c5922b4376e59d2fc46e31c3bf1ebca8d175db6" exitCode=0 Mar 19 19:02:02 crc kubenswrapper[4826]: I0319 19:02:02.973128 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-plvms" event={"ID":"623d1506-574a-4eee-9f1c-cc0ee85e9083","Type":"ContainerDied","Data":"438a77aec2f0b6ee830e112a9c5922b4376e59d2fc46e31c3bf1ebca8d175db6"} Mar 19 19:02:02 crc kubenswrapper[4826]: I0319 19:02:02.973272 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-plvms" event={"ID":"623d1506-574a-4eee-9f1c-cc0ee85e9083","Type":"ContainerStarted","Data":"2ca621e671fff7fc73fd280f67bfde3279ab58b3b76f72e38c415958e2af1a66"} Mar 19 19:02:03 crc kubenswrapper[4826]: I0319 19:02:03.557103 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xnzmj"] Mar 19 19:02:03 crc kubenswrapper[4826]: I0319 19:02:03.559719 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xnzmj" Mar 19 19:02:03 crc kubenswrapper[4826]: I0319 19:02:03.565980 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 19 19:02:03 crc kubenswrapper[4826]: I0319 19:02:03.573412 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xnzmj"] Mar 19 19:02:03 crc kubenswrapper[4826]: I0319 19:02:03.662045 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/326a5687-dfe7-4a01-b8b9-c6bedd76684a-catalog-content\") pod \"redhat-operators-xnzmj\" (UID: \"326a5687-dfe7-4a01-b8b9-c6bedd76684a\") " pod="openshift-marketplace/redhat-operators-xnzmj" Mar 19 19:02:03 crc kubenswrapper[4826]: I0319 19:02:03.662126 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/326a5687-dfe7-4a01-b8b9-c6bedd76684a-utilities\") pod \"redhat-operators-xnzmj\" (UID: \"326a5687-dfe7-4a01-b8b9-c6bedd76684a\") " pod="openshift-marketplace/redhat-operators-xnzmj" Mar 19 19:02:03 crc kubenswrapper[4826]: I0319 19:02:03.662153 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m6rz\" (UniqueName: \"kubernetes.io/projected/326a5687-dfe7-4a01-b8b9-c6bedd76684a-kube-api-access-2m6rz\") pod \"redhat-operators-xnzmj\" (UID: \"326a5687-dfe7-4a01-b8b9-c6bedd76684a\") " pod="openshift-marketplace/redhat-operators-xnzmj" Mar 19 19:02:03 crc kubenswrapper[4826]: I0319 19:02:03.763152 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/326a5687-dfe7-4a01-b8b9-c6bedd76684a-catalog-content\") pod \"redhat-operators-xnzmj\" (UID: \"326a5687-dfe7-4a01-b8b9-c6bedd76684a\") " pod="openshift-marketplace/redhat-operators-xnzmj" Mar 19 19:02:03 crc kubenswrapper[4826]: I0319 19:02:03.763219 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/326a5687-dfe7-4a01-b8b9-c6bedd76684a-utilities\") pod \"redhat-operators-xnzmj\" (UID: \"326a5687-dfe7-4a01-b8b9-c6bedd76684a\") " pod="openshift-marketplace/redhat-operators-xnzmj" Mar 19 19:02:03 crc kubenswrapper[4826]: I0319 19:02:03.763248 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2m6rz\" (UniqueName: \"kubernetes.io/projected/326a5687-dfe7-4a01-b8b9-c6bedd76684a-kube-api-access-2m6rz\") pod \"redhat-operators-xnzmj\" (UID: \"326a5687-dfe7-4a01-b8b9-c6bedd76684a\") " pod="openshift-marketplace/redhat-operators-xnzmj" Mar 19 19:02:03 crc kubenswrapper[4826]: I0319 19:02:03.763618 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/326a5687-dfe7-4a01-b8b9-c6bedd76684a-catalog-content\") pod \"redhat-operators-xnzmj\" (UID: \"326a5687-dfe7-4a01-b8b9-c6bedd76684a\") " pod="openshift-marketplace/redhat-operators-xnzmj" Mar 19 19:02:03 crc kubenswrapper[4826]: I0319 19:02:03.763785 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/326a5687-dfe7-4a01-b8b9-c6bedd76684a-utilities\") pod \"redhat-operators-xnzmj\" (UID: \"326a5687-dfe7-4a01-b8b9-c6bedd76684a\") " pod="openshift-marketplace/redhat-operators-xnzmj" Mar 19 19:02:03 crc kubenswrapper[4826]: I0319 19:02:03.784624 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m6rz\" (UniqueName: \"kubernetes.io/projected/326a5687-dfe7-4a01-b8b9-c6bedd76684a-kube-api-access-2m6rz\") pod \"redhat-operators-xnzmj\" (UID: \"326a5687-dfe7-4a01-b8b9-c6bedd76684a\") " pod="openshift-marketplace/redhat-operators-xnzmj" Mar 19 19:02:03 crc kubenswrapper[4826]: I0319 19:02:03.926502 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xnzmj" Mar 19 19:02:03 crc kubenswrapper[4826]: I0319 19:02:03.991006 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-plvms" event={"ID":"623d1506-574a-4eee-9f1c-cc0ee85e9083","Type":"ContainerStarted","Data":"cd861f571d066d3d01cc2e3ecd316553e96c323dcdcb7b7723c550bbbb82ec7c"} Mar 19 19:02:04 crc kubenswrapper[4826]: I0319 19:02:04.161409 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bp86p"] Mar 19 19:02:04 crc kubenswrapper[4826]: I0319 19:02:04.162550 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bp86p" Mar 19 19:02:04 crc kubenswrapper[4826]: I0319 19:02:04.165960 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 19 19:02:04 crc kubenswrapper[4826]: I0319 19:02:04.169822 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdc7f0e3-fe71-4226-a701-40b602cf6444-utilities\") pod \"community-operators-bp86p\" (UID: \"fdc7f0e3-fe71-4226-a701-40b602cf6444\") " pod="openshift-marketplace/community-operators-bp86p" Mar 19 19:02:04 crc kubenswrapper[4826]: I0319 19:02:04.169865 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdc7f0e3-fe71-4226-a701-40b602cf6444-catalog-content\") pod \"community-operators-bp86p\" (UID: \"fdc7f0e3-fe71-4226-a701-40b602cf6444\") " pod="openshift-marketplace/community-operators-bp86p" Mar 19 19:02:04 crc kubenswrapper[4826]: I0319 19:02:04.169895 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmwxj\" (UniqueName: \"kubernetes.io/projected/fdc7f0e3-fe71-4226-a701-40b602cf6444-kube-api-access-kmwxj\") pod \"community-operators-bp86p\" (UID: \"fdc7f0e3-fe71-4226-a701-40b602cf6444\") " pod="openshift-marketplace/community-operators-bp86p" Mar 19 19:02:04 crc kubenswrapper[4826]: I0319 19:02:04.171247 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bp86p"] Mar 19 19:02:04 crc kubenswrapper[4826]: I0319 19:02:04.633219 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdc7f0e3-fe71-4226-a701-40b602cf6444-catalog-content\") pod \"community-operators-bp86p\" (UID: \"fdc7f0e3-fe71-4226-a701-40b602cf6444\") " pod="openshift-marketplace/community-operators-bp86p" Mar 19 19:02:04 crc kubenswrapper[4826]: I0319 19:02:04.633291 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmwxj\" (UniqueName: \"kubernetes.io/projected/fdc7f0e3-fe71-4226-a701-40b602cf6444-kube-api-access-kmwxj\") pod \"community-operators-bp86p\" (UID: \"fdc7f0e3-fe71-4226-a701-40b602cf6444\") " pod="openshift-marketplace/community-operators-bp86p" Mar 19 19:02:04 crc kubenswrapper[4826]: I0319 19:02:04.633383 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdc7f0e3-fe71-4226-a701-40b602cf6444-utilities\") pod \"community-operators-bp86p\" (UID: \"fdc7f0e3-fe71-4226-a701-40b602cf6444\") " pod="openshift-marketplace/community-operators-bp86p" Mar 19 19:02:04 crc kubenswrapper[4826]: I0319 19:02:04.633878 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdc7f0e3-fe71-4226-a701-40b602cf6444-utilities\") pod \"community-operators-bp86p\" (UID: \"fdc7f0e3-fe71-4226-a701-40b602cf6444\") " pod="openshift-marketplace/community-operators-bp86p" Mar 19 19:02:04 crc kubenswrapper[4826]: I0319 19:02:04.634140 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdc7f0e3-fe71-4226-a701-40b602cf6444-catalog-content\") pod \"community-operators-bp86p\" (UID: \"fdc7f0e3-fe71-4226-a701-40b602cf6444\") " pod="openshift-marketplace/community-operators-bp86p" Mar 19 19:02:04 crc kubenswrapper[4826]: I0319 19:02:04.659025 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmwxj\" (UniqueName: \"kubernetes.io/projected/fdc7f0e3-fe71-4226-a701-40b602cf6444-kube-api-access-kmwxj\") pod \"community-operators-bp86p\" (UID: \"fdc7f0e3-fe71-4226-a701-40b602cf6444\") " pod="openshift-marketplace/community-operators-bp86p" Mar 19 19:02:04 crc kubenswrapper[4826]: I0319 19:02:04.717241 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xnzmj"] Mar 19 19:02:04 crc kubenswrapper[4826]: W0319 19:02:04.721753 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod326a5687_dfe7_4a01_b8b9_c6bedd76684a.slice/crio-96263bc5f569e3207a74db39ab535a0d1984ac2417c340b82a87ffc67a5c4021 WatchSource:0}: Error finding container 96263bc5f569e3207a74db39ab535a0d1984ac2417c340b82a87ffc67a5c4021: Status 404 returned error can't find the container with id 96263bc5f569e3207a74db39ab535a0d1984ac2417c340b82a87ffc67a5c4021 Mar 19 19:02:04 crc kubenswrapper[4826]: I0319 19:02:04.805667 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bp86p" Mar 19 19:02:05 crc kubenswrapper[4826]: I0319 19:02:05.003635 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2v9rn" event={"ID":"4baa475c-f008-41cf-99d8-88723e43abf0","Type":"ContainerStarted","Data":"072758da9dcfd68b3f3f95263e3295ce07af12457db174669a59799058edac93"} Mar 19 19:02:05 crc kubenswrapper[4826]: I0319 19:02:05.007064 4826 generic.go:334] "Generic (PLEG): container finished" podID="623d1506-574a-4eee-9f1c-cc0ee85e9083" containerID="cd861f571d066d3d01cc2e3ecd316553e96c323dcdcb7b7723c550bbbb82ec7c" exitCode=0 Mar 19 19:02:05 crc kubenswrapper[4826]: I0319 19:02:05.007188 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-plvms" event={"ID":"623d1506-574a-4eee-9f1c-cc0ee85e9083","Type":"ContainerDied","Data":"cd861f571d066d3d01cc2e3ecd316553e96c323dcdcb7b7723c550bbbb82ec7c"} Mar 19 19:02:05 crc kubenswrapper[4826]: I0319 19:02:05.015257 4826 generic.go:334] "Generic (PLEG): container finished" podID="326a5687-dfe7-4a01-b8b9-c6bedd76684a" containerID="976f313e8ec030f5d2002986fb336f041a4432107997479f3d63b44d63c73db0" exitCode=0 Mar 19 19:02:05 crc kubenswrapper[4826]: I0319 19:02:05.015298 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xnzmj" event={"ID":"326a5687-dfe7-4a01-b8b9-c6bedd76684a","Type":"ContainerDied","Data":"976f313e8ec030f5d2002986fb336f041a4432107997479f3d63b44d63c73db0"} Mar 19 19:02:05 crc kubenswrapper[4826]: I0319 19:02:05.015325 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xnzmj" event={"ID":"326a5687-dfe7-4a01-b8b9-c6bedd76684a","Type":"ContainerStarted","Data":"96263bc5f569e3207a74db39ab535a0d1984ac2417c340b82a87ffc67a5c4021"} Mar 19 19:02:05 crc kubenswrapper[4826]: I0319 19:02:05.778209 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bp86p"] Mar 19 19:02:06 crc kubenswrapper[4826]: I0319 19:02:06.022087 4826 generic.go:334] "Generic (PLEG): container finished" podID="0e93cc0d-3987-4256-b7cb-6796e66fd509" containerID="c9547ac988e9316cc508204e8ce335d1d20f5b6a5ee44042e52aeb47ed62c566" exitCode=0 Mar 19 19:02:06 crc kubenswrapper[4826]: I0319 19:02:06.022392 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565782-kmh7v" event={"ID":"0e93cc0d-3987-4256-b7cb-6796e66fd509","Type":"ContainerDied","Data":"c9547ac988e9316cc508204e8ce335d1d20f5b6a5ee44042e52aeb47ed62c566"} Mar 19 19:02:06 crc kubenswrapper[4826]: I0319 19:02:06.025702 4826 generic.go:334] "Generic (PLEG): container finished" podID="4baa475c-f008-41cf-99d8-88723e43abf0" containerID="072758da9dcfd68b3f3f95263e3295ce07af12457db174669a59799058edac93" exitCode=0 Mar 19 19:02:06 crc kubenswrapper[4826]: I0319 19:02:06.025768 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2v9rn" event={"ID":"4baa475c-f008-41cf-99d8-88723e43abf0","Type":"ContainerDied","Data":"072758da9dcfd68b3f3f95263e3295ce07af12457db174669a59799058edac93"} Mar 19 19:02:06 crc kubenswrapper[4826]: I0319 19:02:06.029182 4826 generic.go:334] "Generic (PLEG): container finished" podID="fdc7f0e3-fe71-4226-a701-40b602cf6444" containerID="724c6af6dc90b0175b4102f72a5587f2c5c4c14f2fd35d4bf18031456a6063bd" exitCode=0 Mar 19 19:02:06 crc kubenswrapper[4826]: I0319 19:02:06.029240 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bp86p" event={"ID":"fdc7f0e3-fe71-4226-a701-40b602cf6444","Type":"ContainerDied","Data":"724c6af6dc90b0175b4102f72a5587f2c5c4c14f2fd35d4bf18031456a6063bd"} Mar 19 19:02:06 crc kubenswrapper[4826]: I0319 19:02:06.029259 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bp86p" event={"ID":"fdc7f0e3-fe71-4226-a701-40b602cf6444","Type":"ContainerStarted","Data":"00bac2129e9629f7f6ed0424368567c43c29cbb9988d4f3306f00bc9124fdc8b"} Mar 19 19:02:06 crc kubenswrapper[4826]: I0319 19:02:06.033015 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-plvms" event={"ID":"623d1506-574a-4eee-9f1c-cc0ee85e9083","Type":"ContainerStarted","Data":"cf3e6f85c69fdda92a07610750a78876f458376585f88ad22ac8d63b77bc4af5"} Mar 19 19:02:06 crc kubenswrapper[4826]: I0319 19:02:06.101889 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-plvms" podStartSLOduration=2.640275529 podStartE2EDuration="5.101872585s" podCreationTimestamp="2026-03-19 19:02:01 +0000 UTC" firstStartedPulling="2026-03-19 19:02:02.97546983 +0000 UTC m=+347.729538173" lastFinishedPulling="2026-03-19 19:02:05.437066886 +0000 UTC m=+350.191135229" observedRunningTime="2026-03-19 19:02:06.099811062 +0000 UTC m=+350.853879395" watchObservedRunningTime="2026-03-19 19:02:06.101872585 +0000 UTC m=+350.855940898" Mar 19 19:02:07 crc kubenswrapper[4826]: I0319 19:02:07.050382 4826 generic.go:334] "Generic (PLEG): container finished" podID="326a5687-dfe7-4a01-b8b9-c6bedd76684a" containerID="9bd5020ad8c6a35ab4142733028c9313cbcabf6f92ee8bd67a54666c54d9c08c" exitCode=0 Mar 19 19:02:07 crc kubenswrapper[4826]: I0319 19:02:07.050428 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xnzmj" event={"ID":"326a5687-dfe7-4a01-b8b9-c6bedd76684a","Type":"ContainerDied","Data":"9bd5020ad8c6a35ab4142733028c9313cbcabf6f92ee8bd67a54666c54d9c08c"} Mar 19 19:02:07 crc kubenswrapper[4826]: I0319 19:02:07.057127 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2v9rn" event={"ID":"4baa475c-f008-41cf-99d8-88723e43abf0","Type":"ContainerStarted","Data":"bdfa9673fef9a52cc9d3757cd500cdd05b6aa8bd455fa9fab3a2b7a96e629739"} Mar 19 19:02:07 crc kubenswrapper[4826]: I0319 19:02:07.291986 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565782-kmh7v" Mar 19 19:02:07 crc kubenswrapper[4826]: I0319 19:02:07.303083 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2v9rn" podStartSLOduration=2.766201579 podStartE2EDuration="6.303065247s" podCreationTimestamp="2026-03-19 19:02:01 +0000 UTC" firstStartedPulling="2026-03-19 19:02:02.970264315 +0000 UTC m=+347.724332658" lastFinishedPulling="2026-03-19 19:02:06.507127973 +0000 UTC m=+351.261196326" observedRunningTime="2026-03-19 19:02:07.095409298 +0000 UTC m=+351.849477611" watchObservedRunningTime="2026-03-19 19:02:07.303065247 +0000 UTC m=+352.057133560" Mar 19 19:02:07 crc kubenswrapper[4826]: I0319 19:02:07.470545 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4j6w\" (UniqueName: \"kubernetes.io/projected/0e93cc0d-3987-4256-b7cb-6796e66fd509-kube-api-access-s4j6w\") pod \"0e93cc0d-3987-4256-b7cb-6796e66fd509\" (UID: \"0e93cc0d-3987-4256-b7cb-6796e66fd509\") " Mar 19 19:02:07 crc kubenswrapper[4826]: I0319 19:02:07.480326 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e93cc0d-3987-4256-b7cb-6796e66fd509-kube-api-access-s4j6w" (OuterVolumeSpecName: "kube-api-access-s4j6w") pod "0e93cc0d-3987-4256-b7cb-6796e66fd509" (UID: "0e93cc0d-3987-4256-b7cb-6796e66fd509"). InnerVolumeSpecName "kube-api-access-s4j6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:02:07 crc kubenswrapper[4826]: I0319 19:02:07.572245 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4j6w\" (UniqueName: \"kubernetes.io/projected/0e93cc0d-3987-4256-b7cb-6796e66fd509-kube-api-access-s4j6w\") on node \"crc\" DevicePath \"\"" Mar 19 19:02:08 crc kubenswrapper[4826]: I0319 19:02:08.065246 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565782-kmh7v" event={"ID":"0e93cc0d-3987-4256-b7cb-6796e66fd509","Type":"ContainerDied","Data":"44c6ae027e3797b2038571c273f9e513858c203ef9eba209a2a8f74d21f74dbd"} Mar 19 19:02:08 crc kubenswrapper[4826]: I0319 19:02:08.065305 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44c6ae027e3797b2038571c273f9e513858c203ef9eba209a2a8f74d21f74dbd" Mar 19 19:02:08 crc kubenswrapper[4826]: I0319 19:02:08.065385 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565782-kmh7v" Mar 19 19:02:08 crc kubenswrapper[4826]: I0319 19:02:08.070787 4826 generic.go:334] "Generic (PLEG): container finished" podID="fdc7f0e3-fe71-4226-a701-40b602cf6444" containerID="9482e4dda722a69decbcb0dee79dc8ff864ddede5698d6ec77ae7e70de51c2ce" exitCode=0 Mar 19 19:02:08 crc kubenswrapper[4826]: I0319 19:02:08.070868 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bp86p" event={"ID":"fdc7f0e3-fe71-4226-a701-40b602cf6444","Type":"ContainerDied","Data":"9482e4dda722a69decbcb0dee79dc8ff864ddede5698d6ec77ae7e70de51c2ce"} Mar 19 19:02:09 crc kubenswrapper[4826]: I0319 19:02:09.077701 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bp86p" event={"ID":"fdc7f0e3-fe71-4226-a701-40b602cf6444","Type":"ContainerStarted","Data":"700f0b148af7c319786f5b3905922158394330437ce052b0c7378ee132ffba47"} Mar 19 19:02:09 crc kubenswrapper[4826]: I0319 19:02:09.080058 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xnzmj" event={"ID":"326a5687-dfe7-4a01-b8b9-c6bedd76684a","Type":"ContainerStarted","Data":"8f1cd4cbb21cf3c07afddfa69e78270b63fad57082654be50152679acfe34b36"} Mar 19 19:02:09 crc kubenswrapper[4826]: I0319 19:02:09.101774 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bp86p" podStartSLOduration=2.193507893 podStartE2EDuration="5.101760635s" podCreationTimestamp="2026-03-19 19:02:04 +0000 UTC" firstStartedPulling="2026-03-19 19:02:06.030898497 +0000 UTC m=+350.784966810" lastFinishedPulling="2026-03-19 19:02:08.939151209 +0000 UTC m=+353.693219552" observedRunningTime="2026-03-19 19:02:09.098723446 +0000 UTC m=+353.852791769" watchObservedRunningTime="2026-03-19 19:02:09.101760635 +0000 UTC m=+353.855828948" Mar 19 19:02:09 crc kubenswrapper[4826]: I0319 19:02:09.123457 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xnzmj" podStartSLOduration=3.210398702 podStartE2EDuration="6.123436349s" podCreationTimestamp="2026-03-19 19:02:03 +0000 UTC" firstStartedPulling="2026-03-19 19:02:05.016945892 +0000 UTC m=+349.771014205" lastFinishedPulling="2026-03-19 19:02:07.929983519 +0000 UTC m=+352.684051852" observedRunningTime="2026-03-19 19:02:09.11961094 +0000 UTC m=+353.873679253" watchObservedRunningTime="2026-03-19 19:02:09.123436349 +0000 UTC m=+353.877504662" Mar 19 19:02:11 crc kubenswrapper[4826]: I0319 19:02:11.488637 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2v9rn" Mar 19 19:02:11 crc kubenswrapper[4826]: I0319 19:02:11.489037 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2v9rn" Mar 19 19:02:11 crc kubenswrapper[4826]: I0319 19:02:11.538093 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2v9rn" Mar 19 19:02:12 crc kubenswrapper[4826]: I0319 19:02:12.118416 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-plvms" Mar 19 19:02:12 crc kubenswrapper[4826]: I0319 19:02:12.119901 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-plvms" Mar 19 19:02:12 crc kubenswrapper[4826]: I0319 19:02:12.153079 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2v9rn" Mar 19 19:02:12 crc kubenswrapper[4826]: I0319 19:02:12.181170 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-plvms" Mar 19 19:02:13 crc kubenswrapper[4826]: I0319 19:02:13.170650 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-plvms" Mar 19 19:02:13 crc kubenswrapper[4826]: I0319 19:02:13.927807 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xnzmj" Mar 19 19:02:13 crc kubenswrapper[4826]: I0319 19:02:13.928082 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xnzmj" Mar 19 19:02:14 crc kubenswrapper[4826]: I0319 19:02:14.807041 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bp86p" Mar 19 19:02:14 crc kubenswrapper[4826]: I0319 19:02:14.807247 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bp86p" Mar 19 19:02:14 crc kubenswrapper[4826]: I0319 19:02:14.879178 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bp86p" Mar 19 19:02:14 crc kubenswrapper[4826]: I0319 19:02:14.994236 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xnzmj" podUID="326a5687-dfe7-4a01-b8b9-c6bedd76684a" containerName="registry-server" probeResult="failure" output=< Mar 19 19:02:14 crc kubenswrapper[4826]: timeout: failed to connect service ":50051" within 1s Mar 19 19:02:14 crc kubenswrapper[4826]: > Mar 19 19:02:15 crc kubenswrapper[4826]: I0319 19:02:15.173264 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bp86p" Mar 19 19:02:23 crc kubenswrapper[4826]: I0319 19:02:23.997178 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xnzmj" Mar 19 19:02:24 crc kubenswrapper[4826]: I0319 19:02:24.077397 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xnzmj" Mar 19 19:02:52 crc kubenswrapper[4826]: I0319 19:02:52.879523 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-mzsjh"] Mar 19 19:02:52 crc kubenswrapper[4826]: E0319 19:02:52.880567 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e93cc0d-3987-4256-b7cb-6796e66fd509" containerName="oc" Mar 19 19:02:52 crc kubenswrapper[4826]: I0319 19:02:52.880590 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e93cc0d-3987-4256-b7cb-6796e66fd509" containerName="oc" Mar 19 19:02:52 crc kubenswrapper[4826]: I0319 19:02:52.881024 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e93cc0d-3987-4256-b7cb-6796e66fd509" containerName="oc" Mar 19 19:02:52 crc kubenswrapper[4826]: I0319 19:02:52.881897 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-mzsjh" Mar 19 19:02:52 crc kubenswrapper[4826]: I0319 19:02:52.885420 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 19 19:02:52 crc kubenswrapper[4826]: I0319 19:02:52.885490 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 19 19:02:52 crc kubenswrapper[4826]: I0319 19:02:52.885805 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 19 19:02:52 crc kubenswrapper[4826]: I0319 19:02:52.891585 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-dockercfg-wwt9l" Mar 19 19:02:52 crc kubenswrapper[4826]: I0319 19:02:52.891931 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 19 19:02:52 crc kubenswrapper[4826]: I0319 19:02:52.900680 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-mzsjh"] Mar 19 19:02:52 crc kubenswrapper[4826]: I0319 19:02:52.994128 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlvkk\" (UniqueName: \"kubernetes.io/projected/d8440885-e4b6-49c7-8359-9b50a6ff5b29-kube-api-access-vlvkk\") pod \"cluster-monitoring-operator-6d5b84845-mzsjh\" (UID: \"d8440885-e4b6-49c7-8359-9b50a6ff5b29\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-mzsjh" Mar 19 19:02:52 crc kubenswrapper[4826]: I0319 19:02:52.994214 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/d8440885-e4b6-49c7-8359-9b50a6ff5b29-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-mzsjh\" (UID: \"d8440885-e4b6-49c7-8359-9b50a6ff5b29\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-mzsjh" Mar 19 19:02:52 crc kubenswrapper[4826]: I0319 19:02:52.994238 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d8440885-e4b6-49c7-8359-9b50a6ff5b29-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-mzsjh\" (UID: \"d8440885-e4b6-49c7-8359-9b50a6ff5b29\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-mzsjh" Mar 19 19:02:53 crc kubenswrapper[4826]: I0319 19:02:53.095967 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/d8440885-e4b6-49c7-8359-9b50a6ff5b29-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-mzsjh\" (UID: \"d8440885-e4b6-49c7-8359-9b50a6ff5b29\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-mzsjh" Mar 19 19:02:53 crc kubenswrapper[4826]: I0319 19:02:53.096018 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d8440885-e4b6-49c7-8359-9b50a6ff5b29-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-mzsjh\" (UID: \"d8440885-e4b6-49c7-8359-9b50a6ff5b29\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-mzsjh" Mar 19 19:02:53 crc kubenswrapper[4826]: I0319 19:02:53.096092 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlvkk\" (UniqueName: \"kubernetes.io/projected/d8440885-e4b6-49c7-8359-9b50a6ff5b29-kube-api-access-vlvkk\") pod \"cluster-monitoring-operator-6d5b84845-mzsjh\" (UID: \"d8440885-e4b6-49c7-8359-9b50a6ff5b29\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-mzsjh" Mar 19 19:02:53 crc kubenswrapper[4826]: I0319 19:02:53.096945 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/d8440885-e4b6-49c7-8359-9b50a6ff5b29-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-mzsjh\" (UID: \"d8440885-e4b6-49c7-8359-9b50a6ff5b29\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-mzsjh" Mar 19 19:02:53 crc kubenswrapper[4826]: I0319 19:02:53.105350 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d8440885-e4b6-49c7-8359-9b50a6ff5b29-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-mzsjh\" (UID: \"d8440885-e4b6-49c7-8359-9b50a6ff5b29\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-mzsjh" Mar 19 19:02:53 crc kubenswrapper[4826]: I0319 19:02:53.124172 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlvkk\" (UniqueName: \"kubernetes.io/projected/d8440885-e4b6-49c7-8359-9b50a6ff5b29-kube-api-access-vlvkk\") pod \"cluster-monitoring-operator-6d5b84845-mzsjh\" (UID: \"d8440885-e4b6-49c7-8359-9b50a6ff5b29\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-mzsjh" Mar 19 19:02:53 crc kubenswrapper[4826]: I0319 19:02:53.240958 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-mzsjh" Mar 19 19:02:53 crc kubenswrapper[4826]: I0319 19:02:53.493308 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-mzsjh"] Mar 19 19:02:54 crc kubenswrapper[4826]: I0319 19:02:54.350112 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-mzsjh" event={"ID":"d8440885-e4b6-49c7-8359-9b50a6ff5b29","Type":"ContainerStarted","Data":"1a93adfe4da2b079286b853f28ad3dd4a4ca459772f7bec20aa1540d216fa501"} Mar 19 19:02:55 crc kubenswrapper[4826]: I0319 19:02:55.401197 4826 patch_prober.go:28] interesting pod/machine-config-daemon-zz87p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:02:55 crc kubenswrapper[4826]: I0319 19:02:55.401560 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:02:56 crc kubenswrapper[4826]: I0319 19:02:56.361801 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-mzsjh" event={"ID":"d8440885-e4b6-49c7-8359-9b50a6ff5b29","Type":"ContainerStarted","Data":"700e390964f5beeaabc0034336fa9dbe48416c36c95145611bcb0c8f014158a2"} Mar 19 19:02:56 crc kubenswrapper[4826]: I0319 19:02:56.370988 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-sbhr9"] Mar 19 19:02:56 crc kubenswrapper[4826]: I0319 19:02:56.371546 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-sbhr9" Mar 19 19:02:56 crc kubenswrapper[4826]: I0319 19:02:56.372741 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-dockercfg-62bnh" Mar 19 19:02:56 crc kubenswrapper[4826]: I0319 19:02:56.373075 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Mar 19 19:02:56 crc kubenswrapper[4826]: I0319 19:02:56.385389 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-sbhr9"] Mar 19 19:02:56 crc kubenswrapper[4826]: I0319 19:02:56.389041 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-mzsjh" podStartSLOduration=2.224412838 podStartE2EDuration="4.389025773s" podCreationTimestamp="2026-03-19 19:02:52 +0000 UTC" firstStartedPulling="2026-03-19 19:02:53.498142465 +0000 UTC m=+398.252210778" lastFinishedPulling="2026-03-19 19:02:55.66275539 +0000 UTC m=+400.416823713" observedRunningTime="2026-03-19 19:02:56.385678575 +0000 UTC m=+401.139746898" watchObservedRunningTime="2026-03-19 19:02:56.389025773 +0000 UTC m=+401.143094086" Mar 19 19:02:56 crc kubenswrapper[4826]: I0319 19:02:56.444799 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/67f96c65-0583-4f62-a063-98c7e6bbfb87-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-sbhr9\" (UID: \"67f96c65-0583-4f62-a063-98c7e6bbfb87\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-sbhr9" Mar 19 19:02:56 crc kubenswrapper[4826]: I0319 19:02:56.546726 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/67f96c65-0583-4f62-a063-98c7e6bbfb87-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-sbhr9\" (UID: \"67f96c65-0583-4f62-a063-98c7e6bbfb87\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-sbhr9" Mar 19 19:02:56 crc kubenswrapper[4826]: I0319 19:02:56.558007 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/67f96c65-0583-4f62-a063-98c7e6bbfb87-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-sbhr9\" (UID: \"67f96c65-0583-4f62-a063-98c7e6bbfb87\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-sbhr9" Mar 19 19:02:56 crc kubenswrapper[4826]: I0319 19:02:56.686539 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-sbhr9" Mar 19 19:02:56 crc kubenswrapper[4826]: I0319 19:02:56.956405 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-sbhr9"] Mar 19 19:02:56 crc kubenswrapper[4826]: W0319 19:02:56.964823 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67f96c65_0583_4f62_a063_98c7e6bbfb87.slice/crio-34cdebe037184eab990d46ad3ccc458814070150ae2b08c47a8d4fca2ae954ff WatchSource:0}: Error finding container 34cdebe037184eab990d46ad3ccc458814070150ae2b08c47a8d4fca2ae954ff: Status 404 returned error can't find the container with id 34cdebe037184eab990d46ad3ccc458814070150ae2b08c47a8d4fca2ae954ff Mar 19 19:02:57 crc kubenswrapper[4826]: I0319 19:02:57.369783 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-sbhr9" event={"ID":"67f96c65-0583-4f62-a063-98c7e6bbfb87","Type":"ContainerStarted","Data":"34cdebe037184eab990d46ad3ccc458814070150ae2b08c47a8d4fca2ae954ff"} Mar 19 19:02:59 crc kubenswrapper[4826]: I0319 19:02:59.397242 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-sbhr9" event={"ID":"67f96c65-0583-4f62-a063-98c7e6bbfb87","Type":"ContainerStarted","Data":"d633e3e09132bed99d00b1fbe22f27863f03fa82f637c17e9bce1e725b46e4df"} Mar 19 19:02:59 crc kubenswrapper[4826]: I0319 19:02:59.397592 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-sbhr9" Mar 19 19:02:59 crc kubenswrapper[4826]: I0319 19:02:59.406981 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-sbhr9" Mar 19 19:02:59 crc kubenswrapper[4826]: I0319 19:02:59.415441 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-sbhr9" podStartSLOduration=1.789477353 podStartE2EDuration="3.415425557s" podCreationTimestamp="2026-03-19 19:02:56 +0000 UTC" firstStartedPulling="2026-03-19 19:02:56.966980351 +0000 UTC m=+401.721048664" lastFinishedPulling="2026-03-19 19:02:58.592928555 +0000 UTC m=+403.346996868" observedRunningTime="2026-03-19 19:02:59.412765018 +0000 UTC m=+404.166833411" watchObservedRunningTime="2026-03-19 19:02:59.415425557 +0000 UTC m=+404.169493870" Mar 19 19:03:00 crc kubenswrapper[4826]: I0319 19:03:00.455235 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-7qkts"] Mar 19 19:03:00 crc kubenswrapper[4826]: I0319 19:03:00.456505 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-db54df47d-7qkts" Mar 19 19:03:00 crc kubenswrapper[4826]: I0319 19:03:00.458874 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Mar 19 19:03:00 crc kubenswrapper[4826]: I0319 19:03:00.458888 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Mar 19 19:03:00 crc kubenswrapper[4826]: I0319 19:03:00.459123 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-8jjpz" Mar 19 19:03:00 crc kubenswrapper[4826]: I0319 19:03:00.459425 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Mar 19 19:03:00 crc kubenswrapper[4826]: I0319 19:03:00.472533 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-7qkts"] Mar 19 19:03:00 crc kubenswrapper[4826]: I0319 19:03:00.499231 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/4d1ab610-5cfc-4f32-8e5e-90e3aa86a278-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-7qkts\" (UID: \"4d1ab610-5cfc-4f32-8e5e-90e3aa86a278\") " pod="openshift-monitoring/prometheus-operator-db54df47d-7qkts" Mar 19 19:03:00 crc kubenswrapper[4826]: I0319 19:03:00.499285 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvcsl\" (UniqueName: \"kubernetes.io/projected/4d1ab610-5cfc-4f32-8e5e-90e3aa86a278-kube-api-access-cvcsl\") pod \"prometheus-operator-db54df47d-7qkts\" (UID: \"4d1ab610-5cfc-4f32-8e5e-90e3aa86a278\") " pod="openshift-monitoring/prometheus-operator-db54df47d-7qkts" Mar 19 19:03:00 crc kubenswrapper[4826]: I0319 19:03:00.499308 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4d1ab610-5cfc-4f32-8e5e-90e3aa86a278-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-7qkts\" (UID: \"4d1ab610-5cfc-4f32-8e5e-90e3aa86a278\") " pod="openshift-monitoring/prometheus-operator-db54df47d-7qkts" Mar 19 19:03:00 crc kubenswrapper[4826]: I0319 19:03:00.499328 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4d1ab610-5cfc-4f32-8e5e-90e3aa86a278-metrics-client-ca\") pod \"prometheus-operator-db54df47d-7qkts\" (UID: \"4d1ab610-5cfc-4f32-8e5e-90e3aa86a278\") " pod="openshift-monitoring/prometheus-operator-db54df47d-7qkts" Mar 19 19:03:00 crc kubenswrapper[4826]: I0319 19:03:00.600295 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvcsl\" (UniqueName: \"kubernetes.io/projected/4d1ab610-5cfc-4f32-8e5e-90e3aa86a278-kube-api-access-cvcsl\") pod \"prometheus-operator-db54df47d-7qkts\" (UID: \"4d1ab610-5cfc-4f32-8e5e-90e3aa86a278\") " pod="openshift-monitoring/prometheus-operator-db54df47d-7qkts" Mar 19 19:03:00 crc kubenswrapper[4826]: I0319 19:03:00.600368 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4d1ab610-5cfc-4f32-8e5e-90e3aa86a278-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-7qkts\" (UID: \"4d1ab610-5cfc-4f32-8e5e-90e3aa86a278\") " pod="openshift-monitoring/prometheus-operator-db54df47d-7qkts" Mar 19 19:03:00 crc kubenswrapper[4826]: I0319 19:03:00.600409 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4d1ab610-5cfc-4f32-8e5e-90e3aa86a278-metrics-client-ca\") pod \"prometheus-operator-db54df47d-7qkts\" (UID: \"4d1ab610-5cfc-4f32-8e5e-90e3aa86a278\") " pod="openshift-monitoring/prometheus-operator-db54df47d-7qkts" Mar 19 19:03:00 crc kubenswrapper[4826]: I0319 19:03:00.600543 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/4d1ab610-5cfc-4f32-8e5e-90e3aa86a278-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-7qkts\" (UID: \"4d1ab610-5cfc-4f32-8e5e-90e3aa86a278\") " pod="openshift-monitoring/prometheus-operator-db54df47d-7qkts" Mar 19 19:03:00 crc kubenswrapper[4826]: I0319 19:03:00.604034 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4d1ab610-5cfc-4f32-8e5e-90e3aa86a278-metrics-client-ca\") pod \"prometheus-operator-db54df47d-7qkts\" (UID: \"4d1ab610-5cfc-4f32-8e5e-90e3aa86a278\") " pod="openshift-monitoring/prometheus-operator-db54df47d-7qkts" Mar 19 19:03:00 crc kubenswrapper[4826]: I0319 19:03:00.609092 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4d1ab610-5cfc-4f32-8e5e-90e3aa86a278-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-7qkts\" (UID: \"4d1ab610-5cfc-4f32-8e5e-90e3aa86a278\") " pod="openshift-monitoring/prometheus-operator-db54df47d-7qkts" Mar 19 19:03:00 crc kubenswrapper[4826]: I0319 19:03:00.609201 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/4d1ab610-5cfc-4f32-8e5e-90e3aa86a278-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-7qkts\" (UID: \"4d1ab610-5cfc-4f32-8e5e-90e3aa86a278\") " pod="openshift-monitoring/prometheus-operator-db54df47d-7qkts" Mar 19 19:03:00 crc kubenswrapper[4826]: I0319 19:03:00.620477 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvcsl\" (UniqueName: \"kubernetes.io/projected/4d1ab610-5cfc-4f32-8e5e-90e3aa86a278-kube-api-access-cvcsl\") pod \"prometheus-operator-db54df47d-7qkts\" (UID: \"4d1ab610-5cfc-4f32-8e5e-90e3aa86a278\") " pod="openshift-monitoring/prometheus-operator-db54df47d-7qkts" Mar 19 19:03:00 crc kubenswrapper[4826]: I0319 19:03:00.773223 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-db54df47d-7qkts" Mar 19 19:03:01 crc kubenswrapper[4826]: I0319 19:03:01.268644 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-7qkts"] Mar 19 19:03:01 crc kubenswrapper[4826]: W0319 19:03:01.281327 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d1ab610_5cfc_4f32_8e5e_90e3aa86a278.slice/crio-8682d00a2433804c48bc22ca36b8f1dc532cbcc7d8720403afe32c6dd4629ac7 WatchSource:0}: Error finding container 8682d00a2433804c48bc22ca36b8f1dc532cbcc7d8720403afe32c6dd4629ac7: Status 404 returned error can't find the container with id 8682d00a2433804c48bc22ca36b8f1dc532cbcc7d8720403afe32c6dd4629ac7 Mar 19 19:03:01 crc kubenswrapper[4826]: I0319 19:03:01.411608 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-7qkts" event={"ID":"4d1ab610-5cfc-4f32-8e5e-90e3aa86a278","Type":"ContainerStarted","Data":"8682d00a2433804c48bc22ca36b8f1dc532cbcc7d8720403afe32c6dd4629ac7"} Mar 19 19:03:03 crc kubenswrapper[4826]: I0319 19:03:03.422613 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-7qkts" event={"ID":"4d1ab610-5cfc-4f32-8e5e-90e3aa86a278","Type":"ContainerStarted","Data":"5ff794de243f99b2be214a14b491587daae7c746bdf992f3295d25e5039575f3"} Mar 19 19:03:03 crc kubenswrapper[4826]: I0319 19:03:03.423138 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-7qkts" event={"ID":"4d1ab610-5cfc-4f32-8e5e-90e3aa86a278","Type":"ContainerStarted","Data":"4a4f142380867ced47a00aec3c5057681d5733cff653682289799c0eea8128cb"} Mar 19 19:03:03 crc kubenswrapper[4826]: I0319 19:03:03.447734 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-db54df47d-7qkts" podStartSLOduration=1.928403799 podStartE2EDuration="3.447715197s" podCreationTimestamp="2026-03-19 19:03:00 +0000 UTC" firstStartedPulling="2026-03-19 19:03:01.28412027 +0000 UTC m=+406.038188583" lastFinishedPulling="2026-03-19 19:03:02.803431658 +0000 UTC m=+407.557499981" observedRunningTime="2026-03-19 19:03:03.44708085 +0000 UTC m=+408.201149203" watchObservedRunningTime="2026-03-19 19:03:03.447715197 +0000 UTC m=+408.201783520" Mar 19 19:03:05 crc kubenswrapper[4826]: I0319 19:03:05.920902 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-nqvg4"] Mar 19 19:03:05 crc kubenswrapper[4826]: I0319 19:03:05.921854 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-566fddb674-nqvg4" Mar 19 19:03:05 crc kubenswrapper[4826]: I0319 19:03:05.923713 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-gk45l" Mar 19 19:03:05 crc kubenswrapper[4826]: I0319 19:03:05.923725 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Mar 19 19:03:05 crc kubenswrapper[4826]: I0319 19:03:05.924003 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Mar 19 19:03:05 crc kubenswrapper[4826]: I0319 19:03:05.933198 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/6cddb4c8-eded-4ea1-86e9-2d99378281cf-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-nqvg4\" (UID: \"6cddb4c8-eded-4ea1-86e9-2d99378281cf\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-nqvg4" Mar 19 19:03:05 crc kubenswrapper[4826]: I0319 19:03:05.933407 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6cddb4c8-eded-4ea1-86e9-2d99378281cf-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-nqvg4\" (UID: \"6cddb4c8-eded-4ea1-86e9-2d99378281cf\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-nqvg4" Mar 19 19:03:05 crc kubenswrapper[4826]: I0319 19:03:05.933528 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6cddb4c8-eded-4ea1-86e9-2d99378281cf-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-nqvg4\" (UID: \"6cddb4c8-eded-4ea1-86e9-2d99378281cf\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-nqvg4" Mar 19 19:03:05 crc kubenswrapper[4826]: I0319 19:03:05.933627 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbb4j\" (UniqueName: \"kubernetes.io/projected/6cddb4c8-eded-4ea1-86e9-2d99378281cf-kube-api-access-xbb4j\") pod \"openshift-state-metrics-566fddb674-nqvg4\" (UID: \"6cddb4c8-eded-4ea1-86e9-2d99378281cf\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-nqvg4" Mar 19 19:03:05 crc kubenswrapper[4826]: I0319 19:03:05.940385 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-nqvg4"] Mar 19 19:03:05 crc kubenswrapper[4826]: I0319 19:03:05.945328 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-l7kch"] Mar 19 19:03:05 crc kubenswrapper[4826]: I0319 19:03:05.946448 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-l7kch" Mar 19 19:03:05 crc kubenswrapper[4826]: I0319 19:03:05.947989 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Mar 19 19:03:05 crc kubenswrapper[4826]: I0319 19:03:05.948350 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Mar 19 19:03:05 crc kubenswrapper[4826]: I0319 19:03:05.948596 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Mar 19 19:03:05 crc kubenswrapper[4826]: I0319 19:03:05.949727 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-9k8qq" Mar 19 19:03:05 crc kubenswrapper[4826]: I0319 19:03:05.955303 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-czcv4"] Mar 19 19:03:05 crc kubenswrapper[4826]: I0319 19:03:05.956548 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-czcv4" Mar 19 19:03:05 crc kubenswrapper[4826]: I0319 19:03:05.959609 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Mar 19 19:03:05 crc kubenswrapper[4826]: I0319 19:03:05.959695 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-g9gdb" Mar 19 19:03:05 crc kubenswrapper[4826]: I0319 19:03:05.960485 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-l7kch"] Mar 19 19:03:05 crc kubenswrapper[4826]: I0319 19:03:05.964486 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.034900 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7696\" (UniqueName: \"kubernetes.io/projected/d5ee1a7c-0363-484c-a4d3-34a094cbf64c-kube-api-access-z7696\") pod \"kube-state-metrics-777cb5bd5d-l7kch\" (UID: \"d5ee1a7c-0363-484c-a4d3-34a094cbf64c\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-l7kch" Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.034963 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/d5ee1a7c-0363-484c-a4d3-34a094cbf64c-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-l7kch\" (UID: \"d5ee1a7c-0363-484c-a4d3-34a094cbf64c\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-l7kch" Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.035011 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/6cddb4c8-eded-4ea1-86e9-2d99378281cf-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-nqvg4\" (UID: \"6cddb4c8-eded-4ea1-86e9-2d99378281cf\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-nqvg4" Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.035028 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d5ee1a7c-0363-484c-a4d3-34a094cbf64c-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-l7kch\" (UID: \"d5ee1a7c-0363-484c-a4d3-34a094cbf64c\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-l7kch" Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.035071 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6cddb4c8-eded-4ea1-86e9-2d99378281cf-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-nqvg4\" (UID: \"6cddb4c8-eded-4ea1-86e9-2d99378281cf\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-nqvg4" Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.035090 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/d5ee1a7c-0363-484c-a4d3-34a094cbf64c-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-l7kch\" (UID: \"d5ee1a7c-0363-484c-a4d3-34a094cbf64c\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-l7kch" Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.035114 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6cddb4c8-eded-4ea1-86e9-2d99378281cf-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-nqvg4\" (UID: \"6cddb4c8-eded-4ea1-86e9-2d99378281cf\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-nqvg4" Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.035131 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d5ee1a7c-0363-484c-a4d3-34a094cbf64c-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-l7kch\" (UID: \"d5ee1a7c-0363-484c-a4d3-34a094cbf64c\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-l7kch" Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.035149 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbb4j\" (UniqueName: \"kubernetes.io/projected/6cddb4c8-eded-4ea1-86e9-2d99378281cf-kube-api-access-xbb4j\") pod \"openshift-state-metrics-566fddb674-nqvg4\" (UID: \"6cddb4c8-eded-4ea1-86e9-2d99378281cf\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-nqvg4" Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.035166 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d5ee1a7c-0363-484c-a4d3-34a094cbf64c-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-l7kch\" (UID: \"d5ee1a7c-0363-484c-a4d3-34a094cbf64c\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-l7kch" Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.036755 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6cddb4c8-eded-4ea1-86e9-2d99378281cf-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-nqvg4\" (UID: \"6cddb4c8-eded-4ea1-86e9-2d99378281cf\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-nqvg4" Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.043309 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6cddb4c8-eded-4ea1-86e9-2d99378281cf-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-nqvg4\" (UID: \"6cddb4c8-eded-4ea1-86e9-2d99378281cf\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-nqvg4" Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.058212 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/6cddb4c8-eded-4ea1-86e9-2d99378281cf-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-nqvg4\" (UID: \"6cddb4c8-eded-4ea1-86e9-2d99378281cf\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-nqvg4" Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.059759 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbb4j\" (UniqueName: \"kubernetes.io/projected/6cddb4c8-eded-4ea1-86e9-2d99378281cf-kube-api-access-xbb4j\") pod \"openshift-state-metrics-566fddb674-nqvg4\" (UID: \"6cddb4c8-eded-4ea1-86e9-2d99378281cf\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-nqvg4" Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.136076 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b9517cee-fd39-4d14-9084-aed048265f25-node-exporter-tls\") pod \"node-exporter-czcv4\" (UID: \"b9517cee-fd39-4d14-9084-aed048265f25\") " pod="openshift-monitoring/node-exporter-czcv4" Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.136132 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/d5ee1a7c-0363-484c-a4d3-34a094cbf64c-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-l7kch\" (UID: \"d5ee1a7c-0363-484c-a4d3-34a094cbf64c\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-l7kch" Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.136164 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b9517cee-fd39-4d14-9084-aed048265f25-root\") pod \"node-exporter-czcv4\" (UID: \"b9517cee-fd39-4d14-9084-aed048265f25\") " pod="openshift-monitoring/node-exporter-czcv4" Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.136180 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbck5\" (UniqueName: \"kubernetes.io/projected/b9517cee-fd39-4d14-9084-aed048265f25-kube-api-access-jbck5\") pod \"node-exporter-czcv4\" (UID: \"b9517cee-fd39-4d14-9084-aed048265f25\") " pod="openshift-monitoring/node-exporter-czcv4" Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.136206 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d5ee1a7c-0363-484c-a4d3-34a094cbf64c-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-l7kch\" (UID: \"d5ee1a7c-0363-484c-a4d3-34a094cbf64c\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-l7kch" Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.136228 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b9517cee-fd39-4d14-9084-aed048265f25-node-exporter-textfile\") pod \"node-exporter-czcv4\" (UID: \"b9517cee-fd39-4d14-9084-aed048265f25\") " pod="openshift-monitoring/node-exporter-czcv4" Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.136407 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b9517cee-fd39-4d14-9084-aed048265f25-node-exporter-wtmp\") pod \"node-exporter-czcv4\" (UID: \"b9517cee-fd39-4d14-9084-aed048265f25\") " pod="openshift-monitoring/node-exporter-czcv4" Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.136455 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b9517cee-fd39-4d14-9084-aed048265f25-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-czcv4\" (UID: \"b9517cee-fd39-4d14-9084-aed048265f25\") " pod="openshift-monitoring/node-exporter-czcv4" Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.136528 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/d5ee1a7c-0363-484c-a4d3-34a094cbf64c-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-l7kch\" (UID: \"d5ee1a7c-0363-484c-a4d3-34a094cbf64c\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-l7kch" Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.136570 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d5ee1a7c-0363-484c-a4d3-34a094cbf64c-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-l7kch\" (UID: \"d5ee1a7c-0363-484c-a4d3-34a094cbf64c\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-l7kch" Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.136601 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d5ee1a7c-0363-484c-a4d3-34a094cbf64c-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-l7kch\" (UID: \"d5ee1a7c-0363-484c-a4d3-34a094cbf64c\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-l7kch" Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.136640 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b9517cee-fd39-4d14-9084-aed048265f25-metrics-client-ca\") pod \"node-exporter-czcv4\" (UID: \"b9517cee-fd39-4d14-9084-aed048265f25\") " pod="openshift-monitoring/node-exporter-czcv4" Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.136727 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7696\" (UniqueName: \"kubernetes.io/projected/d5ee1a7c-0363-484c-a4d3-34a094cbf64c-kube-api-access-z7696\") pod \"kube-state-metrics-777cb5bd5d-l7kch\" (UID: \"d5ee1a7c-0363-484c-a4d3-34a094cbf64c\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-l7kch" Mar 19 19:03:06 crc kubenswrapper[4826]: E0319 19:03:06.136754 4826 secret.go:188] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.136767 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b9517cee-fd39-4d14-9084-aed048265f25-sys\") pod \"node-exporter-czcv4\" (UID: \"b9517cee-fd39-4d14-9084-aed048265f25\") " pod="openshift-monitoring/node-exporter-czcv4" Mar 19 19:03:06 crc kubenswrapper[4826]: E0319 19:03:06.136829 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5ee1a7c-0363-484c-a4d3-34a094cbf64c-kube-state-metrics-tls podName:d5ee1a7c-0363-484c-a4d3-34a094cbf64c nodeName:}" failed. No retries permitted until 2026-03-19 19:03:06.636807948 +0000 UTC m=+411.390876261 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/d5ee1a7c-0363-484c-a4d3-34a094cbf64c-kube-state-metrics-tls") pod "kube-state-metrics-777cb5bd5d-l7kch" (UID: "d5ee1a7c-0363-484c-a4d3-34a094cbf64c") : secret "kube-state-metrics-tls" not found Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.136990 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/d5ee1a7c-0363-484c-a4d3-34a094cbf64c-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-l7kch\" (UID: \"d5ee1a7c-0363-484c-a4d3-34a094cbf64c\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-l7kch" Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.137375 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d5ee1a7c-0363-484c-a4d3-34a094cbf64c-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-l7kch\" (UID: \"d5ee1a7c-0363-484c-a4d3-34a094cbf64c\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-l7kch" Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.137417 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/d5ee1a7c-0363-484c-a4d3-34a094cbf64c-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-l7kch\" (UID: \"d5ee1a7c-0363-484c-a4d3-34a094cbf64c\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-l7kch" Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.140098 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d5ee1a7c-0363-484c-a4d3-34a094cbf64c-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-l7kch\" (UID: \"d5ee1a7c-0363-484c-a4d3-34a094cbf64c\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-l7kch" Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.150786 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7696\" (UniqueName: \"kubernetes.io/projected/d5ee1a7c-0363-484c-a4d3-34a094cbf64c-kube-api-access-z7696\") pod \"kube-state-metrics-777cb5bd5d-l7kch\" (UID: \"d5ee1a7c-0363-484c-a4d3-34a094cbf64c\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-l7kch" Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.235267 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-566fddb674-nqvg4" Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.237431 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b9517cee-fd39-4d14-9084-aed048265f25-metrics-client-ca\") pod \"node-exporter-czcv4\" (UID: \"b9517cee-fd39-4d14-9084-aed048265f25\") " pod="openshift-monitoring/node-exporter-czcv4" Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.237479 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b9517cee-fd39-4d14-9084-aed048265f25-sys\") pod \"node-exporter-czcv4\" (UID: \"b9517cee-fd39-4d14-9084-aed048265f25\") " pod="openshift-monitoring/node-exporter-czcv4" Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.237513 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b9517cee-fd39-4d14-9084-aed048265f25-node-exporter-tls\") pod \"node-exporter-czcv4\" (UID: \"b9517cee-fd39-4d14-9084-aed048265f25\") " pod="openshift-monitoring/node-exporter-czcv4" Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.237548 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b9517cee-fd39-4d14-9084-aed048265f25-root\") pod \"node-exporter-czcv4\" (UID: \"b9517cee-fd39-4d14-9084-aed048265f25\") " pod="openshift-monitoring/node-exporter-czcv4" Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.237570 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbck5\" (UniqueName: \"kubernetes.io/projected/b9517cee-fd39-4d14-9084-aed048265f25-kube-api-access-jbck5\") pod \"node-exporter-czcv4\" (UID: \"b9517cee-fd39-4d14-9084-aed048265f25\") " pod="openshift-monitoring/node-exporter-czcv4" Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.237607 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b9517cee-fd39-4d14-9084-aed048265f25-node-exporter-textfile\") pod \"node-exporter-czcv4\" (UID: \"b9517cee-fd39-4d14-9084-aed048265f25\") " pod="openshift-monitoring/node-exporter-czcv4" Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.237617 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b9517cee-fd39-4d14-9084-aed048265f25-sys\") pod \"node-exporter-czcv4\" (UID: \"b9517cee-fd39-4d14-9084-aed048265f25\") " pod="openshift-monitoring/node-exporter-czcv4" Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.237629 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b9517cee-fd39-4d14-9084-aed048265f25-node-exporter-wtmp\") pod \"node-exporter-czcv4\" (UID: \"b9517cee-fd39-4d14-9084-aed048265f25\") " pod="openshift-monitoring/node-exporter-czcv4" Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.237741 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b9517cee-fd39-4d14-9084-aed048265f25-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-czcv4\" (UID: \"b9517cee-fd39-4d14-9084-aed048265f25\") " pod="openshift-monitoring/node-exporter-czcv4" Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.237754 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b9517cee-fd39-4d14-9084-aed048265f25-node-exporter-wtmp\") pod \"node-exporter-czcv4\" (UID: \"b9517cee-fd39-4d14-9084-aed048265f25\") " pod="openshift-monitoring/node-exporter-czcv4" Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.237936 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b9517cee-fd39-4d14-9084-aed048265f25-root\") pod \"node-exporter-czcv4\" (UID: \"b9517cee-fd39-4d14-9084-aed048265f25\") " pod="openshift-monitoring/node-exporter-czcv4" Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.238163 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b9517cee-fd39-4d14-9084-aed048265f25-node-exporter-textfile\") pod \"node-exporter-czcv4\" (UID: \"b9517cee-fd39-4d14-9084-aed048265f25\") " pod="openshift-monitoring/node-exporter-czcv4" Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.238273 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b9517cee-fd39-4d14-9084-aed048265f25-metrics-client-ca\") pod \"node-exporter-czcv4\" (UID: \"b9517cee-fd39-4d14-9084-aed048265f25\") " pod="openshift-monitoring/node-exporter-czcv4" Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.240782 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b9517cee-fd39-4d14-9084-aed048265f25-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-czcv4\" (UID: \"b9517cee-fd39-4d14-9084-aed048265f25\") " pod="openshift-monitoring/node-exporter-czcv4" Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.241108 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b9517cee-fd39-4d14-9084-aed048265f25-node-exporter-tls\") pod \"node-exporter-czcv4\" (UID: \"b9517cee-fd39-4d14-9084-aed048265f25\") " pod="openshift-monitoring/node-exporter-czcv4" Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.258235 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbck5\" (UniqueName: \"kubernetes.io/projected/b9517cee-fd39-4d14-9084-aed048265f25-kube-api-access-jbck5\") pod \"node-exporter-czcv4\" (UID: \"b9517cee-fd39-4d14-9084-aed048265f25\") " pod="openshift-monitoring/node-exporter-czcv4" Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.268417 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-czcv4" Mar 19 19:03:06 crc kubenswrapper[4826]: W0319 19:03:06.296644 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9517cee_fd39_4d14_9084_aed048265f25.slice/crio-68a61a629b624959a96a0511ec20af0b44c87e745c65c6874e4b77145f86573a WatchSource:0}: Error finding container 68a61a629b624959a96a0511ec20af0b44c87e745c65c6874e4b77145f86573a: Status 404 returned error can't find the container with id 68a61a629b624959a96a0511ec20af0b44c87e745c65c6874e4b77145f86573a Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.450643 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-czcv4" event={"ID":"b9517cee-fd39-4d14-9084-aed048265f25","Type":"ContainerStarted","Data":"68a61a629b624959a96a0511ec20af0b44c87e745c65c6874e4b77145f86573a"} Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.644680 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d5ee1a7c-0363-484c-a4d3-34a094cbf64c-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-l7kch\" (UID: \"d5ee1a7c-0363-484c-a4d3-34a094cbf64c\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-l7kch" Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.651622 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d5ee1a7c-0363-484c-a4d3-34a094cbf64c-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-l7kch\" (UID: \"d5ee1a7c-0363-484c-a4d3-34a094cbf64c\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-l7kch" Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.656320 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-nqvg4"] Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.857519 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-l7kch" Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.944124 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.947805 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.951355 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.951598 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.952053 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.952254 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.952387 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.952505 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.957953 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.959001 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-dockercfg-hx76n" Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.964429 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Mar 19 19:03:06 crc kubenswrapper[4826]: I0319 19:03:06.980329 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 19 19:03:07 crc kubenswrapper[4826]: I0319 19:03:07.050732 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c-tls-assets\") pod \"alertmanager-main-0\" (UID: \"6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 19:03:07 crc kubenswrapper[4826]: I0319 19:03:07.050859 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 19:03:07 crc kubenswrapper[4826]: I0319 19:03:07.050892 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4hqf\" (UniqueName: \"kubernetes.io/projected/6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c-kube-api-access-w4hqf\") pod \"alertmanager-main-0\" (UID: \"6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 19:03:07 crc kubenswrapper[4826]: I0319 19:03:07.050932 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 19:03:07 crc kubenswrapper[4826]: I0319 19:03:07.051040 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 19:03:07 crc kubenswrapper[4826]: I0319 19:03:07.051207 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c-config-volume\") pod \"alertmanager-main-0\" (UID: \"6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 19:03:07 crc kubenswrapper[4826]: I0319 19:03:07.051281 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 19:03:07 crc kubenswrapper[4826]: I0319 19:03:07.051347 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 19:03:07 crc kubenswrapper[4826]: I0319 19:03:07.051412 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c-web-config\") pod \"alertmanager-main-0\" (UID: \"6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 19:03:07 crc kubenswrapper[4826]: I0319 19:03:07.051446 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 19:03:07 crc kubenswrapper[4826]: I0319 19:03:07.051484 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 19:03:07 crc kubenswrapper[4826]: I0319 19:03:07.051533 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c-config-out\") pod \"alertmanager-main-0\" (UID: \"6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 19:03:07 crc kubenswrapper[4826]: I0319 19:03:07.118778 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-l7kch"] Mar 19 19:03:07 crc kubenswrapper[4826]: I0319 19:03:07.152586 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 19:03:07 crc kubenswrapper[4826]: I0319 19:03:07.152634 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 19:03:07 crc kubenswrapper[4826]: I0319 19:03:07.152678 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c-web-config\") pod \"alertmanager-main-0\" (UID: \"6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 19:03:07 crc kubenswrapper[4826]: I0319 19:03:07.152695 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 19:03:07 crc kubenswrapper[4826]: I0319 19:03:07.152716 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 19:03:07 crc kubenswrapper[4826]: I0319 19:03:07.152739 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c-config-out\") pod \"alertmanager-main-0\" (UID: \"6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 19:03:07 crc kubenswrapper[4826]: I0319 19:03:07.152775 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c-tls-assets\") pod \"alertmanager-main-0\" (UID: \"6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 19:03:07 crc kubenswrapper[4826]: I0319 19:03:07.152811 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 19:03:07 crc kubenswrapper[4826]: I0319 19:03:07.152855 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4hqf\" (UniqueName: \"kubernetes.io/projected/6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c-kube-api-access-w4hqf\") pod \"alertmanager-main-0\" (UID: \"6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 19:03:07 crc kubenswrapper[4826]: I0319 19:03:07.152882 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 19:03:07 crc kubenswrapper[4826]: I0319 19:03:07.152929 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 19:03:07 crc kubenswrapper[4826]: I0319 19:03:07.153071 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c-config-volume\") pod \"alertmanager-main-0\" (UID: \"6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 19:03:07 crc kubenswrapper[4826]: I0319 19:03:07.153733 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 19:03:07 crc kubenswrapper[4826]: I0319 19:03:07.153783 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 19:03:07 crc kubenswrapper[4826]: I0319 19:03:07.153997 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 19:03:07 crc kubenswrapper[4826]: I0319 19:03:07.158336 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 19:03:07 crc kubenswrapper[4826]: I0319 19:03:07.158849 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 19:03:07 crc kubenswrapper[4826]: I0319 19:03:07.158990 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c-config-volume\") pod \"alertmanager-main-0\" (UID: \"6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 19:03:07 crc kubenswrapper[4826]: I0319 19:03:07.159111 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c-web-config\") pod \"alertmanager-main-0\" (UID: \"6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 19:03:07 crc kubenswrapper[4826]: I0319 19:03:07.159152 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 19:03:07 crc kubenswrapper[4826]: I0319 19:03:07.163099 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c-tls-assets\") pod \"alertmanager-main-0\" (UID: \"6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 19:03:07 crc kubenswrapper[4826]: I0319 19:03:07.163443 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c-config-out\") pod \"alertmanager-main-0\" (UID: \"6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 19:03:07 crc kubenswrapper[4826]: I0319 19:03:07.167286 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 19:03:07 crc kubenswrapper[4826]: I0319 19:03:07.170034 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4hqf\" (UniqueName: \"kubernetes.io/projected/6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c-kube-api-access-w4hqf\") pod \"alertmanager-main-0\" (UID: \"6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 19:03:07 crc kubenswrapper[4826]: I0319 19:03:07.272157 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 19 19:03:07 crc kubenswrapper[4826]: I0319 19:03:07.457863 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-l7kch" event={"ID":"d5ee1a7c-0363-484c-a4d3-34a094cbf64c","Type":"ContainerStarted","Data":"523f2a847471a9576e1f13575344cf0577147e01970756bdddd5b5b1bbc4fdf0"} Mar 19 19:03:07 crc kubenswrapper[4826]: I0319 19:03:07.459584 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-nqvg4" event={"ID":"6cddb4c8-eded-4ea1-86e9-2d99378281cf","Type":"ContainerStarted","Data":"5a732564bcdfabafc155077e80703e0fe4f344548ba8ec48d761f8ae4ac90351"} Mar 19 19:03:07 crc kubenswrapper[4826]: I0319 19:03:07.459614 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-nqvg4" event={"ID":"6cddb4c8-eded-4ea1-86e9-2d99378281cf","Type":"ContainerStarted","Data":"f62506f2add87be6bae4776b312777920ae0d21a750994521073b490acc154aa"} Mar 19 19:03:07 crc kubenswrapper[4826]: I0319 19:03:07.459627 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-nqvg4" event={"ID":"6cddb4c8-eded-4ea1-86e9-2d99378281cf","Type":"ContainerStarted","Data":"64f351d3efe4bec0fd4e3b4e061365074d91f7c9effe176630913f126db68e30"} Mar 19 19:03:07 crc kubenswrapper[4826]: I0319 19:03:07.911823 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-788cb6bfb6-558hf"] Mar 19 19:03:07 crc kubenswrapper[4826]: I0319 19:03:07.914073 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-788cb6bfb6-558hf" Mar 19 19:03:07 crc kubenswrapper[4826]: I0319 19:03:07.915568 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-dockercfg-j26xj" Mar 19 19:03:07 crc kubenswrapper[4826]: I0319 19:03:07.915969 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Mar 19 19:03:07 crc kubenswrapper[4826]: I0319 19:03:07.916216 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Mar 19 19:03:07 crc kubenswrapper[4826]: I0319 19:03:07.916379 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Mar 19 19:03:07 crc kubenswrapper[4826]: I0319 19:03:07.916417 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Mar 19 19:03:07 crc kubenswrapper[4826]: I0319 19:03:07.919364 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Mar 19 19:03:07 crc kubenswrapper[4826]: I0319 19:03:07.919596 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-j3lekm4apdg3" Mar 19 19:03:07 crc kubenswrapper[4826]: I0319 19:03:07.930855 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-788cb6bfb6-558hf"] Mar 19 19:03:07 crc kubenswrapper[4826]: I0319 19:03:07.948438 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 19 19:03:07 crc kubenswrapper[4826]: I0319 19:03:07.965960 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c1f0645f-c1e5-40ed-8cc7-d3b7e15175b8-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-788cb6bfb6-558hf\" (UID: \"c1f0645f-c1e5-40ed-8cc7-d3b7e15175b8\") " pod="openshift-monitoring/thanos-querier-788cb6bfb6-558hf" Mar 19 19:03:07 crc kubenswrapper[4826]: I0319 19:03:07.966019 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c1f0645f-c1e5-40ed-8cc7-d3b7e15175b8-metrics-client-ca\") pod \"thanos-querier-788cb6bfb6-558hf\" (UID: \"c1f0645f-c1e5-40ed-8cc7-d3b7e15175b8\") " pod="openshift-monitoring/thanos-querier-788cb6bfb6-558hf" Mar 19 19:03:07 crc kubenswrapper[4826]: I0319 19:03:07.966097 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/c1f0645f-c1e5-40ed-8cc7-d3b7e15175b8-secret-thanos-querier-tls\") pod \"thanos-querier-788cb6bfb6-558hf\" (UID: \"c1f0645f-c1e5-40ed-8cc7-d3b7e15175b8\") " pod="openshift-monitoring/thanos-querier-788cb6bfb6-558hf" Mar 19 19:03:07 crc kubenswrapper[4826]: I0319 19:03:07.966126 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/c1f0645f-c1e5-40ed-8cc7-d3b7e15175b8-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-788cb6bfb6-558hf\" (UID: \"c1f0645f-c1e5-40ed-8cc7-d3b7e15175b8\") " pod="openshift-monitoring/thanos-querier-788cb6bfb6-558hf" Mar 19 19:03:07 crc kubenswrapper[4826]: I0319 19:03:07.966164 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c1f0645f-c1e5-40ed-8cc7-d3b7e15175b8-secret-grpc-tls\") pod \"thanos-querier-788cb6bfb6-558hf\" (UID: \"c1f0645f-c1e5-40ed-8cc7-d3b7e15175b8\") " pod="openshift-monitoring/thanos-querier-788cb6bfb6-558hf" Mar 19 19:03:07 crc kubenswrapper[4826]: I0319 19:03:07.966204 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c1f0645f-c1e5-40ed-8cc7-d3b7e15175b8-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-788cb6bfb6-558hf\" (UID: \"c1f0645f-c1e5-40ed-8cc7-d3b7e15175b8\") " pod="openshift-monitoring/thanos-querier-788cb6bfb6-558hf" Mar 19 19:03:07 crc kubenswrapper[4826]: I0319 19:03:07.966231 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pg5pk\" (UniqueName: \"kubernetes.io/projected/c1f0645f-c1e5-40ed-8cc7-d3b7e15175b8-kube-api-access-pg5pk\") pod \"thanos-querier-788cb6bfb6-558hf\" (UID: \"c1f0645f-c1e5-40ed-8cc7-d3b7e15175b8\") " pod="openshift-monitoring/thanos-querier-788cb6bfb6-558hf" Mar 19 19:03:07 crc kubenswrapper[4826]: I0319 19:03:07.966264 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/c1f0645f-c1e5-40ed-8cc7-d3b7e15175b8-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-788cb6bfb6-558hf\" (UID: \"c1f0645f-c1e5-40ed-8cc7-d3b7e15175b8\") " pod="openshift-monitoring/thanos-querier-788cb6bfb6-558hf" Mar 19 19:03:07 crc kubenswrapper[4826]: W0319 19:03:07.967746 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6df4c2f2_13c6_4a5c_b455_fc8d6dd5c64c.slice/crio-4b701164ec4ec105291e930945500645056032c2ed089a115531ccad88e9f6b0 WatchSource:0}: Error finding container 4b701164ec4ec105291e930945500645056032c2ed089a115531ccad88e9f6b0: Status 404 returned error can't find the container with id 4b701164ec4ec105291e930945500645056032c2ed089a115531ccad88e9f6b0 Mar 19 19:03:08 crc kubenswrapper[4826]: I0319 19:03:08.066745 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c1f0645f-c1e5-40ed-8cc7-d3b7e15175b8-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-788cb6bfb6-558hf\" (UID: \"c1f0645f-c1e5-40ed-8cc7-d3b7e15175b8\") " pod="openshift-monitoring/thanos-querier-788cb6bfb6-558hf" Mar 19 19:03:08 crc kubenswrapper[4826]: I0319 19:03:08.066792 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pg5pk\" (UniqueName: \"kubernetes.io/projected/c1f0645f-c1e5-40ed-8cc7-d3b7e15175b8-kube-api-access-pg5pk\") pod \"thanos-querier-788cb6bfb6-558hf\" (UID: \"c1f0645f-c1e5-40ed-8cc7-d3b7e15175b8\") " pod="openshift-monitoring/thanos-querier-788cb6bfb6-558hf" Mar 19 19:03:08 crc kubenswrapper[4826]: I0319 19:03:08.066826 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/c1f0645f-c1e5-40ed-8cc7-d3b7e15175b8-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-788cb6bfb6-558hf\" (UID: \"c1f0645f-c1e5-40ed-8cc7-d3b7e15175b8\") " pod="openshift-monitoring/thanos-querier-788cb6bfb6-558hf" Mar 19 19:03:08 crc kubenswrapper[4826]: I0319 19:03:08.066878 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c1f0645f-c1e5-40ed-8cc7-d3b7e15175b8-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-788cb6bfb6-558hf\" (UID: \"c1f0645f-c1e5-40ed-8cc7-d3b7e15175b8\") " pod="openshift-monitoring/thanos-querier-788cb6bfb6-558hf" Mar 19 19:03:08 crc kubenswrapper[4826]: I0319 19:03:08.066899 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c1f0645f-c1e5-40ed-8cc7-d3b7e15175b8-metrics-client-ca\") pod \"thanos-querier-788cb6bfb6-558hf\" (UID: \"c1f0645f-c1e5-40ed-8cc7-d3b7e15175b8\") " pod="openshift-monitoring/thanos-querier-788cb6bfb6-558hf" Mar 19 19:03:08 crc kubenswrapper[4826]: I0319 19:03:08.066927 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/c1f0645f-c1e5-40ed-8cc7-d3b7e15175b8-secret-thanos-querier-tls\") pod \"thanos-querier-788cb6bfb6-558hf\" (UID: \"c1f0645f-c1e5-40ed-8cc7-d3b7e15175b8\") " pod="openshift-monitoring/thanos-querier-788cb6bfb6-558hf" Mar 19 19:03:08 crc kubenswrapper[4826]: I0319 19:03:08.066944 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/c1f0645f-c1e5-40ed-8cc7-d3b7e15175b8-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-788cb6bfb6-558hf\" (UID: \"c1f0645f-c1e5-40ed-8cc7-d3b7e15175b8\") " pod="openshift-monitoring/thanos-querier-788cb6bfb6-558hf" Mar 19 19:03:08 crc kubenswrapper[4826]: I0319 19:03:08.066971 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c1f0645f-c1e5-40ed-8cc7-d3b7e15175b8-secret-grpc-tls\") pod \"thanos-querier-788cb6bfb6-558hf\" (UID: \"c1f0645f-c1e5-40ed-8cc7-d3b7e15175b8\") " pod="openshift-monitoring/thanos-querier-788cb6bfb6-558hf" Mar 19 19:03:08 crc kubenswrapper[4826]: I0319 19:03:08.067964 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c1f0645f-c1e5-40ed-8cc7-d3b7e15175b8-metrics-client-ca\") pod \"thanos-querier-788cb6bfb6-558hf\" (UID: \"c1f0645f-c1e5-40ed-8cc7-d3b7e15175b8\") " pod="openshift-monitoring/thanos-querier-788cb6bfb6-558hf" Mar 19 19:03:08 crc kubenswrapper[4826]: I0319 19:03:08.071798 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c1f0645f-c1e5-40ed-8cc7-d3b7e15175b8-secret-grpc-tls\") pod \"thanos-querier-788cb6bfb6-558hf\" (UID: \"c1f0645f-c1e5-40ed-8cc7-d3b7e15175b8\") " pod="openshift-monitoring/thanos-querier-788cb6bfb6-558hf" Mar 19 19:03:08 crc kubenswrapper[4826]: I0319 19:03:08.071814 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c1f0645f-c1e5-40ed-8cc7-d3b7e15175b8-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-788cb6bfb6-558hf\" (UID: \"c1f0645f-c1e5-40ed-8cc7-d3b7e15175b8\") " pod="openshift-monitoring/thanos-querier-788cb6bfb6-558hf" Mar 19 19:03:08 crc kubenswrapper[4826]: I0319 19:03:08.072094 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/c1f0645f-c1e5-40ed-8cc7-d3b7e15175b8-secret-thanos-querier-tls\") pod \"thanos-querier-788cb6bfb6-558hf\" (UID: \"c1f0645f-c1e5-40ed-8cc7-d3b7e15175b8\") " pod="openshift-monitoring/thanos-querier-788cb6bfb6-558hf" Mar 19 19:03:08 crc kubenswrapper[4826]: I0319 19:03:08.073308 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/c1f0645f-c1e5-40ed-8cc7-d3b7e15175b8-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-788cb6bfb6-558hf\" (UID: \"c1f0645f-c1e5-40ed-8cc7-d3b7e15175b8\") " pod="openshift-monitoring/thanos-querier-788cb6bfb6-558hf" Mar 19 19:03:08 crc kubenswrapper[4826]: I0319 19:03:08.076076 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c1f0645f-c1e5-40ed-8cc7-d3b7e15175b8-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-788cb6bfb6-558hf\" (UID: \"c1f0645f-c1e5-40ed-8cc7-d3b7e15175b8\") " pod="openshift-monitoring/thanos-querier-788cb6bfb6-558hf" Mar 19 19:03:08 crc kubenswrapper[4826]: I0319 19:03:08.076981 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/c1f0645f-c1e5-40ed-8cc7-d3b7e15175b8-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-788cb6bfb6-558hf\" (UID: \"c1f0645f-c1e5-40ed-8cc7-d3b7e15175b8\") " pod="openshift-monitoring/thanos-querier-788cb6bfb6-558hf" Mar 19 19:03:08 crc kubenswrapper[4826]: I0319 19:03:08.085488 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pg5pk\" (UniqueName: \"kubernetes.io/projected/c1f0645f-c1e5-40ed-8cc7-d3b7e15175b8-kube-api-access-pg5pk\") pod \"thanos-querier-788cb6bfb6-558hf\" (UID: \"c1f0645f-c1e5-40ed-8cc7-d3b7e15175b8\") " pod="openshift-monitoring/thanos-querier-788cb6bfb6-558hf" Mar 19 19:03:08 crc kubenswrapper[4826]: I0319 19:03:08.235048 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-788cb6bfb6-558hf" Mar 19 19:03:08 crc kubenswrapper[4826]: I0319 19:03:08.468768 4826 generic.go:334] "Generic (PLEG): container finished" podID="b9517cee-fd39-4d14-9084-aed048265f25" containerID="edbf123419aaa774dc07714954bd7b50754216119b2032372d191d1a8d17dba9" exitCode=0 Mar 19 19:03:08 crc kubenswrapper[4826]: I0319 19:03:08.469010 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-czcv4" event={"ID":"b9517cee-fd39-4d14-9084-aed048265f25","Type":"ContainerDied","Data":"edbf123419aaa774dc07714954bd7b50754216119b2032372d191d1a8d17dba9"} Mar 19 19:03:08 crc kubenswrapper[4826]: I0319 19:03:08.471417 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c","Type":"ContainerStarted","Data":"4b701164ec4ec105291e930945500645056032c2ed089a115531ccad88e9f6b0"} Mar 19 19:03:09 crc kubenswrapper[4826]: I0319 19:03:09.029609 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-788cb6bfb6-558hf"] Mar 19 19:03:09 crc kubenswrapper[4826]: W0319 19:03:09.048295 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1f0645f_c1e5_40ed_8cc7_d3b7e15175b8.slice/crio-9d5b6f822b51f8ff1287629ef3af375881c3b1ad720a0cd01352bdc272e980ac WatchSource:0}: Error finding container 9d5b6f822b51f8ff1287629ef3af375881c3b1ad720a0cd01352bdc272e980ac: Status 404 returned error can't find the container with id 9d5b6f822b51f8ff1287629ef3af375881c3b1ad720a0cd01352bdc272e980ac Mar 19 19:03:09 crc kubenswrapper[4826]: I0319 19:03:09.481855 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-788cb6bfb6-558hf" event={"ID":"c1f0645f-c1e5-40ed-8cc7-d3b7e15175b8","Type":"ContainerStarted","Data":"9d5b6f822b51f8ff1287629ef3af375881c3b1ad720a0cd01352bdc272e980ac"} Mar 19 19:03:09 crc kubenswrapper[4826]: I0319 19:03:09.488988 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-l7kch" event={"ID":"d5ee1a7c-0363-484c-a4d3-34a094cbf64c","Type":"ContainerStarted","Data":"57baf5a583718014f3f3af9bfed15d303a340b9ffaf5c0398e06a634a80aae9e"} Mar 19 19:03:09 crc kubenswrapper[4826]: I0319 19:03:09.489057 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-l7kch" event={"ID":"d5ee1a7c-0363-484c-a4d3-34a094cbf64c","Type":"ContainerStarted","Data":"5da47ec762a7d395d094bfe78e58c4626dab0ff9aaa41bc5b68afdac42d06abf"} Mar 19 19:03:09 crc kubenswrapper[4826]: I0319 19:03:09.489072 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-l7kch" event={"ID":"d5ee1a7c-0363-484c-a4d3-34a094cbf64c","Type":"ContainerStarted","Data":"90b7276f2a11203ed4018a116df9e200bcf20eb9699bcfae07a8fdba70391d5e"} Mar 19 19:03:09 crc kubenswrapper[4826]: I0319 19:03:09.492606 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-nqvg4" event={"ID":"6cddb4c8-eded-4ea1-86e9-2d99378281cf","Type":"ContainerStarted","Data":"92f49c205c3b86ffad6c63d15d15c1b7d79db7a0d790316040dd85e7d6e78450"} Mar 19 19:03:09 crc kubenswrapper[4826]: I0319 19:03:09.496314 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-czcv4" event={"ID":"b9517cee-fd39-4d14-9084-aed048265f25","Type":"ContainerStarted","Data":"74ecaf9372736ddb9292bd7d87182c8a491371e178c3fb67c5f7fcbdac4af23e"} Mar 19 19:03:09 crc kubenswrapper[4826]: I0319 19:03:09.496342 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-czcv4" event={"ID":"b9517cee-fd39-4d14-9084-aed048265f25","Type":"ContainerStarted","Data":"52d7b09dcc8944a14b3057824d4232e1c85d0281532b5076910cfac996c87a4f"} Mar 19 19:03:09 crc kubenswrapper[4826]: I0319 19:03:09.517695 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-l7kch" podStartSLOduration=2.87440955 podStartE2EDuration="4.517675231s" podCreationTimestamp="2026-03-19 19:03:05 +0000 UTC" firstStartedPulling="2026-03-19 19:03:07.134992259 +0000 UTC m=+411.889060582" lastFinishedPulling="2026-03-19 19:03:08.77825795 +0000 UTC m=+413.532326263" observedRunningTime="2026-03-19 19:03:09.503942028 +0000 UTC m=+414.258010341" watchObservedRunningTime="2026-03-19 19:03:09.517675231 +0000 UTC m=+414.271743544" Mar 19 19:03:09 crc kubenswrapper[4826]: I0319 19:03:09.543788 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-czcv4" podStartSLOduration=3.273917876 podStartE2EDuration="4.543765639s" podCreationTimestamp="2026-03-19 19:03:05 +0000 UTC" firstStartedPulling="2026-03-19 19:03:06.29826917 +0000 UTC m=+411.052337483" lastFinishedPulling="2026-03-19 19:03:07.568116933 +0000 UTC m=+412.322185246" observedRunningTime="2026-03-19 19:03:09.539447055 +0000 UTC m=+414.293515368" watchObservedRunningTime="2026-03-19 19:03:09.543765639 +0000 UTC m=+414.297833962" Mar 19 19:03:09 crc kubenswrapper[4826]: I0319 19:03:09.564002 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-566fddb674-nqvg4" podStartSLOduration=2.786807476 podStartE2EDuration="4.563973163s" podCreationTimestamp="2026-03-19 19:03:05 +0000 UTC" firstStartedPulling="2026-03-19 19:03:07.005368717 +0000 UTC m=+411.759437030" lastFinishedPulling="2026-03-19 19:03:08.782534404 +0000 UTC m=+413.536602717" observedRunningTime="2026-03-19 19:03:09.554707398 +0000 UTC m=+414.308775751" watchObservedRunningTime="2026-03-19 19:03:09.563973163 +0000 UTC m=+414.318041516" Mar 19 19:03:10 crc kubenswrapper[4826]: I0319 19:03:10.505063 4826 generic.go:334] "Generic (PLEG): container finished" podID="6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c" containerID="6626879669ae2488a998193f535bbdc2e52556b3a5882229771d6430a78cf80c" exitCode=0 Mar 19 19:03:10 crc kubenswrapper[4826]: I0319 19:03:10.506609 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c","Type":"ContainerDied","Data":"6626879669ae2488a998193f535bbdc2e52556b3a5882229771d6430a78cf80c"} Mar 19 19:03:10 crc kubenswrapper[4826]: I0319 19:03:10.683028 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7c6477bfff-gdnfs"] Mar 19 19:03:10 crc kubenswrapper[4826]: I0319 19:03:10.684317 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c6477bfff-gdnfs" Mar 19 19:03:10 crc kubenswrapper[4826]: I0319 19:03:10.711871 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7c6477bfff-gdnfs"] Mar 19 19:03:10 crc kubenswrapper[4826]: I0319 19:03:10.812717 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cc698a42-bbab-4c83-b928-5b7dc3c75b33-oauth-serving-cert\") pod \"console-7c6477bfff-gdnfs\" (UID: \"cc698a42-bbab-4c83-b928-5b7dc3c75b33\") " pod="openshift-console/console-7c6477bfff-gdnfs" Mar 19 19:03:10 crc kubenswrapper[4826]: I0319 19:03:10.812768 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cc698a42-bbab-4c83-b928-5b7dc3c75b33-console-config\") pod \"console-7c6477bfff-gdnfs\" (UID: \"cc698a42-bbab-4c83-b928-5b7dc3c75b33\") " pod="openshift-console/console-7c6477bfff-gdnfs" Mar 19 19:03:10 crc kubenswrapper[4826]: I0319 19:03:10.812791 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cc698a42-bbab-4c83-b928-5b7dc3c75b33-console-serving-cert\") pod \"console-7c6477bfff-gdnfs\" (UID: \"cc698a42-bbab-4c83-b928-5b7dc3c75b33\") " pod="openshift-console/console-7c6477bfff-gdnfs" Mar 19 19:03:10 crc kubenswrapper[4826]: I0319 19:03:10.812822 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cc698a42-bbab-4c83-b928-5b7dc3c75b33-console-oauth-config\") pod \"console-7c6477bfff-gdnfs\" (UID: \"cc698a42-bbab-4c83-b928-5b7dc3c75b33\") " pod="openshift-console/console-7c6477bfff-gdnfs" Mar 19 19:03:10 crc kubenswrapper[4826]: I0319 19:03:10.812847 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff8jv\" (UniqueName: \"kubernetes.io/projected/cc698a42-bbab-4c83-b928-5b7dc3c75b33-kube-api-access-ff8jv\") pod \"console-7c6477bfff-gdnfs\" (UID: \"cc698a42-bbab-4c83-b928-5b7dc3c75b33\") " pod="openshift-console/console-7c6477bfff-gdnfs" Mar 19 19:03:10 crc kubenswrapper[4826]: I0319 19:03:10.812881 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc698a42-bbab-4c83-b928-5b7dc3c75b33-trusted-ca-bundle\") pod \"console-7c6477bfff-gdnfs\" (UID: \"cc698a42-bbab-4c83-b928-5b7dc3c75b33\") " pod="openshift-console/console-7c6477bfff-gdnfs" Mar 19 19:03:10 crc kubenswrapper[4826]: I0319 19:03:10.812924 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cc698a42-bbab-4c83-b928-5b7dc3c75b33-service-ca\") pod \"console-7c6477bfff-gdnfs\" (UID: \"cc698a42-bbab-4c83-b928-5b7dc3c75b33\") " pod="openshift-console/console-7c6477bfff-gdnfs" Mar 19 19:03:10 crc kubenswrapper[4826]: I0319 19:03:10.913912 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cc698a42-bbab-4c83-b928-5b7dc3c75b33-console-oauth-config\") pod \"console-7c6477bfff-gdnfs\" (UID: \"cc698a42-bbab-4c83-b928-5b7dc3c75b33\") " pod="openshift-console/console-7c6477bfff-gdnfs" Mar 19 19:03:10 crc kubenswrapper[4826]: I0319 19:03:10.913971 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ff8jv\" (UniqueName: \"kubernetes.io/projected/cc698a42-bbab-4c83-b928-5b7dc3c75b33-kube-api-access-ff8jv\") pod \"console-7c6477bfff-gdnfs\" (UID: \"cc698a42-bbab-4c83-b928-5b7dc3c75b33\") " pod="openshift-console/console-7c6477bfff-gdnfs" Mar 19 19:03:10 crc kubenswrapper[4826]: I0319 19:03:10.914006 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc698a42-bbab-4c83-b928-5b7dc3c75b33-trusted-ca-bundle\") pod \"console-7c6477bfff-gdnfs\" (UID: \"cc698a42-bbab-4c83-b928-5b7dc3c75b33\") " pod="openshift-console/console-7c6477bfff-gdnfs" Mar 19 19:03:10 crc kubenswrapper[4826]: I0319 19:03:10.914075 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cc698a42-bbab-4c83-b928-5b7dc3c75b33-service-ca\") pod \"console-7c6477bfff-gdnfs\" (UID: \"cc698a42-bbab-4c83-b928-5b7dc3c75b33\") " pod="openshift-console/console-7c6477bfff-gdnfs" Mar 19 19:03:10 crc kubenswrapper[4826]: I0319 19:03:10.914121 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cc698a42-bbab-4c83-b928-5b7dc3c75b33-oauth-serving-cert\") pod \"console-7c6477bfff-gdnfs\" (UID: \"cc698a42-bbab-4c83-b928-5b7dc3c75b33\") " pod="openshift-console/console-7c6477bfff-gdnfs" Mar 19 19:03:10 crc kubenswrapper[4826]: I0319 19:03:10.914488 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cc698a42-bbab-4c83-b928-5b7dc3c75b33-console-config\") pod \"console-7c6477bfff-gdnfs\" (UID: \"cc698a42-bbab-4c83-b928-5b7dc3c75b33\") " pod="openshift-console/console-7c6477bfff-gdnfs" Mar 19 19:03:10 crc kubenswrapper[4826]: I0319 19:03:10.915162 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cc698a42-bbab-4c83-b928-5b7dc3c75b33-console-serving-cert\") pod \"console-7c6477bfff-gdnfs\" (UID: \"cc698a42-bbab-4c83-b928-5b7dc3c75b33\") " pod="openshift-console/console-7c6477bfff-gdnfs" Mar 19 19:03:10 crc kubenswrapper[4826]: I0319 19:03:10.915396 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cc698a42-bbab-4c83-b928-5b7dc3c75b33-service-ca\") pod \"console-7c6477bfff-gdnfs\" (UID: \"cc698a42-bbab-4c83-b928-5b7dc3c75b33\") " pod="openshift-console/console-7c6477bfff-gdnfs" Mar 19 19:03:10 crc kubenswrapper[4826]: I0319 19:03:10.915697 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cc698a42-bbab-4c83-b928-5b7dc3c75b33-oauth-serving-cert\") pod \"console-7c6477bfff-gdnfs\" (UID: \"cc698a42-bbab-4c83-b928-5b7dc3c75b33\") " pod="openshift-console/console-7c6477bfff-gdnfs" Mar 19 19:03:10 crc kubenswrapper[4826]: I0319 19:03:10.916343 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc698a42-bbab-4c83-b928-5b7dc3c75b33-trusted-ca-bundle\") pod \"console-7c6477bfff-gdnfs\" (UID: \"cc698a42-bbab-4c83-b928-5b7dc3c75b33\") " pod="openshift-console/console-7c6477bfff-gdnfs" Mar 19 19:03:10 crc kubenswrapper[4826]: I0319 19:03:10.917755 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cc698a42-bbab-4c83-b928-5b7dc3c75b33-console-config\") pod \"console-7c6477bfff-gdnfs\" (UID: \"cc698a42-bbab-4c83-b928-5b7dc3c75b33\") " pod="openshift-console/console-7c6477bfff-gdnfs" Mar 19 19:03:10 crc kubenswrapper[4826]: I0319 19:03:10.919305 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cc698a42-bbab-4c83-b928-5b7dc3c75b33-console-serving-cert\") pod \"console-7c6477bfff-gdnfs\" (UID: \"cc698a42-bbab-4c83-b928-5b7dc3c75b33\") " pod="openshift-console/console-7c6477bfff-gdnfs" Mar 19 19:03:10 crc kubenswrapper[4826]: I0319 19:03:10.922296 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cc698a42-bbab-4c83-b928-5b7dc3c75b33-console-oauth-config\") pod \"console-7c6477bfff-gdnfs\" (UID: \"cc698a42-bbab-4c83-b928-5b7dc3c75b33\") " pod="openshift-console/console-7c6477bfff-gdnfs" Mar 19 19:03:10 crc kubenswrapper[4826]: I0319 19:03:10.943530 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff8jv\" (UniqueName: \"kubernetes.io/projected/cc698a42-bbab-4c83-b928-5b7dc3c75b33-kube-api-access-ff8jv\") pod \"console-7c6477bfff-gdnfs\" (UID: \"cc698a42-bbab-4c83-b928-5b7dc3c75b33\") " pod="openshift-console/console-7c6477bfff-gdnfs" Mar 19 19:03:11 crc kubenswrapper[4826]: I0319 19:03:11.000746 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c6477bfff-gdnfs" Mar 19 19:03:11 crc kubenswrapper[4826]: I0319 19:03:11.335638 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-657c5b447-gjh5h"] Mar 19 19:03:11 crc kubenswrapper[4826]: I0319 19:03:11.336511 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-657c5b447-gjh5h" Mar 19 19:03:11 crc kubenswrapper[4826]: I0319 19:03:11.339000 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-466hb" Mar 19 19:03:11 crc kubenswrapper[4826]: I0319 19:03:11.339225 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Mar 19 19:03:11 crc kubenswrapper[4826]: I0319 19:03:11.339383 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Mar 19 19:03:11 crc kubenswrapper[4826]: I0319 19:03:11.339460 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-eo2ljhv1bf4up" Mar 19 19:03:11 crc kubenswrapper[4826]: I0319 19:03:11.339960 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Mar 19 19:03:11 crc kubenswrapper[4826]: I0319 19:03:11.341311 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Mar 19 19:03:11 crc kubenswrapper[4826]: I0319 19:03:11.349997 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-657c5b447-gjh5h"] Mar 19 19:03:11 crc kubenswrapper[4826]: I0319 19:03:11.421633 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ab7c3046-ac34-417e-a7c6-63e500286063-secret-metrics-client-certs\") pod \"metrics-server-657c5b447-gjh5h\" (UID: \"ab7c3046-ac34-417e-a7c6-63e500286063\") " pod="openshift-monitoring/metrics-server-657c5b447-gjh5h" Mar 19 19:03:11 crc kubenswrapper[4826]: I0319 19:03:11.422041 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab7c3046-ac34-417e-a7c6-63e500286063-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-657c5b447-gjh5h\" (UID: \"ab7c3046-ac34-417e-a7c6-63e500286063\") " pod="openshift-monitoring/metrics-server-657c5b447-gjh5h" Mar 19 19:03:11 crc kubenswrapper[4826]: I0319 19:03:11.422103 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/ab7c3046-ac34-417e-a7c6-63e500286063-metrics-server-audit-profiles\") pod \"metrics-server-657c5b447-gjh5h\" (UID: \"ab7c3046-ac34-417e-a7c6-63e500286063\") " pod="openshift-monitoring/metrics-server-657c5b447-gjh5h" Mar 19 19:03:11 crc kubenswrapper[4826]: I0319 19:03:11.422268 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m78c\" (UniqueName: \"kubernetes.io/projected/ab7c3046-ac34-417e-a7c6-63e500286063-kube-api-access-5m78c\") pod \"metrics-server-657c5b447-gjh5h\" (UID: \"ab7c3046-ac34-417e-a7c6-63e500286063\") " pod="openshift-monitoring/metrics-server-657c5b447-gjh5h" Mar 19 19:03:11 crc kubenswrapper[4826]: I0319 19:03:11.422534 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/ab7c3046-ac34-417e-a7c6-63e500286063-audit-log\") pod \"metrics-server-657c5b447-gjh5h\" (UID: \"ab7c3046-ac34-417e-a7c6-63e500286063\") " pod="openshift-monitoring/metrics-server-657c5b447-gjh5h" Mar 19 19:03:11 crc kubenswrapper[4826]: I0319 19:03:11.422582 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/ab7c3046-ac34-417e-a7c6-63e500286063-secret-metrics-server-tls\") pod \"metrics-server-657c5b447-gjh5h\" (UID: \"ab7c3046-ac34-417e-a7c6-63e500286063\") " pod="openshift-monitoring/metrics-server-657c5b447-gjh5h" Mar 19 19:03:11 crc kubenswrapper[4826]: I0319 19:03:11.422736 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab7c3046-ac34-417e-a7c6-63e500286063-client-ca-bundle\") pod \"metrics-server-657c5b447-gjh5h\" (UID: \"ab7c3046-ac34-417e-a7c6-63e500286063\") " pod="openshift-monitoring/metrics-server-657c5b447-gjh5h" Mar 19 19:03:11 crc kubenswrapper[4826]: I0319 19:03:11.523818 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5m78c\" (UniqueName: \"kubernetes.io/projected/ab7c3046-ac34-417e-a7c6-63e500286063-kube-api-access-5m78c\") pod \"metrics-server-657c5b447-gjh5h\" (UID: \"ab7c3046-ac34-417e-a7c6-63e500286063\") " pod="openshift-monitoring/metrics-server-657c5b447-gjh5h" Mar 19 19:03:11 crc kubenswrapper[4826]: I0319 19:03:11.523890 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/ab7c3046-ac34-417e-a7c6-63e500286063-secret-metrics-server-tls\") pod \"metrics-server-657c5b447-gjh5h\" (UID: \"ab7c3046-ac34-417e-a7c6-63e500286063\") " pod="openshift-monitoring/metrics-server-657c5b447-gjh5h" Mar 19 19:03:11 crc kubenswrapper[4826]: I0319 19:03:11.523909 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/ab7c3046-ac34-417e-a7c6-63e500286063-audit-log\") pod \"metrics-server-657c5b447-gjh5h\" (UID: \"ab7c3046-ac34-417e-a7c6-63e500286063\") " pod="openshift-monitoring/metrics-server-657c5b447-gjh5h" Mar 19 19:03:11 crc kubenswrapper[4826]: I0319 19:03:11.523937 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab7c3046-ac34-417e-a7c6-63e500286063-client-ca-bundle\") pod \"metrics-server-657c5b447-gjh5h\" (UID: \"ab7c3046-ac34-417e-a7c6-63e500286063\") " pod="openshift-monitoring/metrics-server-657c5b447-gjh5h" Mar 19 19:03:11 crc kubenswrapper[4826]: I0319 19:03:11.523977 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ab7c3046-ac34-417e-a7c6-63e500286063-secret-metrics-client-certs\") pod \"metrics-server-657c5b447-gjh5h\" (UID: \"ab7c3046-ac34-417e-a7c6-63e500286063\") " pod="openshift-monitoring/metrics-server-657c5b447-gjh5h" Mar 19 19:03:11 crc kubenswrapper[4826]: I0319 19:03:11.523999 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab7c3046-ac34-417e-a7c6-63e500286063-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-657c5b447-gjh5h\" (UID: \"ab7c3046-ac34-417e-a7c6-63e500286063\") " pod="openshift-monitoring/metrics-server-657c5b447-gjh5h" Mar 19 19:03:11 crc kubenswrapper[4826]: I0319 19:03:11.524042 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/ab7c3046-ac34-417e-a7c6-63e500286063-metrics-server-audit-profiles\") pod \"metrics-server-657c5b447-gjh5h\" (UID: \"ab7c3046-ac34-417e-a7c6-63e500286063\") " pod="openshift-monitoring/metrics-server-657c5b447-gjh5h" Mar 19 19:03:11 crc kubenswrapper[4826]: I0319 19:03:11.526406 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab7c3046-ac34-417e-a7c6-63e500286063-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-657c5b447-gjh5h\" (UID: \"ab7c3046-ac34-417e-a7c6-63e500286063\") " pod="openshift-monitoring/metrics-server-657c5b447-gjh5h" Mar 19 19:03:11 crc kubenswrapper[4826]: I0319 19:03:11.526682 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/ab7c3046-ac34-417e-a7c6-63e500286063-audit-log\") pod \"metrics-server-657c5b447-gjh5h\" (UID: \"ab7c3046-ac34-417e-a7c6-63e500286063\") " pod="openshift-monitoring/metrics-server-657c5b447-gjh5h" Mar 19 19:03:11 crc kubenswrapper[4826]: I0319 19:03:11.526954 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/ab7c3046-ac34-417e-a7c6-63e500286063-metrics-server-audit-profiles\") pod \"metrics-server-657c5b447-gjh5h\" (UID: \"ab7c3046-ac34-417e-a7c6-63e500286063\") " pod="openshift-monitoring/metrics-server-657c5b447-gjh5h" Mar 19 19:03:11 crc kubenswrapper[4826]: I0319 19:03:11.528817 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ab7c3046-ac34-417e-a7c6-63e500286063-secret-metrics-client-certs\") pod \"metrics-server-657c5b447-gjh5h\" (UID: \"ab7c3046-ac34-417e-a7c6-63e500286063\") " pod="openshift-monitoring/metrics-server-657c5b447-gjh5h" Mar 19 19:03:11 crc kubenswrapper[4826]: I0319 19:03:11.529443 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/ab7c3046-ac34-417e-a7c6-63e500286063-secret-metrics-server-tls\") pod \"metrics-server-657c5b447-gjh5h\" (UID: \"ab7c3046-ac34-417e-a7c6-63e500286063\") " pod="openshift-monitoring/metrics-server-657c5b447-gjh5h" Mar 19 19:03:11 crc kubenswrapper[4826]: I0319 19:03:11.529717 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab7c3046-ac34-417e-a7c6-63e500286063-client-ca-bundle\") pod \"metrics-server-657c5b447-gjh5h\" (UID: \"ab7c3046-ac34-417e-a7c6-63e500286063\") " pod="openshift-monitoring/metrics-server-657c5b447-gjh5h" Mar 19 19:03:11 crc kubenswrapper[4826]: I0319 19:03:11.541709 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m78c\" (UniqueName: \"kubernetes.io/projected/ab7c3046-ac34-417e-a7c6-63e500286063-kube-api-access-5m78c\") pod \"metrics-server-657c5b447-gjh5h\" (UID: \"ab7c3046-ac34-417e-a7c6-63e500286063\") " pod="openshift-monitoring/metrics-server-657c5b447-gjh5h" Mar 19 19:03:11 crc kubenswrapper[4826]: I0319 19:03:11.660062 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-657c5b447-gjh5h" Mar 19 19:03:11 crc kubenswrapper[4826]: I0319 19:03:11.663312 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-747c5d4c44-ltxl4"] Mar 19 19:03:11 crc kubenswrapper[4826]: I0319 19:03:11.664720 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-747c5d4c44-ltxl4" Mar 19 19:03:11 crc kubenswrapper[4826]: I0319 19:03:11.667531 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-747c5d4c44-ltxl4"] Mar 19 19:03:11 crc kubenswrapper[4826]: I0319 19:03:11.668029 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-6tstp" Mar 19 19:03:11 crc kubenswrapper[4826]: I0319 19:03:11.669733 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Mar 19 19:03:11 crc kubenswrapper[4826]: I0319 19:03:11.828695 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2b19eec2-98e8-47bd-b68f-55b033eb788c-monitoring-plugin-cert\") pod \"monitoring-plugin-747c5d4c44-ltxl4\" (UID: \"2b19eec2-98e8-47bd-b68f-55b033eb788c\") " pod="openshift-monitoring/monitoring-plugin-747c5d4c44-ltxl4" Mar 19 19:03:11 crc kubenswrapper[4826]: I0319 19:03:11.932506 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2b19eec2-98e8-47bd-b68f-55b033eb788c-monitoring-plugin-cert\") pod \"monitoring-plugin-747c5d4c44-ltxl4\" (UID: \"2b19eec2-98e8-47bd-b68f-55b033eb788c\") " pod="openshift-monitoring/monitoring-plugin-747c5d4c44-ltxl4" Mar 19 19:03:11 crc kubenswrapper[4826]: I0319 19:03:11.938009 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2b19eec2-98e8-47bd-b68f-55b033eb788c-monitoring-plugin-cert\") pod \"monitoring-plugin-747c5d4c44-ltxl4\" (UID: \"2b19eec2-98e8-47bd-b68f-55b033eb788c\") " pod="openshift-monitoring/monitoring-plugin-747c5d4c44-ltxl4" Mar 19 19:03:11 crc kubenswrapper[4826]: I0319 19:03:11.941166 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-657c5b447-gjh5h"] Mar 19 19:03:11 crc kubenswrapper[4826]: W0319 19:03:11.953732 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab7c3046_ac34_417e_a7c6_63e500286063.slice/crio-c828155f9b9dc7de671cecff8a5fc9e9f3b9d67da86f569a6331425bf101afc4 WatchSource:0}: Error finding container c828155f9b9dc7de671cecff8a5fc9e9f3b9d67da86f569a6331425bf101afc4: Status 404 returned error can't find the container with id c828155f9b9dc7de671cecff8a5fc9e9f3b9d67da86f569a6331425bf101afc4 Mar 19 19:03:11 crc kubenswrapper[4826]: I0319 19:03:11.987686 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-747c5d4c44-ltxl4" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.199471 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7c6477bfff-gdnfs"] Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.204453 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-747c5d4c44-ltxl4"] Mar 19 19:03:12 crc kubenswrapper[4826]: W0319 19:03:12.229869 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b19eec2_98e8_47bd_b68f_55b033eb788c.slice/crio-ac03a1cd8645b7e29fe51fa000dbbd95edcad91f234eed209038c59b7a042995 WatchSource:0}: Error finding container ac03a1cd8645b7e29fe51fa000dbbd95edcad91f234eed209038c59b7a042995: Status 404 returned error can't find the container with id ac03a1cd8645b7e29fe51fa000dbbd95edcad91f234eed209038c59b7a042995 Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.247522 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.248944 4826 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.250336 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.255528 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.255769 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.255967 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.256112 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.256268 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.256366 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.256491 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.256591 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.256717 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-dockercfg-bpsr8" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.257143 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-45sj43udjluik" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.257260 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.261607 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.266281 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.270218 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.439857 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/cb23233f-a975-4476-8bff-5e7b4b9c8646-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"cb23233f-a975-4476-8bff-5e7b4b9c8646\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.439899 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cb23233f-a975-4476-8bff-5e7b4b9c8646-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"cb23233f-a975-4476-8bff-5e7b4b9c8646\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.439929 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cb23233f-a975-4476-8bff-5e7b4b9c8646-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"cb23233f-a975-4476-8bff-5e7b4b9c8646\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.439971 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cb23233f-a975-4476-8bff-5e7b4b9c8646-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cb23233f-a975-4476-8bff-5e7b4b9c8646\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.439999 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cb23233f-a975-4476-8bff-5e7b4b9c8646-config-out\") pod \"prometheus-k8s-0\" (UID: \"cb23233f-a975-4476-8bff-5e7b4b9c8646\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.440021 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cb23233f-a975-4476-8bff-5e7b4b9c8646-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cb23233f-a975-4476-8bff-5e7b4b9c8646\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.440043 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cb23233f-a975-4476-8bff-5e7b4b9c8646-web-config\") pod \"prometheus-k8s-0\" (UID: \"cb23233f-a975-4476-8bff-5e7b4b9c8646\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.440060 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/cb23233f-a975-4476-8bff-5e7b4b9c8646-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"cb23233f-a975-4476-8bff-5e7b4b9c8646\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.440079 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cb23233f-a975-4476-8bff-5e7b4b9c8646-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"cb23233f-a975-4476-8bff-5e7b4b9c8646\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.440096 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cb23233f-a975-4476-8bff-5e7b4b9c8646-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cb23233f-a975-4476-8bff-5e7b4b9c8646\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.440113 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftxwc\" (UniqueName: \"kubernetes.io/projected/cb23233f-a975-4476-8bff-5e7b4b9c8646-kube-api-access-ftxwc\") pod \"prometheus-k8s-0\" (UID: \"cb23233f-a975-4476-8bff-5e7b4b9c8646\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.440131 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/cb23233f-a975-4476-8bff-5e7b4b9c8646-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"cb23233f-a975-4476-8bff-5e7b4b9c8646\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.440148 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cb23233f-a975-4476-8bff-5e7b4b9c8646-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"cb23233f-a975-4476-8bff-5e7b4b9c8646\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.440177 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cb23233f-a975-4476-8bff-5e7b4b9c8646-config\") pod \"prometheus-k8s-0\" (UID: \"cb23233f-a975-4476-8bff-5e7b4b9c8646\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.440193 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cb23233f-a975-4476-8bff-5e7b4b9c8646-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"cb23233f-a975-4476-8bff-5e7b4b9c8646\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.440218 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cb23233f-a975-4476-8bff-5e7b4b9c8646-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"cb23233f-a975-4476-8bff-5e7b4b9c8646\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.440245 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/cb23233f-a975-4476-8bff-5e7b4b9c8646-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"cb23233f-a975-4476-8bff-5e7b4b9c8646\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.440261 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cb23233f-a975-4476-8bff-5e7b4b9c8646-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"cb23233f-a975-4476-8bff-5e7b4b9c8646\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.522509 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-657c5b447-gjh5h" event={"ID":"ab7c3046-ac34-417e-a7c6-63e500286063","Type":"ContainerStarted","Data":"c828155f9b9dc7de671cecff8a5fc9e9f3b9d67da86f569a6331425bf101afc4"} Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.524823 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c6477bfff-gdnfs" event={"ID":"cc698a42-bbab-4c83-b928-5b7dc3c75b33","Type":"ContainerStarted","Data":"113bc059868573a5b61e620a2b1aab79976c1b23cf34b72b5efb63aff638fa49"} Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.524851 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c6477bfff-gdnfs" event={"ID":"cc698a42-bbab-4c83-b928-5b7dc3c75b33","Type":"ContainerStarted","Data":"c139b949197fd8038a8d3ce88ac1a0b8419ce316c3f9cac71fb522710cc47d2c"} Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.533729 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-788cb6bfb6-558hf" event={"ID":"c1f0645f-c1e5-40ed-8cc7-d3b7e15175b8","Type":"ContainerStarted","Data":"e9d975ec1ac22c06a34fff651dbe991741cc2ea1860de85b65baa1ac6acafec4"} Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.533793 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-788cb6bfb6-558hf" event={"ID":"c1f0645f-c1e5-40ed-8cc7-d3b7e15175b8","Type":"ContainerStarted","Data":"3af91a892c8d94d3c1cc1f6f93984346fa62df3e2798362947dc4edec71f2d6b"} Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.533805 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-788cb6bfb6-558hf" event={"ID":"c1f0645f-c1e5-40ed-8cc7-d3b7e15175b8","Type":"ContainerStarted","Data":"4bf14d08f00f5be7969edcc5230afd9d02008b7714eafc7bfb32c1f5042c71dd"} Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.535461 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-747c5d4c44-ltxl4" event={"ID":"2b19eec2-98e8-47bd-b68f-55b033eb788c","Type":"ContainerStarted","Data":"ac03a1cd8645b7e29fe51fa000dbbd95edcad91f234eed209038c59b7a042995"} Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.542593 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cb23233f-a975-4476-8bff-5e7b4b9c8646-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cb23233f-a975-4476-8bff-5e7b4b9c8646\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.542709 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cb23233f-a975-4476-8bff-5e7b4b9c8646-config-out\") pod \"prometheus-k8s-0\" (UID: \"cb23233f-a975-4476-8bff-5e7b4b9c8646\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.542795 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cb23233f-a975-4476-8bff-5e7b4b9c8646-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cb23233f-a975-4476-8bff-5e7b4b9c8646\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.542815 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cb23233f-a975-4476-8bff-5e7b4b9c8646-web-config\") pod \"prometheus-k8s-0\" (UID: \"cb23233f-a975-4476-8bff-5e7b4b9c8646\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.542862 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/cb23233f-a975-4476-8bff-5e7b4b9c8646-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"cb23233f-a975-4476-8bff-5e7b4b9c8646\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.542927 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cb23233f-a975-4476-8bff-5e7b4b9c8646-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"cb23233f-a975-4476-8bff-5e7b4b9c8646\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.542951 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cb23233f-a975-4476-8bff-5e7b4b9c8646-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cb23233f-a975-4476-8bff-5e7b4b9c8646\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.542967 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftxwc\" (UniqueName: \"kubernetes.io/projected/cb23233f-a975-4476-8bff-5e7b4b9c8646-kube-api-access-ftxwc\") pod \"prometheus-k8s-0\" (UID: \"cb23233f-a975-4476-8bff-5e7b4b9c8646\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.543120 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/cb23233f-a975-4476-8bff-5e7b4b9c8646-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"cb23233f-a975-4476-8bff-5e7b4b9c8646\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.543232 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cb23233f-a975-4476-8bff-5e7b4b9c8646-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"cb23233f-a975-4476-8bff-5e7b4b9c8646\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.543280 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cb23233f-a975-4476-8bff-5e7b4b9c8646-config\") pod \"prometheus-k8s-0\" (UID: \"cb23233f-a975-4476-8bff-5e7b4b9c8646\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.543321 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cb23233f-a975-4476-8bff-5e7b4b9c8646-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"cb23233f-a975-4476-8bff-5e7b4b9c8646\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.543349 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cb23233f-a975-4476-8bff-5e7b4b9c8646-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"cb23233f-a975-4476-8bff-5e7b4b9c8646\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.543410 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/cb23233f-a975-4476-8bff-5e7b4b9c8646-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"cb23233f-a975-4476-8bff-5e7b4b9c8646\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.543427 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cb23233f-a975-4476-8bff-5e7b4b9c8646-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"cb23233f-a975-4476-8bff-5e7b4b9c8646\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.543447 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/cb23233f-a975-4476-8bff-5e7b4b9c8646-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"cb23233f-a975-4476-8bff-5e7b4b9c8646\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.543489 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cb23233f-a975-4476-8bff-5e7b4b9c8646-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"cb23233f-a975-4476-8bff-5e7b4b9c8646\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.543524 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cb23233f-a975-4476-8bff-5e7b4b9c8646-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"cb23233f-a975-4476-8bff-5e7b4b9c8646\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.545314 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/cb23233f-a975-4476-8bff-5e7b4b9c8646-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"cb23233f-a975-4476-8bff-5e7b4b9c8646\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.550545 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cb23233f-a975-4476-8bff-5e7b4b9c8646-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cb23233f-a975-4476-8bff-5e7b4b9c8646\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.551856 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cb23233f-a975-4476-8bff-5e7b4b9c8646-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cb23233f-a975-4476-8bff-5e7b4b9c8646\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.554929 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/cb23233f-a975-4476-8bff-5e7b4b9c8646-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"cb23233f-a975-4476-8bff-5e7b4b9c8646\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.556740 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/cb23233f-a975-4476-8bff-5e7b4b9c8646-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"cb23233f-a975-4476-8bff-5e7b4b9c8646\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.557439 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cb23233f-a975-4476-8bff-5e7b4b9c8646-web-config\") pod \"prometheus-k8s-0\" (UID: \"cb23233f-a975-4476-8bff-5e7b4b9c8646\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.558162 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cb23233f-a975-4476-8bff-5e7b4b9c8646-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"cb23233f-a975-4476-8bff-5e7b4b9c8646\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.558307 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cb23233f-a975-4476-8bff-5e7b4b9c8646-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"cb23233f-a975-4476-8bff-5e7b4b9c8646\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.558635 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cb23233f-a975-4476-8bff-5e7b4b9c8646-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cb23233f-a975-4476-8bff-5e7b4b9c8646\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.558630 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cb23233f-a975-4476-8bff-5e7b4b9c8646-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"cb23233f-a975-4476-8bff-5e7b4b9c8646\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.559343 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/cb23233f-a975-4476-8bff-5e7b4b9c8646-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"cb23233f-a975-4476-8bff-5e7b4b9c8646\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.559471 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cb23233f-a975-4476-8bff-5e7b4b9c8646-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"cb23233f-a975-4476-8bff-5e7b4b9c8646\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.562974 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cb23233f-a975-4476-8bff-5e7b4b9c8646-config-out\") pod \"prometheus-k8s-0\" (UID: \"cb23233f-a975-4476-8bff-5e7b4b9c8646\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.563029 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cb23233f-a975-4476-8bff-5e7b4b9c8646-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"cb23233f-a975-4476-8bff-5e7b4b9c8646\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.563812 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/cb23233f-a975-4476-8bff-5e7b4b9c8646-config\") pod \"prometheus-k8s-0\" (UID: \"cb23233f-a975-4476-8bff-5e7b4b9c8646\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.564007 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cb23233f-a975-4476-8bff-5e7b4b9c8646-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"cb23233f-a975-4476-8bff-5e7b4b9c8646\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.568134 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cb23233f-a975-4476-8bff-5e7b4b9c8646-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"cb23233f-a975-4476-8bff-5e7b4b9c8646\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.574545 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftxwc\" (UniqueName: \"kubernetes.io/projected/cb23233f-a975-4476-8bff-5e7b4b9c8646-kube-api-access-ftxwc\") pod \"prometheus-k8s-0\" (UID: \"cb23233f-a975-4476-8bff-5e7b4b9c8646\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 19:03:12 crc kubenswrapper[4826]: I0319 19:03:12.588090 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 19 19:03:13 crc kubenswrapper[4826]: I0319 19:03:13.001110 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7c6477bfff-gdnfs" podStartSLOduration=3.001094209 podStartE2EDuration="3.001094209s" podCreationTimestamp="2026-03-19 19:03:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:03:12.548583614 +0000 UTC m=+417.302651937" watchObservedRunningTime="2026-03-19 19:03:13.001094209 +0000 UTC m=+417.755162522" Mar 19 19:03:13 crc kubenswrapper[4826]: I0319 19:03:13.002343 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 19 19:03:13 crc kubenswrapper[4826]: I0319 19:03:13.544828 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cb23233f-a975-4476-8bff-5e7b4b9c8646","Type":"ContainerStarted","Data":"74f55353d2508de5124f19f77cad0c40376ddc6a25c4810b6d81aae8f0f687c1"} Mar 19 19:03:14 crc kubenswrapper[4826]: I0319 19:03:14.554534 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-788cb6bfb6-558hf" event={"ID":"c1f0645f-c1e5-40ed-8cc7-d3b7e15175b8","Type":"ContainerStarted","Data":"4c775ab787da639d24e259a3d682cda6e9202c103495b974e963c607938760fc"} Mar 19 19:03:14 crc kubenswrapper[4826]: I0319 19:03:14.554933 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-788cb6bfb6-558hf" event={"ID":"c1f0645f-c1e5-40ed-8cc7-d3b7e15175b8","Type":"ContainerStarted","Data":"92f9badfa842abd9cae2bdd1be962b8853e2cf26b746d1a751c07b2f22193d49"} Mar 19 19:03:14 crc kubenswrapper[4826]: I0319 19:03:14.554954 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/thanos-querier-788cb6bfb6-558hf" Mar 19 19:03:14 crc kubenswrapper[4826]: I0319 19:03:14.554965 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-788cb6bfb6-558hf" event={"ID":"c1f0645f-c1e5-40ed-8cc7-d3b7e15175b8","Type":"ContainerStarted","Data":"7c8364808eb1b3ebf1fdb4b61e30f090464bcf1aade920b6c54cb031002956e3"} Mar 19 19:03:14 crc kubenswrapper[4826]: I0319 19:03:14.558981 4826 generic.go:334] "Generic (PLEG): container finished" podID="cb23233f-a975-4476-8bff-5e7b4b9c8646" containerID="f8fa0a1ff157316405069e66f026a1d5bb580879c878a1adf53983a291ec1602" exitCode=0 Mar 19 19:03:14 crc kubenswrapper[4826]: I0319 19:03:14.559054 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cb23233f-a975-4476-8bff-5e7b4b9c8646","Type":"ContainerDied","Data":"f8fa0a1ff157316405069e66f026a1d5bb580879c878a1adf53983a291ec1602"} Mar 19 19:03:14 crc kubenswrapper[4826]: I0319 19:03:14.561359 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-657c5b447-gjh5h" event={"ID":"ab7c3046-ac34-417e-a7c6-63e500286063","Type":"ContainerStarted","Data":"1d31d0d8db0c2bb8340ffd0cc50bd990421104e190bb05e42ac92b09f1760326"} Mar 19 19:03:14 crc kubenswrapper[4826]: I0319 19:03:14.567923 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c","Type":"ContainerStarted","Data":"48acf9caed17fa8badd7752f35a81d149111da3aa4d9acd9b5f028663e51bd11"} Mar 19 19:03:14 crc kubenswrapper[4826]: I0319 19:03:14.567968 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c","Type":"ContainerStarted","Data":"278ef6c7a843a003978b52def8b9b738cecc42937be403a4fe3e19f656116008"} Mar 19 19:03:14 crc kubenswrapper[4826]: I0319 19:03:14.567984 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c","Type":"ContainerStarted","Data":"ecc28c24d55c431a387de88206645aee83113141cdf40f6882ad70e49d4be17f"} Mar 19 19:03:14 crc kubenswrapper[4826]: I0319 19:03:14.579364 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-788cb6bfb6-558hf" podStartSLOduration=2.669137901 podStartE2EDuration="7.579344675s" podCreationTimestamp="2026-03-19 19:03:07 +0000 UTC" firstStartedPulling="2026-03-19 19:03:09.052777248 +0000 UTC m=+413.806845561" lastFinishedPulling="2026-03-19 19:03:13.962984012 +0000 UTC m=+418.717052335" observedRunningTime="2026-03-19 19:03:14.575602436 +0000 UTC m=+419.329670779" watchObservedRunningTime="2026-03-19 19:03:14.579344675 +0000 UTC m=+419.333412988" Mar 19 19:03:14 crc kubenswrapper[4826]: I0319 19:03:14.593886 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-657c5b447-gjh5h" podStartSLOduration=1.586122675 podStartE2EDuration="3.593869768s" podCreationTimestamp="2026-03-19 19:03:11 +0000 UTC" firstStartedPulling="2026-03-19 19:03:11.958420724 +0000 UTC m=+416.712489037" lastFinishedPulling="2026-03-19 19:03:13.966167797 +0000 UTC m=+418.720236130" observedRunningTime="2026-03-19 19:03:14.591990338 +0000 UTC m=+419.346058671" watchObservedRunningTime="2026-03-19 19:03:14.593869768 +0000 UTC m=+419.347938071" Mar 19 19:03:15 crc kubenswrapper[4826]: I0319 19:03:15.575247 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-747c5d4c44-ltxl4" event={"ID":"2b19eec2-98e8-47bd-b68f-55b033eb788c","Type":"ContainerStarted","Data":"7d2e743e7d1efbf6cff8845dffdfa1fde9fe3f3668c99e313dd18bf0c86db911"} Mar 19 19:03:15 crc kubenswrapper[4826]: I0319 19:03:15.575636 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/monitoring-plugin-747c5d4c44-ltxl4" Mar 19 19:03:15 crc kubenswrapper[4826]: I0319 19:03:15.581626 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c","Type":"ContainerStarted","Data":"155d909cb07bb0260121d45194e33da3b660b9c3879b205427adaad6e90c1994"} Mar 19 19:03:15 crc kubenswrapper[4826]: I0319 19:03:15.581690 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c","Type":"ContainerStarted","Data":"7a85953028fa04b59992598583b344a61a23c3503838fa1e979e3c79b99233d7"} Mar 19 19:03:15 crc kubenswrapper[4826]: I0319 19:03:15.581702 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6df4c2f2-13c6-4a5c-b455-fc8d6dd5c64c","Type":"ContainerStarted","Data":"166a0675b8bbbc8cec657a9b4f245830cb7eb562fded7c02bcd93e9fce56da78"} Mar 19 19:03:15 crc kubenswrapper[4826]: I0319 19:03:15.583410 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-747c5d4c44-ltxl4" Mar 19 19:03:15 crc kubenswrapper[4826]: I0319 19:03:15.594631 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-747c5d4c44-ltxl4" podStartSLOduration=1.9190213630000001 podStartE2EDuration="4.594617877s" podCreationTimestamp="2026-03-19 19:03:11 +0000 UTC" firstStartedPulling="2026-03-19 19:03:12.248623625 +0000 UTC m=+417.002691938" lastFinishedPulling="2026-03-19 19:03:14.924220139 +0000 UTC m=+419.678288452" observedRunningTime="2026-03-19 19:03:15.592181302 +0000 UTC m=+420.346249645" watchObservedRunningTime="2026-03-19 19:03:15.594617877 +0000 UTC m=+420.348686180" Mar 19 19:03:15 crc kubenswrapper[4826]: I0319 19:03:15.620754 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=3.6211429109999997 podStartE2EDuration="9.620729296s" podCreationTimestamp="2026-03-19 19:03:06 +0000 UTC" firstStartedPulling="2026-03-19 19:03:07.970180247 +0000 UTC m=+412.724248560" lastFinishedPulling="2026-03-19 19:03:13.969766592 +0000 UTC m=+418.723834945" observedRunningTime="2026-03-19 19:03:15.619902214 +0000 UTC m=+420.373970547" watchObservedRunningTime="2026-03-19 19:03:15.620729296 +0000 UTC m=+420.374797629" Mar 19 19:03:18 crc kubenswrapper[4826]: I0319 19:03:18.255985 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-788cb6bfb6-558hf" Mar 19 19:03:18 crc kubenswrapper[4826]: I0319 19:03:18.604508 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cb23233f-a975-4476-8bff-5e7b4b9c8646","Type":"ContainerStarted","Data":"b7376c82899de5f2fb71856599e9de5f734c9533bd70961c5145e14cee2304c4"} Mar 19 19:03:18 crc kubenswrapper[4826]: I0319 19:03:18.604572 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cb23233f-a975-4476-8bff-5e7b4b9c8646","Type":"ContainerStarted","Data":"351bf76711eccad486efb1e37ee7cf1786d24c811c3f845aefac1dd4256f38f0"} Mar 19 19:03:18 crc kubenswrapper[4826]: I0319 19:03:18.604591 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cb23233f-a975-4476-8bff-5e7b4b9c8646","Type":"ContainerStarted","Data":"f9625e2a72fc3f907926b5792559f166baea15fc5366e9be6d1426295fd0720f"} Mar 19 19:03:19 crc kubenswrapper[4826]: I0319 19:03:19.615546 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cb23233f-a975-4476-8bff-5e7b4b9c8646","Type":"ContainerStarted","Data":"4bb0b8f9eb1367b622be6793e5ea0338b1f0c67f4d7d5c715f1fb372f22f83ad"} Mar 19 19:03:19 crc kubenswrapper[4826]: I0319 19:03:19.615950 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cb23233f-a975-4476-8bff-5e7b4b9c8646","Type":"ContainerStarted","Data":"7900df2a1b7738af5eeb9a908b0de29a1e5f103a14d73c59dc973388fc0d9f42"} Mar 19 19:03:19 crc kubenswrapper[4826]: I0319 19:03:19.615974 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cb23233f-a975-4476-8bff-5e7b4b9c8646","Type":"ContainerStarted","Data":"c1a9ded40aa52d213e77302ffe1cd99edd27fd1a6dd89ef6e3863595e67b5f7a"} Mar 19 19:03:19 crc kubenswrapper[4826]: I0319 19:03:19.652064 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=4.209233073 podStartE2EDuration="7.65202976s" podCreationTimestamp="2026-03-19 19:03:12 +0000 UTC" firstStartedPulling="2026-03-19 19:03:14.560770324 +0000 UTC m=+419.314838647" lastFinishedPulling="2026-03-19 19:03:18.003567011 +0000 UTC m=+422.757635334" observedRunningTime="2026-03-19 19:03:19.65051847 +0000 UTC m=+424.404586813" watchObservedRunningTime="2026-03-19 19:03:19.65202976 +0000 UTC m=+424.406098083" Mar 19 19:03:21 crc kubenswrapper[4826]: I0319 19:03:21.001211 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7c6477bfff-gdnfs" Mar 19 19:03:21 crc kubenswrapper[4826]: I0319 19:03:21.001280 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7c6477bfff-gdnfs" Mar 19 19:03:21 crc kubenswrapper[4826]: I0319 19:03:21.008335 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7c6477bfff-gdnfs" Mar 19 19:03:21 crc kubenswrapper[4826]: I0319 19:03:21.638859 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7c6477bfff-gdnfs" Mar 19 19:03:21 crc kubenswrapper[4826]: I0319 19:03:21.715137 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-p892r"] Mar 19 19:03:22 crc kubenswrapper[4826]: I0319 19:03:22.588941 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Mar 19 19:03:25 crc kubenswrapper[4826]: I0319 19:03:25.400766 4826 patch_prober.go:28] interesting pod/machine-config-daemon-zz87p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:03:25 crc kubenswrapper[4826]: I0319 19:03:25.401910 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:03:31 crc kubenswrapper[4826]: I0319 19:03:31.660751 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-657c5b447-gjh5h" Mar 19 19:03:31 crc kubenswrapper[4826]: I0319 19:03:31.661875 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-657c5b447-gjh5h" Mar 19 19:03:46 crc kubenswrapper[4826]: I0319 19:03:46.766020 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-p892r" podUID="db8ad588-15a8-47f2-97d5-950d4a757183" containerName="console" containerID="cri-o://fd0778cea8bc6275ed6e6423bd7ef2ef76b80d94d0f747e3ea7062dbb60aeb42" gracePeriod=15 Mar 19 19:03:47 crc kubenswrapper[4826]: I0319 19:03:47.352074 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-p892r_db8ad588-15a8-47f2-97d5-950d4a757183/console/0.log" Mar 19 19:03:47 crc kubenswrapper[4826]: I0319 19:03:47.352178 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-p892r" Mar 19 19:03:47 crc kubenswrapper[4826]: I0319 19:03:47.551249 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77zqc\" (UniqueName: \"kubernetes.io/projected/db8ad588-15a8-47f2-97d5-950d4a757183-kube-api-access-77zqc\") pod \"db8ad588-15a8-47f2-97d5-950d4a757183\" (UID: \"db8ad588-15a8-47f2-97d5-950d4a757183\") " Mar 19 19:03:47 crc kubenswrapper[4826]: I0319 19:03:47.551713 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/db8ad588-15a8-47f2-97d5-950d4a757183-console-serving-cert\") pod \"db8ad588-15a8-47f2-97d5-950d4a757183\" (UID: \"db8ad588-15a8-47f2-97d5-950d4a757183\") " Mar 19 19:03:47 crc kubenswrapper[4826]: I0319 19:03:47.551756 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/db8ad588-15a8-47f2-97d5-950d4a757183-console-oauth-config\") pod \"db8ad588-15a8-47f2-97d5-950d4a757183\" (UID: \"db8ad588-15a8-47f2-97d5-950d4a757183\") " Mar 19 19:03:47 crc kubenswrapper[4826]: I0319 19:03:47.551834 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/db8ad588-15a8-47f2-97d5-950d4a757183-console-config\") pod \"db8ad588-15a8-47f2-97d5-950d4a757183\" (UID: \"db8ad588-15a8-47f2-97d5-950d4a757183\") " Mar 19 19:03:47 crc kubenswrapper[4826]: I0319 19:03:47.551862 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db8ad588-15a8-47f2-97d5-950d4a757183-trusted-ca-bundle\") pod \"db8ad588-15a8-47f2-97d5-950d4a757183\" (UID: \"db8ad588-15a8-47f2-97d5-950d4a757183\") " Mar 19 19:03:47 crc kubenswrapper[4826]: I0319 19:03:47.551897 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/db8ad588-15a8-47f2-97d5-950d4a757183-service-ca\") pod \"db8ad588-15a8-47f2-97d5-950d4a757183\" (UID: \"db8ad588-15a8-47f2-97d5-950d4a757183\") " Mar 19 19:03:47 crc kubenswrapper[4826]: I0319 19:03:47.551933 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/db8ad588-15a8-47f2-97d5-950d4a757183-oauth-serving-cert\") pod \"db8ad588-15a8-47f2-97d5-950d4a757183\" (UID: \"db8ad588-15a8-47f2-97d5-950d4a757183\") " Mar 19 19:03:47 crc kubenswrapper[4826]: I0319 19:03:47.552606 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db8ad588-15a8-47f2-97d5-950d4a757183-console-config" (OuterVolumeSpecName: "console-config") pod "db8ad588-15a8-47f2-97d5-950d4a757183" (UID: "db8ad588-15a8-47f2-97d5-950d4a757183"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:03:47 crc kubenswrapper[4826]: I0319 19:03:47.552755 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db8ad588-15a8-47f2-97d5-950d4a757183-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "db8ad588-15a8-47f2-97d5-950d4a757183" (UID: "db8ad588-15a8-47f2-97d5-950d4a757183"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:03:47 crc kubenswrapper[4826]: I0319 19:03:47.552798 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db8ad588-15a8-47f2-97d5-950d4a757183-service-ca" (OuterVolumeSpecName: "service-ca") pod "db8ad588-15a8-47f2-97d5-950d4a757183" (UID: "db8ad588-15a8-47f2-97d5-950d4a757183"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:03:47 crc kubenswrapper[4826]: I0319 19:03:47.552778 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db8ad588-15a8-47f2-97d5-950d4a757183-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "db8ad588-15a8-47f2-97d5-950d4a757183" (UID: "db8ad588-15a8-47f2-97d5-950d4a757183"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:03:47 crc kubenswrapper[4826]: I0319 19:03:47.557087 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db8ad588-15a8-47f2-97d5-950d4a757183-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "db8ad588-15a8-47f2-97d5-950d4a757183" (UID: "db8ad588-15a8-47f2-97d5-950d4a757183"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:03:47 crc kubenswrapper[4826]: I0319 19:03:47.557396 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db8ad588-15a8-47f2-97d5-950d4a757183-kube-api-access-77zqc" (OuterVolumeSpecName: "kube-api-access-77zqc") pod "db8ad588-15a8-47f2-97d5-950d4a757183" (UID: "db8ad588-15a8-47f2-97d5-950d4a757183"). InnerVolumeSpecName "kube-api-access-77zqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:03:47 crc kubenswrapper[4826]: I0319 19:03:47.557852 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db8ad588-15a8-47f2-97d5-950d4a757183-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "db8ad588-15a8-47f2-97d5-950d4a757183" (UID: "db8ad588-15a8-47f2-97d5-950d4a757183"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:03:47 crc kubenswrapper[4826]: I0319 19:03:47.653247 4826 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/db8ad588-15a8-47f2-97d5-950d4a757183-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 19:03:47 crc kubenswrapper[4826]: I0319 19:03:47.653288 4826 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/db8ad588-15a8-47f2-97d5-950d4a757183-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:03:47 crc kubenswrapper[4826]: I0319 19:03:47.653297 4826 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/db8ad588-15a8-47f2-97d5-950d4a757183-console-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:03:47 crc kubenswrapper[4826]: I0319 19:03:47.653306 4826 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db8ad588-15a8-47f2-97d5-950d4a757183-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:03:47 crc kubenswrapper[4826]: I0319 19:03:47.653314 4826 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/db8ad588-15a8-47f2-97d5-950d4a757183-service-ca\") on node \"crc\" DevicePath \"\"" Mar 19 19:03:47 crc kubenswrapper[4826]: I0319 19:03:47.653322 4826 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/db8ad588-15a8-47f2-97d5-950d4a757183-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 19:03:47 crc kubenswrapper[4826]: I0319 19:03:47.653330 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77zqc\" (UniqueName: \"kubernetes.io/projected/db8ad588-15a8-47f2-97d5-950d4a757183-kube-api-access-77zqc\") on node \"crc\" DevicePath \"\"" Mar 19 19:03:47 crc kubenswrapper[4826]: I0319 19:03:47.887014 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-p892r_db8ad588-15a8-47f2-97d5-950d4a757183/console/0.log" Mar 19 19:03:47 crc kubenswrapper[4826]: I0319 19:03:47.887055 4826 generic.go:334] "Generic (PLEG): container finished" podID="db8ad588-15a8-47f2-97d5-950d4a757183" containerID="fd0778cea8bc6275ed6e6423bd7ef2ef76b80d94d0f747e3ea7062dbb60aeb42" exitCode=2 Mar 19 19:03:47 crc kubenswrapper[4826]: I0319 19:03:47.887081 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-p892r" event={"ID":"db8ad588-15a8-47f2-97d5-950d4a757183","Type":"ContainerDied","Data":"fd0778cea8bc6275ed6e6423bd7ef2ef76b80d94d0f747e3ea7062dbb60aeb42"} Mar 19 19:03:47 crc kubenswrapper[4826]: I0319 19:03:47.887108 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-p892r" event={"ID":"db8ad588-15a8-47f2-97d5-950d4a757183","Type":"ContainerDied","Data":"4514fae9eb9413de939022a0a83779ffc4e12982e4d63d5bd09182c188e4d252"} Mar 19 19:03:47 crc kubenswrapper[4826]: I0319 19:03:47.887124 4826 scope.go:117] "RemoveContainer" containerID="fd0778cea8bc6275ed6e6423bd7ef2ef76b80d94d0f747e3ea7062dbb60aeb42" Mar 19 19:03:47 crc kubenswrapper[4826]: I0319 19:03:47.887185 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-p892r" Mar 19 19:03:47 crc kubenswrapper[4826]: I0319 19:03:47.918637 4826 scope.go:117] "RemoveContainer" containerID="fd0778cea8bc6275ed6e6423bd7ef2ef76b80d94d0f747e3ea7062dbb60aeb42" Mar 19 19:03:47 crc kubenswrapper[4826]: E0319 19:03:47.919047 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd0778cea8bc6275ed6e6423bd7ef2ef76b80d94d0f747e3ea7062dbb60aeb42\": container with ID starting with fd0778cea8bc6275ed6e6423bd7ef2ef76b80d94d0f747e3ea7062dbb60aeb42 not found: ID does not exist" containerID="fd0778cea8bc6275ed6e6423bd7ef2ef76b80d94d0f747e3ea7062dbb60aeb42" Mar 19 19:03:47 crc kubenswrapper[4826]: I0319 19:03:47.919077 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd0778cea8bc6275ed6e6423bd7ef2ef76b80d94d0f747e3ea7062dbb60aeb42"} err="failed to get container status \"fd0778cea8bc6275ed6e6423bd7ef2ef76b80d94d0f747e3ea7062dbb60aeb42\": rpc error: code = NotFound desc = could not find container \"fd0778cea8bc6275ed6e6423bd7ef2ef76b80d94d0f747e3ea7062dbb60aeb42\": container with ID starting with fd0778cea8bc6275ed6e6423bd7ef2ef76b80d94d0f747e3ea7062dbb60aeb42 not found: ID does not exist" Mar 19 19:03:47 crc kubenswrapper[4826]: I0319 19:03:47.929467 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-p892r"] Mar 19 19:03:47 crc kubenswrapper[4826]: I0319 19:03:47.932998 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-p892r"] Mar 19 19:03:47 crc kubenswrapper[4826]: I0319 19:03:47.983639 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db8ad588-15a8-47f2-97d5-950d4a757183" path="/var/lib/kubelet/pods/db8ad588-15a8-47f2-97d5-950d4a757183/volumes" Mar 19 19:03:51 crc kubenswrapper[4826]: I0319 19:03:51.668022 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-657c5b447-gjh5h" Mar 19 19:03:51 crc kubenswrapper[4826]: I0319 19:03:51.674367 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-657c5b447-gjh5h" Mar 19 19:03:55 crc kubenswrapper[4826]: I0319 19:03:55.400337 4826 patch_prober.go:28] interesting pod/machine-config-daemon-zz87p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:03:55 crc kubenswrapper[4826]: I0319 19:03:55.401734 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:03:55 crc kubenswrapper[4826]: I0319 19:03:55.401818 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" Mar 19 19:03:55 crc kubenswrapper[4826]: I0319 19:03:55.402849 4826 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3ee9874037af13ba40ff8e8492fc1cbe83ef3f4c7edf979a85a6a720d737c911"} pod="openshift-machine-config-operator/machine-config-daemon-zz87p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 19:03:55 crc kubenswrapper[4826]: I0319 19:03:55.402961 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" containerID="cri-o://3ee9874037af13ba40ff8e8492fc1cbe83ef3f4c7edf979a85a6a720d737c911" gracePeriod=600 Mar 19 19:03:55 crc kubenswrapper[4826]: I0319 19:03:55.962521 4826 generic.go:334] "Generic (PLEG): container finished" podID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerID="3ee9874037af13ba40ff8e8492fc1cbe83ef3f4c7edf979a85a6a720d737c911" exitCode=0 Mar 19 19:03:55 crc kubenswrapper[4826]: I0319 19:03:55.962936 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" event={"ID":"b456fa3f-c7a7-45ca-b560-e7a9b21be05a","Type":"ContainerDied","Data":"3ee9874037af13ba40ff8e8492fc1cbe83ef3f4c7edf979a85a6a720d737c911"} Mar 19 19:03:55 crc kubenswrapper[4826]: I0319 19:03:55.962965 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" event={"ID":"b456fa3f-c7a7-45ca-b560-e7a9b21be05a","Type":"ContainerStarted","Data":"4ebb3021af767d982a8fa18f3ce572a8f3489148bb38a57e868427071dfe9b1a"} Mar 19 19:03:55 crc kubenswrapper[4826]: I0319 19:03:55.962981 4826 scope.go:117] "RemoveContainer" containerID="9d92f655f3b11b40bcc07704e2387d92e12e6f78e1df6ba8885d1c76be823e80" Mar 19 19:04:00 crc kubenswrapper[4826]: I0319 19:04:00.152779 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565784-gqmkb"] Mar 19 19:04:00 crc kubenswrapper[4826]: E0319 19:04:00.153701 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db8ad588-15a8-47f2-97d5-950d4a757183" containerName="console" Mar 19 19:04:00 crc kubenswrapper[4826]: I0319 19:04:00.153722 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="db8ad588-15a8-47f2-97d5-950d4a757183" containerName="console" Mar 19 19:04:00 crc kubenswrapper[4826]: I0319 19:04:00.153972 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="db8ad588-15a8-47f2-97d5-950d4a757183" containerName="console" Mar 19 19:04:00 crc kubenswrapper[4826]: I0319 19:04:00.154574 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565784-gqmkb" Mar 19 19:04:00 crc kubenswrapper[4826]: I0319 19:04:00.158772 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 19:04:00 crc kubenswrapper[4826]: I0319 19:04:00.162650 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 19:04:00 crc kubenswrapper[4826]: I0319 19:04:00.163180 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b27wl" Mar 19 19:04:00 crc kubenswrapper[4826]: I0319 19:04:00.167853 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565784-gqmkb"] Mar 19 19:04:00 crc kubenswrapper[4826]: I0319 19:04:00.251981 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sn87h\" (UniqueName: \"kubernetes.io/projected/c12e6736-71d9-48ae-8df3-c385a2dd259f-kube-api-access-sn87h\") pod \"auto-csr-approver-29565784-gqmkb\" (UID: \"c12e6736-71d9-48ae-8df3-c385a2dd259f\") " pod="openshift-infra/auto-csr-approver-29565784-gqmkb" Mar 19 19:04:00 crc kubenswrapper[4826]: I0319 19:04:00.353799 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sn87h\" (UniqueName: \"kubernetes.io/projected/c12e6736-71d9-48ae-8df3-c385a2dd259f-kube-api-access-sn87h\") pod \"auto-csr-approver-29565784-gqmkb\" (UID: \"c12e6736-71d9-48ae-8df3-c385a2dd259f\") " pod="openshift-infra/auto-csr-approver-29565784-gqmkb" Mar 19 19:04:00 crc kubenswrapper[4826]: I0319 19:04:00.382191 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sn87h\" (UniqueName: \"kubernetes.io/projected/c12e6736-71d9-48ae-8df3-c385a2dd259f-kube-api-access-sn87h\") pod \"auto-csr-approver-29565784-gqmkb\" (UID: \"c12e6736-71d9-48ae-8df3-c385a2dd259f\") " pod="openshift-infra/auto-csr-approver-29565784-gqmkb" Mar 19 19:04:00 crc kubenswrapper[4826]: I0319 19:04:00.493912 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565784-gqmkb" Mar 19 19:04:00 crc kubenswrapper[4826]: I0319 19:04:00.760873 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565784-gqmkb"] Mar 19 19:04:01 crc kubenswrapper[4826]: I0319 19:04:01.006993 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565784-gqmkb" event={"ID":"c12e6736-71d9-48ae-8df3-c385a2dd259f","Type":"ContainerStarted","Data":"9ba412e030692c65490d3108a491a63611fe2bbffe797bf741598764b7de8cf4"} Mar 19 19:04:03 crc kubenswrapper[4826]: I0319 19:04:03.026221 4826 generic.go:334] "Generic (PLEG): container finished" podID="c12e6736-71d9-48ae-8df3-c385a2dd259f" containerID="d19c308138bf340eca1e68c570b2372a1f1e2e310f8150adea90b049a74ec80d" exitCode=0 Mar 19 19:04:03 crc kubenswrapper[4826]: I0319 19:04:03.026533 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565784-gqmkb" event={"ID":"c12e6736-71d9-48ae-8df3-c385a2dd259f","Type":"ContainerDied","Data":"d19c308138bf340eca1e68c570b2372a1f1e2e310f8150adea90b049a74ec80d"} Mar 19 19:04:04 crc kubenswrapper[4826]: I0319 19:04:04.965314 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565784-gqmkb" Mar 19 19:04:05 crc kubenswrapper[4826]: I0319 19:04:05.040104 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565784-gqmkb" event={"ID":"c12e6736-71d9-48ae-8df3-c385a2dd259f","Type":"ContainerDied","Data":"9ba412e030692c65490d3108a491a63611fe2bbffe797bf741598764b7de8cf4"} Mar 19 19:04:05 crc kubenswrapper[4826]: I0319 19:04:05.040137 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ba412e030692c65490d3108a491a63611fe2bbffe797bf741598764b7de8cf4" Mar 19 19:04:05 crc kubenswrapper[4826]: I0319 19:04:05.040202 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565784-gqmkb" Mar 19 19:04:05 crc kubenswrapper[4826]: I0319 19:04:05.126522 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sn87h\" (UniqueName: \"kubernetes.io/projected/c12e6736-71d9-48ae-8df3-c385a2dd259f-kube-api-access-sn87h\") pod \"c12e6736-71d9-48ae-8df3-c385a2dd259f\" (UID: \"c12e6736-71d9-48ae-8df3-c385a2dd259f\") " Mar 19 19:04:05 crc kubenswrapper[4826]: I0319 19:04:05.135358 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c12e6736-71d9-48ae-8df3-c385a2dd259f-kube-api-access-sn87h" (OuterVolumeSpecName: "kube-api-access-sn87h") pod "c12e6736-71d9-48ae-8df3-c385a2dd259f" (UID: "c12e6736-71d9-48ae-8df3-c385a2dd259f"). InnerVolumeSpecName "kube-api-access-sn87h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:04:05 crc kubenswrapper[4826]: I0319 19:04:05.230175 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sn87h\" (UniqueName: \"kubernetes.io/projected/c12e6736-71d9-48ae-8df3-c385a2dd259f-kube-api-access-sn87h\") on node \"crc\" DevicePath \"\"" Mar 19 19:04:06 crc kubenswrapper[4826]: I0319 19:04:06.043924 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565778-hxldh"] Mar 19 19:04:06 crc kubenswrapper[4826]: I0319 19:04:06.052976 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565778-hxldh"] Mar 19 19:04:07 crc kubenswrapper[4826]: I0319 19:04:07.991014 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="509cd3a8-f3bb-4214-a70b-e589905ad242" path="/var/lib/kubelet/pods/509cd3a8-f3bb-4214-a70b-e589905ad242/volumes" Mar 19 19:04:12 crc kubenswrapper[4826]: I0319 19:04:12.589345 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Mar 19 19:04:12 crc kubenswrapper[4826]: I0319 19:04:12.637608 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Mar 19 19:04:13 crc kubenswrapper[4826]: I0319 19:04:13.149420 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Mar 19 19:04:24 crc kubenswrapper[4826]: I0319 19:04:24.853244 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-55ff78d854-f4zw8"] Mar 19 19:04:24 crc kubenswrapper[4826]: E0319 19:04:24.858066 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c12e6736-71d9-48ae-8df3-c385a2dd259f" containerName="oc" Mar 19 19:04:24 crc kubenswrapper[4826]: I0319 19:04:24.858098 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="c12e6736-71d9-48ae-8df3-c385a2dd259f" containerName="oc" Mar 19 19:04:24 crc kubenswrapper[4826]: I0319 19:04:24.858257 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="c12e6736-71d9-48ae-8df3-c385a2dd259f" containerName="oc" Mar 19 19:04:24 crc kubenswrapper[4826]: I0319 19:04:24.858840 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55ff78d854-f4zw8" Mar 19 19:04:24 crc kubenswrapper[4826]: I0319 19:04:24.872518 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-55ff78d854-f4zw8"] Mar 19 19:04:24 crc kubenswrapper[4826]: I0319 19:04:24.964489 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/000caec5-bbed-4ed7-b946-7f859ab61e98-trusted-ca-bundle\") pod \"console-55ff78d854-f4zw8\" (UID: \"000caec5-bbed-4ed7-b946-7f859ab61e98\") " pod="openshift-console/console-55ff78d854-f4zw8" Mar 19 19:04:24 crc kubenswrapper[4826]: I0319 19:04:24.964531 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/000caec5-bbed-4ed7-b946-7f859ab61e98-oauth-serving-cert\") pod \"console-55ff78d854-f4zw8\" (UID: \"000caec5-bbed-4ed7-b946-7f859ab61e98\") " pod="openshift-console/console-55ff78d854-f4zw8" Mar 19 19:04:24 crc kubenswrapper[4826]: I0319 19:04:24.964578 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/000caec5-bbed-4ed7-b946-7f859ab61e98-console-serving-cert\") pod \"console-55ff78d854-f4zw8\" (UID: \"000caec5-bbed-4ed7-b946-7f859ab61e98\") " pod="openshift-console/console-55ff78d854-f4zw8" Mar 19 19:04:24 crc kubenswrapper[4826]: I0319 19:04:24.964593 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8jpn\" (UniqueName: \"kubernetes.io/projected/000caec5-bbed-4ed7-b946-7f859ab61e98-kube-api-access-n8jpn\") pod \"console-55ff78d854-f4zw8\" (UID: \"000caec5-bbed-4ed7-b946-7f859ab61e98\") " pod="openshift-console/console-55ff78d854-f4zw8" Mar 19 19:04:24 crc kubenswrapper[4826]: I0319 19:04:24.964639 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/000caec5-bbed-4ed7-b946-7f859ab61e98-service-ca\") pod \"console-55ff78d854-f4zw8\" (UID: \"000caec5-bbed-4ed7-b946-7f859ab61e98\") " pod="openshift-console/console-55ff78d854-f4zw8" Mar 19 19:04:24 crc kubenswrapper[4826]: I0319 19:04:24.964660 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/000caec5-bbed-4ed7-b946-7f859ab61e98-console-oauth-config\") pod \"console-55ff78d854-f4zw8\" (UID: \"000caec5-bbed-4ed7-b946-7f859ab61e98\") " pod="openshift-console/console-55ff78d854-f4zw8" Mar 19 19:04:24 crc kubenswrapper[4826]: I0319 19:04:24.964715 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/000caec5-bbed-4ed7-b946-7f859ab61e98-console-config\") pod \"console-55ff78d854-f4zw8\" (UID: \"000caec5-bbed-4ed7-b946-7f859ab61e98\") " pod="openshift-console/console-55ff78d854-f4zw8" Mar 19 19:04:25 crc kubenswrapper[4826]: I0319 19:04:25.066799 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/000caec5-bbed-4ed7-b946-7f859ab61e98-service-ca\") pod \"console-55ff78d854-f4zw8\" (UID: \"000caec5-bbed-4ed7-b946-7f859ab61e98\") " pod="openshift-console/console-55ff78d854-f4zw8" Mar 19 19:04:25 crc kubenswrapper[4826]: I0319 19:04:25.066876 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/000caec5-bbed-4ed7-b946-7f859ab61e98-console-oauth-config\") pod \"console-55ff78d854-f4zw8\" (UID: \"000caec5-bbed-4ed7-b946-7f859ab61e98\") " pod="openshift-console/console-55ff78d854-f4zw8" Mar 19 19:04:25 crc kubenswrapper[4826]: I0319 19:04:25.066966 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/000caec5-bbed-4ed7-b946-7f859ab61e98-console-config\") pod \"console-55ff78d854-f4zw8\" (UID: \"000caec5-bbed-4ed7-b946-7f859ab61e98\") " pod="openshift-console/console-55ff78d854-f4zw8" Mar 19 19:04:25 crc kubenswrapper[4826]: I0319 19:04:25.067067 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/000caec5-bbed-4ed7-b946-7f859ab61e98-trusted-ca-bundle\") pod \"console-55ff78d854-f4zw8\" (UID: \"000caec5-bbed-4ed7-b946-7f859ab61e98\") " pod="openshift-console/console-55ff78d854-f4zw8" Mar 19 19:04:25 crc kubenswrapper[4826]: I0319 19:04:25.067099 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/000caec5-bbed-4ed7-b946-7f859ab61e98-oauth-serving-cert\") pod \"console-55ff78d854-f4zw8\" (UID: \"000caec5-bbed-4ed7-b946-7f859ab61e98\") " pod="openshift-console/console-55ff78d854-f4zw8" Mar 19 19:04:25 crc kubenswrapper[4826]: I0319 19:04:25.067183 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/000caec5-bbed-4ed7-b946-7f859ab61e98-console-serving-cert\") pod \"console-55ff78d854-f4zw8\" (UID: \"000caec5-bbed-4ed7-b946-7f859ab61e98\") " pod="openshift-console/console-55ff78d854-f4zw8" Mar 19 19:04:25 crc kubenswrapper[4826]: I0319 19:04:25.067214 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8jpn\" (UniqueName: \"kubernetes.io/projected/000caec5-bbed-4ed7-b946-7f859ab61e98-kube-api-access-n8jpn\") pod \"console-55ff78d854-f4zw8\" (UID: \"000caec5-bbed-4ed7-b946-7f859ab61e98\") " pod="openshift-console/console-55ff78d854-f4zw8" Mar 19 19:04:25 crc kubenswrapper[4826]: I0319 19:04:25.068166 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/000caec5-bbed-4ed7-b946-7f859ab61e98-oauth-serving-cert\") pod \"console-55ff78d854-f4zw8\" (UID: \"000caec5-bbed-4ed7-b946-7f859ab61e98\") " pod="openshift-console/console-55ff78d854-f4zw8" Mar 19 19:04:25 crc kubenswrapper[4826]: I0319 19:04:25.068212 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/000caec5-bbed-4ed7-b946-7f859ab61e98-trusted-ca-bundle\") pod \"console-55ff78d854-f4zw8\" (UID: \"000caec5-bbed-4ed7-b946-7f859ab61e98\") " pod="openshift-console/console-55ff78d854-f4zw8" Mar 19 19:04:25 crc kubenswrapper[4826]: I0319 19:04:25.068459 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/000caec5-bbed-4ed7-b946-7f859ab61e98-service-ca\") pod \"console-55ff78d854-f4zw8\" (UID: \"000caec5-bbed-4ed7-b946-7f859ab61e98\") " pod="openshift-console/console-55ff78d854-f4zw8" Mar 19 19:04:25 crc kubenswrapper[4826]: I0319 19:04:25.068836 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/000caec5-bbed-4ed7-b946-7f859ab61e98-console-config\") pod \"console-55ff78d854-f4zw8\" (UID: \"000caec5-bbed-4ed7-b946-7f859ab61e98\") " pod="openshift-console/console-55ff78d854-f4zw8" Mar 19 19:04:25 crc kubenswrapper[4826]: I0319 19:04:25.074846 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/000caec5-bbed-4ed7-b946-7f859ab61e98-console-serving-cert\") pod \"console-55ff78d854-f4zw8\" (UID: \"000caec5-bbed-4ed7-b946-7f859ab61e98\") " pod="openshift-console/console-55ff78d854-f4zw8" Mar 19 19:04:25 crc kubenswrapper[4826]: I0319 19:04:25.074876 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/000caec5-bbed-4ed7-b946-7f859ab61e98-console-oauth-config\") pod \"console-55ff78d854-f4zw8\" (UID: \"000caec5-bbed-4ed7-b946-7f859ab61e98\") " pod="openshift-console/console-55ff78d854-f4zw8" Mar 19 19:04:25 crc kubenswrapper[4826]: I0319 19:04:25.082238 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8jpn\" (UniqueName: \"kubernetes.io/projected/000caec5-bbed-4ed7-b946-7f859ab61e98-kube-api-access-n8jpn\") pod \"console-55ff78d854-f4zw8\" (UID: \"000caec5-bbed-4ed7-b946-7f859ab61e98\") " pod="openshift-console/console-55ff78d854-f4zw8" Mar 19 19:04:25 crc kubenswrapper[4826]: I0319 19:04:25.179717 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55ff78d854-f4zw8" Mar 19 19:04:25 crc kubenswrapper[4826]: I0319 19:04:25.583108 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-55ff78d854-f4zw8"] Mar 19 19:04:26 crc kubenswrapper[4826]: I0319 19:04:26.196371 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55ff78d854-f4zw8" event={"ID":"000caec5-bbed-4ed7-b946-7f859ab61e98","Type":"ContainerStarted","Data":"1c12390e1edf103197e43825aa34a9e3f97bd897959ae9e98126429c064b8862"} Mar 19 19:04:26 crc kubenswrapper[4826]: I0319 19:04:26.196851 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55ff78d854-f4zw8" event={"ID":"000caec5-bbed-4ed7-b946-7f859ab61e98","Type":"ContainerStarted","Data":"4ea2184973e17bbcfadc97427f2f12e44b8c499e6d60add58ea11b195fb32c4e"} Mar 19 19:04:26 crc kubenswrapper[4826]: I0319 19:04:26.225828 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-55ff78d854-f4zw8" podStartSLOduration=2.225802605 podStartE2EDuration="2.225802605s" podCreationTimestamp="2026-03-19 19:04:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:04:26.224540009 +0000 UTC m=+490.978608322" watchObservedRunningTime="2026-03-19 19:04:26.225802605 +0000 UTC m=+490.979870938" Mar 19 19:04:35 crc kubenswrapper[4826]: I0319 19:04:35.180276 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-55ff78d854-f4zw8" Mar 19 19:04:35 crc kubenswrapper[4826]: I0319 19:04:35.181811 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-55ff78d854-f4zw8" Mar 19 19:04:35 crc kubenswrapper[4826]: I0319 19:04:35.189072 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-55ff78d854-f4zw8" Mar 19 19:04:35 crc kubenswrapper[4826]: I0319 19:04:35.270098 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-55ff78d854-f4zw8" Mar 19 19:04:35 crc kubenswrapper[4826]: I0319 19:04:35.367568 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7c6477bfff-gdnfs"] Mar 19 19:05:00 crc kubenswrapper[4826]: I0319 19:05:00.410521 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-7c6477bfff-gdnfs" podUID="cc698a42-bbab-4c83-b928-5b7dc3c75b33" containerName="console" containerID="cri-o://113bc059868573a5b61e620a2b1aab79976c1b23cf34b72b5efb63aff638fa49" gracePeriod=15 Mar 19 19:05:00 crc kubenswrapper[4826]: I0319 19:05:00.868693 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7c6477bfff-gdnfs_cc698a42-bbab-4c83-b928-5b7dc3c75b33/console/0.log" Mar 19 19:05:00 crc kubenswrapper[4826]: I0319 19:05:00.868974 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c6477bfff-gdnfs" Mar 19 19:05:00 crc kubenswrapper[4826]: I0319 19:05:00.883295 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ff8jv\" (UniqueName: \"kubernetes.io/projected/cc698a42-bbab-4c83-b928-5b7dc3c75b33-kube-api-access-ff8jv\") pod \"cc698a42-bbab-4c83-b928-5b7dc3c75b33\" (UID: \"cc698a42-bbab-4c83-b928-5b7dc3c75b33\") " Mar 19 19:05:00 crc kubenswrapper[4826]: I0319 19:05:00.883340 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cc698a42-bbab-4c83-b928-5b7dc3c75b33-oauth-serving-cert\") pod \"cc698a42-bbab-4c83-b928-5b7dc3c75b33\" (UID: \"cc698a42-bbab-4c83-b928-5b7dc3c75b33\") " Mar 19 19:05:00 crc kubenswrapper[4826]: I0319 19:05:00.883370 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cc698a42-bbab-4c83-b928-5b7dc3c75b33-service-ca\") pod \"cc698a42-bbab-4c83-b928-5b7dc3c75b33\" (UID: \"cc698a42-bbab-4c83-b928-5b7dc3c75b33\") " Mar 19 19:05:00 crc kubenswrapper[4826]: I0319 19:05:00.883388 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc698a42-bbab-4c83-b928-5b7dc3c75b33-trusted-ca-bundle\") pod \"cc698a42-bbab-4c83-b928-5b7dc3c75b33\" (UID: \"cc698a42-bbab-4c83-b928-5b7dc3c75b33\") " Mar 19 19:05:00 crc kubenswrapper[4826]: I0319 19:05:00.883410 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cc698a42-bbab-4c83-b928-5b7dc3c75b33-console-config\") pod \"cc698a42-bbab-4c83-b928-5b7dc3c75b33\" (UID: \"cc698a42-bbab-4c83-b928-5b7dc3c75b33\") " Mar 19 19:05:00 crc kubenswrapper[4826]: I0319 19:05:00.883445 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cc698a42-bbab-4c83-b928-5b7dc3c75b33-console-oauth-config\") pod \"cc698a42-bbab-4c83-b928-5b7dc3c75b33\" (UID: \"cc698a42-bbab-4c83-b928-5b7dc3c75b33\") " Mar 19 19:05:00 crc kubenswrapper[4826]: I0319 19:05:00.884844 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cc698a42-bbab-4c83-b928-5b7dc3c75b33-console-serving-cert\") pod \"cc698a42-bbab-4c83-b928-5b7dc3c75b33\" (UID: \"cc698a42-bbab-4c83-b928-5b7dc3c75b33\") " Mar 19 19:05:00 crc kubenswrapper[4826]: I0319 19:05:00.885045 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc698a42-bbab-4c83-b928-5b7dc3c75b33-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "cc698a42-bbab-4c83-b928-5b7dc3c75b33" (UID: "cc698a42-bbab-4c83-b928-5b7dc3c75b33"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:05:00 crc kubenswrapper[4826]: I0319 19:05:00.885431 4826 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cc698a42-bbab-4c83-b928-5b7dc3c75b33-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 19:05:00 crc kubenswrapper[4826]: I0319 19:05:00.885532 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc698a42-bbab-4c83-b928-5b7dc3c75b33-service-ca" (OuterVolumeSpecName: "service-ca") pod "cc698a42-bbab-4c83-b928-5b7dc3c75b33" (UID: "cc698a42-bbab-4c83-b928-5b7dc3c75b33"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:05:00 crc kubenswrapper[4826]: I0319 19:05:00.885536 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc698a42-bbab-4c83-b928-5b7dc3c75b33-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "cc698a42-bbab-4c83-b928-5b7dc3c75b33" (UID: "cc698a42-bbab-4c83-b928-5b7dc3c75b33"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:05:00 crc kubenswrapper[4826]: I0319 19:05:00.885601 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc698a42-bbab-4c83-b928-5b7dc3c75b33-console-config" (OuterVolumeSpecName: "console-config") pod "cc698a42-bbab-4c83-b928-5b7dc3c75b33" (UID: "cc698a42-bbab-4c83-b928-5b7dc3c75b33"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:05:00 crc kubenswrapper[4826]: I0319 19:05:00.891325 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc698a42-bbab-4c83-b928-5b7dc3c75b33-kube-api-access-ff8jv" (OuterVolumeSpecName: "kube-api-access-ff8jv") pod "cc698a42-bbab-4c83-b928-5b7dc3c75b33" (UID: "cc698a42-bbab-4c83-b928-5b7dc3c75b33"). InnerVolumeSpecName "kube-api-access-ff8jv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:05:00 crc kubenswrapper[4826]: I0319 19:05:00.891371 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc698a42-bbab-4c83-b928-5b7dc3c75b33-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "cc698a42-bbab-4c83-b928-5b7dc3c75b33" (UID: "cc698a42-bbab-4c83-b928-5b7dc3c75b33"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:05:00 crc kubenswrapper[4826]: I0319 19:05:00.894852 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc698a42-bbab-4c83-b928-5b7dc3c75b33-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "cc698a42-bbab-4c83-b928-5b7dc3c75b33" (UID: "cc698a42-bbab-4c83-b928-5b7dc3c75b33"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:05:00 crc kubenswrapper[4826]: I0319 19:05:00.987107 4826 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cc698a42-bbab-4c83-b928-5b7dc3c75b33-service-ca\") on node \"crc\" DevicePath \"\"" Mar 19 19:05:00 crc kubenswrapper[4826]: I0319 19:05:00.987167 4826 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc698a42-bbab-4c83-b928-5b7dc3c75b33-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:05:00 crc kubenswrapper[4826]: I0319 19:05:00.987190 4826 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cc698a42-bbab-4c83-b928-5b7dc3c75b33-console-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:05:00 crc kubenswrapper[4826]: I0319 19:05:00.987208 4826 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cc698a42-bbab-4c83-b928-5b7dc3c75b33-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:05:00 crc kubenswrapper[4826]: I0319 19:05:00.987229 4826 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cc698a42-bbab-4c83-b928-5b7dc3c75b33-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 19:05:00 crc kubenswrapper[4826]: I0319 19:05:00.987248 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ff8jv\" (UniqueName: \"kubernetes.io/projected/cc698a42-bbab-4c83-b928-5b7dc3c75b33-kube-api-access-ff8jv\") on node \"crc\" DevicePath \"\"" Mar 19 19:05:01 crc kubenswrapper[4826]: I0319 19:05:01.475281 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7c6477bfff-gdnfs_cc698a42-bbab-4c83-b928-5b7dc3c75b33/console/0.log" Mar 19 19:05:01 crc kubenswrapper[4826]: I0319 19:05:01.475371 4826 generic.go:334] "Generic (PLEG): container finished" podID="cc698a42-bbab-4c83-b928-5b7dc3c75b33" containerID="113bc059868573a5b61e620a2b1aab79976c1b23cf34b72b5efb63aff638fa49" exitCode=2 Mar 19 19:05:01 crc kubenswrapper[4826]: I0319 19:05:01.475418 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c6477bfff-gdnfs" event={"ID":"cc698a42-bbab-4c83-b928-5b7dc3c75b33","Type":"ContainerDied","Data":"113bc059868573a5b61e620a2b1aab79976c1b23cf34b72b5efb63aff638fa49"} Mar 19 19:05:01 crc kubenswrapper[4826]: I0319 19:05:01.475458 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c6477bfff-gdnfs" event={"ID":"cc698a42-bbab-4c83-b928-5b7dc3c75b33","Type":"ContainerDied","Data":"c139b949197fd8038a8d3ce88ac1a0b8419ce316c3f9cac71fb522710cc47d2c"} Mar 19 19:05:01 crc kubenswrapper[4826]: I0319 19:05:01.475489 4826 scope.go:117] "RemoveContainer" containerID="113bc059868573a5b61e620a2b1aab79976c1b23cf34b72b5efb63aff638fa49" Mar 19 19:05:01 crc kubenswrapper[4826]: I0319 19:05:01.475536 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c6477bfff-gdnfs" Mar 19 19:05:01 crc kubenswrapper[4826]: I0319 19:05:01.508751 4826 scope.go:117] "RemoveContainer" containerID="113bc059868573a5b61e620a2b1aab79976c1b23cf34b72b5efb63aff638fa49" Mar 19 19:05:01 crc kubenswrapper[4826]: E0319 19:05:01.509555 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"113bc059868573a5b61e620a2b1aab79976c1b23cf34b72b5efb63aff638fa49\": container with ID starting with 113bc059868573a5b61e620a2b1aab79976c1b23cf34b72b5efb63aff638fa49 not found: ID does not exist" containerID="113bc059868573a5b61e620a2b1aab79976c1b23cf34b72b5efb63aff638fa49" Mar 19 19:05:01 crc kubenswrapper[4826]: I0319 19:05:01.509680 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"113bc059868573a5b61e620a2b1aab79976c1b23cf34b72b5efb63aff638fa49"} err="failed to get container status \"113bc059868573a5b61e620a2b1aab79976c1b23cf34b72b5efb63aff638fa49\": rpc error: code = NotFound desc = could not find container \"113bc059868573a5b61e620a2b1aab79976c1b23cf34b72b5efb63aff638fa49\": container with ID starting with 113bc059868573a5b61e620a2b1aab79976c1b23cf34b72b5efb63aff638fa49 not found: ID does not exist" Mar 19 19:05:01 crc kubenswrapper[4826]: I0319 19:05:01.531783 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7c6477bfff-gdnfs"] Mar 19 19:05:01 crc kubenswrapper[4826]: I0319 19:05:01.540424 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7c6477bfff-gdnfs"] Mar 19 19:05:01 crc kubenswrapper[4826]: I0319 19:05:01.987535 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc698a42-bbab-4c83-b928-5b7dc3c75b33" path="/var/lib/kubelet/pods/cc698a42-bbab-4c83-b928-5b7dc3c75b33/volumes" Mar 19 19:05:55 crc kubenswrapper[4826]: I0319 19:05:55.400790 4826 patch_prober.go:28] interesting pod/machine-config-daemon-zz87p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:05:55 crc kubenswrapper[4826]: I0319 19:05:55.401495 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:06:00 crc kubenswrapper[4826]: I0319 19:06:00.143879 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565786-5485l"] Mar 19 19:06:00 crc kubenswrapper[4826]: E0319 19:06:00.144417 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc698a42-bbab-4c83-b928-5b7dc3c75b33" containerName="console" Mar 19 19:06:00 crc kubenswrapper[4826]: I0319 19:06:00.144432 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc698a42-bbab-4c83-b928-5b7dc3c75b33" containerName="console" Mar 19 19:06:00 crc kubenswrapper[4826]: I0319 19:06:00.144579 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc698a42-bbab-4c83-b928-5b7dc3c75b33" containerName="console" Mar 19 19:06:00 crc kubenswrapper[4826]: I0319 19:06:00.145093 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565786-5485l" Mar 19 19:06:00 crc kubenswrapper[4826]: I0319 19:06:00.151620 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 19:06:00 crc kubenswrapper[4826]: I0319 19:06:00.151673 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b27wl" Mar 19 19:06:00 crc kubenswrapper[4826]: I0319 19:06:00.151864 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 19:06:00 crc kubenswrapper[4826]: I0319 19:06:00.153346 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cwk9\" (UniqueName: \"kubernetes.io/projected/e18768f0-9479-4355-a573-f689a0e019d7-kube-api-access-9cwk9\") pod \"auto-csr-approver-29565786-5485l\" (UID: \"e18768f0-9479-4355-a573-f689a0e019d7\") " pod="openshift-infra/auto-csr-approver-29565786-5485l" Mar 19 19:06:00 crc kubenswrapper[4826]: I0319 19:06:00.158353 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565786-5485l"] Mar 19 19:06:00 crc kubenswrapper[4826]: I0319 19:06:00.255338 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cwk9\" (UniqueName: \"kubernetes.io/projected/e18768f0-9479-4355-a573-f689a0e019d7-kube-api-access-9cwk9\") pod \"auto-csr-approver-29565786-5485l\" (UID: \"e18768f0-9479-4355-a573-f689a0e019d7\") " pod="openshift-infra/auto-csr-approver-29565786-5485l" Mar 19 19:06:00 crc kubenswrapper[4826]: I0319 19:06:00.277129 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cwk9\" (UniqueName: \"kubernetes.io/projected/e18768f0-9479-4355-a573-f689a0e019d7-kube-api-access-9cwk9\") pod \"auto-csr-approver-29565786-5485l\" (UID: \"e18768f0-9479-4355-a573-f689a0e019d7\") " pod="openshift-infra/auto-csr-approver-29565786-5485l" Mar 19 19:06:00 crc kubenswrapper[4826]: I0319 19:06:00.472893 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565786-5485l" Mar 19 19:06:00 crc kubenswrapper[4826]: I0319 19:06:00.947066 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565786-5485l"] Mar 19 19:06:01 crc kubenswrapper[4826]: I0319 19:06:01.287523 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565786-5485l" event={"ID":"e18768f0-9479-4355-a573-f689a0e019d7","Type":"ContainerStarted","Data":"745414b655ea0a5f23f7ff82eacaf52deef64796373379af7a11ac5b822b6882"} Mar 19 19:06:03 crc kubenswrapper[4826]: I0319 19:06:03.302727 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565786-5485l" event={"ID":"e18768f0-9479-4355-a573-f689a0e019d7","Type":"ContainerStarted","Data":"b5b2fb51112651a9da8493466d7b767343ce35db5484ed71b10d42ff42355bde"} Mar 19 19:06:04 crc kubenswrapper[4826]: I0319 19:06:04.312309 4826 generic.go:334] "Generic (PLEG): container finished" podID="e18768f0-9479-4355-a573-f689a0e019d7" containerID="b5b2fb51112651a9da8493466d7b767343ce35db5484ed71b10d42ff42355bde" exitCode=0 Mar 19 19:06:04 crc kubenswrapper[4826]: I0319 19:06:04.312478 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565786-5485l" event={"ID":"e18768f0-9479-4355-a573-f689a0e019d7","Type":"ContainerDied","Data":"b5b2fb51112651a9da8493466d7b767343ce35db5484ed71b10d42ff42355bde"} Mar 19 19:06:05 crc kubenswrapper[4826]: I0319 19:06:05.616789 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565786-5485l" Mar 19 19:06:05 crc kubenswrapper[4826]: I0319 19:06:05.637720 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9cwk9\" (UniqueName: \"kubernetes.io/projected/e18768f0-9479-4355-a573-f689a0e019d7-kube-api-access-9cwk9\") pod \"e18768f0-9479-4355-a573-f689a0e019d7\" (UID: \"e18768f0-9479-4355-a573-f689a0e019d7\") " Mar 19 19:06:05 crc kubenswrapper[4826]: I0319 19:06:05.643837 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e18768f0-9479-4355-a573-f689a0e019d7-kube-api-access-9cwk9" (OuterVolumeSpecName: "kube-api-access-9cwk9") pod "e18768f0-9479-4355-a573-f689a0e019d7" (UID: "e18768f0-9479-4355-a573-f689a0e019d7"). InnerVolumeSpecName "kube-api-access-9cwk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:06:05 crc kubenswrapper[4826]: I0319 19:06:05.739802 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9cwk9\" (UniqueName: \"kubernetes.io/projected/e18768f0-9479-4355-a573-f689a0e019d7-kube-api-access-9cwk9\") on node \"crc\" DevicePath \"\"" Mar 19 19:06:06 crc kubenswrapper[4826]: I0319 19:06:06.331851 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565786-5485l" event={"ID":"e18768f0-9479-4355-a573-f689a0e019d7","Type":"ContainerDied","Data":"745414b655ea0a5f23f7ff82eacaf52deef64796373379af7a11ac5b822b6882"} Mar 19 19:06:06 crc kubenswrapper[4826]: I0319 19:06:06.331932 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="745414b655ea0a5f23f7ff82eacaf52deef64796373379af7a11ac5b822b6882" Mar 19 19:06:06 crc kubenswrapper[4826]: I0319 19:06:06.331959 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565786-5485l" Mar 19 19:06:06 crc kubenswrapper[4826]: I0319 19:06:06.381081 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565780-w8btv"] Mar 19 19:06:06 crc kubenswrapper[4826]: I0319 19:06:06.384529 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565780-w8btv"] Mar 19 19:06:07 crc kubenswrapper[4826]: I0319 19:06:07.988835 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6b4c82a-0ec5-412b-96c9-f61bc323c223" path="/var/lib/kubelet/pods/c6b4c82a-0ec5-412b-96c9-f61bc323c223/volumes" Mar 19 19:06:16 crc kubenswrapper[4826]: I0319 19:06:16.549049 4826 scope.go:117] "RemoveContainer" containerID="0bec69c3b0d5bc0917d38cbb822b84fec430b28433b779379849e2b62c194246" Mar 19 19:06:25 crc kubenswrapper[4826]: I0319 19:06:25.401228 4826 patch_prober.go:28] interesting pod/machine-config-daemon-zz87p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:06:25 crc kubenswrapper[4826]: I0319 19:06:25.401963 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:06:55 crc kubenswrapper[4826]: I0319 19:06:55.400736 4826 patch_prober.go:28] interesting pod/machine-config-daemon-zz87p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:06:55 crc kubenswrapper[4826]: I0319 19:06:55.401266 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:06:55 crc kubenswrapper[4826]: I0319 19:06:55.401340 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" Mar 19 19:06:55 crc kubenswrapper[4826]: I0319 19:06:55.402228 4826 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4ebb3021af767d982a8fa18f3ce572a8f3489148bb38a57e868427071dfe9b1a"} pod="openshift-machine-config-operator/machine-config-daemon-zz87p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 19:06:55 crc kubenswrapper[4826]: I0319 19:06:55.402351 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" containerID="cri-o://4ebb3021af767d982a8fa18f3ce572a8f3489148bb38a57e868427071dfe9b1a" gracePeriod=600 Mar 19 19:06:55 crc kubenswrapper[4826]: I0319 19:06:55.727383 4826 generic.go:334] "Generic (PLEG): container finished" podID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerID="4ebb3021af767d982a8fa18f3ce572a8f3489148bb38a57e868427071dfe9b1a" exitCode=0 Mar 19 19:06:55 crc kubenswrapper[4826]: I0319 19:06:55.727440 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" event={"ID":"b456fa3f-c7a7-45ca-b560-e7a9b21be05a","Type":"ContainerDied","Data":"4ebb3021af767d982a8fa18f3ce572a8f3489148bb38a57e868427071dfe9b1a"} Mar 19 19:06:55 crc kubenswrapper[4826]: I0319 19:06:55.727892 4826 scope.go:117] "RemoveContainer" containerID="3ee9874037af13ba40ff8e8492fc1cbe83ef3f4c7edf979a85a6a720d737c911" Mar 19 19:06:56 crc kubenswrapper[4826]: I0319 19:06:56.739213 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" event={"ID":"b456fa3f-c7a7-45ca-b560-e7a9b21be05a","Type":"ContainerStarted","Data":"8ea88529d854cce8d681f7f626a89455b8191b0562d0f4f8577894657d2eaf5c"} Mar 19 19:07:09 crc kubenswrapper[4826]: I0319 19:07:09.427194 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwmbv"] Mar 19 19:07:09 crc kubenswrapper[4826]: E0319 19:07:09.428128 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e18768f0-9479-4355-a573-f689a0e019d7" containerName="oc" Mar 19 19:07:09 crc kubenswrapper[4826]: I0319 19:07:09.428149 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="e18768f0-9479-4355-a573-f689a0e019d7" containerName="oc" Mar 19 19:07:09 crc kubenswrapper[4826]: I0319 19:07:09.428371 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="e18768f0-9479-4355-a573-f689a0e019d7" containerName="oc" Mar 19 19:07:09 crc kubenswrapper[4826]: I0319 19:07:09.435038 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwmbv" Mar 19 19:07:09 crc kubenswrapper[4826]: I0319 19:07:09.439589 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 19 19:07:09 crc kubenswrapper[4826]: I0319 19:07:09.458777 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwmbv"] Mar 19 19:07:09 crc kubenswrapper[4826]: I0319 19:07:09.534165 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e8bfbc17-505e-4154-98c2-6e9c25345308-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwmbv\" (UID: \"e8bfbc17-505e-4154-98c2-6e9c25345308\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwmbv" Mar 19 19:07:09 crc kubenswrapper[4826]: I0319 19:07:09.534618 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e8bfbc17-505e-4154-98c2-6e9c25345308-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwmbv\" (UID: \"e8bfbc17-505e-4154-98c2-6e9c25345308\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwmbv" Mar 19 19:07:09 crc kubenswrapper[4826]: I0319 19:07:09.534915 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jph5d\" (UniqueName: \"kubernetes.io/projected/e8bfbc17-505e-4154-98c2-6e9c25345308-kube-api-access-jph5d\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwmbv\" (UID: \"e8bfbc17-505e-4154-98c2-6e9c25345308\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwmbv" Mar 19 19:07:09 crc kubenswrapper[4826]: I0319 19:07:09.635964 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e8bfbc17-505e-4154-98c2-6e9c25345308-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwmbv\" (UID: \"e8bfbc17-505e-4154-98c2-6e9c25345308\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwmbv" Mar 19 19:07:09 crc kubenswrapper[4826]: I0319 19:07:09.636924 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e8bfbc17-505e-4154-98c2-6e9c25345308-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwmbv\" (UID: \"e8bfbc17-505e-4154-98c2-6e9c25345308\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwmbv" Mar 19 19:07:09 crc kubenswrapper[4826]: I0319 19:07:09.637137 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jph5d\" (UniqueName: \"kubernetes.io/projected/e8bfbc17-505e-4154-98c2-6e9c25345308-kube-api-access-jph5d\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwmbv\" (UID: \"e8bfbc17-505e-4154-98c2-6e9c25345308\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwmbv" Mar 19 19:07:09 crc kubenswrapper[4826]: I0319 19:07:09.637303 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e8bfbc17-505e-4154-98c2-6e9c25345308-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwmbv\" (UID: \"e8bfbc17-505e-4154-98c2-6e9c25345308\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwmbv" Mar 19 19:07:09 crc kubenswrapper[4826]: I0319 19:07:09.637456 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e8bfbc17-505e-4154-98c2-6e9c25345308-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwmbv\" (UID: \"e8bfbc17-505e-4154-98c2-6e9c25345308\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwmbv" Mar 19 19:07:09 crc kubenswrapper[4826]: I0319 19:07:09.667856 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jph5d\" (UniqueName: \"kubernetes.io/projected/e8bfbc17-505e-4154-98c2-6e9c25345308-kube-api-access-jph5d\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwmbv\" (UID: \"e8bfbc17-505e-4154-98c2-6e9c25345308\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwmbv" Mar 19 19:07:09 crc kubenswrapper[4826]: I0319 19:07:09.759263 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwmbv" Mar 19 19:07:10 crc kubenswrapper[4826]: I0319 19:07:10.003687 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwmbv"] Mar 19 19:07:10 crc kubenswrapper[4826]: I0319 19:07:10.855255 4826 generic.go:334] "Generic (PLEG): container finished" podID="e8bfbc17-505e-4154-98c2-6e9c25345308" containerID="e08eb9b1b8dac040661eacb55f9bf28db8fe1f3db90c320f860a69dbe954b1e5" exitCode=0 Mar 19 19:07:10 crc kubenswrapper[4826]: I0319 19:07:10.855319 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwmbv" event={"ID":"e8bfbc17-505e-4154-98c2-6e9c25345308","Type":"ContainerDied","Data":"e08eb9b1b8dac040661eacb55f9bf28db8fe1f3db90c320f860a69dbe954b1e5"} Mar 19 19:07:10 crc kubenswrapper[4826]: I0319 19:07:10.855612 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwmbv" event={"ID":"e8bfbc17-505e-4154-98c2-6e9c25345308","Type":"ContainerStarted","Data":"288b55087f0c963b536f8b1b302950d101fddaa8bdd15856040edc1162b47563"} Mar 19 19:07:16 crc kubenswrapper[4826]: I0319 19:07:16.637092 4826 scope.go:117] "RemoveContainer" containerID="bd3d498d949ef15f2b6717c00121ba069a6744779c7bfaef6091b1a983c11748" Mar 19 19:07:20 crc kubenswrapper[4826]: I0319 19:07:20.648044 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-tmdll"] Mar 19 19:07:20 crc kubenswrapper[4826]: I0319 19:07:20.648790 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" podUID="d5a08d12-9af8-4524-8312-dac430ab73ac" containerName="ovn-controller" containerID="cri-o://f037fea284de0cac9aaaafaec1b782570ad70ea0f6342072f861e01a13da09cb" gracePeriod=30 Mar 19 19:07:20 crc kubenswrapper[4826]: I0319 19:07:20.648903 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" podUID="d5a08d12-9af8-4524-8312-dac430ab73ac" containerName="ovn-acl-logging" containerID="cri-o://61849390975c705320f8f58d0e1c2ad8c1b75b8272e9648e52c5c9285350a9f0" gracePeriod=30 Mar 19 19:07:20 crc kubenswrapper[4826]: I0319 19:07:20.648883 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" podUID="d5a08d12-9af8-4524-8312-dac430ab73ac" containerName="nbdb" containerID="cri-o://7bdae07ceccbc9af72b58d07baa0246b76c101ec439c0d0cccda4933ee732d36" gracePeriod=30 Mar 19 19:07:20 crc kubenswrapper[4826]: I0319 19:07:20.649009 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" podUID="d5a08d12-9af8-4524-8312-dac430ab73ac" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://9c2feb7a785c80cfbb1039d4b73bd90734e4f14d6c680a1c4e3e6ad53e52e987" gracePeriod=30 Mar 19 19:07:20 crc kubenswrapper[4826]: I0319 19:07:20.649048 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" podUID="d5a08d12-9af8-4524-8312-dac430ab73ac" containerName="sbdb" containerID="cri-o://f368b3acfe3fc97f8ff0bc0ca84d609b243a60e28ee7004b7cc19d30f1081683" gracePeriod=30 Mar 19 19:07:20 crc kubenswrapper[4826]: I0319 19:07:20.648935 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" podUID="d5a08d12-9af8-4524-8312-dac430ab73ac" containerName="kube-rbac-proxy-node" containerID="cri-o://8e46f0d8e96721b4855b6ad1e1c626aa6015f077e082aa8699a0e1d9763bb26f" gracePeriod=30 Mar 19 19:07:20 crc kubenswrapper[4826]: I0319 19:07:20.648999 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" podUID="d5a08d12-9af8-4524-8312-dac430ab73ac" containerName="northd" containerID="cri-o://906e2b3d6954f05f75aee72a899184cb8df00bc6791da4a80cd788cf0aeb6dd6" gracePeriod=30 Mar 19 19:07:20 crc kubenswrapper[4826]: I0319 19:07:20.707551 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" podUID="d5a08d12-9af8-4524-8312-dac430ab73ac" containerName="ovnkube-controller" containerID="cri-o://04f8c9d39de66273aab67bc4a6c2d30f4e42c3520ff8fa7d26f0b711cc67792b" gracePeriod=30 Mar 19 19:07:20 crc kubenswrapper[4826]: I0319 19:07:20.936255 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fwtqp_66ede589-eceb-497a-b51a-f702f9181969/kube-multus/0.log" Mar 19 19:07:20 crc kubenswrapper[4826]: I0319 19:07:20.936315 4826 generic.go:334] "Generic (PLEG): container finished" podID="66ede589-eceb-497a-b51a-f702f9181969" containerID="23953b69631e80f97d6e28d3ca594416c07f1a83bc445d00460c0edbb31552ab" exitCode=2 Mar 19 19:07:20 crc kubenswrapper[4826]: I0319 19:07:20.936382 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fwtqp" event={"ID":"66ede589-eceb-497a-b51a-f702f9181969","Type":"ContainerDied","Data":"23953b69631e80f97d6e28d3ca594416c07f1a83bc445d00460c0edbb31552ab"} Mar 19 19:07:20 crc kubenswrapper[4826]: I0319 19:07:20.937336 4826 scope.go:117] "RemoveContainer" containerID="23953b69631e80f97d6e28d3ca594416c07f1a83bc445d00460c0edbb31552ab" Mar 19 19:07:20 crc kubenswrapper[4826]: I0319 19:07:20.942505 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmdll_d5a08d12-9af8-4524-8312-dac430ab73ac/ovn-acl-logging/0.log" Mar 19 19:07:20 crc kubenswrapper[4826]: I0319 19:07:20.943165 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmdll_d5a08d12-9af8-4524-8312-dac430ab73ac/ovn-controller/0.log" Mar 19 19:07:20 crc kubenswrapper[4826]: I0319 19:07:20.943538 4826 generic.go:334] "Generic (PLEG): container finished" podID="d5a08d12-9af8-4524-8312-dac430ab73ac" containerID="04f8c9d39de66273aab67bc4a6c2d30f4e42c3520ff8fa7d26f0b711cc67792b" exitCode=0 Mar 19 19:07:20 crc kubenswrapper[4826]: I0319 19:07:20.943924 4826 generic.go:334] "Generic (PLEG): container finished" podID="d5a08d12-9af8-4524-8312-dac430ab73ac" containerID="f368b3acfe3fc97f8ff0bc0ca84d609b243a60e28ee7004b7cc19d30f1081683" exitCode=0 Mar 19 19:07:20 crc kubenswrapper[4826]: I0319 19:07:20.943955 4826 generic.go:334] "Generic (PLEG): container finished" podID="d5a08d12-9af8-4524-8312-dac430ab73ac" containerID="7bdae07ceccbc9af72b58d07baa0246b76c101ec439c0d0cccda4933ee732d36" exitCode=0 Mar 19 19:07:20 crc kubenswrapper[4826]: I0319 19:07:20.943965 4826 generic.go:334] "Generic (PLEG): container finished" podID="d5a08d12-9af8-4524-8312-dac430ab73ac" containerID="61849390975c705320f8f58d0e1c2ad8c1b75b8272e9648e52c5c9285350a9f0" exitCode=143 Mar 19 19:07:20 crc kubenswrapper[4826]: I0319 19:07:20.943974 4826 generic.go:334] "Generic (PLEG): container finished" podID="d5a08d12-9af8-4524-8312-dac430ab73ac" containerID="f037fea284de0cac9aaaafaec1b782570ad70ea0f6342072f861e01a13da09cb" exitCode=143 Mar 19 19:07:20 crc kubenswrapper[4826]: I0319 19:07:20.943915 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" event={"ID":"d5a08d12-9af8-4524-8312-dac430ab73ac","Type":"ContainerDied","Data":"04f8c9d39de66273aab67bc4a6c2d30f4e42c3520ff8fa7d26f0b711cc67792b"} Mar 19 19:07:20 crc kubenswrapper[4826]: I0319 19:07:20.944009 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" event={"ID":"d5a08d12-9af8-4524-8312-dac430ab73ac","Type":"ContainerDied","Data":"f368b3acfe3fc97f8ff0bc0ca84d609b243a60e28ee7004b7cc19d30f1081683"} Mar 19 19:07:20 crc kubenswrapper[4826]: I0319 19:07:20.944027 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" event={"ID":"d5a08d12-9af8-4524-8312-dac430ab73ac","Type":"ContainerDied","Data":"7bdae07ceccbc9af72b58d07baa0246b76c101ec439c0d0cccda4933ee732d36"} Mar 19 19:07:20 crc kubenswrapper[4826]: I0319 19:07:20.944041 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" event={"ID":"d5a08d12-9af8-4524-8312-dac430ab73ac","Type":"ContainerDied","Data":"61849390975c705320f8f58d0e1c2ad8c1b75b8272e9648e52c5c9285350a9f0"} Mar 19 19:07:20 crc kubenswrapper[4826]: I0319 19:07:20.944054 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" event={"ID":"d5a08d12-9af8-4524-8312-dac430ab73ac","Type":"ContainerDied","Data":"f037fea284de0cac9aaaafaec1b782570ad70ea0f6342072f861e01a13da09cb"} Mar 19 19:07:21 crc kubenswrapper[4826]: I0319 19:07:21.849346 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmdll_d5a08d12-9af8-4524-8312-dac430ab73ac/ovn-acl-logging/0.log" Mar 19 19:07:21 crc kubenswrapper[4826]: I0319 19:07:21.851183 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmdll_d5a08d12-9af8-4524-8312-dac430ab73ac/ovn-controller/0.log" Mar 19 19:07:21 crc kubenswrapper[4826]: I0319 19:07:21.851948 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 19:07:21 crc kubenswrapper[4826]: I0319 19:07:21.918311 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-kpj5p"] Mar 19 19:07:21 crc kubenswrapper[4826]: E0319 19:07:21.918642 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5a08d12-9af8-4524-8312-dac430ab73ac" containerName="nbdb" Mar 19 19:07:21 crc kubenswrapper[4826]: I0319 19:07:21.918705 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5a08d12-9af8-4524-8312-dac430ab73ac" containerName="nbdb" Mar 19 19:07:21 crc kubenswrapper[4826]: E0319 19:07:21.918730 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5a08d12-9af8-4524-8312-dac430ab73ac" containerName="sbdb" Mar 19 19:07:21 crc kubenswrapper[4826]: I0319 19:07:21.918748 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5a08d12-9af8-4524-8312-dac430ab73ac" containerName="sbdb" Mar 19 19:07:21 crc kubenswrapper[4826]: E0319 19:07:21.918784 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5a08d12-9af8-4524-8312-dac430ab73ac" containerName="kube-rbac-proxy-ovn-metrics" Mar 19 19:07:21 crc kubenswrapper[4826]: I0319 19:07:21.918798 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5a08d12-9af8-4524-8312-dac430ab73ac" containerName="kube-rbac-proxy-ovn-metrics" Mar 19 19:07:21 crc kubenswrapper[4826]: E0319 19:07:21.918818 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5a08d12-9af8-4524-8312-dac430ab73ac" containerName="ovn-controller" Mar 19 19:07:21 crc kubenswrapper[4826]: I0319 19:07:21.918830 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5a08d12-9af8-4524-8312-dac430ab73ac" containerName="ovn-controller" Mar 19 19:07:21 crc kubenswrapper[4826]: E0319 19:07:21.918861 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5a08d12-9af8-4524-8312-dac430ab73ac" containerName="ovnkube-controller" Mar 19 19:07:21 crc kubenswrapper[4826]: I0319 19:07:21.918878 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5a08d12-9af8-4524-8312-dac430ab73ac" containerName="ovnkube-controller" Mar 19 19:07:21 crc kubenswrapper[4826]: E0319 19:07:21.918906 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5a08d12-9af8-4524-8312-dac430ab73ac" containerName="northd" Mar 19 19:07:21 crc kubenswrapper[4826]: I0319 19:07:21.918922 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5a08d12-9af8-4524-8312-dac430ab73ac" containerName="northd" Mar 19 19:07:21 crc kubenswrapper[4826]: E0319 19:07:21.918953 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5a08d12-9af8-4524-8312-dac430ab73ac" containerName="ovn-acl-logging" Mar 19 19:07:21 crc kubenswrapper[4826]: I0319 19:07:21.918966 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5a08d12-9af8-4524-8312-dac430ab73ac" containerName="ovn-acl-logging" Mar 19 19:07:21 crc kubenswrapper[4826]: E0319 19:07:21.918988 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5a08d12-9af8-4524-8312-dac430ab73ac" containerName="kube-rbac-proxy-node" Mar 19 19:07:21 crc kubenswrapper[4826]: I0319 19:07:21.919000 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5a08d12-9af8-4524-8312-dac430ab73ac" containerName="kube-rbac-proxy-node" Mar 19 19:07:21 crc kubenswrapper[4826]: E0319 19:07:21.919027 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5a08d12-9af8-4524-8312-dac430ab73ac" containerName="kubecfg-setup" Mar 19 19:07:21 crc kubenswrapper[4826]: I0319 19:07:21.919043 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5a08d12-9af8-4524-8312-dac430ab73ac" containerName="kubecfg-setup" Mar 19 19:07:21 crc kubenswrapper[4826]: I0319 19:07:21.919262 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5a08d12-9af8-4524-8312-dac430ab73ac" containerName="kube-rbac-proxy-node" Mar 19 19:07:21 crc kubenswrapper[4826]: I0319 19:07:21.919299 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5a08d12-9af8-4524-8312-dac430ab73ac" containerName="ovn-controller" Mar 19 19:07:21 crc kubenswrapper[4826]: I0319 19:07:21.919314 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5a08d12-9af8-4524-8312-dac430ab73ac" containerName="ovn-acl-logging" Mar 19 19:07:21 crc kubenswrapper[4826]: I0319 19:07:21.919332 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5a08d12-9af8-4524-8312-dac430ab73ac" containerName="northd" Mar 19 19:07:21 crc kubenswrapper[4826]: I0319 19:07:21.919357 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5a08d12-9af8-4524-8312-dac430ab73ac" containerName="sbdb" Mar 19 19:07:21 crc kubenswrapper[4826]: I0319 19:07:21.919376 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5a08d12-9af8-4524-8312-dac430ab73ac" containerName="nbdb" Mar 19 19:07:21 crc kubenswrapper[4826]: I0319 19:07:21.919391 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5a08d12-9af8-4524-8312-dac430ab73ac" containerName="kube-rbac-proxy-ovn-metrics" Mar 19 19:07:21 crc kubenswrapper[4826]: I0319 19:07:21.919407 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5a08d12-9af8-4524-8312-dac430ab73ac" containerName="ovnkube-controller" Mar 19 19:07:21 crc kubenswrapper[4826]: I0319 19:07:21.921194 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-host-var-lib-cni-networks-ovn-kubernetes\") pod \"d5a08d12-9af8-4524-8312-dac430ab73ac\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " Mar 19 19:07:21 crc kubenswrapper[4826]: I0319 19:07:21.921365 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "d5a08d12-9af8-4524-8312-dac430ab73ac" (UID: "d5a08d12-9af8-4524-8312-dac430ab73ac"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 19:07:21 crc kubenswrapper[4826]: I0319 19:07:21.921627 4826 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 19 19:07:21 crc kubenswrapper[4826]: I0319 19:07:21.924198 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:21 crc kubenswrapper[4826]: I0319 19:07:21.959856 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fwtqp_66ede589-eceb-497a-b51a-f702f9181969/kube-multus/0.log" Mar 19 19:07:21 crc kubenswrapper[4826]: I0319 19:07:21.960111 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fwtqp" event={"ID":"66ede589-eceb-497a-b51a-f702f9181969","Type":"ContainerStarted","Data":"d21b71bca73172a654aa6a34b9a5b5182f3758e3fd8521936ecaa4f4f0affa72"} Mar 19 19:07:21 crc kubenswrapper[4826]: I0319 19:07:21.978232 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmdll_d5a08d12-9af8-4524-8312-dac430ab73ac/ovn-acl-logging/0.log" Mar 19 19:07:21 crc kubenswrapper[4826]: I0319 19:07:21.978928 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmdll_d5a08d12-9af8-4524-8312-dac430ab73ac/ovn-controller/0.log" Mar 19 19:07:21 crc kubenswrapper[4826]: I0319 19:07:21.979546 4826 generic.go:334] "Generic (PLEG): container finished" podID="d5a08d12-9af8-4524-8312-dac430ab73ac" containerID="906e2b3d6954f05f75aee72a899184cb8df00bc6791da4a80cd788cf0aeb6dd6" exitCode=0 Mar 19 19:07:21 crc kubenswrapper[4826]: I0319 19:07:21.979635 4826 generic.go:334] "Generic (PLEG): container finished" podID="d5a08d12-9af8-4524-8312-dac430ab73ac" containerID="9c2feb7a785c80cfbb1039d4b73bd90734e4f14d6c680a1c4e3e6ad53e52e987" exitCode=0 Mar 19 19:07:21 crc kubenswrapper[4826]: I0319 19:07:21.979669 4826 generic.go:334] "Generic (PLEG): container finished" podID="d5a08d12-9af8-4524-8312-dac430ab73ac" containerID="8e46f0d8e96721b4855b6ad1e1c626aa6015f077e082aa8699a0e1d9763bb26f" exitCode=0 Mar 19 19:07:21 crc kubenswrapper[4826]: I0319 19:07:21.979770 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" Mar 19 19:07:21 crc kubenswrapper[4826]: I0319 19:07:21.988897 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" event={"ID":"d5a08d12-9af8-4524-8312-dac430ab73ac","Type":"ContainerDied","Data":"906e2b3d6954f05f75aee72a899184cb8df00bc6791da4a80cd788cf0aeb6dd6"} Mar 19 19:07:21 crc kubenswrapper[4826]: I0319 19:07:21.988943 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" event={"ID":"d5a08d12-9af8-4524-8312-dac430ab73ac","Type":"ContainerDied","Data":"9c2feb7a785c80cfbb1039d4b73bd90734e4f14d6c680a1c4e3e6ad53e52e987"} Mar 19 19:07:21 crc kubenswrapper[4826]: I0319 19:07:21.988959 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" event={"ID":"d5a08d12-9af8-4524-8312-dac430ab73ac","Type":"ContainerDied","Data":"8e46f0d8e96721b4855b6ad1e1c626aa6015f077e082aa8699a0e1d9763bb26f"} Mar 19 19:07:21 crc kubenswrapper[4826]: I0319 19:07:21.988971 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmdll" event={"ID":"d5a08d12-9af8-4524-8312-dac430ab73ac","Type":"ContainerDied","Data":"3f12baf1fc0b617ff99a728b69b11b3b665630761f0efd3183686248dfe6d4c8"} Mar 19 19:07:21 crc kubenswrapper[4826]: I0319 19:07:21.988991 4826 scope.go:117] "RemoveContainer" containerID="04f8c9d39de66273aab67bc4a6c2d30f4e42c3520ff8fa7d26f0b711cc67792b" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.015093 4826 scope.go:117] "RemoveContainer" containerID="f368b3acfe3fc97f8ff0bc0ca84d609b243a60e28ee7004b7cc19d30f1081683" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.022139 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d5a08d12-9af8-4524-8312-dac430ab73ac-ovnkube-config\") pod \"d5a08d12-9af8-4524-8312-dac430ab73ac\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.022167 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-host-run-netns\") pod \"d5a08d12-9af8-4524-8312-dac430ab73ac\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.022253 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-run-ovn\") pod \"d5a08d12-9af8-4524-8312-dac430ab73ac\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.022285 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "d5a08d12-9af8-4524-8312-dac430ab73ac" (UID: "d5a08d12-9af8-4524-8312-dac430ab73ac"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.022437 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "d5a08d12-9af8-4524-8312-dac430ab73ac" (UID: "d5a08d12-9af8-4524-8312-dac430ab73ac"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.022523 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-host-kubelet\") pod \"d5a08d12-9af8-4524-8312-dac430ab73ac\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.022567 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5a08d12-9af8-4524-8312-dac430ab73ac-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "d5a08d12-9af8-4524-8312-dac430ab73ac" (UID: "d5a08d12-9af8-4524-8312-dac430ab73ac"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.022595 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "d5a08d12-9af8-4524-8312-dac430ab73ac" (UID: "d5a08d12-9af8-4524-8312-dac430ab73ac"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.022879 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-var-lib-openvswitch\") pod \"d5a08d12-9af8-4524-8312-dac430ab73ac\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.022908 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "d5a08d12-9af8-4524-8312-dac430ab73ac" (UID: "d5a08d12-9af8-4524-8312-dac430ab73ac"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.027894 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d5a08d12-9af8-4524-8312-dac430ab73ac-ovn-node-metrics-cert\") pod \"d5a08d12-9af8-4524-8312-dac430ab73ac\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.027981 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-run-systemd\") pod \"d5a08d12-9af8-4524-8312-dac430ab73ac\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.027999 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d5a08d12-9af8-4524-8312-dac430ab73ac-env-overrides\") pod \"d5a08d12-9af8-4524-8312-dac430ab73ac\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.028023 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-host-cni-bin\") pod \"d5a08d12-9af8-4524-8312-dac430ab73ac\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.028042 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-node-log\") pod \"d5a08d12-9af8-4524-8312-dac430ab73ac\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.028068 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d5a08d12-9af8-4524-8312-dac430ab73ac-ovnkube-script-lib\") pod \"d5a08d12-9af8-4524-8312-dac430ab73ac\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.028087 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-host-slash\") pod \"d5a08d12-9af8-4524-8312-dac430ab73ac\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.028103 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-run-openvswitch\") pod \"d5a08d12-9af8-4524-8312-dac430ab73ac\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.028125 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-systemd-units\") pod \"d5a08d12-9af8-4524-8312-dac430ab73ac\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.028141 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4b4qs\" (UniqueName: \"kubernetes.io/projected/d5a08d12-9af8-4524-8312-dac430ab73ac-kube-api-access-4b4qs\") pod \"d5a08d12-9af8-4524-8312-dac430ab73ac\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.028167 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-etc-openvswitch\") pod \"d5a08d12-9af8-4524-8312-dac430ab73ac\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.028189 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-log-socket\") pod \"d5a08d12-9af8-4524-8312-dac430ab73ac\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.028212 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-host-run-ovn-kubernetes\") pod \"d5a08d12-9af8-4524-8312-dac430ab73ac\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.028239 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-host-cni-netd\") pod \"d5a08d12-9af8-4524-8312-dac430ab73ac\" (UID: \"d5a08d12-9af8-4524-8312-dac430ab73ac\") " Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.028471 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "d5a08d12-9af8-4524-8312-dac430ab73ac" (UID: "d5a08d12-9af8-4524-8312-dac430ab73ac"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.028722 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-node-log" (OuterVolumeSpecName: "node-log") pod "d5a08d12-9af8-4524-8312-dac430ab73ac" (UID: "d5a08d12-9af8-4524-8312-dac430ab73ac"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.028758 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "d5a08d12-9af8-4524-8312-dac430ab73ac" (UID: "d5a08d12-9af8-4524-8312-dac430ab73ac"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.029165 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-host-slash" (OuterVolumeSpecName: "host-slash") pod "d5a08d12-9af8-4524-8312-dac430ab73ac" (UID: "d5a08d12-9af8-4524-8312-dac430ab73ac"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.029233 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "d5a08d12-9af8-4524-8312-dac430ab73ac" (UID: "d5a08d12-9af8-4524-8312-dac430ab73ac"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.029242 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "d5a08d12-9af8-4524-8312-dac430ab73ac" (UID: "d5a08d12-9af8-4524-8312-dac430ab73ac"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.029263 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-log-socket" (OuterVolumeSpecName: "log-socket") pod "d5a08d12-9af8-4524-8312-dac430ab73ac" (UID: "d5a08d12-9af8-4524-8312-dac430ab73ac"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.029310 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "d5a08d12-9af8-4524-8312-dac430ab73ac" (UID: "d5a08d12-9af8-4524-8312-dac430ab73ac"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.029429 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5a08d12-9af8-4524-8312-dac430ab73ac-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "d5a08d12-9af8-4524-8312-dac430ab73ac" (UID: "d5a08d12-9af8-4524-8312-dac430ab73ac"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.029409 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "d5a08d12-9af8-4524-8312-dac430ab73ac" (UID: "d5a08d12-9af8-4524-8312-dac430ab73ac"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.029498 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3431e3be-0e2e-4581-93a3-e3eebad0afee-node-log\") pod \"ovnkube-node-kpj5p\" (UID: \"3431e3be-0e2e-4581-93a3-e3eebad0afee\") " pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.029700 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3431e3be-0e2e-4581-93a3-e3eebad0afee-run-openvswitch\") pod \"ovnkube-node-kpj5p\" (UID: \"3431e3be-0e2e-4581-93a3-e3eebad0afee\") " pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.029771 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3431e3be-0e2e-4581-93a3-e3eebad0afee-ovn-node-metrics-cert\") pod \"ovnkube-node-kpj5p\" (UID: \"3431e3be-0e2e-4581-93a3-e3eebad0afee\") " pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.029835 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3431e3be-0e2e-4581-93a3-e3eebad0afee-ovnkube-script-lib\") pod \"ovnkube-node-kpj5p\" (UID: \"3431e3be-0e2e-4581-93a3-e3eebad0afee\") " pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.029923 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6f85r\" (UniqueName: \"kubernetes.io/projected/3431e3be-0e2e-4581-93a3-e3eebad0afee-kube-api-access-6f85r\") pod \"ovnkube-node-kpj5p\" (UID: \"3431e3be-0e2e-4581-93a3-e3eebad0afee\") " pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.029996 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3431e3be-0e2e-4581-93a3-e3eebad0afee-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kpj5p\" (UID: \"3431e3be-0e2e-4581-93a3-e3eebad0afee\") " pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.030065 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3431e3be-0e2e-4581-93a3-e3eebad0afee-ovnkube-config\") pod \"ovnkube-node-kpj5p\" (UID: \"3431e3be-0e2e-4581-93a3-e3eebad0afee\") " pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.030159 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3431e3be-0e2e-4581-93a3-e3eebad0afee-etc-openvswitch\") pod \"ovnkube-node-kpj5p\" (UID: \"3431e3be-0e2e-4581-93a3-e3eebad0afee\") " pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.030283 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3431e3be-0e2e-4581-93a3-e3eebad0afee-host-run-ovn-kubernetes\") pod \"ovnkube-node-kpj5p\" (UID: \"3431e3be-0e2e-4581-93a3-e3eebad0afee\") " pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.030383 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3431e3be-0e2e-4581-93a3-e3eebad0afee-run-systemd\") pod \"ovnkube-node-kpj5p\" (UID: \"3431e3be-0e2e-4581-93a3-e3eebad0afee\") " pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.030480 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3431e3be-0e2e-4581-93a3-e3eebad0afee-env-overrides\") pod \"ovnkube-node-kpj5p\" (UID: \"3431e3be-0e2e-4581-93a3-e3eebad0afee\") " pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.030566 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3431e3be-0e2e-4581-93a3-e3eebad0afee-systemd-units\") pod \"ovnkube-node-kpj5p\" (UID: \"3431e3be-0e2e-4581-93a3-e3eebad0afee\") " pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.030648 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3431e3be-0e2e-4581-93a3-e3eebad0afee-host-cni-bin\") pod \"ovnkube-node-kpj5p\" (UID: \"3431e3be-0e2e-4581-93a3-e3eebad0afee\") " pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.030795 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3431e3be-0e2e-4581-93a3-e3eebad0afee-host-slash\") pod \"ovnkube-node-kpj5p\" (UID: \"3431e3be-0e2e-4581-93a3-e3eebad0afee\") " pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.031084 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3431e3be-0e2e-4581-93a3-e3eebad0afee-log-socket\") pod \"ovnkube-node-kpj5p\" (UID: \"3431e3be-0e2e-4581-93a3-e3eebad0afee\") " pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.031201 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3431e3be-0e2e-4581-93a3-e3eebad0afee-host-cni-netd\") pod \"ovnkube-node-kpj5p\" (UID: \"3431e3be-0e2e-4581-93a3-e3eebad0afee\") " pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.031291 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3431e3be-0e2e-4581-93a3-e3eebad0afee-var-lib-openvswitch\") pod \"ovnkube-node-kpj5p\" (UID: \"3431e3be-0e2e-4581-93a3-e3eebad0afee\") " pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.031383 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3431e3be-0e2e-4581-93a3-e3eebad0afee-run-ovn\") pod \"ovnkube-node-kpj5p\" (UID: \"3431e3be-0e2e-4581-93a3-e3eebad0afee\") " pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.031847 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3431e3be-0e2e-4581-93a3-e3eebad0afee-host-kubelet\") pod \"ovnkube-node-kpj5p\" (UID: \"3431e3be-0e2e-4581-93a3-e3eebad0afee\") " pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.032039 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3431e3be-0e2e-4581-93a3-e3eebad0afee-host-run-netns\") pod \"ovnkube-node-kpj5p\" (UID: \"3431e3be-0e2e-4581-93a3-e3eebad0afee\") " pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.032313 4826 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.032387 4826 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.032844 4826 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d5a08d12-9af8-4524-8312-dac430ab73ac-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.033796 4826 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.033814 4826 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.033826 4826 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.033838 4826 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.033850 4826 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d5a08d12-9af8-4524-8312-dac430ab73ac-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.033862 4826 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.033872 4826 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-node-log\") on node \"crc\" DevicePath \"\"" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.033884 4826 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-host-slash\") on node \"crc\" DevicePath \"\"" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.029734 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5a08d12-9af8-4524-8312-dac430ab73ac-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "d5a08d12-9af8-4524-8312-dac430ab73ac" (UID: "d5a08d12-9af8-4524-8312-dac430ab73ac"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.033895 4826 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.033955 4826 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.033972 4826 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.033985 4826 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-log-socket\") on node \"crc\" DevicePath \"\"" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.034785 4826 scope.go:117] "RemoveContainer" containerID="7bdae07ceccbc9af72b58d07baa0246b76c101ec439c0d0cccda4933ee732d36" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.039479 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5a08d12-9af8-4524-8312-dac430ab73ac-kube-api-access-4b4qs" (OuterVolumeSpecName: "kube-api-access-4b4qs") pod "d5a08d12-9af8-4524-8312-dac430ab73ac" (UID: "d5a08d12-9af8-4524-8312-dac430ab73ac"). InnerVolumeSpecName "kube-api-access-4b4qs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.051153 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5a08d12-9af8-4524-8312-dac430ab73ac-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "d5a08d12-9af8-4524-8312-dac430ab73ac" (UID: "d5a08d12-9af8-4524-8312-dac430ab73ac"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.060083 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "d5a08d12-9af8-4524-8312-dac430ab73ac" (UID: "d5a08d12-9af8-4524-8312-dac430ab73ac"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.097032 4826 scope.go:117] "RemoveContainer" containerID="906e2b3d6954f05f75aee72a899184cb8df00bc6791da4a80cd788cf0aeb6dd6" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.110960 4826 scope.go:117] "RemoveContainer" containerID="9c2feb7a785c80cfbb1039d4b73bd90734e4f14d6c680a1c4e3e6ad53e52e987" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.127726 4826 scope.go:117] "RemoveContainer" containerID="8e46f0d8e96721b4855b6ad1e1c626aa6015f077e082aa8699a0e1d9763bb26f" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.135503 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3431e3be-0e2e-4581-93a3-e3eebad0afee-node-log\") pod \"ovnkube-node-kpj5p\" (UID: \"3431e3be-0e2e-4581-93a3-e3eebad0afee\") " pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.135555 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3431e3be-0e2e-4581-93a3-e3eebad0afee-run-openvswitch\") pod \"ovnkube-node-kpj5p\" (UID: \"3431e3be-0e2e-4581-93a3-e3eebad0afee\") " pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.135576 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3431e3be-0e2e-4581-93a3-e3eebad0afee-ovn-node-metrics-cert\") pod \"ovnkube-node-kpj5p\" (UID: \"3431e3be-0e2e-4581-93a3-e3eebad0afee\") " pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.135595 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3431e3be-0e2e-4581-93a3-e3eebad0afee-ovnkube-script-lib\") pod \"ovnkube-node-kpj5p\" (UID: \"3431e3be-0e2e-4581-93a3-e3eebad0afee\") " pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.135623 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6f85r\" (UniqueName: \"kubernetes.io/projected/3431e3be-0e2e-4581-93a3-e3eebad0afee-kube-api-access-6f85r\") pod \"ovnkube-node-kpj5p\" (UID: \"3431e3be-0e2e-4581-93a3-e3eebad0afee\") " pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.135647 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3431e3be-0e2e-4581-93a3-e3eebad0afee-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kpj5p\" (UID: \"3431e3be-0e2e-4581-93a3-e3eebad0afee\") " pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.135696 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3431e3be-0e2e-4581-93a3-e3eebad0afee-ovnkube-config\") pod \"ovnkube-node-kpj5p\" (UID: \"3431e3be-0e2e-4581-93a3-e3eebad0afee\") " pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.135726 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3431e3be-0e2e-4581-93a3-e3eebad0afee-etc-openvswitch\") pod \"ovnkube-node-kpj5p\" (UID: \"3431e3be-0e2e-4581-93a3-e3eebad0afee\") " pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.135736 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3431e3be-0e2e-4581-93a3-e3eebad0afee-run-openvswitch\") pod \"ovnkube-node-kpj5p\" (UID: \"3431e3be-0e2e-4581-93a3-e3eebad0afee\") " pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.135788 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3431e3be-0e2e-4581-93a3-e3eebad0afee-host-run-ovn-kubernetes\") pod \"ovnkube-node-kpj5p\" (UID: \"3431e3be-0e2e-4581-93a3-e3eebad0afee\") " pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.135752 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3431e3be-0e2e-4581-93a3-e3eebad0afee-host-run-ovn-kubernetes\") pod \"ovnkube-node-kpj5p\" (UID: \"3431e3be-0e2e-4581-93a3-e3eebad0afee\") " pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.135879 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3431e3be-0e2e-4581-93a3-e3eebad0afee-run-systemd\") pod \"ovnkube-node-kpj5p\" (UID: \"3431e3be-0e2e-4581-93a3-e3eebad0afee\") " pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.135911 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3431e3be-0e2e-4581-93a3-e3eebad0afee-node-log\") pod \"ovnkube-node-kpj5p\" (UID: \"3431e3be-0e2e-4581-93a3-e3eebad0afee\") " pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.135988 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3431e3be-0e2e-4581-93a3-e3eebad0afee-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kpj5p\" (UID: \"3431e3be-0e2e-4581-93a3-e3eebad0afee\") " pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.136004 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3431e3be-0e2e-4581-93a3-e3eebad0afee-systemd-units\") pod \"ovnkube-node-kpj5p\" (UID: \"3431e3be-0e2e-4581-93a3-e3eebad0afee\") " pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.136033 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3431e3be-0e2e-4581-93a3-e3eebad0afee-run-systemd\") pod \"ovnkube-node-kpj5p\" (UID: \"3431e3be-0e2e-4581-93a3-e3eebad0afee\") " pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.136043 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3431e3be-0e2e-4581-93a3-e3eebad0afee-env-overrides\") pod \"ovnkube-node-kpj5p\" (UID: \"3431e3be-0e2e-4581-93a3-e3eebad0afee\") " pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.136086 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3431e3be-0e2e-4581-93a3-e3eebad0afee-host-cni-bin\") pod \"ovnkube-node-kpj5p\" (UID: \"3431e3be-0e2e-4581-93a3-e3eebad0afee\") " pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.136121 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3431e3be-0e2e-4581-93a3-e3eebad0afee-host-slash\") pod \"ovnkube-node-kpj5p\" (UID: \"3431e3be-0e2e-4581-93a3-e3eebad0afee\") " pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.136200 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3431e3be-0e2e-4581-93a3-e3eebad0afee-log-socket\") pod \"ovnkube-node-kpj5p\" (UID: \"3431e3be-0e2e-4581-93a3-e3eebad0afee\") " pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.136235 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3431e3be-0e2e-4581-93a3-e3eebad0afee-var-lib-openvswitch\") pod \"ovnkube-node-kpj5p\" (UID: \"3431e3be-0e2e-4581-93a3-e3eebad0afee\") " pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.136269 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3431e3be-0e2e-4581-93a3-e3eebad0afee-host-cni-bin\") pod \"ovnkube-node-kpj5p\" (UID: \"3431e3be-0e2e-4581-93a3-e3eebad0afee\") " pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.136270 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3431e3be-0e2e-4581-93a3-e3eebad0afee-host-cni-netd\") pod \"ovnkube-node-kpj5p\" (UID: \"3431e3be-0e2e-4581-93a3-e3eebad0afee\") " pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.136311 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3431e3be-0e2e-4581-93a3-e3eebad0afee-host-cni-netd\") pod \"ovnkube-node-kpj5p\" (UID: \"3431e3be-0e2e-4581-93a3-e3eebad0afee\") " pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.137938 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3431e3be-0e2e-4581-93a3-e3eebad0afee-run-ovn\") pod \"ovnkube-node-kpj5p\" (UID: \"3431e3be-0e2e-4581-93a3-e3eebad0afee\") " pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.136338 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3431e3be-0e2e-4581-93a3-e3eebad0afee-host-slash\") pod \"ovnkube-node-kpj5p\" (UID: \"3431e3be-0e2e-4581-93a3-e3eebad0afee\") " pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.136403 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3431e3be-0e2e-4581-93a3-e3eebad0afee-var-lib-openvswitch\") pod \"ovnkube-node-kpj5p\" (UID: \"3431e3be-0e2e-4581-93a3-e3eebad0afee\") " pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.136413 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3431e3be-0e2e-4581-93a3-e3eebad0afee-systemd-units\") pod \"ovnkube-node-kpj5p\" (UID: \"3431e3be-0e2e-4581-93a3-e3eebad0afee\") " pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.136420 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3431e3be-0e2e-4581-93a3-e3eebad0afee-log-socket\") pod \"ovnkube-node-kpj5p\" (UID: \"3431e3be-0e2e-4581-93a3-e3eebad0afee\") " pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.136853 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3431e3be-0e2e-4581-93a3-e3eebad0afee-ovnkube-config\") pod \"ovnkube-node-kpj5p\" (UID: \"3431e3be-0e2e-4581-93a3-e3eebad0afee\") " pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.136983 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3431e3be-0e2e-4581-93a3-e3eebad0afee-env-overrides\") pod \"ovnkube-node-kpj5p\" (UID: \"3431e3be-0e2e-4581-93a3-e3eebad0afee\") " pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.137709 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3431e3be-0e2e-4581-93a3-e3eebad0afee-ovnkube-script-lib\") pod \"ovnkube-node-kpj5p\" (UID: \"3431e3be-0e2e-4581-93a3-e3eebad0afee\") " pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.136231 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3431e3be-0e2e-4581-93a3-e3eebad0afee-etc-openvswitch\") pod \"ovnkube-node-kpj5p\" (UID: \"3431e3be-0e2e-4581-93a3-e3eebad0afee\") " pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.138028 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3431e3be-0e2e-4581-93a3-e3eebad0afee-host-kubelet\") pod \"ovnkube-node-kpj5p\" (UID: \"3431e3be-0e2e-4581-93a3-e3eebad0afee\") " pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.138037 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3431e3be-0e2e-4581-93a3-e3eebad0afee-run-ovn\") pod \"ovnkube-node-kpj5p\" (UID: \"3431e3be-0e2e-4581-93a3-e3eebad0afee\") " pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.138066 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3431e3be-0e2e-4581-93a3-e3eebad0afee-host-kubelet\") pod \"ovnkube-node-kpj5p\" (UID: \"3431e3be-0e2e-4581-93a3-e3eebad0afee\") " pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.138079 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3431e3be-0e2e-4581-93a3-e3eebad0afee-host-run-netns\") pod \"ovnkube-node-kpj5p\" (UID: \"3431e3be-0e2e-4581-93a3-e3eebad0afee\") " pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.138185 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3431e3be-0e2e-4581-93a3-e3eebad0afee-host-run-netns\") pod \"ovnkube-node-kpj5p\" (UID: \"3431e3be-0e2e-4581-93a3-e3eebad0afee\") " pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.138188 4826 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d5a08d12-9af8-4524-8312-dac430ab73ac-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.138215 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4b4qs\" (UniqueName: \"kubernetes.io/projected/d5a08d12-9af8-4524-8312-dac430ab73ac-kube-api-access-4b4qs\") on node \"crc\" DevicePath \"\"" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.138226 4826 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d5a08d12-9af8-4524-8312-dac430ab73ac-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.138235 4826 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d5a08d12-9af8-4524-8312-dac430ab73ac-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.141733 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3431e3be-0e2e-4581-93a3-e3eebad0afee-ovn-node-metrics-cert\") pod \"ovnkube-node-kpj5p\" (UID: \"3431e3be-0e2e-4581-93a3-e3eebad0afee\") " pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.148739 4826 scope.go:117] "RemoveContainer" containerID="61849390975c705320f8f58d0e1c2ad8c1b75b8272e9648e52c5c9285350a9f0" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.152581 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6f85r\" (UniqueName: \"kubernetes.io/projected/3431e3be-0e2e-4581-93a3-e3eebad0afee-kube-api-access-6f85r\") pod \"ovnkube-node-kpj5p\" (UID: \"3431e3be-0e2e-4581-93a3-e3eebad0afee\") " pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.257238 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.265681 4826 scope.go:117] "RemoveContainer" containerID="f037fea284de0cac9aaaafaec1b782570ad70ea0f6342072f861e01a13da09cb" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.322163 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-tmdll"] Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.326380 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-tmdll"] Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.343180 4826 scope.go:117] "RemoveContainer" containerID="1d9d11401e0db6838e5ede12e71db43517ccb90500295373ea3bc3d284d5346c" Mar 19 19:07:22 crc kubenswrapper[4826]: W0319 19:07:22.354402 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3431e3be_0e2e_4581_93a3_e3eebad0afee.slice/crio-717bc4313bb76dfd7099d8822e3c5f74b3806ec32d7f0c7742ec64ada6c31784 WatchSource:0}: Error finding container 717bc4313bb76dfd7099d8822e3c5f74b3806ec32d7f0c7742ec64ada6c31784: Status 404 returned error can't find the container with id 717bc4313bb76dfd7099d8822e3c5f74b3806ec32d7f0c7742ec64ada6c31784 Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.365259 4826 scope.go:117] "RemoveContainer" containerID="04f8c9d39de66273aab67bc4a6c2d30f4e42c3520ff8fa7d26f0b711cc67792b" Mar 19 19:07:22 crc kubenswrapper[4826]: E0319 19:07:22.365635 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04f8c9d39de66273aab67bc4a6c2d30f4e42c3520ff8fa7d26f0b711cc67792b\": container with ID starting with 04f8c9d39de66273aab67bc4a6c2d30f4e42c3520ff8fa7d26f0b711cc67792b not found: ID does not exist" containerID="04f8c9d39de66273aab67bc4a6c2d30f4e42c3520ff8fa7d26f0b711cc67792b" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.365731 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04f8c9d39de66273aab67bc4a6c2d30f4e42c3520ff8fa7d26f0b711cc67792b"} err="failed to get container status \"04f8c9d39de66273aab67bc4a6c2d30f4e42c3520ff8fa7d26f0b711cc67792b\": rpc error: code = NotFound desc = could not find container \"04f8c9d39de66273aab67bc4a6c2d30f4e42c3520ff8fa7d26f0b711cc67792b\": container with ID starting with 04f8c9d39de66273aab67bc4a6c2d30f4e42c3520ff8fa7d26f0b711cc67792b not found: ID does not exist" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.365759 4826 scope.go:117] "RemoveContainer" containerID="f368b3acfe3fc97f8ff0bc0ca84d609b243a60e28ee7004b7cc19d30f1081683" Mar 19 19:07:22 crc kubenswrapper[4826]: E0319 19:07:22.366119 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f368b3acfe3fc97f8ff0bc0ca84d609b243a60e28ee7004b7cc19d30f1081683\": container with ID starting with f368b3acfe3fc97f8ff0bc0ca84d609b243a60e28ee7004b7cc19d30f1081683 not found: ID does not exist" containerID="f368b3acfe3fc97f8ff0bc0ca84d609b243a60e28ee7004b7cc19d30f1081683" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.366152 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f368b3acfe3fc97f8ff0bc0ca84d609b243a60e28ee7004b7cc19d30f1081683"} err="failed to get container status \"f368b3acfe3fc97f8ff0bc0ca84d609b243a60e28ee7004b7cc19d30f1081683\": rpc error: code = NotFound desc = could not find container \"f368b3acfe3fc97f8ff0bc0ca84d609b243a60e28ee7004b7cc19d30f1081683\": container with ID starting with f368b3acfe3fc97f8ff0bc0ca84d609b243a60e28ee7004b7cc19d30f1081683 not found: ID does not exist" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.366178 4826 scope.go:117] "RemoveContainer" containerID="7bdae07ceccbc9af72b58d07baa0246b76c101ec439c0d0cccda4933ee732d36" Mar 19 19:07:22 crc kubenswrapper[4826]: E0319 19:07:22.366434 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bdae07ceccbc9af72b58d07baa0246b76c101ec439c0d0cccda4933ee732d36\": container with ID starting with 7bdae07ceccbc9af72b58d07baa0246b76c101ec439c0d0cccda4933ee732d36 not found: ID does not exist" containerID="7bdae07ceccbc9af72b58d07baa0246b76c101ec439c0d0cccda4933ee732d36" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.366478 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bdae07ceccbc9af72b58d07baa0246b76c101ec439c0d0cccda4933ee732d36"} err="failed to get container status \"7bdae07ceccbc9af72b58d07baa0246b76c101ec439c0d0cccda4933ee732d36\": rpc error: code = NotFound desc = could not find container \"7bdae07ceccbc9af72b58d07baa0246b76c101ec439c0d0cccda4933ee732d36\": container with ID starting with 7bdae07ceccbc9af72b58d07baa0246b76c101ec439c0d0cccda4933ee732d36 not found: ID does not exist" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.366492 4826 scope.go:117] "RemoveContainer" containerID="906e2b3d6954f05f75aee72a899184cb8df00bc6791da4a80cd788cf0aeb6dd6" Mar 19 19:07:22 crc kubenswrapper[4826]: E0319 19:07:22.366888 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"906e2b3d6954f05f75aee72a899184cb8df00bc6791da4a80cd788cf0aeb6dd6\": container with ID starting with 906e2b3d6954f05f75aee72a899184cb8df00bc6791da4a80cd788cf0aeb6dd6 not found: ID does not exist" containerID="906e2b3d6954f05f75aee72a899184cb8df00bc6791da4a80cd788cf0aeb6dd6" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.366909 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"906e2b3d6954f05f75aee72a899184cb8df00bc6791da4a80cd788cf0aeb6dd6"} err="failed to get container status \"906e2b3d6954f05f75aee72a899184cb8df00bc6791da4a80cd788cf0aeb6dd6\": rpc error: code = NotFound desc = could not find container \"906e2b3d6954f05f75aee72a899184cb8df00bc6791da4a80cd788cf0aeb6dd6\": container with ID starting with 906e2b3d6954f05f75aee72a899184cb8df00bc6791da4a80cd788cf0aeb6dd6 not found: ID does not exist" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.366922 4826 scope.go:117] "RemoveContainer" containerID="9c2feb7a785c80cfbb1039d4b73bd90734e4f14d6c680a1c4e3e6ad53e52e987" Mar 19 19:07:22 crc kubenswrapper[4826]: E0319 19:07:22.367195 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c2feb7a785c80cfbb1039d4b73bd90734e4f14d6c680a1c4e3e6ad53e52e987\": container with ID starting with 9c2feb7a785c80cfbb1039d4b73bd90734e4f14d6c680a1c4e3e6ad53e52e987 not found: ID does not exist" containerID="9c2feb7a785c80cfbb1039d4b73bd90734e4f14d6c680a1c4e3e6ad53e52e987" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.367245 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c2feb7a785c80cfbb1039d4b73bd90734e4f14d6c680a1c4e3e6ad53e52e987"} err="failed to get container status \"9c2feb7a785c80cfbb1039d4b73bd90734e4f14d6c680a1c4e3e6ad53e52e987\": rpc error: code = NotFound desc = could not find container \"9c2feb7a785c80cfbb1039d4b73bd90734e4f14d6c680a1c4e3e6ad53e52e987\": container with ID starting with 9c2feb7a785c80cfbb1039d4b73bd90734e4f14d6c680a1c4e3e6ad53e52e987 not found: ID does not exist" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.367287 4826 scope.go:117] "RemoveContainer" containerID="8e46f0d8e96721b4855b6ad1e1c626aa6015f077e082aa8699a0e1d9763bb26f" Mar 19 19:07:22 crc kubenswrapper[4826]: E0319 19:07:22.367600 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e46f0d8e96721b4855b6ad1e1c626aa6015f077e082aa8699a0e1d9763bb26f\": container with ID starting with 8e46f0d8e96721b4855b6ad1e1c626aa6015f077e082aa8699a0e1d9763bb26f not found: ID does not exist" containerID="8e46f0d8e96721b4855b6ad1e1c626aa6015f077e082aa8699a0e1d9763bb26f" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.367633 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e46f0d8e96721b4855b6ad1e1c626aa6015f077e082aa8699a0e1d9763bb26f"} err="failed to get container status \"8e46f0d8e96721b4855b6ad1e1c626aa6015f077e082aa8699a0e1d9763bb26f\": rpc error: code = NotFound desc = could not find container \"8e46f0d8e96721b4855b6ad1e1c626aa6015f077e082aa8699a0e1d9763bb26f\": container with ID starting with 8e46f0d8e96721b4855b6ad1e1c626aa6015f077e082aa8699a0e1d9763bb26f not found: ID does not exist" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.367682 4826 scope.go:117] "RemoveContainer" containerID="61849390975c705320f8f58d0e1c2ad8c1b75b8272e9648e52c5c9285350a9f0" Mar 19 19:07:22 crc kubenswrapper[4826]: E0319 19:07:22.368790 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61849390975c705320f8f58d0e1c2ad8c1b75b8272e9648e52c5c9285350a9f0\": container with ID starting with 61849390975c705320f8f58d0e1c2ad8c1b75b8272e9648e52c5c9285350a9f0 not found: ID does not exist" containerID="61849390975c705320f8f58d0e1c2ad8c1b75b8272e9648e52c5c9285350a9f0" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.368852 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61849390975c705320f8f58d0e1c2ad8c1b75b8272e9648e52c5c9285350a9f0"} err="failed to get container status \"61849390975c705320f8f58d0e1c2ad8c1b75b8272e9648e52c5c9285350a9f0\": rpc error: code = NotFound desc = could not find container \"61849390975c705320f8f58d0e1c2ad8c1b75b8272e9648e52c5c9285350a9f0\": container with ID starting with 61849390975c705320f8f58d0e1c2ad8c1b75b8272e9648e52c5c9285350a9f0 not found: ID does not exist" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.368868 4826 scope.go:117] "RemoveContainer" containerID="f037fea284de0cac9aaaafaec1b782570ad70ea0f6342072f861e01a13da09cb" Mar 19 19:07:22 crc kubenswrapper[4826]: E0319 19:07:22.369158 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f037fea284de0cac9aaaafaec1b782570ad70ea0f6342072f861e01a13da09cb\": container with ID starting with f037fea284de0cac9aaaafaec1b782570ad70ea0f6342072f861e01a13da09cb not found: ID does not exist" containerID="f037fea284de0cac9aaaafaec1b782570ad70ea0f6342072f861e01a13da09cb" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.369223 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f037fea284de0cac9aaaafaec1b782570ad70ea0f6342072f861e01a13da09cb"} err="failed to get container status \"f037fea284de0cac9aaaafaec1b782570ad70ea0f6342072f861e01a13da09cb\": rpc error: code = NotFound desc = could not find container \"f037fea284de0cac9aaaafaec1b782570ad70ea0f6342072f861e01a13da09cb\": container with ID starting with f037fea284de0cac9aaaafaec1b782570ad70ea0f6342072f861e01a13da09cb not found: ID does not exist" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.369244 4826 scope.go:117] "RemoveContainer" containerID="1d9d11401e0db6838e5ede12e71db43517ccb90500295373ea3bc3d284d5346c" Mar 19 19:07:22 crc kubenswrapper[4826]: E0319 19:07:22.369544 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d9d11401e0db6838e5ede12e71db43517ccb90500295373ea3bc3d284d5346c\": container with ID starting with 1d9d11401e0db6838e5ede12e71db43517ccb90500295373ea3bc3d284d5346c not found: ID does not exist" containerID="1d9d11401e0db6838e5ede12e71db43517ccb90500295373ea3bc3d284d5346c" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.369583 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d9d11401e0db6838e5ede12e71db43517ccb90500295373ea3bc3d284d5346c"} err="failed to get container status \"1d9d11401e0db6838e5ede12e71db43517ccb90500295373ea3bc3d284d5346c\": rpc error: code = NotFound desc = could not find container \"1d9d11401e0db6838e5ede12e71db43517ccb90500295373ea3bc3d284d5346c\": container with ID starting with 1d9d11401e0db6838e5ede12e71db43517ccb90500295373ea3bc3d284d5346c not found: ID does not exist" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.369606 4826 scope.go:117] "RemoveContainer" containerID="04f8c9d39de66273aab67bc4a6c2d30f4e42c3520ff8fa7d26f0b711cc67792b" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.369969 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04f8c9d39de66273aab67bc4a6c2d30f4e42c3520ff8fa7d26f0b711cc67792b"} err="failed to get container status \"04f8c9d39de66273aab67bc4a6c2d30f4e42c3520ff8fa7d26f0b711cc67792b\": rpc error: code = NotFound desc = could not find container \"04f8c9d39de66273aab67bc4a6c2d30f4e42c3520ff8fa7d26f0b711cc67792b\": container with ID starting with 04f8c9d39de66273aab67bc4a6c2d30f4e42c3520ff8fa7d26f0b711cc67792b not found: ID does not exist" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.370003 4826 scope.go:117] "RemoveContainer" containerID="f368b3acfe3fc97f8ff0bc0ca84d609b243a60e28ee7004b7cc19d30f1081683" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.370300 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f368b3acfe3fc97f8ff0bc0ca84d609b243a60e28ee7004b7cc19d30f1081683"} err="failed to get container status \"f368b3acfe3fc97f8ff0bc0ca84d609b243a60e28ee7004b7cc19d30f1081683\": rpc error: code = NotFound desc = could not find container \"f368b3acfe3fc97f8ff0bc0ca84d609b243a60e28ee7004b7cc19d30f1081683\": container with ID starting with f368b3acfe3fc97f8ff0bc0ca84d609b243a60e28ee7004b7cc19d30f1081683 not found: ID does not exist" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.370326 4826 scope.go:117] "RemoveContainer" containerID="7bdae07ceccbc9af72b58d07baa0246b76c101ec439c0d0cccda4933ee732d36" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.370613 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bdae07ceccbc9af72b58d07baa0246b76c101ec439c0d0cccda4933ee732d36"} err="failed to get container status \"7bdae07ceccbc9af72b58d07baa0246b76c101ec439c0d0cccda4933ee732d36\": rpc error: code = NotFound desc = could not find container \"7bdae07ceccbc9af72b58d07baa0246b76c101ec439c0d0cccda4933ee732d36\": container with ID starting with 7bdae07ceccbc9af72b58d07baa0246b76c101ec439c0d0cccda4933ee732d36 not found: ID does not exist" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.370635 4826 scope.go:117] "RemoveContainer" containerID="906e2b3d6954f05f75aee72a899184cb8df00bc6791da4a80cd788cf0aeb6dd6" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.370924 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"906e2b3d6954f05f75aee72a899184cb8df00bc6791da4a80cd788cf0aeb6dd6"} err="failed to get container status \"906e2b3d6954f05f75aee72a899184cb8df00bc6791da4a80cd788cf0aeb6dd6\": rpc error: code = NotFound desc = could not find container \"906e2b3d6954f05f75aee72a899184cb8df00bc6791da4a80cd788cf0aeb6dd6\": container with ID starting with 906e2b3d6954f05f75aee72a899184cb8df00bc6791da4a80cd788cf0aeb6dd6 not found: ID does not exist" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.370951 4826 scope.go:117] "RemoveContainer" containerID="9c2feb7a785c80cfbb1039d4b73bd90734e4f14d6c680a1c4e3e6ad53e52e987" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.371195 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c2feb7a785c80cfbb1039d4b73bd90734e4f14d6c680a1c4e3e6ad53e52e987"} err="failed to get container status \"9c2feb7a785c80cfbb1039d4b73bd90734e4f14d6c680a1c4e3e6ad53e52e987\": rpc error: code = NotFound desc = could not find container \"9c2feb7a785c80cfbb1039d4b73bd90734e4f14d6c680a1c4e3e6ad53e52e987\": container with ID starting with 9c2feb7a785c80cfbb1039d4b73bd90734e4f14d6c680a1c4e3e6ad53e52e987 not found: ID does not exist" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.371228 4826 scope.go:117] "RemoveContainer" containerID="8e46f0d8e96721b4855b6ad1e1c626aa6015f077e082aa8699a0e1d9763bb26f" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.371540 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e46f0d8e96721b4855b6ad1e1c626aa6015f077e082aa8699a0e1d9763bb26f"} err="failed to get container status \"8e46f0d8e96721b4855b6ad1e1c626aa6015f077e082aa8699a0e1d9763bb26f\": rpc error: code = NotFound desc = could not find container \"8e46f0d8e96721b4855b6ad1e1c626aa6015f077e082aa8699a0e1d9763bb26f\": container with ID starting with 8e46f0d8e96721b4855b6ad1e1c626aa6015f077e082aa8699a0e1d9763bb26f not found: ID does not exist" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.371565 4826 scope.go:117] "RemoveContainer" containerID="61849390975c705320f8f58d0e1c2ad8c1b75b8272e9648e52c5c9285350a9f0" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.371814 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61849390975c705320f8f58d0e1c2ad8c1b75b8272e9648e52c5c9285350a9f0"} err="failed to get container status \"61849390975c705320f8f58d0e1c2ad8c1b75b8272e9648e52c5c9285350a9f0\": rpc error: code = NotFound desc = could not find container \"61849390975c705320f8f58d0e1c2ad8c1b75b8272e9648e52c5c9285350a9f0\": container with ID starting with 61849390975c705320f8f58d0e1c2ad8c1b75b8272e9648e52c5c9285350a9f0 not found: ID does not exist" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.371840 4826 scope.go:117] "RemoveContainer" containerID="f037fea284de0cac9aaaafaec1b782570ad70ea0f6342072f861e01a13da09cb" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.372062 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f037fea284de0cac9aaaafaec1b782570ad70ea0f6342072f861e01a13da09cb"} err="failed to get container status \"f037fea284de0cac9aaaafaec1b782570ad70ea0f6342072f861e01a13da09cb\": rpc error: code = NotFound desc = could not find container \"f037fea284de0cac9aaaafaec1b782570ad70ea0f6342072f861e01a13da09cb\": container with ID starting with f037fea284de0cac9aaaafaec1b782570ad70ea0f6342072f861e01a13da09cb not found: ID does not exist" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.372085 4826 scope.go:117] "RemoveContainer" containerID="1d9d11401e0db6838e5ede12e71db43517ccb90500295373ea3bc3d284d5346c" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.372306 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d9d11401e0db6838e5ede12e71db43517ccb90500295373ea3bc3d284d5346c"} err="failed to get container status \"1d9d11401e0db6838e5ede12e71db43517ccb90500295373ea3bc3d284d5346c\": rpc error: code = NotFound desc = could not find container \"1d9d11401e0db6838e5ede12e71db43517ccb90500295373ea3bc3d284d5346c\": container with ID starting with 1d9d11401e0db6838e5ede12e71db43517ccb90500295373ea3bc3d284d5346c not found: ID does not exist" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.372322 4826 scope.go:117] "RemoveContainer" containerID="04f8c9d39de66273aab67bc4a6c2d30f4e42c3520ff8fa7d26f0b711cc67792b" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.372542 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04f8c9d39de66273aab67bc4a6c2d30f4e42c3520ff8fa7d26f0b711cc67792b"} err="failed to get container status \"04f8c9d39de66273aab67bc4a6c2d30f4e42c3520ff8fa7d26f0b711cc67792b\": rpc error: code = NotFound desc = could not find container \"04f8c9d39de66273aab67bc4a6c2d30f4e42c3520ff8fa7d26f0b711cc67792b\": container with ID starting with 04f8c9d39de66273aab67bc4a6c2d30f4e42c3520ff8fa7d26f0b711cc67792b not found: ID does not exist" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.372579 4826 scope.go:117] "RemoveContainer" containerID="f368b3acfe3fc97f8ff0bc0ca84d609b243a60e28ee7004b7cc19d30f1081683" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.372989 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f368b3acfe3fc97f8ff0bc0ca84d609b243a60e28ee7004b7cc19d30f1081683"} err="failed to get container status \"f368b3acfe3fc97f8ff0bc0ca84d609b243a60e28ee7004b7cc19d30f1081683\": rpc error: code = NotFound desc = could not find container \"f368b3acfe3fc97f8ff0bc0ca84d609b243a60e28ee7004b7cc19d30f1081683\": container with ID starting with f368b3acfe3fc97f8ff0bc0ca84d609b243a60e28ee7004b7cc19d30f1081683 not found: ID does not exist" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.373018 4826 scope.go:117] "RemoveContainer" containerID="7bdae07ceccbc9af72b58d07baa0246b76c101ec439c0d0cccda4933ee732d36" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.373285 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bdae07ceccbc9af72b58d07baa0246b76c101ec439c0d0cccda4933ee732d36"} err="failed to get container status \"7bdae07ceccbc9af72b58d07baa0246b76c101ec439c0d0cccda4933ee732d36\": rpc error: code = NotFound desc = could not find container \"7bdae07ceccbc9af72b58d07baa0246b76c101ec439c0d0cccda4933ee732d36\": container with ID starting with 7bdae07ceccbc9af72b58d07baa0246b76c101ec439c0d0cccda4933ee732d36 not found: ID does not exist" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.373313 4826 scope.go:117] "RemoveContainer" containerID="906e2b3d6954f05f75aee72a899184cb8df00bc6791da4a80cd788cf0aeb6dd6" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.373576 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"906e2b3d6954f05f75aee72a899184cb8df00bc6791da4a80cd788cf0aeb6dd6"} err="failed to get container status \"906e2b3d6954f05f75aee72a899184cb8df00bc6791da4a80cd788cf0aeb6dd6\": rpc error: code = NotFound desc = could not find container \"906e2b3d6954f05f75aee72a899184cb8df00bc6791da4a80cd788cf0aeb6dd6\": container with ID starting with 906e2b3d6954f05f75aee72a899184cb8df00bc6791da4a80cd788cf0aeb6dd6 not found: ID does not exist" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.373593 4826 scope.go:117] "RemoveContainer" containerID="9c2feb7a785c80cfbb1039d4b73bd90734e4f14d6c680a1c4e3e6ad53e52e987" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.373997 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c2feb7a785c80cfbb1039d4b73bd90734e4f14d6c680a1c4e3e6ad53e52e987"} err="failed to get container status \"9c2feb7a785c80cfbb1039d4b73bd90734e4f14d6c680a1c4e3e6ad53e52e987\": rpc error: code = NotFound desc = could not find container \"9c2feb7a785c80cfbb1039d4b73bd90734e4f14d6c680a1c4e3e6ad53e52e987\": container with ID starting with 9c2feb7a785c80cfbb1039d4b73bd90734e4f14d6c680a1c4e3e6ad53e52e987 not found: ID does not exist" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.374030 4826 scope.go:117] "RemoveContainer" containerID="8e46f0d8e96721b4855b6ad1e1c626aa6015f077e082aa8699a0e1d9763bb26f" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.374261 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e46f0d8e96721b4855b6ad1e1c626aa6015f077e082aa8699a0e1d9763bb26f"} err="failed to get container status \"8e46f0d8e96721b4855b6ad1e1c626aa6015f077e082aa8699a0e1d9763bb26f\": rpc error: code = NotFound desc = could not find container \"8e46f0d8e96721b4855b6ad1e1c626aa6015f077e082aa8699a0e1d9763bb26f\": container with ID starting with 8e46f0d8e96721b4855b6ad1e1c626aa6015f077e082aa8699a0e1d9763bb26f not found: ID does not exist" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.374283 4826 scope.go:117] "RemoveContainer" containerID="61849390975c705320f8f58d0e1c2ad8c1b75b8272e9648e52c5c9285350a9f0" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.374495 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61849390975c705320f8f58d0e1c2ad8c1b75b8272e9648e52c5c9285350a9f0"} err="failed to get container status \"61849390975c705320f8f58d0e1c2ad8c1b75b8272e9648e52c5c9285350a9f0\": rpc error: code = NotFound desc = could not find container \"61849390975c705320f8f58d0e1c2ad8c1b75b8272e9648e52c5c9285350a9f0\": container with ID starting with 61849390975c705320f8f58d0e1c2ad8c1b75b8272e9648e52c5c9285350a9f0 not found: ID does not exist" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.374514 4826 scope.go:117] "RemoveContainer" containerID="f037fea284de0cac9aaaafaec1b782570ad70ea0f6342072f861e01a13da09cb" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.374759 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f037fea284de0cac9aaaafaec1b782570ad70ea0f6342072f861e01a13da09cb"} err="failed to get container status \"f037fea284de0cac9aaaafaec1b782570ad70ea0f6342072f861e01a13da09cb\": rpc error: code = NotFound desc = could not find container \"f037fea284de0cac9aaaafaec1b782570ad70ea0f6342072f861e01a13da09cb\": container with ID starting with f037fea284de0cac9aaaafaec1b782570ad70ea0f6342072f861e01a13da09cb not found: ID does not exist" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.374779 4826 scope.go:117] "RemoveContainer" containerID="1d9d11401e0db6838e5ede12e71db43517ccb90500295373ea3bc3d284d5346c" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.375011 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d9d11401e0db6838e5ede12e71db43517ccb90500295373ea3bc3d284d5346c"} err="failed to get container status \"1d9d11401e0db6838e5ede12e71db43517ccb90500295373ea3bc3d284d5346c\": rpc error: code = NotFound desc = could not find container \"1d9d11401e0db6838e5ede12e71db43517ccb90500295373ea3bc3d284d5346c\": container with ID starting with 1d9d11401e0db6838e5ede12e71db43517ccb90500295373ea3bc3d284d5346c not found: ID does not exist" Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.993709 4826 generic.go:334] "Generic (PLEG): container finished" podID="e8bfbc17-505e-4154-98c2-6e9c25345308" containerID="ee7b59ac109ce29012454f6550ffb1c90d546d7b0e8e74fad7f4a818226ffadd" exitCode=0 Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.993781 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwmbv" event={"ID":"e8bfbc17-505e-4154-98c2-6e9c25345308","Type":"ContainerDied","Data":"ee7b59ac109ce29012454f6550ffb1c90d546d7b0e8e74fad7f4a818226ffadd"} Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.995862 4826 generic.go:334] "Generic (PLEG): container finished" podID="3431e3be-0e2e-4581-93a3-e3eebad0afee" containerID="72afefe6d93aef1696be05146680974d7bbf524f655075b76dd47d9340c7e74d" exitCode=0 Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.995888 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" event={"ID":"3431e3be-0e2e-4581-93a3-e3eebad0afee","Type":"ContainerDied","Data":"72afefe6d93aef1696be05146680974d7bbf524f655075b76dd47d9340c7e74d"} Mar 19 19:07:22 crc kubenswrapper[4826]: I0319 19:07:22.995911 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" event={"ID":"3431e3be-0e2e-4581-93a3-e3eebad0afee","Type":"ContainerStarted","Data":"717bc4313bb76dfd7099d8822e3c5f74b3806ec32d7f0c7742ec64ada6c31784"} Mar 19 19:07:23 crc kubenswrapper[4826]: I0319 19:07:23.989564 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5a08d12-9af8-4524-8312-dac430ab73ac" path="/var/lib/kubelet/pods/d5a08d12-9af8-4524-8312-dac430ab73ac/volumes" Mar 19 19:07:24 crc kubenswrapper[4826]: I0319 19:07:24.014070 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" event={"ID":"3431e3be-0e2e-4581-93a3-e3eebad0afee","Type":"ContainerStarted","Data":"478c92db0aaa0b3007eb48ee7964174616f4bd3b5166773e4fcc863bb72706d6"} Mar 19 19:07:24 crc kubenswrapper[4826]: I0319 19:07:24.014182 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" event={"ID":"3431e3be-0e2e-4581-93a3-e3eebad0afee","Type":"ContainerStarted","Data":"46ea5ecbafd5f73282c2c5e8f83fdf0d55144c28aeb6aa5d542102efd1d2561d"} Mar 19 19:07:24 crc kubenswrapper[4826]: I0319 19:07:24.014201 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" event={"ID":"3431e3be-0e2e-4581-93a3-e3eebad0afee","Type":"ContainerStarted","Data":"e8c0a432a0d9a48f50804a60d92f820c434e0046d48617be2445d1fd8161b5c8"} Mar 19 19:07:24 crc kubenswrapper[4826]: I0319 19:07:24.014229 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" event={"ID":"3431e3be-0e2e-4581-93a3-e3eebad0afee","Type":"ContainerStarted","Data":"0d0ae3e95ec54dcbb77f3d4de420d52269c91d5e144365bf1c5b8fafab0e4ab3"} Mar 19 19:07:24 crc kubenswrapper[4826]: I0319 19:07:24.014258 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" event={"ID":"3431e3be-0e2e-4581-93a3-e3eebad0afee","Type":"ContainerStarted","Data":"87a2edae5f1d283df368159da4f336340e220d99049160105092b1745dfbb45b"} Mar 19 19:07:24 crc kubenswrapper[4826]: I0319 19:07:24.014270 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" event={"ID":"3431e3be-0e2e-4581-93a3-e3eebad0afee","Type":"ContainerStarted","Data":"6c6b9cb0db0690e3243df031463cda40a415ae8ffcc472519a90348aa0d638a7"} Mar 19 19:07:24 crc kubenswrapper[4826]: I0319 19:07:24.018484 4826 generic.go:334] "Generic (PLEG): container finished" podID="e8bfbc17-505e-4154-98c2-6e9c25345308" containerID="a4e73cd2e884dc00210a7909667c5d1e5b12a350bef46f5bb004d38c13bff9c8" exitCode=0 Mar 19 19:07:24 crc kubenswrapper[4826]: I0319 19:07:24.018514 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwmbv" event={"ID":"e8bfbc17-505e-4154-98c2-6e9c25345308","Type":"ContainerDied","Data":"a4e73cd2e884dc00210a7909667c5d1e5b12a350bef46f5bb004d38c13bff9c8"} Mar 19 19:07:25 crc kubenswrapper[4826]: I0319 19:07:25.152949 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwmbv" Mar 19 19:07:25 crc kubenswrapper[4826]: I0319 19:07:25.183839 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jph5d\" (UniqueName: \"kubernetes.io/projected/e8bfbc17-505e-4154-98c2-6e9c25345308-kube-api-access-jph5d\") pod \"e8bfbc17-505e-4154-98c2-6e9c25345308\" (UID: \"e8bfbc17-505e-4154-98c2-6e9c25345308\") " Mar 19 19:07:25 crc kubenswrapper[4826]: I0319 19:07:25.184006 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e8bfbc17-505e-4154-98c2-6e9c25345308-util\") pod \"e8bfbc17-505e-4154-98c2-6e9c25345308\" (UID: \"e8bfbc17-505e-4154-98c2-6e9c25345308\") " Mar 19 19:07:25 crc kubenswrapper[4826]: I0319 19:07:25.184235 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e8bfbc17-505e-4154-98c2-6e9c25345308-bundle\") pod \"e8bfbc17-505e-4154-98c2-6e9c25345308\" (UID: \"e8bfbc17-505e-4154-98c2-6e9c25345308\") " Mar 19 19:07:25 crc kubenswrapper[4826]: I0319 19:07:25.188593 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8bfbc17-505e-4154-98c2-6e9c25345308-bundle" (OuterVolumeSpecName: "bundle") pod "e8bfbc17-505e-4154-98c2-6e9c25345308" (UID: "e8bfbc17-505e-4154-98c2-6e9c25345308"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:07:25 crc kubenswrapper[4826]: I0319 19:07:25.191001 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8bfbc17-505e-4154-98c2-6e9c25345308-kube-api-access-jph5d" (OuterVolumeSpecName: "kube-api-access-jph5d") pod "e8bfbc17-505e-4154-98c2-6e9c25345308" (UID: "e8bfbc17-505e-4154-98c2-6e9c25345308"). InnerVolumeSpecName "kube-api-access-jph5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:07:25 crc kubenswrapper[4826]: I0319 19:07:25.198922 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8bfbc17-505e-4154-98c2-6e9c25345308-util" (OuterVolumeSpecName: "util") pod "e8bfbc17-505e-4154-98c2-6e9c25345308" (UID: "e8bfbc17-505e-4154-98c2-6e9c25345308"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:07:25 crc kubenswrapper[4826]: I0319 19:07:25.286965 4826 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e8bfbc17-505e-4154-98c2-6e9c25345308-util\") on node \"crc\" DevicePath \"\"" Mar 19 19:07:25 crc kubenswrapper[4826]: I0319 19:07:25.287031 4826 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e8bfbc17-505e-4154-98c2-6e9c25345308-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:07:25 crc kubenswrapper[4826]: I0319 19:07:25.287055 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jph5d\" (UniqueName: \"kubernetes.io/projected/e8bfbc17-505e-4154-98c2-6e9c25345308-kube-api-access-jph5d\") on node \"crc\" DevicePath \"\"" Mar 19 19:07:26 crc kubenswrapper[4826]: I0319 19:07:26.038416 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwmbv" event={"ID":"e8bfbc17-505e-4154-98c2-6e9c25345308","Type":"ContainerDied","Data":"288b55087f0c963b536f8b1b302950d101fddaa8bdd15856040edc1162b47563"} Mar 19 19:07:26 crc kubenswrapper[4826]: I0319 19:07:26.038483 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="288b55087f0c963b536f8b1b302950d101fddaa8bdd15856040edc1162b47563" Mar 19 19:07:26 crc kubenswrapper[4826]: I0319 19:07:26.038494 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwmbv" Mar 19 19:07:27 crc kubenswrapper[4826]: I0319 19:07:27.054547 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" event={"ID":"3431e3be-0e2e-4581-93a3-e3eebad0afee","Type":"ContainerStarted","Data":"c397e69c8afe108c5fa18a89ed071effd110a597adaaccc9573b05ac346834c2"} Mar 19 19:07:29 crc kubenswrapper[4826]: I0319 19:07:29.070097 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" event={"ID":"3431e3be-0e2e-4581-93a3-e3eebad0afee","Type":"ContainerStarted","Data":"f3ea4ea618c68327b2a960ef1bb840b17a5e3ef479ed806f02841d7b39844354"} Mar 19 19:07:29 crc kubenswrapper[4826]: I0319 19:07:29.070820 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:29 crc kubenswrapper[4826]: I0319 19:07:29.070958 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:29 crc kubenswrapper[4826]: I0319 19:07:29.071009 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:29 crc kubenswrapper[4826]: I0319 19:07:29.103616 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:29 crc kubenswrapper[4826]: I0319 19:07:29.107442 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:29 crc kubenswrapper[4826]: I0319 19:07:29.119297 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" podStartSLOduration=8.119272091 podStartE2EDuration="8.119272091s" podCreationTimestamp="2026-03-19 19:07:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:07:29.11361474 +0000 UTC m=+673.867683073" watchObservedRunningTime="2026-03-19 19:07:29.119272091 +0000 UTC m=+673.873340414" Mar 19 19:07:39 crc kubenswrapper[4826]: I0319 19:07:39.327225 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-q6rkt"] Mar 19 19:07:39 crc kubenswrapper[4826]: E0319 19:07:39.328095 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8bfbc17-505e-4154-98c2-6e9c25345308" containerName="util" Mar 19 19:07:39 crc kubenswrapper[4826]: I0319 19:07:39.328108 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8bfbc17-505e-4154-98c2-6e9c25345308" containerName="util" Mar 19 19:07:39 crc kubenswrapper[4826]: E0319 19:07:39.328119 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8bfbc17-505e-4154-98c2-6e9c25345308" containerName="extract" Mar 19 19:07:39 crc kubenswrapper[4826]: I0319 19:07:39.328125 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8bfbc17-505e-4154-98c2-6e9c25345308" containerName="extract" Mar 19 19:07:39 crc kubenswrapper[4826]: E0319 19:07:39.328143 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8bfbc17-505e-4154-98c2-6e9c25345308" containerName="pull" Mar 19 19:07:39 crc kubenswrapper[4826]: I0319 19:07:39.328150 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8bfbc17-505e-4154-98c2-6e9c25345308" containerName="pull" Mar 19 19:07:39 crc kubenswrapper[4826]: I0319 19:07:39.328263 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8bfbc17-505e-4154-98c2-6e9c25345308" containerName="extract" Mar 19 19:07:39 crc kubenswrapper[4826]: I0319 19:07:39.328792 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-q6rkt" Mar 19 19:07:39 crc kubenswrapper[4826]: I0319 19:07:39.330942 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Mar 19 19:07:39 crc kubenswrapper[4826]: I0319 19:07:39.330990 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Mar 19 19:07:39 crc kubenswrapper[4826]: I0319 19:07:39.332280 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-knjmp" Mar 19 19:07:39 crc kubenswrapper[4826]: I0319 19:07:39.345353 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-q6rkt"] Mar 19 19:07:39 crc kubenswrapper[4826]: I0319 19:07:39.387585 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5p69\" (UniqueName: \"kubernetes.io/projected/ac9e546d-1803-4ef0-a76c-6d9e0823c010-kube-api-access-n5p69\") pod \"obo-prometheus-operator-8ff7d675-q6rkt\" (UID: \"ac9e546d-1803-4ef0-a76c-6d9e0823c010\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-q6rkt" Mar 19 19:07:39 crc kubenswrapper[4826]: I0319 19:07:39.489086 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5p69\" (UniqueName: \"kubernetes.io/projected/ac9e546d-1803-4ef0-a76c-6d9e0823c010-kube-api-access-n5p69\") pod \"obo-prometheus-operator-8ff7d675-q6rkt\" (UID: \"ac9e546d-1803-4ef0-a76c-6d9e0823c010\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-q6rkt" Mar 19 19:07:39 crc kubenswrapper[4826]: I0319 19:07:39.517095 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5p69\" (UniqueName: \"kubernetes.io/projected/ac9e546d-1803-4ef0-a76c-6d9e0823c010-kube-api-access-n5p69\") pod \"obo-prometheus-operator-8ff7d675-q6rkt\" (UID: \"ac9e546d-1803-4ef0-a76c-6d9e0823c010\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-q6rkt" Mar 19 19:07:39 crc kubenswrapper[4826]: I0319 19:07:39.648920 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-q6rkt" Mar 19 19:07:39 crc kubenswrapper[4826]: I0319 19:07:39.688336 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5b8dd7d9db-hhlpv"] Mar 19 19:07:39 crc kubenswrapper[4826]: I0319 19:07:39.689029 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b8dd7d9db-hhlpv" Mar 19 19:07:39 crc kubenswrapper[4826]: I0319 19:07:39.691288 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-fkkws" Mar 19 19:07:39 crc kubenswrapper[4826]: I0319 19:07:39.692299 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Mar 19 19:07:39 crc kubenswrapper[4826]: I0319 19:07:39.701905 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5b8dd7d9db-jbnft"] Mar 19 19:07:39 crc kubenswrapper[4826]: I0319 19:07:39.702853 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b8dd7d9db-jbnft" Mar 19 19:07:39 crc kubenswrapper[4826]: I0319 19:07:39.708240 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5b8dd7d9db-hhlpv"] Mar 19 19:07:39 crc kubenswrapper[4826]: I0319 19:07:39.713473 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5b8dd7d9db-jbnft"] Mar 19 19:07:39 crc kubenswrapper[4826]: I0319 19:07:39.797328 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2e6734a7-50cb-4366-baab-fa0feba677f0-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5b8dd7d9db-hhlpv\" (UID: \"2e6734a7-50cb-4366-baab-fa0feba677f0\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b8dd7d9db-hhlpv" Mar 19 19:07:39 crc kubenswrapper[4826]: I0319 19:07:39.797679 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f1278453-65e7-42aa-8d60-b42e0c10f232-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5b8dd7d9db-jbnft\" (UID: \"f1278453-65e7-42aa-8d60-b42e0c10f232\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b8dd7d9db-jbnft" Mar 19 19:07:39 crc kubenswrapper[4826]: I0319 19:07:39.797704 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2e6734a7-50cb-4366-baab-fa0feba677f0-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5b8dd7d9db-hhlpv\" (UID: \"2e6734a7-50cb-4366-baab-fa0feba677f0\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b8dd7d9db-hhlpv" Mar 19 19:07:39 crc kubenswrapper[4826]: I0319 19:07:39.797763 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f1278453-65e7-42aa-8d60-b42e0c10f232-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5b8dd7d9db-jbnft\" (UID: \"f1278453-65e7-42aa-8d60-b42e0c10f232\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b8dd7d9db-jbnft" Mar 19 19:07:39 crc kubenswrapper[4826]: I0319 19:07:39.903308 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f1278453-65e7-42aa-8d60-b42e0c10f232-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5b8dd7d9db-jbnft\" (UID: \"f1278453-65e7-42aa-8d60-b42e0c10f232\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b8dd7d9db-jbnft" Mar 19 19:07:39 crc kubenswrapper[4826]: I0319 19:07:39.903374 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2e6734a7-50cb-4366-baab-fa0feba677f0-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5b8dd7d9db-hhlpv\" (UID: \"2e6734a7-50cb-4366-baab-fa0feba677f0\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b8dd7d9db-hhlpv" Mar 19 19:07:39 crc kubenswrapper[4826]: I0319 19:07:39.903415 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f1278453-65e7-42aa-8d60-b42e0c10f232-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5b8dd7d9db-jbnft\" (UID: \"f1278453-65e7-42aa-8d60-b42e0c10f232\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b8dd7d9db-jbnft" Mar 19 19:07:39 crc kubenswrapper[4826]: I0319 19:07:39.903432 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2e6734a7-50cb-4366-baab-fa0feba677f0-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5b8dd7d9db-hhlpv\" (UID: \"2e6734a7-50cb-4366-baab-fa0feba677f0\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b8dd7d9db-hhlpv" Mar 19 19:07:39 crc kubenswrapper[4826]: I0319 19:07:39.908058 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2e6734a7-50cb-4366-baab-fa0feba677f0-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5b8dd7d9db-hhlpv\" (UID: \"2e6734a7-50cb-4366-baab-fa0feba677f0\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b8dd7d9db-hhlpv" Mar 19 19:07:39 crc kubenswrapper[4826]: I0319 19:07:39.908092 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f1278453-65e7-42aa-8d60-b42e0c10f232-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5b8dd7d9db-jbnft\" (UID: \"f1278453-65e7-42aa-8d60-b42e0c10f232\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b8dd7d9db-jbnft" Mar 19 19:07:39 crc kubenswrapper[4826]: I0319 19:07:39.908396 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2e6734a7-50cb-4366-baab-fa0feba677f0-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5b8dd7d9db-hhlpv\" (UID: \"2e6734a7-50cb-4366-baab-fa0feba677f0\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b8dd7d9db-hhlpv" Mar 19 19:07:39 crc kubenswrapper[4826]: I0319 19:07:39.909143 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f1278453-65e7-42aa-8d60-b42e0c10f232-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5b8dd7d9db-jbnft\" (UID: \"f1278453-65e7-42aa-8d60-b42e0c10f232\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b8dd7d9db-jbnft" Mar 19 19:07:40 crc kubenswrapper[4826]: I0319 19:07:40.023607 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b8dd7d9db-hhlpv" Mar 19 19:07:40 crc kubenswrapper[4826]: I0319 19:07:40.031902 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b8dd7d9db-jbnft" Mar 19 19:07:40 crc kubenswrapper[4826]: I0319 19:07:40.047839 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-tcdmb"] Mar 19 19:07:40 crc kubenswrapper[4826]: I0319 19:07:40.048609 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-tcdmb" Mar 19 19:07:40 crc kubenswrapper[4826]: I0319 19:07:40.050360 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-xggw9" Mar 19 19:07:40 crc kubenswrapper[4826]: I0319 19:07:40.050832 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Mar 19 19:07:40 crc kubenswrapper[4826]: I0319 19:07:40.080796 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-tcdmb"] Mar 19 19:07:40 crc kubenswrapper[4826]: I0319 19:07:40.106988 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/217c809e-0af8-4b11-a5ce-932d698ed444-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-tcdmb\" (UID: \"217c809e-0af8-4b11-a5ce-932d698ed444\") " pod="openshift-operators/observability-operator-6dd7dd855f-tcdmb" Mar 19 19:07:40 crc kubenswrapper[4826]: I0319 19:07:40.107084 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wbzp\" (UniqueName: \"kubernetes.io/projected/217c809e-0af8-4b11-a5ce-932d698ed444-kube-api-access-4wbzp\") pod \"observability-operator-6dd7dd855f-tcdmb\" (UID: \"217c809e-0af8-4b11-a5ce-932d698ed444\") " pod="openshift-operators/observability-operator-6dd7dd855f-tcdmb" Mar 19 19:07:40 crc kubenswrapper[4826]: I0319 19:07:40.122131 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-q6rkt"] Mar 19 19:07:40 crc kubenswrapper[4826]: W0319 19:07:40.126610 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac9e546d_1803_4ef0_a76c_6d9e0823c010.slice/crio-d71fb5642f6777124311130f082a8a6c912271a70e59387c9995c82c45786e5e WatchSource:0}: Error finding container d71fb5642f6777124311130f082a8a6c912271a70e59387c9995c82c45786e5e: Status 404 returned error can't find the container with id d71fb5642f6777124311130f082a8a6c912271a70e59387c9995c82c45786e5e Mar 19 19:07:40 crc kubenswrapper[4826]: I0319 19:07:40.139062 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-8ff7d675-q6rkt" event={"ID":"ac9e546d-1803-4ef0-a76c-6d9e0823c010","Type":"ContainerStarted","Data":"d71fb5642f6777124311130f082a8a6c912271a70e59387c9995c82c45786e5e"} Mar 19 19:07:40 crc kubenswrapper[4826]: I0319 19:07:40.208800 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wbzp\" (UniqueName: \"kubernetes.io/projected/217c809e-0af8-4b11-a5ce-932d698ed444-kube-api-access-4wbzp\") pod \"observability-operator-6dd7dd855f-tcdmb\" (UID: \"217c809e-0af8-4b11-a5ce-932d698ed444\") " pod="openshift-operators/observability-operator-6dd7dd855f-tcdmb" Mar 19 19:07:40 crc kubenswrapper[4826]: I0319 19:07:40.208922 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/217c809e-0af8-4b11-a5ce-932d698ed444-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-tcdmb\" (UID: \"217c809e-0af8-4b11-a5ce-932d698ed444\") " pod="openshift-operators/observability-operator-6dd7dd855f-tcdmb" Mar 19 19:07:40 crc kubenswrapper[4826]: I0319 19:07:40.215111 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/217c809e-0af8-4b11-a5ce-932d698ed444-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-tcdmb\" (UID: \"217c809e-0af8-4b11-a5ce-932d698ed444\") " pod="openshift-operators/observability-operator-6dd7dd855f-tcdmb" Mar 19 19:07:40 crc kubenswrapper[4826]: I0319 19:07:40.231751 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wbzp\" (UniqueName: \"kubernetes.io/projected/217c809e-0af8-4b11-a5ce-932d698ed444-kube-api-access-4wbzp\") pod \"observability-operator-6dd7dd855f-tcdmb\" (UID: \"217c809e-0af8-4b11-a5ce-932d698ed444\") " pod="openshift-operators/observability-operator-6dd7dd855f-tcdmb" Mar 19 19:07:40 crc kubenswrapper[4826]: I0319 19:07:40.367315 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5b8dd7d9db-jbnft"] Mar 19 19:07:40 crc kubenswrapper[4826]: I0319 19:07:40.375186 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-tcdmb" Mar 19 19:07:40 crc kubenswrapper[4826]: I0319 19:07:40.460118 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5b8dd7d9db-hhlpv"] Mar 19 19:07:40 crc kubenswrapper[4826]: W0319 19:07:40.478546 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e6734a7_50cb_4366_baab_fa0feba677f0.slice/crio-b89c95b37e63718c80a0ce989fb811c40787b10f6de5c12bc86a81a755db8d81 WatchSource:0}: Error finding container b89c95b37e63718c80a0ce989fb811c40787b10f6de5c12bc86a81a755db8d81: Status 404 returned error can't find the container with id b89c95b37e63718c80a0ce989fb811c40787b10f6de5c12bc86a81a755db8d81 Mar 19 19:07:40 crc kubenswrapper[4826]: I0319 19:07:40.503468 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-6648f6899-wbmts"] Mar 19 19:07:40 crc kubenswrapper[4826]: I0319 19:07:40.504264 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-6648f6899-wbmts" Mar 19 19:07:40 crc kubenswrapper[4826]: I0319 19:07:40.508958 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-service-cert" Mar 19 19:07:40 crc kubenswrapper[4826]: I0319 19:07:40.509118 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-v95rq" Mar 19 19:07:40 crc kubenswrapper[4826]: I0319 19:07:40.514377 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-6648f6899-wbmts"] Mar 19 19:07:40 crc kubenswrapper[4826]: I0319 19:07:40.616085 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/8eb71543-680b-4018-94e4-572cfcc12660-openshift-service-ca\") pod \"perses-operator-6648f6899-wbmts\" (UID: \"8eb71543-680b-4018-94e4-572cfcc12660\") " pod="openshift-operators/perses-operator-6648f6899-wbmts" Mar 19 19:07:40 crc kubenswrapper[4826]: I0319 19:07:40.616166 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8eb71543-680b-4018-94e4-572cfcc12660-webhook-cert\") pod \"perses-operator-6648f6899-wbmts\" (UID: \"8eb71543-680b-4018-94e4-572cfcc12660\") " pod="openshift-operators/perses-operator-6648f6899-wbmts" Mar 19 19:07:40 crc kubenswrapper[4826]: I0319 19:07:40.616220 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r65m6\" (UniqueName: \"kubernetes.io/projected/8eb71543-680b-4018-94e4-572cfcc12660-kube-api-access-r65m6\") pod \"perses-operator-6648f6899-wbmts\" (UID: \"8eb71543-680b-4018-94e4-572cfcc12660\") " pod="openshift-operators/perses-operator-6648f6899-wbmts" Mar 19 19:07:40 crc kubenswrapper[4826]: I0319 19:07:40.616259 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8eb71543-680b-4018-94e4-572cfcc12660-apiservice-cert\") pod \"perses-operator-6648f6899-wbmts\" (UID: \"8eb71543-680b-4018-94e4-572cfcc12660\") " pod="openshift-operators/perses-operator-6648f6899-wbmts" Mar 19 19:07:40 crc kubenswrapper[4826]: I0319 19:07:40.717561 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8eb71543-680b-4018-94e4-572cfcc12660-apiservice-cert\") pod \"perses-operator-6648f6899-wbmts\" (UID: \"8eb71543-680b-4018-94e4-572cfcc12660\") " pod="openshift-operators/perses-operator-6648f6899-wbmts" Mar 19 19:07:40 crc kubenswrapper[4826]: I0319 19:07:40.717671 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/8eb71543-680b-4018-94e4-572cfcc12660-openshift-service-ca\") pod \"perses-operator-6648f6899-wbmts\" (UID: \"8eb71543-680b-4018-94e4-572cfcc12660\") " pod="openshift-operators/perses-operator-6648f6899-wbmts" Mar 19 19:07:40 crc kubenswrapper[4826]: I0319 19:07:40.717712 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8eb71543-680b-4018-94e4-572cfcc12660-webhook-cert\") pod \"perses-operator-6648f6899-wbmts\" (UID: \"8eb71543-680b-4018-94e4-572cfcc12660\") " pod="openshift-operators/perses-operator-6648f6899-wbmts" Mar 19 19:07:40 crc kubenswrapper[4826]: I0319 19:07:40.717741 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r65m6\" (UniqueName: \"kubernetes.io/projected/8eb71543-680b-4018-94e4-572cfcc12660-kube-api-access-r65m6\") pod \"perses-operator-6648f6899-wbmts\" (UID: \"8eb71543-680b-4018-94e4-572cfcc12660\") " pod="openshift-operators/perses-operator-6648f6899-wbmts" Mar 19 19:07:40 crc kubenswrapper[4826]: I0319 19:07:40.719184 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/8eb71543-680b-4018-94e4-572cfcc12660-openshift-service-ca\") pod \"perses-operator-6648f6899-wbmts\" (UID: \"8eb71543-680b-4018-94e4-572cfcc12660\") " pod="openshift-operators/perses-operator-6648f6899-wbmts" Mar 19 19:07:40 crc kubenswrapper[4826]: I0319 19:07:40.721356 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8eb71543-680b-4018-94e4-572cfcc12660-webhook-cert\") pod \"perses-operator-6648f6899-wbmts\" (UID: \"8eb71543-680b-4018-94e4-572cfcc12660\") " pod="openshift-operators/perses-operator-6648f6899-wbmts" Mar 19 19:07:40 crc kubenswrapper[4826]: I0319 19:07:40.721598 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8eb71543-680b-4018-94e4-572cfcc12660-apiservice-cert\") pod \"perses-operator-6648f6899-wbmts\" (UID: \"8eb71543-680b-4018-94e4-572cfcc12660\") " pod="openshift-operators/perses-operator-6648f6899-wbmts" Mar 19 19:07:40 crc kubenswrapper[4826]: I0319 19:07:40.737663 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r65m6\" (UniqueName: \"kubernetes.io/projected/8eb71543-680b-4018-94e4-572cfcc12660-kube-api-access-r65m6\") pod \"perses-operator-6648f6899-wbmts\" (UID: \"8eb71543-680b-4018-94e4-572cfcc12660\") " pod="openshift-operators/perses-operator-6648f6899-wbmts" Mar 19 19:07:40 crc kubenswrapper[4826]: I0319 19:07:40.882709 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-6648f6899-wbmts" Mar 19 19:07:40 crc kubenswrapper[4826]: I0319 19:07:40.954176 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-tcdmb"] Mar 19 19:07:40 crc kubenswrapper[4826]: W0319 19:07:40.965032 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod217c809e_0af8_4b11_a5ce_932d698ed444.slice/crio-1457a752b8f7b2062f27797390c3cd7bace5203d1e39bcc4f49308fdf86cfe40 WatchSource:0}: Error finding container 1457a752b8f7b2062f27797390c3cd7bace5203d1e39bcc4f49308fdf86cfe40: Status 404 returned error can't find the container with id 1457a752b8f7b2062f27797390c3cd7bace5203d1e39bcc4f49308fdf86cfe40 Mar 19 19:07:41 crc kubenswrapper[4826]: I0319 19:07:41.148432 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b8dd7d9db-hhlpv" event={"ID":"2e6734a7-50cb-4366-baab-fa0feba677f0","Type":"ContainerStarted","Data":"b89c95b37e63718c80a0ce989fb811c40787b10f6de5c12bc86a81a755db8d81"} Mar 19 19:07:41 crc kubenswrapper[4826]: I0319 19:07:41.150160 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b8dd7d9db-jbnft" event={"ID":"f1278453-65e7-42aa-8d60-b42e0c10f232","Type":"ContainerStarted","Data":"e68b243b6ebf4e89611257cc1fde4bd25e087ddb1e7430e0ad516d5c526accb3"} Mar 19 19:07:41 crc kubenswrapper[4826]: I0319 19:07:41.150999 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-6dd7dd855f-tcdmb" event={"ID":"217c809e-0af8-4b11-a5ce-932d698ed444","Type":"ContainerStarted","Data":"1457a752b8f7b2062f27797390c3cd7bace5203d1e39bcc4f49308fdf86cfe40"} Mar 19 19:07:41 crc kubenswrapper[4826]: I0319 19:07:41.347190 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-6648f6899-wbmts"] Mar 19 19:07:41 crc kubenswrapper[4826]: W0319 19:07:41.360781 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8eb71543_680b_4018_94e4_572cfcc12660.slice/crio-9f609f8f5f50910a895d0f639a6b0028d34a42222400cd26176d9d218403d83c WatchSource:0}: Error finding container 9f609f8f5f50910a895d0f639a6b0028d34a42222400cd26176d9d218403d83c: Status 404 returned error can't find the container with id 9f609f8f5f50910a895d0f639a6b0028d34a42222400cd26176d9d218403d83c Mar 19 19:07:42 crc kubenswrapper[4826]: I0319 19:07:42.161418 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-6648f6899-wbmts" event={"ID":"8eb71543-680b-4018-94e4-572cfcc12660","Type":"ContainerStarted","Data":"9f609f8f5f50910a895d0f639a6b0028d34a42222400cd26176d9d218403d83c"} Mar 19 19:07:52 crc kubenswrapper[4826]: I0319 19:07:52.262531 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b8dd7d9db-jbnft" event={"ID":"f1278453-65e7-42aa-8d60-b42e0c10f232","Type":"ContainerStarted","Data":"c1f36fdcbd4906efbca5312a983a47a161b0cd373cca091ab729671017fb7050"} Mar 19 19:07:52 crc kubenswrapper[4826]: I0319 19:07:52.268119 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-6648f6899-wbmts" event={"ID":"8eb71543-680b-4018-94e4-572cfcc12660","Type":"ContainerStarted","Data":"5e85860116fd12d94e3a4afdbd55c5ded48f7564742d91816ebf7c6762d2f6e5"} Mar 19 19:07:52 crc kubenswrapper[4826]: I0319 19:07:52.268762 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-6648f6899-wbmts" Mar 19 19:07:52 crc kubenswrapper[4826]: I0319 19:07:52.270486 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-8ff7d675-q6rkt" event={"ID":"ac9e546d-1803-4ef0-a76c-6d9e0823c010","Type":"ContainerStarted","Data":"c023929c130217d9c3f0970e77d9af6be298b2206bccb77691e0b0e6de1c05e1"} Mar 19 19:07:52 crc kubenswrapper[4826]: I0319 19:07:52.272852 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-6dd7dd855f-tcdmb" event={"ID":"217c809e-0af8-4b11-a5ce-932d698ed444","Type":"ContainerStarted","Data":"4e64c0a4a98e62e740173a5f8f4c9dc44425c09198509fa5ce76cbdc06dfbea7"} Mar 19 19:07:52 crc kubenswrapper[4826]: I0319 19:07:52.273461 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-6dd7dd855f-tcdmb" Mar 19 19:07:52 crc kubenswrapper[4826]: I0319 19:07:52.280958 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-6dd7dd855f-tcdmb" Mar 19 19:07:52 crc kubenswrapper[4826]: I0319 19:07:52.281732 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b8dd7d9db-hhlpv" event={"ID":"2e6734a7-50cb-4366-baab-fa0feba677f0","Type":"ContainerStarted","Data":"5cdd8cd911c1f7cf9fe5dd1d05c23ef524c3230764585b928c06fb7990bb5bbb"} Mar 19 19:07:52 crc kubenswrapper[4826]: I0319 19:07:52.312176 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kpj5p" Mar 19 19:07:52 crc kubenswrapper[4826]: I0319 19:07:52.313373 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b8dd7d9db-jbnft" podStartSLOduration=2.042869015 podStartE2EDuration="13.313357502s" podCreationTimestamp="2026-03-19 19:07:39 +0000 UTC" firstStartedPulling="2026-03-19 19:07:40.378395354 +0000 UTC m=+685.132463667" lastFinishedPulling="2026-03-19 19:07:51.648883841 +0000 UTC m=+696.402952154" observedRunningTime="2026-03-19 19:07:52.311848064 +0000 UTC m=+697.065916377" watchObservedRunningTime="2026-03-19 19:07:52.313357502 +0000 UTC m=+697.067425815" Mar 19 19:07:52 crc kubenswrapper[4826]: I0319 19:07:52.338554 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-8ff7d675-q6rkt" podStartSLOduration=1.817364424 podStartE2EDuration="13.33853735s" podCreationTimestamp="2026-03-19 19:07:39 +0000 UTC" firstStartedPulling="2026-03-19 19:07:40.128841993 +0000 UTC m=+684.882910306" lastFinishedPulling="2026-03-19 19:07:51.650014929 +0000 UTC m=+696.404083232" observedRunningTime="2026-03-19 19:07:52.335841753 +0000 UTC m=+697.089910066" watchObservedRunningTime="2026-03-19 19:07:52.33853735 +0000 UTC m=+697.092605653" Mar 19 19:07:52 crc kubenswrapper[4826]: I0319 19:07:52.386969 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-6dd7dd855f-tcdmb" podStartSLOduration=1.692540538 podStartE2EDuration="12.38695326s" podCreationTimestamp="2026-03-19 19:07:40 +0000 UTC" firstStartedPulling="2026-03-19 19:07:40.972584611 +0000 UTC m=+685.726652924" lastFinishedPulling="2026-03-19 19:07:51.666997333 +0000 UTC m=+696.421065646" observedRunningTime="2026-03-19 19:07:52.377676658 +0000 UTC m=+697.131744981" watchObservedRunningTime="2026-03-19 19:07:52.38695326 +0000 UTC m=+697.141021573" Mar 19 19:07:52 crc kubenswrapper[4826]: I0319 19:07:52.434674 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b8dd7d9db-hhlpv" podStartSLOduration=2.249014902 podStartE2EDuration="13.43464038s" podCreationTimestamp="2026-03-19 19:07:39 +0000 UTC" firstStartedPulling="2026-03-19 19:07:40.481375715 +0000 UTC m=+685.235444028" lastFinishedPulling="2026-03-19 19:07:51.667001193 +0000 UTC m=+696.421069506" observedRunningTime="2026-03-19 19:07:52.42581563 +0000 UTC m=+697.179883933" watchObservedRunningTime="2026-03-19 19:07:52.43464038 +0000 UTC m=+697.188708693" Mar 19 19:07:52 crc kubenswrapper[4826]: I0319 19:07:52.526864 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-6648f6899-wbmts" podStartSLOduration=2.2412777090000002 podStartE2EDuration="12.526849003s" podCreationTimestamp="2026-03-19 19:07:40 +0000 UTC" firstStartedPulling="2026-03-19 19:07:41.364841065 +0000 UTC m=+686.118909378" lastFinishedPulling="2026-03-19 19:07:51.650412359 +0000 UTC m=+696.404480672" observedRunningTime="2026-03-19 19:07:52.485519441 +0000 UTC m=+697.239587754" watchObservedRunningTime="2026-03-19 19:07:52.526849003 +0000 UTC m=+697.280917336" Mar 19 19:08:00 crc kubenswrapper[4826]: I0319 19:08:00.159760 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565788-qhtlk"] Mar 19 19:08:00 crc kubenswrapper[4826]: I0319 19:08:00.161236 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565788-qhtlk" Mar 19 19:08:00 crc kubenswrapper[4826]: I0319 19:08:00.163271 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 19:08:00 crc kubenswrapper[4826]: I0319 19:08:00.163604 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 19:08:00 crc kubenswrapper[4826]: I0319 19:08:00.164052 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b27wl" Mar 19 19:08:00 crc kubenswrapper[4826]: I0319 19:08:00.176671 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565788-qhtlk"] Mar 19 19:08:00 crc kubenswrapper[4826]: I0319 19:08:00.235216 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlfxf\" (UniqueName: \"kubernetes.io/projected/22b7e1a6-658b-4f9d-a9ee-947a6283266f-kube-api-access-rlfxf\") pod \"auto-csr-approver-29565788-qhtlk\" (UID: \"22b7e1a6-658b-4f9d-a9ee-947a6283266f\") " pod="openshift-infra/auto-csr-approver-29565788-qhtlk" Mar 19 19:08:00 crc kubenswrapper[4826]: I0319 19:08:00.336915 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlfxf\" (UniqueName: \"kubernetes.io/projected/22b7e1a6-658b-4f9d-a9ee-947a6283266f-kube-api-access-rlfxf\") pod \"auto-csr-approver-29565788-qhtlk\" (UID: \"22b7e1a6-658b-4f9d-a9ee-947a6283266f\") " pod="openshift-infra/auto-csr-approver-29565788-qhtlk" Mar 19 19:08:00 crc kubenswrapper[4826]: I0319 19:08:00.378401 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlfxf\" (UniqueName: \"kubernetes.io/projected/22b7e1a6-658b-4f9d-a9ee-947a6283266f-kube-api-access-rlfxf\") pod \"auto-csr-approver-29565788-qhtlk\" (UID: \"22b7e1a6-658b-4f9d-a9ee-947a6283266f\") " pod="openshift-infra/auto-csr-approver-29565788-qhtlk" Mar 19 19:08:00 crc kubenswrapper[4826]: I0319 19:08:00.481034 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565788-qhtlk" Mar 19 19:08:00 crc kubenswrapper[4826]: I0319 19:08:00.885542 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-6648f6899-wbmts" Mar 19 19:08:00 crc kubenswrapper[4826]: I0319 19:08:00.911762 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565788-qhtlk"] Mar 19 19:08:01 crc kubenswrapper[4826]: I0319 19:08:01.339685 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565788-qhtlk" event={"ID":"22b7e1a6-658b-4f9d-a9ee-947a6283266f","Type":"ContainerStarted","Data":"8614e8bdda12175fef471ae07d2da35ad7b3c92ec7ba72feecc70789aed41ed5"} Mar 19 19:08:02 crc kubenswrapper[4826]: I0319 19:08:02.350357 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565788-qhtlk" event={"ID":"22b7e1a6-658b-4f9d-a9ee-947a6283266f","Type":"ContainerStarted","Data":"253ccce7086c613fe8e20ecca1aacd8e92ef8fb49f779399af2e442d661625c0"} Mar 19 19:08:02 crc kubenswrapper[4826]: I0319 19:08:02.364857 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565788-qhtlk" podStartSLOduration=1.402847111 podStartE2EDuration="2.364844031s" podCreationTimestamp="2026-03-19 19:08:00 +0000 UTC" firstStartedPulling="2026-03-19 19:08:00.915723667 +0000 UTC m=+705.669791980" lastFinishedPulling="2026-03-19 19:08:01.877720587 +0000 UTC m=+706.631788900" observedRunningTime="2026-03-19 19:08:02.362235525 +0000 UTC m=+707.116303838" watchObservedRunningTime="2026-03-19 19:08:02.364844031 +0000 UTC m=+707.118912344" Mar 19 19:08:03 crc kubenswrapper[4826]: I0319 19:08:03.292006 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-btnzb"] Mar 19 19:08:03 crc kubenswrapper[4826]: I0319 19:08:03.293011 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-btnzb" Mar 19 19:08:03 crc kubenswrapper[4826]: I0319 19:08:03.297736 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-mdv5n"] Mar 19 19:08:03 crc kubenswrapper[4826]: I0319 19:08:03.298437 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-mdv5n" Mar 19 19:08:03 crc kubenswrapper[4826]: I0319 19:08:03.300142 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 19 19:08:03 crc kubenswrapper[4826]: I0319 19:08:03.300248 4826 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-dbjl8" Mar 19 19:08:03 crc kubenswrapper[4826]: I0319 19:08:03.300553 4826 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-42m4d" Mar 19 19:08:03 crc kubenswrapper[4826]: I0319 19:08:03.300790 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 19 19:08:03 crc kubenswrapper[4826]: I0319 19:08:03.302679 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-fhw8v"] Mar 19 19:08:03 crc kubenswrapper[4826]: I0319 19:08:03.303693 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-fhw8v" Mar 19 19:08:03 crc kubenswrapper[4826]: I0319 19:08:03.305036 4826 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-nmxvj" Mar 19 19:08:03 crc kubenswrapper[4826]: I0319 19:08:03.306894 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-btnzb"] Mar 19 19:08:03 crc kubenswrapper[4826]: I0319 19:08:03.320142 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-mdv5n"] Mar 19 19:08:03 crc kubenswrapper[4826]: I0319 19:08:03.324347 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-fhw8v"] Mar 19 19:08:03 crc kubenswrapper[4826]: I0319 19:08:03.358326 4826 generic.go:334] "Generic (PLEG): container finished" podID="22b7e1a6-658b-4f9d-a9ee-947a6283266f" containerID="253ccce7086c613fe8e20ecca1aacd8e92ef8fb49f779399af2e442d661625c0" exitCode=0 Mar 19 19:08:03 crc kubenswrapper[4826]: I0319 19:08:03.358365 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565788-qhtlk" event={"ID":"22b7e1a6-658b-4f9d-a9ee-947a6283266f","Type":"ContainerDied","Data":"253ccce7086c613fe8e20ecca1aacd8e92ef8fb49f779399af2e442d661625c0"} Mar 19 19:08:03 crc kubenswrapper[4826]: I0319 19:08:03.389011 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrhjg\" (UniqueName: \"kubernetes.io/projected/5161bc15-b664-4436-ae44-dc85f97b5dd3-kube-api-access-hrhjg\") pod \"cert-manager-858654f9db-btnzb\" (UID: \"5161bc15-b664-4436-ae44-dc85f97b5dd3\") " pod="cert-manager/cert-manager-858654f9db-btnzb" Mar 19 19:08:03 crc kubenswrapper[4826]: I0319 19:08:03.389055 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cczxj\" (UniqueName: \"kubernetes.io/projected/ed4460c1-0ccc-4545-9477-7a5e8380c5e6-kube-api-access-cczxj\") pod \"cert-manager-cainjector-cf98fcc89-mdv5n\" (UID: \"ed4460c1-0ccc-4545-9477-7a5e8380c5e6\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-mdv5n" Mar 19 19:08:03 crc kubenswrapper[4826]: I0319 19:08:03.389137 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drpcb\" (UniqueName: \"kubernetes.io/projected/fe2ad622-0df2-4cb2-8c00-45f4d9a8a1c3-kube-api-access-drpcb\") pod \"cert-manager-webhook-687f57d79b-fhw8v\" (UID: \"fe2ad622-0df2-4cb2-8c00-45f4d9a8a1c3\") " pod="cert-manager/cert-manager-webhook-687f57d79b-fhw8v" Mar 19 19:08:03 crc kubenswrapper[4826]: I0319 19:08:03.490686 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drpcb\" (UniqueName: \"kubernetes.io/projected/fe2ad622-0df2-4cb2-8c00-45f4d9a8a1c3-kube-api-access-drpcb\") pod \"cert-manager-webhook-687f57d79b-fhw8v\" (UID: \"fe2ad622-0df2-4cb2-8c00-45f4d9a8a1c3\") " pod="cert-manager/cert-manager-webhook-687f57d79b-fhw8v" Mar 19 19:08:03 crc kubenswrapper[4826]: I0319 19:08:03.490752 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrhjg\" (UniqueName: \"kubernetes.io/projected/5161bc15-b664-4436-ae44-dc85f97b5dd3-kube-api-access-hrhjg\") pod \"cert-manager-858654f9db-btnzb\" (UID: \"5161bc15-b664-4436-ae44-dc85f97b5dd3\") " pod="cert-manager/cert-manager-858654f9db-btnzb" Mar 19 19:08:03 crc kubenswrapper[4826]: I0319 19:08:03.490780 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cczxj\" (UniqueName: \"kubernetes.io/projected/ed4460c1-0ccc-4545-9477-7a5e8380c5e6-kube-api-access-cczxj\") pod \"cert-manager-cainjector-cf98fcc89-mdv5n\" (UID: \"ed4460c1-0ccc-4545-9477-7a5e8380c5e6\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-mdv5n" Mar 19 19:08:03 crc kubenswrapper[4826]: I0319 19:08:03.509673 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cczxj\" (UniqueName: \"kubernetes.io/projected/ed4460c1-0ccc-4545-9477-7a5e8380c5e6-kube-api-access-cczxj\") pod \"cert-manager-cainjector-cf98fcc89-mdv5n\" (UID: \"ed4460c1-0ccc-4545-9477-7a5e8380c5e6\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-mdv5n" Mar 19 19:08:03 crc kubenswrapper[4826]: I0319 19:08:03.509941 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drpcb\" (UniqueName: \"kubernetes.io/projected/fe2ad622-0df2-4cb2-8c00-45f4d9a8a1c3-kube-api-access-drpcb\") pod \"cert-manager-webhook-687f57d79b-fhw8v\" (UID: \"fe2ad622-0df2-4cb2-8c00-45f4d9a8a1c3\") " pod="cert-manager/cert-manager-webhook-687f57d79b-fhw8v" Mar 19 19:08:03 crc kubenswrapper[4826]: I0319 19:08:03.515026 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrhjg\" (UniqueName: \"kubernetes.io/projected/5161bc15-b664-4436-ae44-dc85f97b5dd3-kube-api-access-hrhjg\") pod \"cert-manager-858654f9db-btnzb\" (UID: \"5161bc15-b664-4436-ae44-dc85f97b5dd3\") " pod="cert-manager/cert-manager-858654f9db-btnzb" Mar 19 19:08:03 crc kubenswrapper[4826]: I0319 19:08:03.607979 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-btnzb" Mar 19 19:08:03 crc kubenswrapper[4826]: I0319 19:08:03.615518 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-mdv5n" Mar 19 19:08:03 crc kubenswrapper[4826]: I0319 19:08:03.626207 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-fhw8v" Mar 19 19:08:04 crc kubenswrapper[4826]: I0319 19:08:04.085755 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-mdv5n"] Mar 19 19:08:04 crc kubenswrapper[4826]: W0319 19:08:04.088235 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded4460c1_0ccc_4545_9477_7a5e8380c5e6.slice/crio-d5e82c1baf3cd72744a9dfd08111283f0c0bdf99ffb73a68674b3b67be3ac483 WatchSource:0}: Error finding container d5e82c1baf3cd72744a9dfd08111283f0c0bdf99ffb73a68674b3b67be3ac483: Status 404 returned error can't find the container with id d5e82c1baf3cd72744a9dfd08111283f0c0bdf99ffb73a68674b3b67be3ac483 Mar 19 19:08:04 crc kubenswrapper[4826]: I0319 19:08:04.143352 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-btnzb"] Mar 19 19:08:04 crc kubenswrapper[4826]: W0319 19:08:04.147000 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5161bc15_b664_4436_ae44_dc85f97b5dd3.slice/crio-aee4732acb83cc0ab80096eead8e7ae248d154cc0645907294a382b9e8958cd8 WatchSource:0}: Error finding container aee4732acb83cc0ab80096eead8e7ae248d154cc0645907294a382b9e8958cd8: Status 404 returned error can't find the container with id aee4732acb83cc0ab80096eead8e7ae248d154cc0645907294a382b9e8958cd8 Mar 19 19:08:04 crc kubenswrapper[4826]: I0319 19:08:04.204737 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-fhw8v"] Mar 19 19:08:04 crc kubenswrapper[4826]: W0319 19:08:04.207349 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe2ad622_0df2_4cb2_8c00_45f4d9a8a1c3.slice/crio-bfb901d817301011383b03ebd310007cf3036edc03259844e9a6220678933623 WatchSource:0}: Error finding container bfb901d817301011383b03ebd310007cf3036edc03259844e9a6220678933623: Status 404 returned error can't find the container with id bfb901d817301011383b03ebd310007cf3036edc03259844e9a6220678933623 Mar 19 19:08:04 crc kubenswrapper[4826]: I0319 19:08:04.364783 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-btnzb" event={"ID":"5161bc15-b664-4436-ae44-dc85f97b5dd3","Type":"ContainerStarted","Data":"aee4732acb83cc0ab80096eead8e7ae248d154cc0645907294a382b9e8958cd8"} Mar 19 19:08:04 crc kubenswrapper[4826]: I0319 19:08:04.365927 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-fhw8v" event={"ID":"fe2ad622-0df2-4cb2-8c00-45f4d9a8a1c3","Type":"ContainerStarted","Data":"bfb901d817301011383b03ebd310007cf3036edc03259844e9a6220678933623"} Mar 19 19:08:04 crc kubenswrapper[4826]: I0319 19:08:04.367498 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-mdv5n" event={"ID":"ed4460c1-0ccc-4545-9477-7a5e8380c5e6","Type":"ContainerStarted","Data":"d5e82c1baf3cd72744a9dfd08111283f0c0bdf99ffb73a68674b3b67be3ac483"} Mar 19 19:08:04 crc kubenswrapper[4826]: I0319 19:08:04.743364 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565788-qhtlk" Mar 19 19:08:04 crc kubenswrapper[4826]: I0319 19:08:04.825034 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlfxf\" (UniqueName: \"kubernetes.io/projected/22b7e1a6-658b-4f9d-a9ee-947a6283266f-kube-api-access-rlfxf\") pod \"22b7e1a6-658b-4f9d-a9ee-947a6283266f\" (UID: \"22b7e1a6-658b-4f9d-a9ee-947a6283266f\") " Mar 19 19:08:04 crc kubenswrapper[4826]: I0319 19:08:04.831359 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22b7e1a6-658b-4f9d-a9ee-947a6283266f-kube-api-access-rlfxf" (OuterVolumeSpecName: "kube-api-access-rlfxf") pod "22b7e1a6-658b-4f9d-a9ee-947a6283266f" (UID: "22b7e1a6-658b-4f9d-a9ee-947a6283266f"). InnerVolumeSpecName "kube-api-access-rlfxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:08:04 crc kubenswrapper[4826]: I0319 19:08:04.928073 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlfxf\" (UniqueName: \"kubernetes.io/projected/22b7e1a6-658b-4f9d-a9ee-947a6283266f-kube-api-access-rlfxf\") on node \"crc\" DevicePath \"\"" Mar 19 19:08:05 crc kubenswrapper[4826]: I0319 19:08:05.392985 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565788-qhtlk" event={"ID":"22b7e1a6-658b-4f9d-a9ee-947a6283266f","Type":"ContainerDied","Data":"8614e8bdda12175fef471ae07d2da35ad7b3c92ec7ba72feecc70789aed41ed5"} Mar 19 19:08:05 crc kubenswrapper[4826]: I0319 19:08:05.393055 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8614e8bdda12175fef471ae07d2da35ad7b3c92ec7ba72feecc70789aed41ed5" Mar 19 19:08:05 crc kubenswrapper[4826]: I0319 19:08:05.393139 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565788-qhtlk" Mar 19 19:08:05 crc kubenswrapper[4826]: I0319 19:08:05.423868 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565782-kmh7v"] Mar 19 19:08:05 crc kubenswrapper[4826]: I0319 19:08:05.428906 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565782-kmh7v"] Mar 19 19:08:06 crc kubenswrapper[4826]: I0319 19:08:06.004098 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e93cc0d-3987-4256-b7cb-6796e66fd509" path="/var/lib/kubelet/pods/0e93cc0d-3987-4256-b7cb-6796e66fd509/volumes" Mar 19 19:08:10 crc kubenswrapper[4826]: I0319 19:08:10.425601 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-mdv5n" event={"ID":"ed4460c1-0ccc-4545-9477-7a5e8380c5e6","Type":"ContainerStarted","Data":"3f1764bdbb640b822f69806a923c5541c60624c61b3baef990ff4df5b0da4aaf"} Mar 19 19:08:10 crc kubenswrapper[4826]: I0319 19:08:10.427388 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-btnzb" event={"ID":"5161bc15-b664-4436-ae44-dc85f97b5dd3","Type":"ContainerStarted","Data":"ee55f808dde0061e3742ad140e27d46a6256fd02f77761ce98230e33d23cd47e"} Mar 19 19:08:10 crc kubenswrapper[4826]: I0319 19:08:10.429113 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-fhw8v" event={"ID":"fe2ad622-0df2-4cb2-8c00-45f4d9a8a1c3","Type":"ContainerStarted","Data":"eb25e1a7f636d76405c62c75f23bc96190aa157fc71aa0d29d74fb6db7a63350"} Mar 19 19:08:10 crc kubenswrapper[4826]: I0319 19:08:10.429296 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-fhw8v" Mar 19 19:08:10 crc kubenswrapper[4826]: I0319 19:08:10.484120 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-mdv5n" podStartSLOduration=2.018181823 podStartE2EDuration="7.484092083s" podCreationTimestamp="2026-03-19 19:08:03 +0000 UTC" firstStartedPulling="2026-03-19 19:08:04.090476099 +0000 UTC m=+708.844544412" lastFinishedPulling="2026-03-19 19:08:09.556386319 +0000 UTC m=+714.310454672" observedRunningTime="2026-03-19 19:08:10.447843148 +0000 UTC m=+715.201911471" watchObservedRunningTime="2026-03-19 19:08:10.484092083 +0000 UTC m=+715.238160426" Mar 19 19:08:10 crc kubenswrapper[4826]: I0319 19:08:10.485596 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-btnzb" podStartSLOduration=1.9808554310000002 podStartE2EDuration="7.48558696s" podCreationTimestamp="2026-03-19 19:08:03 +0000 UTC" firstStartedPulling="2026-03-19 19:08:04.149066592 +0000 UTC m=+708.903134905" lastFinishedPulling="2026-03-19 19:08:09.653798101 +0000 UTC m=+714.407866434" observedRunningTime="2026-03-19 19:08:10.478413171 +0000 UTC m=+715.232481494" watchObservedRunningTime="2026-03-19 19:08:10.48558696 +0000 UTC m=+715.239655283" Mar 19 19:08:10 crc kubenswrapper[4826]: I0319 19:08:10.501693 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-fhw8v" podStartSLOduration=2.042805209 podStartE2EDuration="7.501674913s" podCreationTimestamp="2026-03-19 19:08:03 +0000 UTC" firstStartedPulling="2026-03-19 19:08:04.209516352 +0000 UTC m=+708.963584665" lastFinishedPulling="2026-03-19 19:08:09.668386046 +0000 UTC m=+714.422454369" observedRunningTime="2026-03-19 19:08:10.49957576 +0000 UTC m=+715.253644063" watchObservedRunningTime="2026-03-19 19:08:10.501674913 +0000 UTC m=+715.255743236" Mar 19 19:08:16 crc kubenswrapper[4826]: I0319 19:08:16.706829 4826 scope.go:117] "RemoveContainer" containerID="c9547ac988e9316cc508204e8ce335d1d20f5b6a5ee44042e52aeb47ed62c566" Mar 19 19:08:18 crc kubenswrapper[4826]: I0319 19:08:18.631056 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-fhw8v" Mar 19 19:08:23 crc kubenswrapper[4826]: I0319 19:08:23.659081 4826 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 19 19:08:46 crc kubenswrapper[4826]: I0319 19:08:46.445103 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5t4jtf"] Mar 19 19:08:46 crc kubenswrapper[4826]: E0319 19:08:46.448009 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22b7e1a6-658b-4f9d-a9ee-947a6283266f" containerName="oc" Mar 19 19:08:46 crc kubenswrapper[4826]: I0319 19:08:46.448027 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="22b7e1a6-658b-4f9d-a9ee-947a6283266f" containerName="oc" Mar 19 19:08:46 crc kubenswrapper[4826]: I0319 19:08:46.448175 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="22b7e1a6-658b-4f9d-a9ee-947a6283266f" containerName="oc" Mar 19 19:08:46 crc kubenswrapper[4826]: I0319 19:08:46.449233 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5t4jtf" Mar 19 19:08:46 crc kubenswrapper[4826]: I0319 19:08:46.452505 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 19 19:08:46 crc kubenswrapper[4826]: I0319 19:08:46.457395 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5t4jtf"] Mar 19 19:08:46 crc kubenswrapper[4826]: I0319 19:08:46.624465 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5xpn\" (UniqueName: \"kubernetes.io/projected/1e8dab2c-5a4e-4383-85a4-3422eac078ee-kube-api-access-n5xpn\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5t4jtf\" (UID: \"1e8dab2c-5a4e-4383-85a4-3422eac078ee\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5t4jtf" Mar 19 19:08:46 crc kubenswrapper[4826]: I0319 19:08:46.624687 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1e8dab2c-5a4e-4383-85a4-3422eac078ee-bundle\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5t4jtf\" (UID: \"1e8dab2c-5a4e-4383-85a4-3422eac078ee\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5t4jtf" Mar 19 19:08:46 crc kubenswrapper[4826]: I0319 19:08:46.624739 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1e8dab2c-5a4e-4383-85a4-3422eac078ee-util\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5t4jtf\" (UID: \"1e8dab2c-5a4e-4383-85a4-3422eac078ee\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5t4jtf" Mar 19 19:08:46 crc kubenswrapper[4826]: I0319 19:08:46.726578 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1e8dab2c-5a4e-4383-85a4-3422eac078ee-bundle\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5t4jtf\" (UID: \"1e8dab2c-5a4e-4383-85a4-3422eac078ee\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5t4jtf" Mar 19 19:08:46 crc kubenswrapper[4826]: I0319 19:08:46.726715 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1e8dab2c-5a4e-4383-85a4-3422eac078ee-util\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5t4jtf\" (UID: \"1e8dab2c-5a4e-4383-85a4-3422eac078ee\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5t4jtf" Mar 19 19:08:46 crc kubenswrapper[4826]: I0319 19:08:46.726882 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5xpn\" (UniqueName: \"kubernetes.io/projected/1e8dab2c-5a4e-4383-85a4-3422eac078ee-kube-api-access-n5xpn\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5t4jtf\" (UID: \"1e8dab2c-5a4e-4383-85a4-3422eac078ee\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5t4jtf" Mar 19 19:08:46 crc kubenswrapper[4826]: I0319 19:08:46.727250 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1e8dab2c-5a4e-4383-85a4-3422eac078ee-util\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5t4jtf\" (UID: \"1e8dab2c-5a4e-4383-85a4-3422eac078ee\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5t4jtf" Mar 19 19:08:46 crc kubenswrapper[4826]: I0319 19:08:46.727387 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1e8dab2c-5a4e-4383-85a4-3422eac078ee-bundle\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5t4jtf\" (UID: \"1e8dab2c-5a4e-4383-85a4-3422eac078ee\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5t4jtf" Mar 19 19:08:46 crc kubenswrapper[4826]: I0319 19:08:46.754345 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5xpn\" (UniqueName: \"kubernetes.io/projected/1e8dab2c-5a4e-4383-85a4-3422eac078ee-kube-api-access-n5xpn\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5t4jtf\" (UID: \"1e8dab2c-5a4e-4383-85a4-3422eac078ee\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5t4jtf" Mar 19 19:08:46 crc kubenswrapper[4826]: I0319 19:08:46.764968 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5t4jtf" Mar 19 19:08:46 crc kubenswrapper[4826]: I0319 19:08:46.826322 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cc68dz"] Mar 19 19:08:46 crc kubenswrapper[4826]: I0319 19:08:46.828756 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cc68dz" Mar 19 19:08:46 crc kubenswrapper[4826]: I0319 19:08:46.839912 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cc68dz"] Mar 19 19:08:46 crc kubenswrapper[4826]: I0319 19:08:46.930079 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75mf9\" (UniqueName: \"kubernetes.io/projected/40389493-fee7-4684-8af2-6b5845158143-kube-api-access-75mf9\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cc68dz\" (UID: \"40389493-fee7-4684-8af2-6b5845158143\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cc68dz" Mar 19 19:08:46 crc kubenswrapper[4826]: I0319 19:08:46.930208 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/40389493-fee7-4684-8af2-6b5845158143-bundle\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cc68dz\" (UID: \"40389493-fee7-4684-8af2-6b5845158143\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cc68dz" Mar 19 19:08:46 crc kubenswrapper[4826]: I0319 19:08:46.930239 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/40389493-fee7-4684-8af2-6b5845158143-util\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cc68dz\" (UID: \"40389493-fee7-4684-8af2-6b5845158143\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cc68dz" Mar 19 19:08:47 crc kubenswrapper[4826]: I0319 19:08:47.031230 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75mf9\" (UniqueName: \"kubernetes.io/projected/40389493-fee7-4684-8af2-6b5845158143-kube-api-access-75mf9\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cc68dz\" (UID: \"40389493-fee7-4684-8af2-6b5845158143\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cc68dz" Mar 19 19:08:47 crc kubenswrapper[4826]: I0319 19:08:47.031385 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/40389493-fee7-4684-8af2-6b5845158143-bundle\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cc68dz\" (UID: \"40389493-fee7-4684-8af2-6b5845158143\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cc68dz" Mar 19 19:08:47 crc kubenswrapper[4826]: I0319 19:08:47.031418 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/40389493-fee7-4684-8af2-6b5845158143-util\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cc68dz\" (UID: \"40389493-fee7-4684-8af2-6b5845158143\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cc68dz" Mar 19 19:08:47 crc kubenswrapper[4826]: I0319 19:08:47.031887 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/40389493-fee7-4684-8af2-6b5845158143-util\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cc68dz\" (UID: \"40389493-fee7-4684-8af2-6b5845158143\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cc68dz" Mar 19 19:08:47 crc kubenswrapper[4826]: I0319 19:08:47.031935 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/40389493-fee7-4684-8af2-6b5845158143-bundle\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cc68dz\" (UID: \"40389493-fee7-4684-8af2-6b5845158143\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cc68dz" Mar 19 19:08:47 crc kubenswrapper[4826]: I0319 19:08:47.050788 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75mf9\" (UniqueName: \"kubernetes.io/projected/40389493-fee7-4684-8af2-6b5845158143-kube-api-access-75mf9\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cc68dz\" (UID: \"40389493-fee7-4684-8af2-6b5845158143\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cc68dz" Mar 19 19:08:47 crc kubenswrapper[4826]: I0319 19:08:47.156283 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cc68dz" Mar 19 19:08:47 crc kubenswrapper[4826]: I0319 19:08:47.243244 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5t4jtf"] Mar 19 19:08:47 crc kubenswrapper[4826]: W0319 19:08:47.448455 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40389493_fee7_4684_8af2_6b5845158143.slice/crio-289c5cba4bafcc83b2f9a646140a25580e574d4dc95ac5372d4f2027d95c1f06 WatchSource:0}: Error finding container 289c5cba4bafcc83b2f9a646140a25580e574d4dc95ac5372d4f2027d95c1f06: Status 404 returned error can't find the container with id 289c5cba4bafcc83b2f9a646140a25580e574d4dc95ac5372d4f2027d95c1f06 Mar 19 19:08:47 crc kubenswrapper[4826]: I0319 19:08:47.452589 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cc68dz"] Mar 19 19:08:47 crc kubenswrapper[4826]: I0319 19:08:47.780387 4826 generic.go:334] "Generic (PLEG): container finished" podID="40389493-fee7-4684-8af2-6b5845158143" containerID="79f610c4df321a47cffece59e4d9db829b6c8ea2267a5728c61407e9dc23864d" exitCode=0 Mar 19 19:08:47 crc kubenswrapper[4826]: I0319 19:08:47.780489 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cc68dz" event={"ID":"40389493-fee7-4684-8af2-6b5845158143","Type":"ContainerDied","Data":"79f610c4df321a47cffece59e4d9db829b6c8ea2267a5728c61407e9dc23864d"} Mar 19 19:08:47 crc kubenswrapper[4826]: I0319 19:08:47.780811 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cc68dz" event={"ID":"40389493-fee7-4684-8af2-6b5845158143","Type":"ContainerStarted","Data":"289c5cba4bafcc83b2f9a646140a25580e574d4dc95ac5372d4f2027d95c1f06"} Mar 19 19:08:47 crc kubenswrapper[4826]: I0319 19:08:47.783487 4826 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 19:08:47 crc kubenswrapper[4826]: I0319 19:08:47.786163 4826 generic.go:334] "Generic (PLEG): container finished" podID="1e8dab2c-5a4e-4383-85a4-3422eac078ee" containerID="7fc1d54bd5951a8aa18031fabca7420819582e7035e5b969fcdb34ab32b44e7b" exitCode=0 Mar 19 19:08:47 crc kubenswrapper[4826]: I0319 19:08:47.786214 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5t4jtf" event={"ID":"1e8dab2c-5a4e-4383-85a4-3422eac078ee","Type":"ContainerDied","Data":"7fc1d54bd5951a8aa18031fabca7420819582e7035e5b969fcdb34ab32b44e7b"} Mar 19 19:08:47 crc kubenswrapper[4826]: I0319 19:08:47.786251 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5t4jtf" event={"ID":"1e8dab2c-5a4e-4383-85a4-3422eac078ee","Type":"ContainerStarted","Data":"85e03aa780312ee0621e6efaac5015ef0a897ac2773898872e225f8253f011f4"} Mar 19 19:08:49 crc kubenswrapper[4826]: I0319 19:08:49.800761 4826 generic.go:334] "Generic (PLEG): container finished" podID="40389493-fee7-4684-8af2-6b5845158143" containerID="26e8451e538752def4241333c53a92c844d42dd37084df7e30c8f069fdc2cefa" exitCode=0 Mar 19 19:08:49 crc kubenswrapper[4826]: I0319 19:08:49.800866 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cc68dz" event={"ID":"40389493-fee7-4684-8af2-6b5845158143","Type":"ContainerDied","Data":"26e8451e538752def4241333c53a92c844d42dd37084df7e30c8f069fdc2cefa"} Mar 19 19:08:50 crc kubenswrapper[4826]: I0319 19:08:50.158678 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-295bd"] Mar 19 19:08:50 crc kubenswrapper[4826]: I0319 19:08:50.160359 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-295bd" Mar 19 19:08:50 crc kubenswrapper[4826]: I0319 19:08:50.178485 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-295bd"] Mar 19 19:08:50 crc kubenswrapper[4826]: I0319 19:08:50.280419 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8929441-a64d-4fd8-aa9a-12b6585090e5-catalog-content\") pod \"redhat-operators-295bd\" (UID: \"a8929441-a64d-4fd8-aa9a-12b6585090e5\") " pod="openshift-marketplace/redhat-operators-295bd" Mar 19 19:08:50 crc kubenswrapper[4826]: I0319 19:08:50.280466 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8929441-a64d-4fd8-aa9a-12b6585090e5-utilities\") pod \"redhat-operators-295bd\" (UID: \"a8929441-a64d-4fd8-aa9a-12b6585090e5\") " pod="openshift-marketplace/redhat-operators-295bd" Mar 19 19:08:50 crc kubenswrapper[4826]: I0319 19:08:50.280491 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkkgp\" (UniqueName: \"kubernetes.io/projected/a8929441-a64d-4fd8-aa9a-12b6585090e5-kube-api-access-zkkgp\") pod \"redhat-operators-295bd\" (UID: \"a8929441-a64d-4fd8-aa9a-12b6585090e5\") " pod="openshift-marketplace/redhat-operators-295bd" Mar 19 19:08:50 crc kubenswrapper[4826]: I0319 19:08:50.381702 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkkgp\" (UniqueName: \"kubernetes.io/projected/a8929441-a64d-4fd8-aa9a-12b6585090e5-kube-api-access-zkkgp\") pod \"redhat-operators-295bd\" (UID: \"a8929441-a64d-4fd8-aa9a-12b6585090e5\") " pod="openshift-marketplace/redhat-operators-295bd" Mar 19 19:08:50 crc kubenswrapper[4826]: I0319 19:08:50.381864 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8929441-a64d-4fd8-aa9a-12b6585090e5-catalog-content\") pod \"redhat-operators-295bd\" (UID: \"a8929441-a64d-4fd8-aa9a-12b6585090e5\") " pod="openshift-marketplace/redhat-operators-295bd" Mar 19 19:08:50 crc kubenswrapper[4826]: I0319 19:08:50.381896 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8929441-a64d-4fd8-aa9a-12b6585090e5-utilities\") pod \"redhat-operators-295bd\" (UID: \"a8929441-a64d-4fd8-aa9a-12b6585090e5\") " pod="openshift-marketplace/redhat-operators-295bd" Mar 19 19:08:50 crc kubenswrapper[4826]: I0319 19:08:50.382398 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8929441-a64d-4fd8-aa9a-12b6585090e5-utilities\") pod \"redhat-operators-295bd\" (UID: \"a8929441-a64d-4fd8-aa9a-12b6585090e5\") " pod="openshift-marketplace/redhat-operators-295bd" Mar 19 19:08:50 crc kubenswrapper[4826]: I0319 19:08:50.382425 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8929441-a64d-4fd8-aa9a-12b6585090e5-catalog-content\") pod \"redhat-operators-295bd\" (UID: \"a8929441-a64d-4fd8-aa9a-12b6585090e5\") " pod="openshift-marketplace/redhat-operators-295bd" Mar 19 19:08:50 crc kubenswrapper[4826]: I0319 19:08:50.407820 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkkgp\" (UniqueName: \"kubernetes.io/projected/a8929441-a64d-4fd8-aa9a-12b6585090e5-kube-api-access-zkkgp\") pod \"redhat-operators-295bd\" (UID: \"a8929441-a64d-4fd8-aa9a-12b6585090e5\") " pod="openshift-marketplace/redhat-operators-295bd" Mar 19 19:08:50 crc kubenswrapper[4826]: I0319 19:08:50.476868 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-295bd" Mar 19 19:08:50 crc kubenswrapper[4826]: I0319 19:08:50.691529 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-295bd"] Mar 19 19:08:50 crc kubenswrapper[4826]: W0319 19:08:50.696471 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8929441_a64d_4fd8_aa9a_12b6585090e5.slice/crio-d48e3084534869b35086c7381a4563ded421096b7c3f9d0f0f6b25e9add3f8ab WatchSource:0}: Error finding container d48e3084534869b35086c7381a4563ded421096b7c3f9d0f0f6b25e9add3f8ab: Status 404 returned error can't find the container with id d48e3084534869b35086c7381a4563ded421096b7c3f9d0f0f6b25e9add3f8ab Mar 19 19:08:50 crc kubenswrapper[4826]: I0319 19:08:50.808865 4826 generic.go:334] "Generic (PLEG): container finished" podID="40389493-fee7-4684-8af2-6b5845158143" containerID="133bc69938335e8e2248754191c9443230edac5b9c2bd58b52190336f4195323" exitCode=0 Mar 19 19:08:50 crc kubenswrapper[4826]: I0319 19:08:50.808971 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cc68dz" event={"ID":"40389493-fee7-4684-8af2-6b5845158143","Type":"ContainerDied","Data":"133bc69938335e8e2248754191c9443230edac5b9c2bd58b52190336f4195323"} Mar 19 19:08:50 crc kubenswrapper[4826]: I0319 19:08:50.810118 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-295bd" event={"ID":"a8929441-a64d-4fd8-aa9a-12b6585090e5","Type":"ContainerStarted","Data":"d48e3084534869b35086c7381a4563ded421096b7c3f9d0f0f6b25e9add3f8ab"} Mar 19 19:08:51 crc kubenswrapper[4826]: I0319 19:08:51.818302 4826 generic.go:334] "Generic (PLEG): container finished" podID="a8929441-a64d-4fd8-aa9a-12b6585090e5" containerID="8e632c0b3cf98a82f2ccf0ccff6ee20cf96c64dc86ff1ebf498702c0e0e7efd3" exitCode=0 Mar 19 19:08:51 crc kubenswrapper[4826]: I0319 19:08:51.819298 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-295bd" event={"ID":"a8929441-a64d-4fd8-aa9a-12b6585090e5","Type":"ContainerDied","Data":"8e632c0b3cf98a82f2ccf0ccff6ee20cf96c64dc86ff1ebf498702c0e0e7efd3"} Mar 19 19:08:52 crc kubenswrapper[4826]: I0319 19:08:52.082880 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cc68dz" Mar 19 19:08:52 crc kubenswrapper[4826]: I0319 19:08:52.210730 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/40389493-fee7-4684-8af2-6b5845158143-util\") pod \"40389493-fee7-4684-8af2-6b5845158143\" (UID: \"40389493-fee7-4684-8af2-6b5845158143\") " Mar 19 19:08:52 crc kubenswrapper[4826]: I0319 19:08:52.210785 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/40389493-fee7-4684-8af2-6b5845158143-bundle\") pod \"40389493-fee7-4684-8af2-6b5845158143\" (UID: \"40389493-fee7-4684-8af2-6b5845158143\") " Mar 19 19:08:52 crc kubenswrapper[4826]: I0319 19:08:52.210845 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75mf9\" (UniqueName: \"kubernetes.io/projected/40389493-fee7-4684-8af2-6b5845158143-kube-api-access-75mf9\") pod \"40389493-fee7-4684-8af2-6b5845158143\" (UID: \"40389493-fee7-4684-8af2-6b5845158143\") " Mar 19 19:08:52 crc kubenswrapper[4826]: I0319 19:08:52.211638 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40389493-fee7-4684-8af2-6b5845158143-bundle" (OuterVolumeSpecName: "bundle") pod "40389493-fee7-4684-8af2-6b5845158143" (UID: "40389493-fee7-4684-8af2-6b5845158143"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:08:52 crc kubenswrapper[4826]: I0319 19:08:52.216498 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40389493-fee7-4684-8af2-6b5845158143-kube-api-access-75mf9" (OuterVolumeSpecName: "kube-api-access-75mf9") pod "40389493-fee7-4684-8af2-6b5845158143" (UID: "40389493-fee7-4684-8af2-6b5845158143"). InnerVolumeSpecName "kube-api-access-75mf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:08:52 crc kubenswrapper[4826]: I0319 19:08:52.241205 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40389493-fee7-4684-8af2-6b5845158143-util" (OuterVolumeSpecName: "util") pod "40389493-fee7-4684-8af2-6b5845158143" (UID: "40389493-fee7-4684-8af2-6b5845158143"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:08:52 crc kubenswrapper[4826]: I0319 19:08:52.312516 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75mf9\" (UniqueName: \"kubernetes.io/projected/40389493-fee7-4684-8af2-6b5845158143-kube-api-access-75mf9\") on node \"crc\" DevicePath \"\"" Mar 19 19:08:52 crc kubenswrapper[4826]: I0319 19:08:52.312556 4826 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/40389493-fee7-4684-8af2-6b5845158143-util\") on node \"crc\" DevicePath \"\"" Mar 19 19:08:52 crc kubenswrapper[4826]: I0319 19:08:52.312566 4826 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/40389493-fee7-4684-8af2-6b5845158143-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:08:52 crc kubenswrapper[4826]: I0319 19:08:52.825610 4826 generic.go:334] "Generic (PLEG): container finished" podID="1e8dab2c-5a4e-4383-85a4-3422eac078ee" containerID="0e5ac4ecc1208375e80d5bf982a882d75c9ddcb370d2e281e22c99622005af53" exitCode=0 Mar 19 19:08:52 crc kubenswrapper[4826]: I0319 19:08:52.825701 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5t4jtf" event={"ID":"1e8dab2c-5a4e-4383-85a4-3422eac078ee","Type":"ContainerDied","Data":"0e5ac4ecc1208375e80d5bf982a882d75c9ddcb370d2e281e22c99622005af53"} Mar 19 19:08:52 crc kubenswrapper[4826]: I0319 19:08:52.830035 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cc68dz" event={"ID":"40389493-fee7-4684-8af2-6b5845158143","Type":"ContainerDied","Data":"289c5cba4bafcc83b2f9a646140a25580e574d4dc95ac5372d4f2027d95c1f06"} Mar 19 19:08:52 crc kubenswrapper[4826]: I0319 19:08:52.830060 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="289c5cba4bafcc83b2f9a646140a25580e574d4dc95ac5372d4f2027d95c1f06" Mar 19 19:08:52 crc kubenswrapper[4826]: I0319 19:08:52.830074 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cc68dz" Mar 19 19:08:52 crc kubenswrapper[4826]: I0319 19:08:52.831589 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-295bd" event={"ID":"a8929441-a64d-4fd8-aa9a-12b6585090e5","Type":"ContainerStarted","Data":"428c8464a031245eeaff4997467dc696368fe0618be370763fb84ca8994fa380"} Mar 19 19:08:53 crc kubenswrapper[4826]: I0319 19:08:53.845222 4826 generic.go:334] "Generic (PLEG): container finished" podID="1e8dab2c-5a4e-4383-85a4-3422eac078ee" containerID="b05c1ae3f74b74109f9145fd4133d05a08f6a1fadec94026e219acbc31740b7d" exitCode=0 Mar 19 19:08:53 crc kubenswrapper[4826]: I0319 19:08:53.845311 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5t4jtf" event={"ID":"1e8dab2c-5a4e-4383-85a4-3422eac078ee","Type":"ContainerDied","Data":"b05c1ae3f74b74109f9145fd4133d05a08f6a1fadec94026e219acbc31740b7d"} Mar 19 19:08:53 crc kubenswrapper[4826]: I0319 19:08:53.849238 4826 generic.go:334] "Generic (PLEG): container finished" podID="a8929441-a64d-4fd8-aa9a-12b6585090e5" containerID="428c8464a031245eeaff4997467dc696368fe0618be370763fb84ca8994fa380" exitCode=0 Mar 19 19:08:53 crc kubenswrapper[4826]: I0319 19:08:53.849332 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-295bd" event={"ID":"a8929441-a64d-4fd8-aa9a-12b6585090e5","Type":"ContainerDied","Data":"428c8464a031245eeaff4997467dc696368fe0618be370763fb84ca8994fa380"} Mar 19 19:08:54 crc kubenswrapper[4826]: I0319 19:08:54.865686 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-295bd" event={"ID":"a8929441-a64d-4fd8-aa9a-12b6585090e5","Type":"ContainerStarted","Data":"1fe4b3d6867cf38019a25240af4b912e650b14fc9e8daeea59e52453d8485846"} Mar 19 19:08:54 crc kubenswrapper[4826]: I0319 19:08:54.901894 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-295bd" podStartSLOduration=2.36430708 podStartE2EDuration="4.90186718s" podCreationTimestamp="2026-03-19 19:08:50 +0000 UTC" firstStartedPulling="2026-03-19 19:08:51.819731325 +0000 UTC m=+756.573799648" lastFinishedPulling="2026-03-19 19:08:54.357291395 +0000 UTC m=+759.111359748" observedRunningTime="2026-03-19 19:08:54.896316644 +0000 UTC m=+759.650384997" watchObservedRunningTime="2026-03-19 19:08:54.90186718 +0000 UTC m=+759.655935513" Mar 19 19:08:55 crc kubenswrapper[4826]: I0319 19:08:55.220811 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5t4jtf" Mar 19 19:08:55 crc kubenswrapper[4826]: I0319 19:08:55.365494 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1e8dab2c-5a4e-4383-85a4-3422eac078ee-util\") pod \"1e8dab2c-5a4e-4383-85a4-3422eac078ee\" (UID: \"1e8dab2c-5a4e-4383-85a4-3422eac078ee\") " Mar 19 19:08:55 crc kubenswrapper[4826]: I0319 19:08:55.366248 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5xpn\" (UniqueName: \"kubernetes.io/projected/1e8dab2c-5a4e-4383-85a4-3422eac078ee-kube-api-access-n5xpn\") pod \"1e8dab2c-5a4e-4383-85a4-3422eac078ee\" (UID: \"1e8dab2c-5a4e-4383-85a4-3422eac078ee\") " Mar 19 19:08:55 crc kubenswrapper[4826]: I0319 19:08:55.366282 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1e8dab2c-5a4e-4383-85a4-3422eac078ee-bundle\") pod \"1e8dab2c-5a4e-4383-85a4-3422eac078ee\" (UID: \"1e8dab2c-5a4e-4383-85a4-3422eac078ee\") " Mar 19 19:08:55 crc kubenswrapper[4826]: I0319 19:08:55.367209 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e8dab2c-5a4e-4383-85a4-3422eac078ee-bundle" (OuterVolumeSpecName: "bundle") pod "1e8dab2c-5a4e-4383-85a4-3422eac078ee" (UID: "1e8dab2c-5a4e-4383-85a4-3422eac078ee"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:08:55 crc kubenswrapper[4826]: I0319 19:08:55.371918 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e8dab2c-5a4e-4383-85a4-3422eac078ee-kube-api-access-n5xpn" (OuterVolumeSpecName: "kube-api-access-n5xpn") pod "1e8dab2c-5a4e-4383-85a4-3422eac078ee" (UID: "1e8dab2c-5a4e-4383-85a4-3422eac078ee"). InnerVolumeSpecName "kube-api-access-n5xpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:08:55 crc kubenswrapper[4826]: I0319 19:08:55.380030 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e8dab2c-5a4e-4383-85a4-3422eac078ee-util" (OuterVolumeSpecName: "util") pod "1e8dab2c-5a4e-4383-85a4-3422eac078ee" (UID: "1e8dab2c-5a4e-4383-85a4-3422eac078ee"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:08:55 crc kubenswrapper[4826]: I0319 19:08:55.400490 4826 patch_prober.go:28] interesting pod/machine-config-daemon-zz87p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:08:55 crc kubenswrapper[4826]: I0319 19:08:55.400555 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:08:55 crc kubenswrapper[4826]: I0319 19:08:55.468008 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5xpn\" (UniqueName: \"kubernetes.io/projected/1e8dab2c-5a4e-4383-85a4-3422eac078ee-kube-api-access-n5xpn\") on node \"crc\" DevicePath \"\"" Mar 19 19:08:55 crc kubenswrapper[4826]: I0319 19:08:55.468042 4826 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1e8dab2c-5a4e-4383-85a4-3422eac078ee-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:08:55 crc kubenswrapper[4826]: I0319 19:08:55.468050 4826 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1e8dab2c-5a4e-4383-85a4-3422eac078ee-util\") on node \"crc\" DevicePath \"\"" Mar 19 19:08:55 crc kubenswrapper[4826]: I0319 19:08:55.877767 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5t4jtf" Mar 19 19:08:55 crc kubenswrapper[4826]: I0319 19:08:55.877760 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5t4jtf" event={"ID":"1e8dab2c-5a4e-4383-85a4-3422eac078ee","Type":"ContainerDied","Data":"85e03aa780312ee0621e6efaac5015ef0a897ac2773898872e225f8253f011f4"} Mar 19 19:08:55 crc kubenswrapper[4826]: I0319 19:08:55.877866 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85e03aa780312ee0621e6efaac5015ef0a897ac2773898872e225f8253f011f4" Mar 19 19:09:00 crc kubenswrapper[4826]: I0319 19:09:00.477907 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-295bd" Mar 19 19:09:00 crc kubenswrapper[4826]: I0319 19:09:00.478397 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-295bd" Mar 19 19:09:01 crc kubenswrapper[4826]: I0319 19:09:01.544504 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-295bd" podUID="a8929441-a64d-4fd8-aa9a-12b6585090e5" containerName="registry-server" probeResult="failure" output=< Mar 19 19:09:01 crc kubenswrapper[4826]: timeout: failed to connect service ":50051" within 1s Mar 19 19:09:01 crc kubenswrapper[4826]: > Mar 19 19:09:03 crc kubenswrapper[4826]: I0319 19:09:03.690451 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-d88f59dd5-fqs6s"] Mar 19 19:09:03 crc kubenswrapper[4826]: E0319 19:09:03.691246 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40389493-fee7-4684-8af2-6b5845158143" containerName="util" Mar 19 19:09:03 crc kubenswrapper[4826]: I0319 19:09:03.691261 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="40389493-fee7-4684-8af2-6b5845158143" containerName="util" Mar 19 19:09:03 crc kubenswrapper[4826]: E0319 19:09:03.691277 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e8dab2c-5a4e-4383-85a4-3422eac078ee" containerName="extract" Mar 19 19:09:03 crc kubenswrapper[4826]: I0319 19:09:03.691284 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e8dab2c-5a4e-4383-85a4-3422eac078ee" containerName="extract" Mar 19 19:09:03 crc kubenswrapper[4826]: E0319 19:09:03.691296 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40389493-fee7-4684-8af2-6b5845158143" containerName="extract" Mar 19 19:09:03 crc kubenswrapper[4826]: I0319 19:09:03.691303 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="40389493-fee7-4684-8af2-6b5845158143" containerName="extract" Mar 19 19:09:03 crc kubenswrapper[4826]: E0319 19:09:03.691313 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e8dab2c-5a4e-4383-85a4-3422eac078ee" containerName="util" Mar 19 19:09:03 crc kubenswrapper[4826]: I0319 19:09:03.691322 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e8dab2c-5a4e-4383-85a4-3422eac078ee" containerName="util" Mar 19 19:09:03 crc kubenswrapper[4826]: E0319 19:09:03.691333 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40389493-fee7-4684-8af2-6b5845158143" containerName="pull" Mar 19 19:09:03 crc kubenswrapper[4826]: I0319 19:09:03.691341 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="40389493-fee7-4684-8af2-6b5845158143" containerName="pull" Mar 19 19:09:03 crc kubenswrapper[4826]: E0319 19:09:03.691363 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e8dab2c-5a4e-4383-85a4-3422eac078ee" containerName="pull" Mar 19 19:09:03 crc kubenswrapper[4826]: I0319 19:09:03.691371 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e8dab2c-5a4e-4383-85a4-3422eac078ee" containerName="pull" Mar 19 19:09:03 crc kubenswrapper[4826]: I0319 19:09:03.691521 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="40389493-fee7-4684-8af2-6b5845158143" containerName="extract" Mar 19 19:09:03 crc kubenswrapper[4826]: I0319 19:09:03.691533 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e8dab2c-5a4e-4383-85a4-3422eac078ee" containerName="extract" Mar 19 19:09:03 crc kubenswrapper[4826]: I0319 19:09:03.692298 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-d88f59dd5-fqs6s" Mar 19 19:09:03 crc kubenswrapper[4826]: I0319 19:09:03.694865 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"openshift-service-ca.crt" Mar 19 19:09:03 crc kubenswrapper[4826]: I0319 19:09:03.695154 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-metrics" Mar 19 19:09:03 crc kubenswrapper[4826]: I0319 19:09:03.695350 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-service-cert" Mar 19 19:09:03 crc kubenswrapper[4826]: I0319 19:09:03.695563 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-dockercfg-kn6xk" Mar 19 19:09:03 crc kubenswrapper[4826]: I0319 19:09:03.695742 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"loki-operator-manager-config" Mar 19 19:09:03 crc kubenswrapper[4826]: I0319 19:09:03.699099 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"kube-root-ca.crt" Mar 19 19:09:03 crc kubenswrapper[4826]: I0319 19:09:03.717624 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-d88f59dd5-fqs6s"] Mar 19 19:09:03 crc kubenswrapper[4826]: I0319 19:09:03.795668 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/84bba80c-841e-4df3-87e0-901afbc23bf3-apiservice-cert\") pod \"loki-operator-controller-manager-d88f59dd5-fqs6s\" (UID: \"84bba80c-841e-4df3-87e0-901afbc23bf3\") " pod="openshift-operators-redhat/loki-operator-controller-manager-d88f59dd5-fqs6s" Mar 19 19:09:03 crc kubenswrapper[4826]: I0319 19:09:03.795714 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/84bba80c-841e-4df3-87e0-901afbc23bf3-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-d88f59dd5-fqs6s\" (UID: \"84bba80c-841e-4df3-87e0-901afbc23bf3\") " pod="openshift-operators-redhat/loki-operator-controller-manager-d88f59dd5-fqs6s" Mar 19 19:09:03 crc kubenswrapper[4826]: I0319 19:09:03.795744 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/84bba80c-841e-4df3-87e0-901afbc23bf3-webhook-cert\") pod \"loki-operator-controller-manager-d88f59dd5-fqs6s\" (UID: \"84bba80c-841e-4df3-87e0-901afbc23bf3\") " pod="openshift-operators-redhat/loki-operator-controller-manager-d88f59dd5-fqs6s" Mar 19 19:09:03 crc kubenswrapper[4826]: I0319 19:09:03.795763 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcsdn\" (UniqueName: \"kubernetes.io/projected/84bba80c-841e-4df3-87e0-901afbc23bf3-kube-api-access-vcsdn\") pod \"loki-operator-controller-manager-d88f59dd5-fqs6s\" (UID: \"84bba80c-841e-4df3-87e0-901afbc23bf3\") " pod="openshift-operators-redhat/loki-operator-controller-manager-d88f59dd5-fqs6s" Mar 19 19:09:03 crc kubenswrapper[4826]: I0319 19:09:03.795860 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/84bba80c-841e-4df3-87e0-901afbc23bf3-manager-config\") pod \"loki-operator-controller-manager-d88f59dd5-fqs6s\" (UID: \"84bba80c-841e-4df3-87e0-901afbc23bf3\") " pod="openshift-operators-redhat/loki-operator-controller-manager-d88f59dd5-fqs6s" Mar 19 19:09:03 crc kubenswrapper[4826]: I0319 19:09:03.897724 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/84bba80c-841e-4df3-87e0-901afbc23bf3-apiservice-cert\") pod \"loki-operator-controller-manager-d88f59dd5-fqs6s\" (UID: \"84bba80c-841e-4df3-87e0-901afbc23bf3\") " pod="openshift-operators-redhat/loki-operator-controller-manager-d88f59dd5-fqs6s" Mar 19 19:09:03 crc kubenswrapper[4826]: I0319 19:09:03.897792 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/84bba80c-841e-4df3-87e0-901afbc23bf3-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-d88f59dd5-fqs6s\" (UID: \"84bba80c-841e-4df3-87e0-901afbc23bf3\") " pod="openshift-operators-redhat/loki-operator-controller-manager-d88f59dd5-fqs6s" Mar 19 19:09:03 crc kubenswrapper[4826]: I0319 19:09:03.897836 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/84bba80c-841e-4df3-87e0-901afbc23bf3-webhook-cert\") pod \"loki-operator-controller-manager-d88f59dd5-fqs6s\" (UID: \"84bba80c-841e-4df3-87e0-901afbc23bf3\") " pod="openshift-operators-redhat/loki-operator-controller-manager-d88f59dd5-fqs6s" Mar 19 19:09:03 crc kubenswrapper[4826]: I0319 19:09:03.897861 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcsdn\" (UniqueName: \"kubernetes.io/projected/84bba80c-841e-4df3-87e0-901afbc23bf3-kube-api-access-vcsdn\") pod \"loki-operator-controller-manager-d88f59dd5-fqs6s\" (UID: \"84bba80c-841e-4df3-87e0-901afbc23bf3\") " pod="openshift-operators-redhat/loki-operator-controller-manager-d88f59dd5-fqs6s" Mar 19 19:09:03 crc kubenswrapper[4826]: I0319 19:09:03.897887 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/84bba80c-841e-4df3-87e0-901afbc23bf3-manager-config\") pod \"loki-operator-controller-manager-d88f59dd5-fqs6s\" (UID: \"84bba80c-841e-4df3-87e0-901afbc23bf3\") " pod="openshift-operators-redhat/loki-operator-controller-manager-d88f59dd5-fqs6s" Mar 19 19:09:03 crc kubenswrapper[4826]: I0319 19:09:03.898878 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/84bba80c-841e-4df3-87e0-901afbc23bf3-manager-config\") pod \"loki-operator-controller-manager-d88f59dd5-fqs6s\" (UID: \"84bba80c-841e-4df3-87e0-901afbc23bf3\") " pod="openshift-operators-redhat/loki-operator-controller-manager-d88f59dd5-fqs6s" Mar 19 19:09:03 crc kubenswrapper[4826]: I0319 19:09:03.903986 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/84bba80c-841e-4df3-87e0-901afbc23bf3-webhook-cert\") pod \"loki-operator-controller-manager-d88f59dd5-fqs6s\" (UID: \"84bba80c-841e-4df3-87e0-901afbc23bf3\") " pod="openshift-operators-redhat/loki-operator-controller-manager-d88f59dd5-fqs6s" Mar 19 19:09:03 crc kubenswrapper[4826]: I0319 19:09:03.904027 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/84bba80c-841e-4df3-87e0-901afbc23bf3-apiservice-cert\") pod \"loki-operator-controller-manager-d88f59dd5-fqs6s\" (UID: \"84bba80c-841e-4df3-87e0-901afbc23bf3\") " pod="openshift-operators-redhat/loki-operator-controller-manager-d88f59dd5-fqs6s" Mar 19 19:09:03 crc kubenswrapper[4826]: I0319 19:09:03.908321 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/84bba80c-841e-4df3-87e0-901afbc23bf3-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-d88f59dd5-fqs6s\" (UID: \"84bba80c-841e-4df3-87e0-901afbc23bf3\") " pod="openshift-operators-redhat/loki-operator-controller-manager-d88f59dd5-fqs6s" Mar 19 19:09:03 crc kubenswrapper[4826]: I0319 19:09:03.915335 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcsdn\" (UniqueName: \"kubernetes.io/projected/84bba80c-841e-4df3-87e0-901afbc23bf3-kube-api-access-vcsdn\") pod \"loki-operator-controller-manager-d88f59dd5-fqs6s\" (UID: \"84bba80c-841e-4df3-87e0-901afbc23bf3\") " pod="openshift-operators-redhat/loki-operator-controller-manager-d88f59dd5-fqs6s" Mar 19 19:09:04 crc kubenswrapper[4826]: I0319 19:09:04.010481 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-d88f59dd5-fqs6s" Mar 19 19:09:04 crc kubenswrapper[4826]: I0319 19:09:04.227918 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-d88f59dd5-fqs6s"] Mar 19 19:09:04 crc kubenswrapper[4826]: W0319 19:09:04.247727 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84bba80c_841e_4df3_87e0_901afbc23bf3.slice/crio-30061910d0ee4929083ea39342dabb4ff3dd63c94e4547c61f89329dbb50d4de WatchSource:0}: Error finding container 30061910d0ee4929083ea39342dabb4ff3dd63c94e4547c61f89329dbb50d4de: Status 404 returned error can't find the container with id 30061910d0ee4929083ea39342dabb4ff3dd63c94e4547c61f89329dbb50d4de Mar 19 19:09:04 crc kubenswrapper[4826]: I0319 19:09:04.936462 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-d88f59dd5-fqs6s" event={"ID":"84bba80c-841e-4df3-87e0-901afbc23bf3","Type":"ContainerStarted","Data":"30061910d0ee4929083ea39342dabb4ff3dd63c94e4547c61f89329dbb50d4de"} Mar 19 19:09:06 crc kubenswrapper[4826]: I0319 19:09:06.583244 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/cluster-logging-operator-66689c4bbf-gpn69"] Mar 19 19:09:06 crc kubenswrapper[4826]: I0319 19:09:06.584691 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/cluster-logging-operator-66689c4bbf-gpn69" Mar 19 19:09:06 crc kubenswrapper[4826]: I0319 19:09:06.588329 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"kube-root-ca.crt" Mar 19 19:09:06 crc kubenswrapper[4826]: I0319 19:09:06.589238 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"openshift-service-ca.crt" Mar 19 19:09:06 crc kubenswrapper[4826]: I0319 19:09:06.589492 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"cluster-logging-operator-dockercfg-2nnn7" Mar 19 19:09:06 crc kubenswrapper[4826]: I0319 19:09:06.614273 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/cluster-logging-operator-66689c4bbf-gpn69"] Mar 19 19:09:06 crc kubenswrapper[4826]: I0319 19:09:06.748219 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v6j2\" (UniqueName: \"kubernetes.io/projected/c2f22ac6-fb47-448e-8570-b95a2688d081-kube-api-access-9v6j2\") pod \"cluster-logging-operator-66689c4bbf-gpn69\" (UID: \"c2f22ac6-fb47-448e-8570-b95a2688d081\") " pod="openshift-logging/cluster-logging-operator-66689c4bbf-gpn69" Mar 19 19:09:06 crc kubenswrapper[4826]: I0319 19:09:06.849177 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v6j2\" (UniqueName: \"kubernetes.io/projected/c2f22ac6-fb47-448e-8570-b95a2688d081-kube-api-access-9v6j2\") pod \"cluster-logging-operator-66689c4bbf-gpn69\" (UID: \"c2f22ac6-fb47-448e-8570-b95a2688d081\") " pod="openshift-logging/cluster-logging-operator-66689c4bbf-gpn69" Mar 19 19:09:06 crc kubenswrapper[4826]: I0319 19:09:06.867340 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v6j2\" (UniqueName: \"kubernetes.io/projected/c2f22ac6-fb47-448e-8570-b95a2688d081-kube-api-access-9v6j2\") pod \"cluster-logging-operator-66689c4bbf-gpn69\" (UID: \"c2f22ac6-fb47-448e-8570-b95a2688d081\") " pod="openshift-logging/cluster-logging-operator-66689c4bbf-gpn69" Mar 19 19:09:06 crc kubenswrapper[4826]: I0319 19:09:06.911024 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/cluster-logging-operator-66689c4bbf-gpn69" Mar 19 19:09:07 crc kubenswrapper[4826]: I0319 19:09:07.247978 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/cluster-logging-operator-66689c4bbf-gpn69"] Mar 19 19:09:07 crc kubenswrapper[4826]: W0319 19:09:07.259610 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2f22ac6_fb47_448e_8570_b95a2688d081.slice/crio-98d420e5da1a3bf6d2c407ffa3a40645cd737c3d62d254781d8efce60d77b153 WatchSource:0}: Error finding container 98d420e5da1a3bf6d2c407ffa3a40645cd737c3d62d254781d8efce60d77b153: Status 404 returned error can't find the container with id 98d420e5da1a3bf6d2c407ffa3a40645cd737c3d62d254781d8efce60d77b153 Mar 19 19:09:07 crc kubenswrapper[4826]: I0319 19:09:07.959678 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/cluster-logging-operator-66689c4bbf-gpn69" event={"ID":"c2f22ac6-fb47-448e-8570-b95a2688d081","Type":"ContainerStarted","Data":"98d420e5da1a3bf6d2c407ffa3a40645cd737c3d62d254781d8efce60d77b153"} Mar 19 19:09:10 crc kubenswrapper[4826]: I0319 19:09:10.534196 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-295bd" Mar 19 19:09:10 crc kubenswrapper[4826]: I0319 19:09:10.591838 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-295bd" Mar 19 19:09:10 crc kubenswrapper[4826]: I0319 19:09:10.987131 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-d88f59dd5-fqs6s" event={"ID":"84bba80c-841e-4df3-87e0-901afbc23bf3","Type":"ContainerStarted","Data":"848443327f956e513fb70499b5d5c8874d8078b35219c51423462c8665815814"} Mar 19 19:09:13 crc kubenswrapper[4826]: I0319 19:09:13.754385 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-295bd"] Mar 19 19:09:13 crc kubenswrapper[4826]: I0319 19:09:13.755991 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-295bd" podUID="a8929441-a64d-4fd8-aa9a-12b6585090e5" containerName="registry-server" containerID="cri-o://1fe4b3d6867cf38019a25240af4b912e650b14fc9e8daeea59e52453d8485846" gracePeriod=2 Mar 19 19:09:14 crc kubenswrapper[4826]: I0319 19:09:14.036971 4826 generic.go:334] "Generic (PLEG): container finished" podID="a8929441-a64d-4fd8-aa9a-12b6585090e5" containerID="1fe4b3d6867cf38019a25240af4b912e650b14fc9e8daeea59e52453d8485846" exitCode=0 Mar 19 19:09:14 crc kubenswrapper[4826]: I0319 19:09:14.037015 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-295bd" event={"ID":"a8929441-a64d-4fd8-aa9a-12b6585090e5","Type":"ContainerDied","Data":"1fe4b3d6867cf38019a25240af4b912e650b14fc9e8daeea59e52453d8485846"} Mar 19 19:09:18 crc kubenswrapper[4826]: I0319 19:09:18.047906 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-295bd" Mar 19 19:09:18 crc kubenswrapper[4826]: I0319 19:09:18.083133 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-295bd" event={"ID":"a8929441-a64d-4fd8-aa9a-12b6585090e5","Type":"ContainerDied","Data":"d48e3084534869b35086c7381a4563ded421096b7c3f9d0f0f6b25e9add3f8ab"} Mar 19 19:09:18 crc kubenswrapper[4826]: I0319 19:09:18.084326 4826 scope.go:117] "RemoveContainer" containerID="1fe4b3d6867cf38019a25240af4b912e650b14fc9e8daeea59e52453d8485846" Mar 19 19:09:18 crc kubenswrapper[4826]: I0319 19:09:18.083403 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-295bd" Mar 19 19:09:18 crc kubenswrapper[4826]: I0319 19:09:18.117497 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkkgp\" (UniqueName: \"kubernetes.io/projected/a8929441-a64d-4fd8-aa9a-12b6585090e5-kube-api-access-zkkgp\") pod \"a8929441-a64d-4fd8-aa9a-12b6585090e5\" (UID: \"a8929441-a64d-4fd8-aa9a-12b6585090e5\") " Mar 19 19:09:18 crc kubenswrapper[4826]: I0319 19:09:18.117623 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8929441-a64d-4fd8-aa9a-12b6585090e5-catalog-content\") pod \"a8929441-a64d-4fd8-aa9a-12b6585090e5\" (UID: \"a8929441-a64d-4fd8-aa9a-12b6585090e5\") " Mar 19 19:09:18 crc kubenswrapper[4826]: I0319 19:09:18.117785 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8929441-a64d-4fd8-aa9a-12b6585090e5-utilities\") pod \"a8929441-a64d-4fd8-aa9a-12b6585090e5\" (UID: \"a8929441-a64d-4fd8-aa9a-12b6585090e5\") " Mar 19 19:09:18 crc kubenswrapper[4826]: I0319 19:09:18.118554 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8929441-a64d-4fd8-aa9a-12b6585090e5-utilities" (OuterVolumeSpecName: "utilities") pod "a8929441-a64d-4fd8-aa9a-12b6585090e5" (UID: "a8929441-a64d-4fd8-aa9a-12b6585090e5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:09:18 crc kubenswrapper[4826]: I0319 19:09:18.125421 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8929441-a64d-4fd8-aa9a-12b6585090e5-kube-api-access-zkkgp" (OuterVolumeSpecName: "kube-api-access-zkkgp") pod "a8929441-a64d-4fd8-aa9a-12b6585090e5" (UID: "a8929441-a64d-4fd8-aa9a-12b6585090e5"). InnerVolumeSpecName "kube-api-access-zkkgp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:09:18 crc kubenswrapper[4826]: I0319 19:09:18.133055 4826 scope.go:117] "RemoveContainer" containerID="428c8464a031245eeaff4997467dc696368fe0618be370763fb84ca8994fa380" Mar 19 19:09:18 crc kubenswrapper[4826]: I0319 19:09:18.198118 4826 scope.go:117] "RemoveContainer" containerID="8e632c0b3cf98a82f2ccf0ccff6ee20cf96c64dc86ff1ebf498702c0e0e7efd3" Mar 19 19:09:18 crc kubenswrapper[4826]: I0319 19:09:18.219538 4826 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8929441-a64d-4fd8-aa9a-12b6585090e5-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 19:09:18 crc kubenswrapper[4826]: I0319 19:09:18.219565 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkkgp\" (UniqueName: \"kubernetes.io/projected/a8929441-a64d-4fd8-aa9a-12b6585090e5-kube-api-access-zkkgp\") on node \"crc\" DevicePath \"\"" Mar 19 19:09:18 crc kubenswrapper[4826]: I0319 19:09:18.265957 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8929441-a64d-4fd8-aa9a-12b6585090e5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a8929441-a64d-4fd8-aa9a-12b6585090e5" (UID: "a8929441-a64d-4fd8-aa9a-12b6585090e5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:09:18 crc kubenswrapper[4826]: I0319 19:09:18.321374 4826 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8929441-a64d-4fd8-aa9a-12b6585090e5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 19:09:18 crc kubenswrapper[4826]: I0319 19:09:18.422493 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-295bd"] Mar 19 19:09:18 crc kubenswrapper[4826]: I0319 19:09:18.433583 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-295bd"] Mar 19 19:09:19 crc kubenswrapper[4826]: I0319 19:09:19.097257 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/cluster-logging-operator-66689c4bbf-gpn69" event={"ID":"c2f22ac6-fb47-448e-8570-b95a2688d081","Type":"ContainerStarted","Data":"3926a65a1a288d5564ef92d14c9d96dc00b7a52f0ee9ed427bc16fb2f4ca3cf8"} Mar 19 19:09:19 crc kubenswrapper[4826]: I0319 19:09:19.102049 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-d88f59dd5-fqs6s" event={"ID":"84bba80c-841e-4df3-87e0-901afbc23bf3","Type":"ContainerStarted","Data":"dc14c179a0b9e1cc78023ceefe65dc406c1c2b35101c57b999c61abd1b814954"} Mar 19 19:09:19 crc kubenswrapper[4826]: I0319 19:09:19.103316 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-d88f59dd5-fqs6s" Mar 19 19:09:19 crc kubenswrapper[4826]: I0319 19:09:19.106646 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators-redhat/loki-operator-controller-manager-d88f59dd5-fqs6s" Mar 19 19:09:19 crc kubenswrapper[4826]: I0319 19:09:19.131953 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/cluster-logging-operator-66689c4bbf-gpn69" podStartSLOduration=2.341403292 podStartE2EDuration="13.131920577s" podCreationTimestamp="2026-03-19 19:09:06 +0000 UTC" firstStartedPulling="2026-03-19 19:09:07.262061482 +0000 UTC m=+772.016129795" lastFinishedPulling="2026-03-19 19:09:18.052578757 +0000 UTC m=+782.806647080" observedRunningTime="2026-03-19 19:09:19.120373694 +0000 UTC m=+783.874442057" watchObservedRunningTime="2026-03-19 19:09:19.131920577 +0000 UTC m=+783.885988930" Mar 19 19:09:19 crc kubenswrapper[4826]: I0319 19:09:19.173009 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators-redhat/loki-operator-controller-manager-d88f59dd5-fqs6s" podStartSLOduration=2.286796141 podStartE2EDuration="16.172986982s" podCreationTimestamp="2026-03-19 19:09:03 +0000 UTC" firstStartedPulling="2026-03-19 19:09:04.253505318 +0000 UTC m=+769.007573631" lastFinishedPulling="2026-03-19 19:09:18.139696149 +0000 UTC m=+782.893764472" observedRunningTime="2026-03-19 19:09:19.15822214 +0000 UTC m=+783.912290463" watchObservedRunningTime="2026-03-19 19:09:19.172986982 +0000 UTC m=+783.927055315" Mar 19 19:09:19 crc kubenswrapper[4826]: I0319 19:09:19.985995 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8929441-a64d-4fd8-aa9a-12b6585090e5" path="/var/lib/kubelet/pods/a8929441-a64d-4fd8-aa9a-12b6585090e5/volumes" Mar 19 19:09:23 crc kubenswrapper[4826]: I0319 19:09:23.290013 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["minio-dev/minio"] Mar 19 19:09:23 crc kubenswrapper[4826]: E0319 19:09:23.290778 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8929441-a64d-4fd8-aa9a-12b6585090e5" containerName="registry-server" Mar 19 19:09:23 crc kubenswrapper[4826]: I0319 19:09:23.290790 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8929441-a64d-4fd8-aa9a-12b6585090e5" containerName="registry-server" Mar 19 19:09:23 crc kubenswrapper[4826]: E0319 19:09:23.290816 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8929441-a64d-4fd8-aa9a-12b6585090e5" containerName="extract-content" Mar 19 19:09:23 crc kubenswrapper[4826]: I0319 19:09:23.290822 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8929441-a64d-4fd8-aa9a-12b6585090e5" containerName="extract-content" Mar 19 19:09:23 crc kubenswrapper[4826]: E0319 19:09:23.290833 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8929441-a64d-4fd8-aa9a-12b6585090e5" containerName="extract-utilities" Mar 19 19:09:23 crc kubenswrapper[4826]: I0319 19:09:23.290839 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8929441-a64d-4fd8-aa9a-12b6585090e5" containerName="extract-utilities" Mar 19 19:09:23 crc kubenswrapper[4826]: I0319 19:09:23.290941 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8929441-a64d-4fd8-aa9a-12b6585090e5" containerName="registry-server" Mar 19 19:09:23 crc kubenswrapper[4826]: I0319 19:09:23.291345 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Mar 19 19:09:23 crc kubenswrapper[4826]: I0319 19:09:23.293829 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"openshift-service-ca.crt" Mar 19 19:09:23 crc kubenswrapper[4826]: I0319 19:09:23.293857 4826 reflector.go:368] Caches populated for *v1.Secret from object-"minio-dev"/"default-dockercfg-zhzv4" Mar 19 19:09:23 crc kubenswrapper[4826]: I0319 19:09:23.294063 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"kube-root-ca.crt" Mar 19 19:09:23 crc kubenswrapper[4826]: I0319 19:09:23.309883 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Mar 19 19:09:23 crc kubenswrapper[4826]: I0319 19:09:23.391306 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4kq9\" (UniqueName: \"kubernetes.io/projected/d3ec390a-1afc-4653-8adc-e3d664227faa-kube-api-access-m4kq9\") pod \"minio\" (UID: \"d3ec390a-1afc-4653-8adc-e3d664227faa\") " pod="minio-dev/minio" Mar 19 19:09:23 crc kubenswrapper[4826]: I0319 19:09:23.391534 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0c3297a1-9746-477a-ac0c-9e30e681780b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0c3297a1-9746-477a-ac0c-9e30e681780b\") pod \"minio\" (UID: \"d3ec390a-1afc-4653-8adc-e3d664227faa\") " pod="minio-dev/minio" Mar 19 19:09:23 crc kubenswrapper[4826]: I0319 19:09:23.493271 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4kq9\" (UniqueName: \"kubernetes.io/projected/d3ec390a-1afc-4653-8adc-e3d664227faa-kube-api-access-m4kq9\") pod \"minio\" (UID: \"d3ec390a-1afc-4653-8adc-e3d664227faa\") " pod="minio-dev/minio" Mar 19 19:09:23 crc kubenswrapper[4826]: I0319 19:09:23.493390 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0c3297a1-9746-477a-ac0c-9e30e681780b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0c3297a1-9746-477a-ac0c-9e30e681780b\") pod \"minio\" (UID: \"d3ec390a-1afc-4653-8adc-e3d664227faa\") " pod="minio-dev/minio" Mar 19 19:09:23 crc kubenswrapper[4826]: I0319 19:09:23.496507 4826 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 19:09:23 crc kubenswrapper[4826]: I0319 19:09:23.496554 4826 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-0c3297a1-9746-477a-ac0c-9e30e681780b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0c3297a1-9746-477a-ac0c-9e30e681780b\") pod \"minio\" (UID: \"d3ec390a-1afc-4653-8adc-e3d664227faa\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/217402feb789d928c08361befe9a45eb1b9aa738ad3b983394cbb734db36c2d2/globalmount\"" pod="minio-dev/minio" Mar 19 19:09:23 crc kubenswrapper[4826]: I0319 19:09:23.523391 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4kq9\" (UniqueName: \"kubernetes.io/projected/d3ec390a-1afc-4653-8adc-e3d664227faa-kube-api-access-m4kq9\") pod \"minio\" (UID: \"d3ec390a-1afc-4653-8adc-e3d664227faa\") " pod="minio-dev/minio" Mar 19 19:09:23 crc kubenswrapper[4826]: I0319 19:09:23.553979 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0c3297a1-9746-477a-ac0c-9e30e681780b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0c3297a1-9746-477a-ac0c-9e30e681780b\") pod \"minio\" (UID: \"d3ec390a-1afc-4653-8adc-e3d664227faa\") " pod="minio-dev/minio" Mar 19 19:09:23 crc kubenswrapper[4826]: I0319 19:09:23.663829 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Mar 19 19:09:24 crc kubenswrapper[4826]: I0319 19:09:24.099251 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Mar 19 19:09:24 crc kubenswrapper[4826]: I0319 19:09:24.140968 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"d3ec390a-1afc-4653-8adc-e3d664227faa","Type":"ContainerStarted","Data":"98a314177c7a7ca24592525fd36bef2e5319625f39439488c7c354b88f0034a1"} Mar 19 19:09:25 crc kubenswrapper[4826]: I0319 19:09:25.401083 4826 patch_prober.go:28] interesting pod/machine-config-daemon-zz87p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:09:25 crc kubenswrapper[4826]: I0319 19:09:25.402853 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:09:26 crc kubenswrapper[4826]: I0319 19:09:26.171059 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-l64bk"] Mar 19 19:09:26 crc kubenswrapper[4826]: I0319 19:09:26.172479 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l64bk" Mar 19 19:09:26 crc kubenswrapper[4826]: I0319 19:09:26.182905 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l64bk"] Mar 19 19:09:26 crc kubenswrapper[4826]: I0319 19:09:26.236545 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqdt6\" (UniqueName: \"kubernetes.io/projected/1119aa44-6ba2-48fd-a96f-64120fc6cb20-kube-api-access-lqdt6\") pod \"community-operators-l64bk\" (UID: \"1119aa44-6ba2-48fd-a96f-64120fc6cb20\") " pod="openshift-marketplace/community-operators-l64bk" Mar 19 19:09:26 crc kubenswrapper[4826]: I0319 19:09:26.236590 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1119aa44-6ba2-48fd-a96f-64120fc6cb20-catalog-content\") pod \"community-operators-l64bk\" (UID: \"1119aa44-6ba2-48fd-a96f-64120fc6cb20\") " pod="openshift-marketplace/community-operators-l64bk" Mar 19 19:09:26 crc kubenswrapper[4826]: I0319 19:09:26.236612 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1119aa44-6ba2-48fd-a96f-64120fc6cb20-utilities\") pod \"community-operators-l64bk\" (UID: \"1119aa44-6ba2-48fd-a96f-64120fc6cb20\") " pod="openshift-marketplace/community-operators-l64bk" Mar 19 19:09:26 crc kubenswrapper[4826]: I0319 19:09:26.338063 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqdt6\" (UniqueName: \"kubernetes.io/projected/1119aa44-6ba2-48fd-a96f-64120fc6cb20-kube-api-access-lqdt6\") pod \"community-operators-l64bk\" (UID: \"1119aa44-6ba2-48fd-a96f-64120fc6cb20\") " pod="openshift-marketplace/community-operators-l64bk" Mar 19 19:09:26 crc kubenswrapper[4826]: I0319 19:09:26.338111 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1119aa44-6ba2-48fd-a96f-64120fc6cb20-catalog-content\") pod \"community-operators-l64bk\" (UID: \"1119aa44-6ba2-48fd-a96f-64120fc6cb20\") " pod="openshift-marketplace/community-operators-l64bk" Mar 19 19:09:26 crc kubenswrapper[4826]: I0319 19:09:26.338130 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1119aa44-6ba2-48fd-a96f-64120fc6cb20-utilities\") pod \"community-operators-l64bk\" (UID: \"1119aa44-6ba2-48fd-a96f-64120fc6cb20\") " pod="openshift-marketplace/community-operators-l64bk" Mar 19 19:09:26 crc kubenswrapper[4826]: I0319 19:09:26.338583 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1119aa44-6ba2-48fd-a96f-64120fc6cb20-catalog-content\") pod \"community-operators-l64bk\" (UID: \"1119aa44-6ba2-48fd-a96f-64120fc6cb20\") " pod="openshift-marketplace/community-operators-l64bk" Mar 19 19:09:26 crc kubenswrapper[4826]: I0319 19:09:26.338604 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1119aa44-6ba2-48fd-a96f-64120fc6cb20-utilities\") pod \"community-operators-l64bk\" (UID: \"1119aa44-6ba2-48fd-a96f-64120fc6cb20\") " pod="openshift-marketplace/community-operators-l64bk" Mar 19 19:09:26 crc kubenswrapper[4826]: I0319 19:09:26.360462 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqdt6\" (UniqueName: \"kubernetes.io/projected/1119aa44-6ba2-48fd-a96f-64120fc6cb20-kube-api-access-lqdt6\") pod \"community-operators-l64bk\" (UID: \"1119aa44-6ba2-48fd-a96f-64120fc6cb20\") " pod="openshift-marketplace/community-operators-l64bk" Mar 19 19:09:26 crc kubenswrapper[4826]: I0319 19:09:26.495095 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l64bk" Mar 19 19:09:27 crc kubenswrapper[4826]: I0319 19:09:27.912392 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l64bk"] Mar 19 19:09:27 crc kubenswrapper[4826]: W0319 19:09:27.915915 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1119aa44_6ba2_48fd_a96f_64120fc6cb20.slice/crio-8fbe69f4a5f7f3ce3960f8cbfca2355d0314d740b3fd7decc537fd7340a8c36f WatchSource:0}: Error finding container 8fbe69f4a5f7f3ce3960f8cbfca2355d0314d740b3fd7decc537fd7340a8c36f: Status 404 returned error can't find the container with id 8fbe69f4a5f7f3ce3960f8cbfca2355d0314d740b3fd7decc537fd7340a8c36f Mar 19 19:09:28 crc kubenswrapper[4826]: I0319 19:09:28.172525 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"d3ec390a-1afc-4653-8adc-e3d664227faa","Type":"ContainerStarted","Data":"32fc6e2e3489b4c146c845db8922b9af4eb90cd1f34d485eb40826f2ac94740f"} Mar 19 19:09:28 crc kubenswrapper[4826]: I0319 19:09:28.174955 4826 generic.go:334] "Generic (PLEG): container finished" podID="1119aa44-6ba2-48fd-a96f-64120fc6cb20" containerID="4048af147e881fa8554dd7cb5007bc3cab018fc173227fd51a761b468f346b44" exitCode=0 Mar 19 19:09:28 crc kubenswrapper[4826]: I0319 19:09:28.174998 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l64bk" event={"ID":"1119aa44-6ba2-48fd-a96f-64120fc6cb20","Type":"ContainerDied","Data":"4048af147e881fa8554dd7cb5007bc3cab018fc173227fd51a761b468f346b44"} Mar 19 19:09:28 crc kubenswrapper[4826]: I0319 19:09:28.175021 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l64bk" event={"ID":"1119aa44-6ba2-48fd-a96f-64120fc6cb20","Type":"ContainerStarted","Data":"8fbe69f4a5f7f3ce3960f8cbfca2355d0314d740b3fd7decc537fd7340a8c36f"} Mar 19 19:09:28 crc kubenswrapper[4826]: I0319 19:09:28.193238 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="minio-dev/minio" podStartSLOduration=3.829493266 podStartE2EDuration="7.193181211s" podCreationTimestamp="2026-03-19 19:09:21 +0000 UTC" firstStartedPulling="2026-03-19 19:09:24.107922501 +0000 UTC m=+788.861990804" lastFinishedPulling="2026-03-19 19:09:27.471610426 +0000 UTC m=+792.225678749" observedRunningTime="2026-03-19 19:09:28.185317859 +0000 UTC m=+792.939386182" watchObservedRunningTime="2026-03-19 19:09:28.193181211 +0000 UTC m=+792.947249534" Mar 19 19:09:33 crc kubenswrapper[4826]: I0319 19:09:33.211572 4826 generic.go:334] "Generic (PLEG): container finished" podID="1119aa44-6ba2-48fd-a96f-64120fc6cb20" containerID="13eeb051d44e3b6abb3b6cbd8125342215442f32682cbac63afb81250ca79250" exitCode=0 Mar 19 19:09:33 crc kubenswrapper[4826]: I0319 19:09:33.211948 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l64bk" event={"ID":"1119aa44-6ba2-48fd-a96f-64120fc6cb20","Type":"ContainerDied","Data":"13eeb051d44e3b6abb3b6cbd8125342215442f32682cbac63afb81250ca79250"} Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.130550 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-distributor-9c6b6d984-qrlfg"] Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.131834 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-distributor-9c6b6d984-qrlfg" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.134093 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-config" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.134950 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-distributor-http" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.136585 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-dockercfg-nk8kj" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.146992 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-ca-bundle" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.149856 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-distributor-grpc" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.162061 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-distributor-9c6b6d984-qrlfg"] Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.221765 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l64bk" event={"ID":"1119aa44-6ba2-48fd-a96f-64120fc6cb20","Type":"ContainerStarted","Data":"44f2587f750366c5ff0104d0dad3a31554d5862e3b655c1bd243752b313e47f6"} Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.251220 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-l64bk" podStartSLOduration=2.798017701 podStartE2EDuration="8.251202391s" podCreationTimestamp="2026-03-19 19:09:26 +0000 UTC" firstStartedPulling="2026-03-19 19:09:28.176615056 +0000 UTC m=+792.930683379" lastFinishedPulling="2026-03-19 19:09:33.629799736 +0000 UTC m=+798.383868069" observedRunningTime="2026-03-19 19:09:34.249342295 +0000 UTC m=+799.003410628" watchObservedRunningTime="2026-03-19 19:09:34.251202391 +0000 UTC m=+799.005270714" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.281035 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/e1f51b15-5d82-43d5-b391-5f4b10434957-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-9c6b6d984-qrlfg\" (UID: \"e1f51b15-5d82-43d5-b391-5f4b10434957\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-qrlfg" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.281089 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e1f51b15-5d82-43d5-b391-5f4b10434957-logging-loki-ca-bundle\") pod \"logging-loki-distributor-9c6b6d984-qrlfg\" (UID: \"e1f51b15-5d82-43d5-b391-5f4b10434957\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-qrlfg" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.281136 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/e1f51b15-5d82-43d5-b391-5f4b10434957-logging-loki-distributor-http\") pod \"logging-loki-distributor-9c6b6d984-qrlfg\" (UID: \"e1f51b15-5d82-43d5-b391-5f4b10434957\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-qrlfg" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.281152 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wwz7\" (UniqueName: \"kubernetes.io/projected/e1f51b15-5d82-43d5-b391-5f4b10434957-kube-api-access-8wwz7\") pod \"logging-loki-distributor-9c6b6d984-qrlfg\" (UID: \"e1f51b15-5d82-43d5-b391-5f4b10434957\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-qrlfg" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.281179 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1f51b15-5d82-43d5-b391-5f4b10434957-config\") pod \"logging-loki-distributor-9c6b6d984-qrlfg\" (UID: \"e1f51b15-5d82-43d5-b391-5f4b10434957\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-qrlfg" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.286556 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-querier-6dcbdf8bb8-qltmk"] Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.287334 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qltmk" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.292826 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-s3" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.293027 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-querier-grpc" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.293069 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-querier-http" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.303859 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-querier-6dcbdf8bb8-qltmk"] Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.382286 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1f51b15-5d82-43d5-b391-5f4b10434957-config\") pod \"logging-loki-distributor-9c6b6d984-qrlfg\" (UID: \"e1f51b15-5d82-43d5-b391-5f4b10434957\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-qrlfg" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.382372 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/e1f51b15-5d82-43d5-b391-5f4b10434957-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-9c6b6d984-qrlfg\" (UID: \"e1f51b15-5d82-43d5-b391-5f4b10434957\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-qrlfg" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.383281 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e1f51b15-5d82-43d5-b391-5f4b10434957-logging-loki-ca-bundle\") pod \"logging-loki-distributor-9c6b6d984-qrlfg\" (UID: \"e1f51b15-5d82-43d5-b391-5f4b10434957\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-qrlfg" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.383336 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/e1f51b15-5d82-43d5-b391-5f4b10434957-logging-loki-distributor-http\") pod \"logging-loki-distributor-9c6b6d984-qrlfg\" (UID: \"e1f51b15-5d82-43d5-b391-5f4b10434957\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-qrlfg" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.383353 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wwz7\" (UniqueName: \"kubernetes.io/projected/e1f51b15-5d82-43d5-b391-5f4b10434957-kube-api-access-8wwz7\") pod \"logging-loki-distributor-9c6b6d984-qrlfg\" (UID: \"e1f51b15-5d82-43d5-b391-5f4b10434957\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-qrlfg" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.382583 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-query-frontend-ff66c4dc9-l2p46"] Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.383581 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1f51b15-5d82-43d5-b391-5f4b10434957-config\") pod \"logging-loki-distributor-9c6b6d984-qrlfg\" (UID: \"e1f51b15-5d82-43d5-b391-5f4b10434957\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-qrlfg" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.384148 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e1f51b15-5d82-43d5-b391-5f4b10434957-logging-loki-ca-bundle\") pod \"logging-loki-distributor-9c6b6d984-qrlfg\" (UID: \"e1f51b15-5d82-43d5-b391-5f4b10434957\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-qrlfg" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.384323 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-l2p46" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.389555 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/e1f51b15-5d82-43d5-b391-5f4b10434957-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-9c6b6d984-qrlfg\" (UID: \"e1f51b15-5d82-43d5-b391-5f4b10434957\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-qrlfg" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.389568 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/e1f51b15-5d82-43d5-b391-5f4b10434957-logging-loki-distributor-http\") pod \"logging-loki-distributor-9c6b6d984-qrlfg\" (UID: \"e1f51b15-5d82-43d5-b391-5f4b10434957\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-qrlfg" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.389618 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-query-frontend-grpc" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.389741 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-query-frontend-http" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.409260 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-query-frontend-ff66c4dc9-l2p46"] Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.430750 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wwz7\" (UniqueName: \"kubernetes.io/projected/e1f51b15-5d82-43d5-b391-5f4b10434957-kube-api-access-8wwz7\") pod \"logging-loki-distributor-9c6b6d984-qrlfg\" (UID: \"e1f51b15-5d82-43d5-b391-5f4b10434957\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-qrlfg" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.462003 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-distributor-9c6b6d984-qrlfg" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.484528 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/081e84d7-1c7e-4c6f-935e-ee01eaf393e2-config\") pod \"logging-loki-querier-6dcbdf8bb8-qltmk\" (UID: \"081e84d7-1c7e-4c6f-935e-ee01eaf393e2\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qltmk" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.484806 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0fc08676-ae6f-4018-8f85-259585de45fe-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-ff66c4dc9-l2p46\" (UID: \"0fc08676-ae6f-4018-8f85-259585de45fe\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-l2p46" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.484842 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/081e84d7-1c7e-4c6f-935e-ee01eaf393e2-logging-loki-querier-http\") pod \"logging-loki-querier-6dcbdf8bb8-qltmk\" (UID: \"081e84d7-1c7e-4c6f-935e-ee01eaf393e2\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qltmk" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.484859 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7bkb\" (UniqueName: \"kubernetes.io/projected/081e84d7-1c7e-4c6f-935e-ee01eaf393e2-kube-api-access-b7bkb\") pod \"logging-loki-querier-6dcbdf8bb8-qltmk\" (UID: \"081e84d7-1c7e-4c6f-935e-ee01eaf393e2\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qltmk" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.484893 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/0fc08676-ae6f-4018-8f85-259585de45fe-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-ff66c4dc9-l2p46\" (UID: \"0fc08676-ae6f-4018-8f85-259585de45fe\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-l2p46" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.484913 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/081e84d7-1c7e-4c6f-935e-ee01eaf393e2-logging-loki-ca-bundle\") pod \"logging-loki-querier-6dcbdf8bb8-qltmk\" (UID: \"081e84d7-1c7e-4c6f-935e-ee01eaf393e2\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qltmk" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.484927 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/0fc08676-ae6f-4018-8f85-259585de45fe-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-ff66c4dc9-l2p46\" (UID: \"0fc08676-ae6f-4018-8f85-259585de45fe\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-l2p46" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.484952 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0fc08676-ae6f-4018-8f85-259585de45fe-config\") pod \"logging-loki-query-frontend-ff66c4dc9-l2p46\" (UID: \"0fc08676-ae6f-4018-8f85-259585de45fe\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-l2p46" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.484974 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/081e84d7-1c7e-4c6f-935e-ee01eaf393e2-logging-loki-querier-grpc\") pod \"logging-loki-querier-6dcbdf8bb8-qltmk\" (UID: \"081e84d7-1c7e-4c6f-935e-ee01eaf393e2\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qltmk" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.485004 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7d8t\" (UniqueName: \"kubernetes.io/projected/0fc08676-ae6f-4018-8f85-259585de45fe-kube-api-access-s7d8t\") pod \"logging-loki-query-frontend-ff66c4dc9-l2p46\" (UID: \"0fc08676-ae6f-4018-8f85-259585de45fe\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-l2p46" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.485024 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/081e84d7-1c7e-4c6f-935e-ee01eaf393e2-logging-loki-s3\") pod \"logging-loki-querier-6dcbdf8bb8-qltmk\" (UID: \"081e84d7-1c7e-4c6f-935e-ee01eaf393e2\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qltmk" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.511894 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-gateway-68b4bcd8f5-zvtrc"] Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.514346 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-zvtrc" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.519589 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-gateway" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.519773 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-http" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.519913 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-client-http" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.520516 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.520637 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-gateway-ca-bundle" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.524962 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-68b4bcd8f5-zvtrc"] Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.540014 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-gateway-68b4bcd8f5-mhqzk"] Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.541307 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-mhqzk" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.548962 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-dockercfg-qj4w2" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.559323 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-68b4bcd8f5-mhqzk"] Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.597629 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/081e84d7-1c7e-4c6f-935e-ee01eaf393e2-logging-loki-querier-http\") pod \"logging-loki-querier-6dcbdf8bb8-qltmk\" (UID: \"081e84d7-1c7e-4c6f-935e-ee01eaf393e2\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qltmk" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.597684 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7bkb\" (UniqueName: \"kubernetes.io/projected/081e84d7-1c7e-4c6f-935e-ee01eaf393e2-kube-api-access-b7bkb\") pod \"logging-loki-querier-6dcbdf8bb8-qltmk\" (UID: \"081e84d7-1c7e-4c6f-935e-ee01eaf393e2\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qltmk" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.597759 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/0fc08676-ae6f-4018-8f85-259585de45fe-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-ff66c4dc9-l2p46\" (UID: \"0fc08676-ae6f-4018-8f85-259585de45fe\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-l2p46" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.597788 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/081e84d7-1c7e-4c6f-935e-ee01eaf393e2-logging-loki-ca-bundle\") pod \"logging-loki-querier-6dcbdf8bb8-qltmk\" (UID: \"081e84d7-1c7e-4c6f-935e-ee01eaf393e2\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qltmk" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.597812 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/0fc08676-ae6f-4018-8f85-259585de45fe-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-ff66c4dc9-l2p46\" (UID: \"0fc08676-ae6f-4018-8f85-259585de45fe\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-l2p46" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.597850 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0fc08676-ae6f-4018-8f85-259585de45fe-config\") pod \"logging-loki-query-frontend-ff66c4dc9-l2p46\" (UID: \"0fc08676-ae6f-4018-8f85-259585de45fe\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-l2p46" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.597888 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/081e84d7-1c7e-4c6f-935e-ee01eaf393e2-logging-loki-querier-grpc\") pod \"logging-loki-querier-6dcbdf8bb8-qltmk\" (UID: \"081e84d7-1c7e-4c6f-935e-ee01eaf393e2\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qltmk" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.597943 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7d8t\" (UniqueName: \"kubernetes.io/projected/0fc08676-ae6f-4018-8f85-259585de45fe-kube-api-access-s7d8t\") pod \"logging-loki-query-frontend-ff66c4dc9-l2p46\" (UID: \"0fc08676-ae6f-4018-8f85-259585de45fe\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-l2p46" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.597980 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/081e84d7-1c7e-4c6f-935e-ee01eaf393e2-logging-loki-s3\") pod \"logging-loki-querier-6dcbdf8bb8-qltmk\" (UID: \"081e84d7-1c7e-4c6f-935e-ee01eaf393e2\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qltmk" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.598020 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/081e84d7-1c7e-4c6f-935e-ee01eaf393e2-config\") pod \"logging-loki-querier-6dcbdf8bb8-qltmk\" (UID: \"081e84d7-1c7e-4c6f-935e-ee01eaf393e2\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qltmk" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.598051 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0fc08676-ae6f-4018-8f85-259585de45fe-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-ff66c4dc9-l2p46\" (UID: \"0fc08676-ae6f-4018-8f85-259585de45fe\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-l2p46" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.600505 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/081e84d7-1c7e-4c6f-935e-ee01eaf393e2-logging-loki-ca-bundle\") pod \"logging-loki-querier-6dcbdf8bb8-qltmk\" (UID: \"081e84d7-1c7e-4c6f-935e-ee01eaf393e2\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qltmk" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.601105 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0fc08676-ae6f-4018-8f85-259585de45fe-config\") pod \"logging-loki-query-frontend-ff66c4dc9-l2p46\" (UID: \"0fc08676-ae6f-4018-8f85-259585de45fe\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-l2p46" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.603265 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/081e84d7-1c7e-4c6f-935e-ee01eaf393e2-logging-loki-querier-grpc\") pod \"logging-loki-querier-6dcbdf8bb8-qltmk\" (UID: \"081e84d7-1c7e-4c6f-935e-ee01eaf393e2\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qltmk" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.608041 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0fc08676-ae6f-4018-8f85-259585de45fe-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-ff66c4dc9-l2p46\" (UID: \"0fc08676-ae6f-4018-8f85-259585de45fe\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-l2p46" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.613875 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/0fc08676-ae6f-4018-8f85-259585de45fe-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-ff66c4dc9-l2p46\" (UID: \"0fc08676-ae6f-4018-8f85-259585de45fe\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-l2p46" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.614758 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/0fc08676-ae6f-4018-8f85-259585de45fe-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-ff66c4dc9-l2p46\" (UID: \"0fc08676-ae6f-4018-8f85-259585de45fe\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-l2p46" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.615162 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7bkb\" (UniqueName: \"kubernetes.io/projected/081e84d7-1c7e-4c6f-935e-ee01eaf393e2-kube-api-access-b7bkb\") pod \"logging-loki-querier-6dcbdf8bb8-qltmk\" (UID: \"081e84d7-1c7e-4c6f-935e-ee01eaf393e2\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qltmk" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.615947 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/081e84d7-1c7e-4c6f-935e-ee01eaf393e2-logging-loki-s3\") pod \"logging-loki-querier-6dcbdf8bb8-qltmk\" (UID: \"081e84d7-1c7e-4c6f-935e-ee01eaf393e2\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qltmk" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.616508 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7d8t\" (UniqueName: \"kubernetes.io/projected/0fc08676-ae6f-4018-8f85-259585de45fe-kube-api-access-s7d8t\") pod \"logging-loki-query-frontend-ff66c4dc9-l2p46\" (UID: \"0fc08676-ae6f-4018-8f85-259585de45fe\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-l2p46" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.617907 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/081e84d7-1c7e-4c6f-935e-ee01eaf393e2-config\") pod \"logging-loki-querier-6dcbdf8bb8-qltmk\" (UID: \"081e84d7-1c7e-4c6f-935e-ee01eaf393e2\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qltmk" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.625187 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/081e84d7-1c7e-4c6f-935e-ee01eaf393e2-logging-loki-querier-http\") pod \"logging-loki-querier-6dcbdf8bb8-qltmk\" (UID: \"081e84d7-1c7e-4c6f-935e-ee01eaf393e2\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qltmk" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.699549 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/1e1484c9-801f-4999-9754-456df604d7ca-tenants\") pod \"logging-loki-gateway-68b4bcd8f5-mhqzk\" (UID: \"1e1484c9-801f-4999-9754-456df604d7ca\") " pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-mhqzk" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.699592 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/1e1484c9-801f-4999-9754-456df604d7ca-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-68b4bcd8f5-mhqzk\" (UID: \"1e1484c9-801f-4999-9754-456df604d7ca\") " pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-mhqzk" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.699617 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/64ca34d8-5f9f-448d-9ab2-414c5b4757e9-tls-secret\") pod \"logging-loki-gateway-68b4bcd8f5-zvtrc\" (UID: \"64ca34d8-5f9f-448d-9ab2-414c5b4757e9\") " pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-zvtrc" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.699638 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64ca34d8-5f9f-448d-9ab2-414c5b4757e9-logging-loki-ca-bundle\") pod \"logging-loki-gateway-68b4bcd8f5-zvtrc\" (UID: \"64ca34d8-5f9f-448d-9ab2-414c5b4757e9\") " pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-zvtrc" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.699676 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/64ca34d8-5f9f-448d-9ab2-414c5b4757e9-lokistack-gateway\") pod \"logging-loki-gateway-68b4bcd8f5-zvtrc\" (UID: \"64ca34d8-5f9f-448d-9ab2-414c5b4757e9\") " pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-zvtrc" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.699695 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64ca34d8-5f9f-448d-9ab2-414c5b4757e9-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-68b4bcd8f5-zvtrc\" (UID: \"64ca34d8-5f9f-448d-9ab2-414c5b4757e9\") " pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-zvtrc" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.699710 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e1484c9-801f-4999-9754-456df604d7ca-logging-loki-ca-bundle\") pod \"logging-loki-gateway-68b4bcd8f5-mhqzk\" (UID: \"1e1484c9-801f-4999-9754-456df604d7ca\") " pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-mhqzk" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.699728 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/1e1484c9-801f-4999-9754-456df604d7ca-rbac\") pod \"logging-loki-gateway-68b4bcd8f5-mhqzk\" (UID: \"1e1484c9-801f-4999-9754-456df604d7ca\") " pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-mhqzk" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.699744 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/64ca34d8-5f9f-448d-9ab2-414c5b4757e9-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-68b4bcd8f5-zvtrc\" (UID: \"64ca34d8-5f9f-448d-9ab2-414c5b4757e9\") " pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-zvtrc" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.699785 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/1e1484c9-801f-4999-9754-456df604d7ca-lokistack-gateway\") pod \"logging-loki-gateway-68b4bcd8f5-mhqzk\" (UID: \"1e1484c9-801f-4999-9754-456df604d7ca\") " pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-mhqzk" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.699801 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/64ca34d8-5f9f-448d-9ab2-414c5b4757e9-tenants\") pod \"logging-loki-gateway-68b4bcd8f5-zvtrc\" (UID: \"64ca34d8-5f9f-448d-9ab2-414c5b4757e9\") " pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-zvtrc" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.699814 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzsmj\" (UniqueName: \"kubernetes.io/projected/64ca34d8-5f9f-448d-9ab2-414c5b4757e9-kube-api-access-mzsmj\") pod \"logging-loki-gateway-68b4bcd8f5-zvtrc\" (UID: \"64ca34d8-5f9f-448d-9ab2-414c5b4757e9\") " pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-zvtrc" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.699833 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/1e1484c9-801f-4999-9754-456df604d7ca-tls-secret\") pod \"logging-loki-gateway-68b4bcd8f5-mhqzk\" (UID: \"1e1484c9-801f-4999-9754-456df604d7ca\") " pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-mhqzk" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.699853 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e1484c9-801f-4999-9754-456df604d7ca-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-68b4bcd8f5-mhqzk\" (UID: \"1e1484c9-801f-4999-9754-456df604d7ca\") " pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-mhqzk" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.699870 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44rkw\" (UniqueName: \"kubernetes.io/projected/1e1484c9-801f-4999-9754-456df604d7ca-kube-api-access-44rkw\") pod \"logging-loki-gateway-68b4bcd8f5-mhqzk\" (UID: \"1e1484c9-801f-4999-9754-456df604d7ca\") " pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-mhqzk" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.699896 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/64ca34d8-5f9f-448d-9ab2-414c5b4757e9-rbac\") pod \"logging-loki-gateway-68b4bcd8f5-zvtrc\" (UID: \"64ca34d8-5f9f-448d-9ab2-414c5b4757e9\") " pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-zvtrc" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.769104 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-l2p46" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.804488 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/64ca34d8-5f9f-448d-9ab2-414c5b4757e9-rbac\") pod \"logging-loki-gateway-68b4bcd8f5-zvtrc\" (UID: \"64ca34d8-5f9f-448d-9ab2-414c5b4757e9\") " pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-zvtrc" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.804565 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/1e1484c9-801f-4999-9754-456df604d7ca-tenants\") pod \"logging-loki-gateway-68b4bcd8f5-mhqzk\" (UID: \"1e1484c9-801f-4999-9754-456df604d7ca\") " pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-mhqzk" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.804607 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/1e1484c9-801f-4999-9754-456df604d7ca-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-68b4bcd8f5-mhqzk\" (UID: \"1e1484c9-801f-4999-9754-456df604d7ca\") " pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-mhqzk" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.804643 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/64ca34d8-5f9f-448d-9ab2-414c5b4757e9-tls-secret\") pod \"logging-loki-gateway-68b4bcd8f5-zvtrc\" (UID: \"64ca34d8-5f9f-448d-9ab2-414c5b4757e9\") " pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-zvtrc" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.804688 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64ca34d8-5f9f-448d-9ab2-414c5b4757e9-logging-loki-ca-bundle\") pod \"logging-loki-gateway-68b4bcd8f5-zvtrc\" (UID: \"64ca34d8-5f9f-448d-9ab2-414c5b4757e9\") " pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-zvtrc" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.804713 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/64ca34d8-5f9f-448d-9ab2-414c5b4757e9-lokistack-gateway\") pod \"logging-loki-gateway-68b4bcd8f5-zvtrc\" (UID: \"64ca34d8-5f9f-448d-9ab2-414c5b4757e9\") " pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-zvtrc" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.804737 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64ca34d8-5f9f-448d-9ab2-414c5b4757e9-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-68b4bcd8f5-zvtrc\" (UID: \"64ca34d8-5f9f-448d-9ab2-414c5b4757e9\") " pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-zvtrc" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.804758 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e1484c9-801f-4999-9754-456df604d7ca-logging-loki-ca-bundle\") pod \"logging-loki-gateway-68b4bcd8f5-mhqzk\" (UID: \"1e1484c9-801f-4999-9754-456df604d7ca\") " pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-mhqzk" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.804786 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/1e1484c9-801f-4999-9754-456df604d7ca-rbac\") pod \"logging-loki-gateway-68b4bcd8f5-mhqzk\" (UID: \"1e1484c9-801f-4999-9754-456df604d7ca\") " pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-mhqzk" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.804807 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/64ca34d8-5f9f-448d-9ab2-414c5b4757e9-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-68b4bcd8f5-zvtrc\" (UID: \"64ca34d8-5f9f-448d-9ab2-414c5b4757e9\") " pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-zvtrc" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.804861 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/1e1484c9-801f-4999-9754-456df604d7ca-lokistack-gateway\") pod \"logging-loki-gateway-68b4bcd8f5-mhqzk\" (UID: \"1e1484c9-801f-4999-9754-456df604d7ca\") " pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-mhqzk" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.804887 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzsmj\" (UniqueName: \"kubernetes.io/projected/64ca34d8-5f9f-448d-9ab2-414c5b4757e9-kube-api-access-mzsmj\") pod \"logging-loki-gateway-68b4bcd8f5-zvtrc\" (UID: \"64ca34d8-5f9f-448d-9ab2-414c5b4757e9\") " pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-zvtrc" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.804905 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/64ca34d8-5f9f-448d-9ab2-414c5b4757e9-tenants\") pod \"logging-loki-gateway-68b4bcd8f5-zvtrc\" (UID: \"64ca34d8-5f9f-448d-9ab2-414c5b4757e9\") " pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-zvtrc" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.804932 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/1e1484c9-801f-4999-9754-456df604d7ca-tls-secret\") pod \"logging-loki-gateway-68b4bcd8f5-mhqzk\" (UID: \"1e1484c9-801f-4999-9754-456df604d7ca\") " pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-mhqzk" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.804953 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e1484c9-801f-4999-9754-456df604d7ca-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-68b4bcd8f5-mhqzk\" (UID: \"1e1484c9-801f-4999-9754-456df604d7ca\") " pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-mhqzk" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.804975 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44rkw\" (UniqueName: \"kubernetes.io/projected/1e1484c9-801f-4999-9754-456df604d7ca-kube-api-access-44rkw\") pod \"logging-loki-gateway-68b4bcd8f5-mhqzk\" (UID: \"1e1484c9-801f-4999-9754-456df604d7ca\") " pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-mhqzk" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.813563 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/1e1484c9-801f-4999-9754-456df604d7ca-rbac\") pod \"logging-loki-gateway-68b4bcd8f5-mhqzk\" (UID: \"1e1484c9-801f-4999-9754-456df604d7ca\") " pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-mhqzk" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.817625 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/64ca34d8-5f9f-448d-9ab2-414c5b4757e9-lokistack-gateway\") pod \"logging-loki-gateway-68b4bcd8f5-zvtrc\" (UID: \"64ca34d8-5f9f-448d-9ab2-414c5b4757e9\") " pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-zvtrc" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.818277 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64ca34d8-5f9f-448d-9ab2-414c5b4757e9-logging-loki-ca-bundle\") pod \"logging-loki-gateway-68b4bcd8f5-zvtrc\" (UID: \"64ca34d8-5f9f-448d-9ab2-414c5b4757e9\") " pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-zvtrc" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.819226 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/64ca34d8-5f9f-448d-9ab2-414c5b4757e9-rbac\") pod \"logging-loki-gateway-68b4bcd8f5-zvtrc\" (UID: \"64ca34d8-5f9f-448d-9ab2-414c5b4757e9\") " pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-zvtrc" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.820395 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e1484c9-801f-4999-9754-456df604d7ca-logging-loki-ca-bundle\") pod \"logging-loki-gateway-68b4bcd8f5-mhqzk\" (UID: \"1e1484c9-801f-4999-9754-456df604d7ca\") " pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-mhqzk" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.821077 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/1e1484c9-801f-4999-9754-456df604d7ca-lokistack-gateway\") pod \"logging-loki-gateway-68b4bcd8f5-mhqzk\" (UID: \"1e1484c9-801f-4999-9754-456df604d7ca\") " pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-mhqzk" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.822078 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e1484c9-801f-4999-9754-456df604d7ca-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-68b4bcd8f5-mhqzk\" (UID: \"1e1484c9-801f-4999-9754-456df604d7ca\") " pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-mhqzk" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.827921 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64ca34d8-5f9f-448d-9ab2-414c5b4757e9-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-68b4bcd8f5-zvtrc\" (UID: \"64ca34d8-5f9f-448d-9ab2-414c5b4757e9\") " pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-zvtrc" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.850766 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/1e1484c9-801f-4999-9754-456df604d7ca-tls-secret\") pod \"logging-loki-gateway-68b4bcd8f5-mhqzk\" (UID: \"1e1484c9-801f-4999-9754-456df604d7ca\") " pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-mhqzk" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.851380 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/64ca34d8-5f9f-448d-9ab2-414c5b4757e9-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-68b4bcd8f5-zvtrc\" (UID: \"64ca34d8-5f9f-448d-9ab2-414c5b4757e9\") " pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-zvtrc" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.852464 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/1e1484c9-801f-4999-9754-456df604d7ca-tenants\") pod \"logging-loki-gateway-68b4bcd8f5-mhqzk\" (UID: \"1e1484c9-801f-4999-9754-456df604d7ca\") " pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-mhqzk" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.854675 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/64ca34d8-5f9f-448d-9ab2-414c5b4757e9-tenants\") pod \"logging-loki-gateway-68b4bcd8f5-zvtrc\" (UID: \"64ca34d8-5f9f-448d-9ab2-414c5b4757e9\") " pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-zvtrc" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.865888 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/1e1484c9-801f-4999-9754-456df604d7ca-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-68b4bcd8f5-mhqzk\" (UID: \"1e1484c9-801f-4999-9754-456df604d7ca\") " pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-mhqzk" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.866374 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/64ca34d8-5f9f-448d-9ab2-414c5b4757e9-tls-secret\") pod \"logging-loki-gateway-68b4bcd8f5-zvtrc\" (UID: \"64ca34d8-5f9f-448d-9ab2-414c5b4757e9\") " pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-zvtrc" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.867160 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44rkw\" (UniqueName: \"kubernetes.io/projected/1e1484c9-801f-4999-9754-456df604d7ca-kube-api-access-44rkw\") pod \"logging-loki-gateway-68b4bcd8f5-mhqzk\" (UID: \"1e1484c9-801f-4999-9754-456df604d7ca\") " pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-mhqzk" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.877091 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzsmj\" (UniqueName: \"kubernetes.io/projected/64ca34d8-5f9f-448d-9ab2-414c5b4757e9-kube-api-access-mzsmj\") pod \"logging-loki-gateway-68b4bcd8f5-zvtrc\" (UID: \"64ca34d8-5f9f-448d-9ab2-414c5b4757e9\") " pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-zvtrc" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.906488 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qltmk" Mar 19 19:09:34 crc kubenswrapper[4826]: I0319 19:09:34.922105 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-distributor-9c6b6d984-qrlfg"] Mar 19 19:09:34 crc kubenswrapper[4826]: W0319 19:09:34.930835 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1f51b15_5d82_43d5_b391_5f4b10434957.slice/crio-5118282c01d1b2235582f7f3c6491286aa6ddc64770830621945cc1eb831ff2d WatchSource:0}: Error finding container 5118282c01d1b2235582f7f3c6491286aa6ddc64770830621945cc1eb831ff2d: Status 404 returned error can't find the container with id 5118282c01d1b2235582f7f3c6491286aa6ddc64770830621945cc1eb831ff2d Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.140599 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-zvtrc" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.160033 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-mhqzk" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.231580 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-distributor-9c6b6d984-qrlfg" event={"ID":"e1f51b15-5d82-43d5-b391-5f4b10434957","Type":"ContainerStarted","Data":"5118282c01d1b2235582f7f3c6491286aa6ddc64770830621945cc1eb831ff2d"} Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.285766 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.286611 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-ingester-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.290713 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-ingester-grpc" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.290994 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-ingester-http" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.304101 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.339373 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-query-frontend-ff66c4dc9-l2p46"] Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.368187 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.379284 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-compactor-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.382505 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-compactor-grpc" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.382501 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-compactor-http" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.388913 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.395763 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-querier-6dcbdf8bb8-qltmk"] Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.413119 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93990ea7-96ba-4c12-b92c-17a7c38aece4-config\") pod \"logging-loki-ingester-0\" (UID: \"93990ea7-96ba-4c12-b92c-17a7c38aece4\") " pod="openshift-logging/logging-loki-ingester-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.413160 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/93990ea7-96ba-4c12-b92c-17a7c38aece4-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"93990ea7-96ba-4c12-b92c-17a7c38aece4\") " pod="openshift-logging/logging-loki-ingester-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.413188 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8d293036-e511-4a5a-bdb2-73b6918edb94\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8d293036-e511-4a5a-bdb2-73b6918edb94\") pod \"logging-loki-ingester-0\" (UID: \"93990ea7-96ba-4c12-b92c-17a7c38aece4\") " pod="openshift-logging/logging-loki-ingester-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.413465 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7153bbad-f47f-49bb-ab47-307f281f8b4f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7153bbad-f47f-49bb-ab47-307f281f8b4f\") pod \"logging-loki-ingester-0\" (UID: \"93990ea7-96ba-4c12-b92c-17a7c38aece4\") " pod="openshift-logging/logging-loki-ingester-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.413592 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/93990ea7-96ba-4c12-b92c-17a7c38aece4-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"93990ea7-96ba-4c12-b92c-17a7c38aece4\") " pod="openshift-logging/logging-loki-ingester-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.413779 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8j8z\" (UniqueName: \"kubernetes.io/projected/93990ea7-96ba-4c12-b92c-17a7c38aece4-kube-api-access-p8j8z\") pod \"logging-loki-ingester-0\" (UID: \"93990ea7-96ba-4c12-b92c-17a7c38aece4\") " pod="openshift-logging/logging-loki-ingester-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.413799 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/93990ea7-96ba-4c12-b92c-17a7c38aece4-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"93990ea7-96ba-4c12-b92c-17a7c38aece4\") " pod="openshift-logging/logging-loki-ingester-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.413886 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93990ea7-96ba-4c12-b92c-17a7c38aece4-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"93990ea7-96ba-4c12-b92c-17a7c38aece4\") " pod="openshift-logging/logging-loki-ingester-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.414644 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-68b4bcd8f5-mhqzk"] Mar 19 19:09:35 crc kubenswrapper[4826]: W0319 19:09:35.419802 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod081e84d7_1c7e_4c6f_935e_ee01eaf393e2.slice/crio-6c39d7c5217793353e377feaba70c2ec5ceacd0df0791f8a83bdf3d8931db5ed WatchSource:0}: Error finding container 6c39d7c5217793353e377feaba70c2ec5ceacd0df0791f8a83bdf3d8931db5ed: Status 404 returned error can't find the container with id 6c39d7c5217793353e377feaba70c2ec5ceacd0df0791f8a83bdf3d8931db5ed Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.441993 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-68b4bcd8f5-zvtrc"] Mar 19 19:09:35 crc kubenswrapper[4826]: W0319 19:09:35.449763 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64ca34d8_5f9f_448d_9ab2_414c5b4757e9.slice/crio-18cc8d7f3d468e16140cff5ea4bdf4cfed87b6da26552027143e03476f5efc74 WatchSource:0}: Error finding container 18cc8d7f3d468e16140cff5ea4bdf4cfed87b6da26552027143e03476f5efc74: Status 404 returned error can't find the container with id 18cc8d7f3d468e16140cff5ea4bdf4cfed87b6da26552027143e03476f5efc74 Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.469953 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.471309 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-index-gateway-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.474369 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-index-gateway-grpc" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.475323 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-index-gateway-http" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.476492 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.515785 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8j8z\" (UniqueName: \"kubernetes.io/projected/93990ea7-96ba-4c12-b92c-17a7c38aece4-kube-api-access-p8j8z\") pod \"logging-loki-ingester-0\" (UID: \"93990ea7-96ba-4c12-b92c-17a7c38aece4\") " pod="openshift-logging/logging-loki-ingester-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.515825 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/93990ea7-96ba-4c12-b92c-17a7c38aece4-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"93990ea7-96ba-4c12-b92c-17a7c38aece4\") " pod="openshift-logging/logging-loki-ingester-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.515849 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93990ea7-96ba-4c12-b92c-17a7c38aece4-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"93990ea7-96ba-4c12-b92c-17a7c38aece4\") " pod="openshift-logging/logging-loki-ingester-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.515884 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/377fff75-1f59-4c28-a3ed-2bd89e803b73-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"377fff75-1f59-4c28-a3ed-2bd89e803b73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.515904 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5s6c\" (UniqueName: \"kubernetes.io/projected/377fff75-1f59-4c28-a3ed-2bd89e803b73-kube-api-access-m5s6c\") pod \"logging-loki-compactor-0\" (UID: \"377fff75-1f59-4c28-a3ed-2bd89e803b73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.515934 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93990ea7-96ba-4c12-b92c-17a7c38aece4-config\") pod \"logging-loki-ingester-0\" (UID: \"93990ea7-96ba-4c12-b92c-17a7c38aece4\") " pod="openshift-logging/logging-loki-ingester-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.515963 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/377fff75-1f59-4c28-a3ed-2bd89e803b73-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"377fff75-1f59-4c28-a3ed-2bd89e803b73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.515980 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/377fff75-1f59-4c28-a3ed-2bd89e803b73-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"377fff75-1f59-4c28-a3ed-2bd89e803b73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.517185 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/93990ea7-96ba-4c12-b92c-17a7c38aece4-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"93990ea7-96ba-4c12-b92c-17a7c38aece4\") " pod="openshift-logging/logging-loki-ingester-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.517228 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-cfab44ae-32b3-4a9c-b8f2-d2acfbe1289f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cfab44ae-32b3-4a9c-b8f2-d2acfbe1289f\") pod \"logging-loki-compactor-0\" (UID: \"377fff75-1f59-4c28-a3ed-2bd89e803b73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.517247 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/377fff75-1f59-4c28-a3ed-2bd89e803b73-config\") pod \"logging-loki-compactor-0\" (UID: \"377fff75-1f59-4c28-a3ed-2bd89e803b73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.517270 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8d293036-e511-4a5a-bdb2-73b6918edb94\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8d293036-e511-4a5a-bdb2-73b6918edb94\") pod \"logging-loki-ingester-0\" (UID: \"93990ea7-96ba-4c12-b92c-17a7c38aece4\") " pod="openshift-logging/logging-loki-ingester-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.517310 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7153bbad-f47f-49bb-ab47-307f281f8b4f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7153bbad-f47f-49bb-ab47-307f281f8b4f\") pod \"logging-loki-ingester-0\" (UID: \"93990ea7-96ba-4c12-b92c-17a7c38aece4\") " pod="openshift-logging/logging-loki-ingester-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.517330 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/93990ea7-96ba-4c12-b92c-17a7c38aece4-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"93990ea7-96ba-4c12-b92c-17a7c38aece4\") " pod="openshift-logging/logging-loki-ingester-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.517347 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/377fff75-1f59-4c28-a3ed-2bd89e803b73-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"377fff75-1f59-4c28-a3ed-2bd89e803b73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.517245 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93990ea7-96ba-4c12-b92c-17a7c38aece4-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"93990ea7-96ba-4c12-b92c-17a7c38aece4\") " pod="openshift-logging/logging-loki-ingester-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.517417 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93990ea7-96ba-4c12-b92c-17a7c38aece4-config\") pod \"logging-loki-ingester-0\" (UID: \"93990ea7-96ba-4c12-b92c-17a7c38aece4\") " pod="openshift-logging/logging-loki-ingester-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.520990 4826 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.521034 4826 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7153bbad-f47f-49bb-ab47-307f281f8b4f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7153bbad-f47f-49bb-ab47-307f281f8b4f\") pod \"logging-loki-ingester-0\" (UID: \"93990ea7-96ba-4c12-b92c-17a7c38aece4\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/072eb70d60d9682053021b63995c7a38d5b47b5671300442e7fd2fe1cf1109d1/globalmount\"" pod="openshift-logging/logging-loki-ingester-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.522287 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/93990ea7-96ba-4c12-b92c-17a7c38aece4-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"93990ea7-96ba-4c12-b92c-17a7c38aece4\") " pod="openshift-logging/logging-loki-ingester-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.531282 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/93990ea7-96ba-4c12-b92c-17a7c38aece4-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"93990ea7-96ba-4c12-b92c-17a7c38aece4\") " pod="openshift-logging/logging-loki-ingester-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.531292 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/93990ea7-96ba-4c12-b92c-17a7c38aece4-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"93990ea7-96ba-4c12-b92c-17a7c38aece4\") " pod="openshift-logging/logging-loki-ingester-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.531430 4826 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.531529 4826 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8d293036-e511-4a5a-bdb2-73b6918edb94\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8d293036-e511-4a5a-bdb2-73b6918edb94\") pod \"logging-loki-ingester-0\" (UID: \"93990ea7-96ba-4c12-b92c-17a7c38aece4\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f53133b3678dfd670965ad0da9dc48121739a2eff9f89c062c7875e4c886ba3/globalmount\"" pod="openshift-logging/logging-loki-ingester-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.534114 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8j8z\" (UniqueName: \"kubernetes.io/projected/93990ea7-96ba-4c12-b92c-17a7c38aece4-kube-api-access-p8j8z\") pod \"logging-loki-ingester-0\" (UID: \"93990ea7-96ba-4c12-b92c-17a7c38aece4\") " pod="openshift-logging/logging-loki-ingester-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.544034 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7153bbad-f47f-49bb-ab47-307f281f8b4f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7153bbad-f47f-49bb-ab47-307f281f8b4f\") pod \"logging-loki-ingester-0\" (UID: \"93990ea7-96ba-4c12-b92c-17a7c38aece4\") " pod="openshift-logging/logging-loki-ingester-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.555772 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8d293036-e511-4a5a-bdb2-73b6918edb94\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8d293036-e511-4a5a-bdb2-73b6918edb94\") pod \"logging-loki-ingester-0\" (UID: \"93990ea7-96ba-4c12-b92c-17a7c38aece4\") " pod="openshift-logging/logging-loki-ingester-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.605991 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-ingester-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.618949 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/377fff75-1f59-4c28-a3ed-2bd89e803b73-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"377fff75-1f59-4c28-a3ed-2bd89e803b73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.619245 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/377fff75-1f59-4c28-a3ed-2bd89e803b73-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"377fff75-1f59-4c28-a3ed-2bd89e803b73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.619327 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-cfab44ae-32b3-4a9c-b8f2-d2acfbe1289f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cfab44ae-32b3-4a9c-b8f2-d2acfbe1289f\") pod \"logging-loki-compactor-0\" (UID: \"377fff75-1f59-4c28-a3ed-2bd89e803b73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.619426 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/05505420-3d58-4de7-9da6-2f27e54c32f5-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"05505420-3d58-4de7-9da6-2f27e54c32f5\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.619510 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-75aba771-01e7-44b4-b64c-d87ff5be36b2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-75aba771-01e7-44b4-b64c-d87ff5be36b2\") pod \"logging-loki-index-gateway-0\" (UID: \"05505420-3d58-4de7-9da6-2f27e54c32f5\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.619625 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/377fff75-1f59-4c28-a3ed-2bd89e803b73-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"377fff75-1f59-4c28-a3ed-2bd89e803b73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.619828 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5s6c\" (UniqueName: \"kubernetes.io/projected/377fff75-1f59-4c28-a3ed-2bd89e803b73-kube-api-access-m5s6c\") pod \"logging-loki-compactor-0\" (UID: \"377fff75-1f59-4c28-a3ed-2bd89e803b73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.619918 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05505420-3d58-4de7-9da6-2f27e54c32f5-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"05505420-3d58-4de7-9da6-2f27e54c32f5\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.619995 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/377fff75-1f59-4c28-a3ed-2bd89e803b73-config\") pod \"logging-loki-compactor-0\" (UID: \"377fff75-1f59-4c28-a3ed-2bd89e803b73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.620080 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/377fff75-1f59-4c28-a3ed-2bd89e803b73-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"377fff75-1f59-4c28-a3ed-2bd89e803b73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.620148 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/05505420-3d58-4de7-9da6-2f27e54c32f5-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"05505420-3d58-4de7-9da6-2f27e54c32f5\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.620215 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/05505420-3d58-4de7-9da6-2f27e54c32f5-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"05505420-3d58-4de7-9da6-2f27e54c32f5\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.620282 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t89bp\" (UniqueName: \"kubernetes.io/projected/05505420-3d58-4de7-9da6-2f27e54c32f5-kube-api-access-t89bp\") pod \"logging-loki-index-gateway-0\" (UID: \"05505420-3d58-4de7-9da6-2f27e54c32f5\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.620352 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05505420-3d58-4de7-9da6-2f27e54c32f5-config\") pod \"logging-loki-index-gateway-0\" (UID: \"05505420-3d58-4de7-9da6-2f27e54c32f5\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.620843 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/377fff75-1f59-4c28-a3ed-2bd89e803b73-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"377fff75-1f59-4c28-a3ed-2bd89e803b73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.621308 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/377fff75-1f59-4c28-a3ed-2bd89e803b73-config\") pod \"logging-loki-compactor-0\" (UID: \"377fff75-1f59-4c28-a3ed-2bd89e803b73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.622708 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/377fff75-1f59-4c28-a3ed-2bd89e803b73-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"377fff75-1f59-4c28-a3ed-2bd89e803b73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.622869 4826 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.622919 4826 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-cfab44ae-32b3-4a9c-b8f2-d2acfbe1289f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cfab44ae-32b3-4a9c-b8f2-d2acfbe1289f\") pod \"logging-loki-compactor-0\" (UID: \"377fff75-1f59-4c28-a3ed-2bd89e803b73\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/00288785598a170f4e9757f1997f1dbaa44ffacd92a4013c8c168e65ec5923e1/globalmount\"" pod="openshift-logging/logging-loki-compactor-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.623144 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/377fff75-1f59-4c28-a3ed-2bd89e803b73-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"377fff75-1f59-4c28-a3ed-2bd89e803b73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.624027 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/377fff75-1f59-4c28-a3ed-2bd89e803b73-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"377fff75-1f59-4c28-a3ed-2bd89e803b73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.638287 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5s6c\" (UniqueName: \"kubernetes.io/projected/377fff75-1f59-4c28-a3ed-2bd89e803b73-kube-api-access-m5s6c\") pod \"logging-loki-compactor-0\" (UID: \"377fff75-1f59-4c28-a3ed-2bd89e803b73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.652256 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-cfab44ae-32b3-4a9c-b8f2-d2acfbe1289f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cfab44ae-32b3-4a9c-b8f2-d2acfbe1289f\") pod \"logging-loki-compactor-0\" (UID: \"377fff75-1f59-4c28-a3ed-2bd89e803b73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.716636 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-compactor-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.721907 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05505420-3d58-4de7-9da6-2f27e54c32f5-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"05505420-3d58-4de7-9da6-2f27e54c32f5\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.722247 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/05505420-3d58-4de7-9da6-2f27e54c32f5-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"05505420-3d58-4de7-9da6-2f27e54c32f5\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.722291 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/05505420-3d58-4de7-9da6-2f27e54c32f5-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"05505420-3d58-4de7-9da6-2f27e54c32f5\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.722326 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t89bp\" (UniqueName: \"kubernetes.io/projected/05505420-3d58-4de7-9da6-2f27e54c32f5-kube-api-access-t89bp\") pod \"logging-loki-index-gateway-0\" (UID: \"05505420-3d58-4de7-9da6-2f27e54c32f5\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.722361 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05505420-3d58-4de7-9da6-2f27e54c32f5-config\") pod \"logging-loki-index-gateway-0\" (UID: \"05505420-3d58-4de7-9da6-2f27e54c32f5\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.722450 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/05505420-3d58-4de7-9da6-2f27e54c32f5-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"05505420-3d58-4de7-9da6-2f27e54c32f5\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.722533 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-75aba771-01e7-44b4-b64c-d87ff5be36b2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-75aba771-01e7-44b4-b64c-d87ff5be36b2\") pod \"logging-loki-index-gateway-0\" (UID: \"05505420-3d58-4de7-9da6-2f27e54c32f5\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.722633 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05505420-3d58-4de7-9da6-2f27e54c32f5-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"05505420-3d58-4de7-9da6-2f27e54c32f5\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.724817 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05505420-3d58-4de7-9da6-2f27e54c32f5-config\") pod \"logging-loki-index-gateway-0\" (UID: \"05505420-3d58-4de7-9da6-2f27e54c32f5\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.729228 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/05505420-3d58-4de7-9da6-2f27e54c32f5-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"05505420-3d58-4de7-9da6-2f27e54c32f5\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.729714 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/05505420-3d58-4de7-9da6-2f27e54c32f5-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"05505420-3d58-4de7-9da6-2f27e54c32f5\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.730165 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/05505420-3d58-4de7-9da6-2f27e54c32f5-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"05505420-3d58-4de7-9da6-2f27e54c32f5\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.738096 4826 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.738133 4826 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-75aba771-01e7-44b4-b64c-d87ff5be36b2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-75aba771-01e7-44b4-b64c-d87ff5be36b2\") pod \"logging-loki-index-gateway-0\" (UID: \"05505420-3d58-4de7-9da6-2f27e54c32f5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/73ac87f3da4ca82c4cddfe471ac1724c66b0b7188672f7120f512737248986f8/globalmount\"" pod="openshift-logging/logging-loki-index-gateway-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.742784 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t89bp\" (UniqueName: \"kubernetes.io/projected/05505420-3d58-4de7-9da6-2f27e54c32f5-kube-api-access-t89bp\") pod \"logging-loki-index-gateway-0\" (UID: \"05505420-3d58-4de7-9da6-2f27e54c32f5\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 19 19:09:35 crc kubenswrapper[4826]: I0319 19:09:35.811011 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-75aba771-01e7-44b4-b64c-d87ff5be36b2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-75aba771-01e7-44b4-b64c-d87ff5be36b2\") pod \"logging-loki-index-gateway-0\" (UID: \"05505420-3d58-4de7-9da6-2f27e54c32f5\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 19 19:09:36 crc kubenswrapper[4826]: I0319 19:09:36.089064 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-index-gateway-0" Mar 19 19:09:36 crc kubenswrapper[4826]: I0319 19:09:36.120125 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Mar 19 19:09:36 crc kubenswrapper[4826]: W0319 19:09:36.135867 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93990ea7_96ba_4c12_b92c_17a7c38aece4.slice/crio-fe5a93f1ed4d6a052c0b802681205267e32e2d7ecd6e0587bdfd4e195330a75c WatchSource:0}: Error finding container fe5a93f1ed4d6a052c0b802681205267e32e2d7ecd6e0587bdfd4e195330a75c: Status 404 returned error can't find the container with id fe5a93f1ed4d6a052c0b802681205267e32e2d7ecd6e0587bdfd4e195330a75c Mar 19 19:09:36 crc kubenswrapper[4826]: I0319 19:09:36.193900 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Mar 19 19:09:36 crc kubenswrapper[4826]: W0319 19:09:36.199212 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod377fff75_1f59_4c28_a3ed_2bd89e803b73.slice/crio-a5df96fb0ba5b62268a5d4f36ae1aae889a7ed9ce24738e3d5faedc590854cbb WatchSource:0}: Error finding container a5df96fb0ba5b62268a5d4f36ae1aae889a7ed9ce24738e3d5faedc590854cbb: Status 404 returned error can't find the container with id a5df96fb0ba5b62268a5d4f36ae1aae889a7ed9ce24738e3d5faedc590854cbb Mar 19 19:09:36 crc kubenswrapper[4826]: I0319 19:09:36.242584 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-compactor-0" event={"ID":"377fff75-1f59-4c28-a3ed-2bd89e803b73","Type":"ContainerStarted","Data":"a5df96fb0ba5b62268a5d4f36ae1aae889a7ed9ce24738e3d5faedc590854cbb"} Mar 19 19:09:36 crc kubenswrapper[4826]: I0319 19:09:36.243319 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-zvtrc" event={"ID":"64ca34d8-5f9f-448d-9ab2-414c5b4757e9","Type":"ContainerStarted","Data":"18cc8d7f3d468e16140cff5ea4bdf4cfed87b6da26552027143e03476f5efc74"} Mar 19 19:09:36 crc kubenswrapper[4826]: I0319 19:09:36.243954 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-l2p46" event={"ID":"0fc08676-ae6f-4018-8f85-259585de45fe","Type":"ContainerStarted","Data":"c0d4640fa80addcf31559b53adc32485de9f8e3d27b9271ab3c7353ca94e77d3"} Mar 19 19:09:36 crc kubenswrapper[4826]: I0319 19:09:36.245183 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qltmk" event={"ID":"081e84d7-1c7e-4c6f-935e-ee01eaf393e2","Type":"ContainerStarted","Data":"6c39d7c5217793353e377feaba70c2ec5ceacd0df0791f8a83bdf3d8931db5ed"} Mar 19 19:09:36 crc kubenswrapper[4826]: I0319 19:09:36.246409 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-ingester-0" event={"ID":"93990ea7-96ba-4c12-b92c-17a7c38aece4","Type":"ContainerStarted","Data":"fe5a93f1ed4d6a052c0b802681205267e32e2d7ecd6e0587bdfd4e195330a75c"} Mar 19 19:09:36 crc kubenswrapper[4826]: I0319 19:09:36.247575 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-mhqzk" event={"ID":"1e1484c9-801f-4999-9754-456df604d7ca","Type":"ContainerStarted","Data":"278cfaf261c02be43da200d0e62ecf00641c4fb52f712aa0d0e0e194ef854510"} Mar 19 19:09:36 crc kubenswrapper[4826]: I0319 19:09:36.495303 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-l64bk" Mar 19 19:09:36 crc kubenswrapper[4826]: I0319 19:09:36.496087 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-l64bk" Mar 19 19:09:36 crc kubenswrapper[4826]: W0319 19:09:36.533028 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05505420_3d58_4de7_9da6_2f27e54c32f5.slice/crio-8668a236c13ec4a6fb1fcc7ac5621bada56771dcc947b1c69ce49307677b4020 WatchSource:0}: Error finding container 8668a236c13ec4a6fb1fcc7ac5621bada56771dcc947b1c69ce49307677b4020: Status 404 returned error can't find the container with id 8668a236c13ec4a6fb1fcc7ac5621bada56771dcc947b1c69ce49307677b4020 Mar 19 19:09:36 crc kubenswrapper[4826]: I0319 19:09:36.536114 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Mar 19 19:09:36 crc kubenswrapper[4826]: I0319 19:09:36.586998 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-l64bk" Mar 19 19:09:37 crc kubenswrapper[4826]: I0319 19:09:37.273227 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-index-gateway-0" event={"ID":"05505420-3d58-4de7-9da6-2f27e54c32f5","Type":"ContainerStarted","Data":"8668a236c13ec4a6fb1fcc7ac5621bada56771dcc947b1c69ce49307677b4020"} Mar 19 19:09:38 crc kubenswrapper[4826]: I0319 19:09:38.350342 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-l64bk" Mar 19 19:09:39 crc kubenswrapper[4826]: I0319 19:09:39.569730 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l64bk"] Mar 19 19:09:40 crc kubenswrapper[4826]: I0319 19:09:40.304043 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-mhqzk" event={"ID":"1e1484c9-801f-4999-9754-456df604d7ca","Type":"ContainerStarted","Data":"9fe3f7a88bf01f02e7a5b7ee03452ef09f42fa6a0f66337f21a9f7935c30dec7"} Mar 19 19:09:40 crc kubenswrapper[4826]: I0319 19:09:40.306616 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-distributor-9c6b6d984-qrlfg" event={"ID":"e1f51b15-5d82-43d5-b391-5f4b10434957","Type":"ContainerStarted","Data":"843ded38fc152e6a3e2c530f593ab962f00506d6bb2138d4001cd5e01b59ba80"} Mar 19 19:09:40 crc kubenswrapper[4826]: I0319 19:09:40.306786 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-distributor-9c6b6d984-qrlfg" Mar 19 19:09:40 crc kubenswrapper[4826]: I0319 19:09:40.309156 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-compactor-0" event={"ID":"377fff75-1f59-4c28-a3ed-2bd89e803b73","Type":"ContainerStarted","Data":"8b7eb6a1b1281acaa0cc64d9d75db6546ccb9f83d63e8741137ec6d1b955f396"} Mar 19 19:09:40 crc kubenswrapper[4826]: I0319 19:09:40.309237 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-compactor-0" Mar 19 19:09:40 crc kubenswrapper[4826]: I0319 19:09:40.312301 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-zvtrc" event={"ID":"64ca34d8-5f9f-448d-9ab2-414c5b4757e9","Type":"ContainerStarted","Data":"e82c8c8a549eec456f56cc91bca45c911e7eeca98b9104cdcf25f201e1d257cb"} Mar 19 19:09:40 crc kubenswrapper[4826]: I0319 19:09:40.314340 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-l2p46" event={"ID":"0fc08676-ae6f-4018-8f85-259585de45fe","Type":"ContainerStarted","Data":"e62f4303bede2e4b203dce45bf07f34ad9a409a3b74bb2f4a7bf1b188cbc1a13"} Mar 19 19:09:40 crc kubenswrapper[4826]: I0319 19:09:40.315186 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-l2p46" Mar 19 19:09:40 crc kubenswrapper[4826]: I0319 19:09:40.317771 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qltmk" event={"ID":"081e84d7-1c7e-4c6f-935e-ee01eaf393e2","Type":"ContainerStarted","Data":"4ba46dba152e1c786c0bcbaad135f8b78f273ed1452b990ddd787afbe3fbea46"} Mar 19 19:09:40 crc kubenswrapper[4826]: I0319 19:09:40.317954 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qltmk" Mar 19 19:09:40 crc kubenswrapper[4826]: I0319 19:09:40.320479 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-index-gateway-0" event={"ID":"05505420-3d58-4de7-9da6-2f27e54c32f5","Type":"ContainerStarted","Data":"07965e6ecc9f64c4ab7be6a4ecfcba0ef0bad4b4db9890bfb8386f1e86d09a38"} Mar 19 19:09:40 crc kubenswrapper[4826]: I0319 19:09:40.320699 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-index-gateway-0" Mar 19 19:09:40 crc kubenswrapper[4826]: I0319 19:09:40.323194 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-ingester-0" event={"ID":"93990ea7-96ba-4c12-b92c-17a7c38aece4","Type":"ContainerStarted","Data":"cdde00418c634d250026ec3f3cc6d7f08a4248bfe0fc290254d91e500a5e6dfb"} Mar 19 19:09:40 crc kubenswrapper[4826]: I0319 19:09:40.323417 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-l64bk" podUID="1119aa44-6ba2-48fd-a96f-64120fc6cb20" containerName="registry-server" containerID="cri-o://44f2587f750366c5ff0104d0dad3a31554d5862e3b655c1bd243752b313e47f6" gracePeriod=2 Mar 19 19:09:40 crc kubenswrapper[4826]: I0319 19:09:40.338024 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-distributor-9c6b6d984-qrlfg" podStartSLOduration=1.963231431 podStartE2EDuration="6.337994864s" podCreationTimestamp="2026-03-19 19:09:34 +0000 UTC" firstStartedPulling="2026-03-19 19:09:34.934151971 +0000 UTC m=+799.688220284" lastFinishedPulling="2026-03-19 19:09:39.308915404 +0000 UTC m=+804.062983717" observedRunningTime="2026-03-19 19:09:40.331638528 +0000 UTC m=+805.085706891" watchObservedRunningTime="2026-03-19 19:09:40.337994864 +0000 UTC m=+805.092063197" Mar 19 19:09:40 crc kubenswrapper[4826]: I0319 19:09:40.360381 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-compactor-0" podStartSLOduration=3.186241266 podStartE2EDuration="6.360349581s" podCreationTimestamp="2026-03-19 19:09:34 +0000 UTC" firstStartedPulling="2026-03-19 19:09:36.214456138 +0000 UTC m=+800.968524451" lastFinishedPulling="2026-03-19 19:09:39.388564433 +0000 UTC m=+804.142632766" observedRunningTime="2026-03-19 19:09:40.353944244 +0000 UTC m=+805.108012607" watchObservedRunningTime="2026-03-19 19:09:40.360349581 +0000 UTC m=+805.114417914" Mar 19 19:09:40 crc kubenswrapper[4826]: I0319 19:09:40.385791 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-index-gateway-0" podStartSLOduration=3.637340254 podStartE2EDuration="6.385763943s" podCreationTimestamp="2026-03-19 19:09:34 +0000 UTC" firstStartedPulling="2026-03-19 19:09:36.537423421 +0000 UTC m=+801.291491744" lastFinishedPulling="2026-03-19 19:09:39.28584712 +0000 UTC m=+804.039915433" observedRunningTime="2026-03-19 19:09:40.383874027 +0000 UTC m=+805.137942380" watchObservedRunningTime="2026-03-19 19:09:40.385763943 +0000 UTC m=+805.139832286" Mar 19 19:09:40 crc kubenswrapper[4826]: I0319 19:09:40.436251 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-ingester-0" podStartSLOduration=3.207614768 podStartE2EDuration="6.436229487s" podCreationTimestamp="2026-03-19 19:09:34 +0000 UTC" firstStartedPulling="2026-03-19 19:09:36.137534776 +0000 UTC m=+800.891603089" lastFinishedPulling="2026-03-19 19:09:39.366149495 +0000 UTC m=+804.120217808" observedRunningTime="2026-03-19 19:09:40.412783843 +0000 UTC m=+805.166852176" watchObservedRunningTime="2026-03-19 19:09:40.436229487 +0000 UTC m=+805.190297820" Mar 19 19:09:40 crc kubenswrapper[4826]: I0319 19:09:40.439287 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-l2p46" podStartSLOduration=2.511792183 podStartE2EDuration="6.439276492s" podCreationTimestamp="2026-03-19 19:09:34 +0000 UTC" firstStartedPulling="2026-03-19 19:09:35.38040734 +0000 UTC m=+800.134475663" lastFinishedPulling="2026-03-19 19:09:39.307891659 +0000 UTC m=+804.061959972" observedRunningTime="2026-03-19 19:09:40.432137697 +0000 UTC m=+805.186206020" watchObservedRunningTime="2026-03-19 19:09:40.439276492 +0000 UTC m=+805.193344815" Mar 19 19:09:40 crc kubenswrapper[4826]: I0319 19:09:40.876303 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l64bk" Mar 19 19:09:40 crc kubenswrapper[4826]: I0319 19:09:40.894965 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qltmk" podStartSLOduration=3.081227477 podStartE2EDuration="6.894942752s" podCreationTimestamp="2026-03-19 19:09:34 +0000 UTC" firstStartedPulling="2026-03-19 19:09:35.412193798 +0000 UTC m=+800.166262101" lastFinishedPulling="2026-03-19 19:09:39.225909063 +0000 UTC m=+803.979977376" observedRunningTime="2026-03-19 19:09:40.460244615 +0000 UTC m=+805.214312938" watchObservedRunningTime="2026-03-19 19:09:40.894942752 +0000 UTC m=+805.649011065" Mar 19 19:09:40 crc kubenswrapper[4826]: I0319 19:09:40.928767 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1119aa44-6ba2-48fd-a96f-64120fc6cb20-catalog-content\") pod \"1119aa44-6ba2-48fd-a96f-64120fc6cb20\" (UID: \"1119aa44-6ba2-48fd-a96f-64120fc6cb20\") " Mar 19 19:09:40 crc kubenswrapper[4826]: I0319 19:09:40.928948 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1119aa44-6ba2-48fd-a96f-64120fc6cb20-utilities\") pod \"1119aa44-6ba2-48fd-a96f-64120fc6cb20\" (UID: \"1119aa44-6ba2-48fd-a96f-64120fc6cb20\") " Mar 19 19:09:40 crc kubenswrapper[4826]: I0319 19:09:40.929020 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqdt6\" (UniqueName: \"kubernetes.io/projected/1119aa44-6ba2-48fd-a96f-64120fc6cb20-kube-api-access-lqdt6\") pod \"1119aa44-6ba2-48fd-a96f-64120fc6cb20\" (UID: \"1119aa44-6ba2-48fd-a96f-64120fc6cb20\") " Mar 19 19:09:40 crc kubenswrapper[4826]: I0319 19:09:40.929637 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1119aa44-6ba2-48fd-a96f-64120fc6cb20-utilities" (OuterVolumeSpecName: "utilities") pod "1119aa44-6ba2-48fd-a96f-64120fc6cb20" (UID: "1119aa44-6ba2-48fd-a96f-64120fc6cb20"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:09:40 crc kubenswrapper[4826]: I0319 19:09:40.933494 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1119aa44-6ba2-48fd-a96f-64120fc6cb20-kube-api-access-lqdt6" (OuterVolumeSpecName: "kube-api-access-lqdt6") pod "1119aa44-6ba2-48fd-a96f-64120fc6cb20" (UID: "1119aa44-6ba2-48fd-a96f-64120fc6cb20"). InnerVolumeSpecName "kube-api-access-lqdt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:09:40 crc kubenswrapper[4826]: I0319 19:09:40.983673 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1119aa44-6ba2-48fd-a96f-64120fc6cb20-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1119aa44-6ba2-48fd-a96f-64120fc6cb20" (UID: "1119aa44-6ba2-48fd-a96f-64120fc6cb20"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:09:41 crc kubenswrapper[4826]: I0319 19:09:41.030642 4826 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1119aa44-6ba2-48fd-a96f-64120fc6cb20-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 19:09:41 crc kubenswrapper[4826]: I0319 19:09:41.030723 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqdt6\" (UniqueName: \"kubernetes.io/projected/1119aa44-6ba2-48fd-a96f-64120fc6cb20-kube-api-access-lqdt6\") on node \"crc\" DevicePath \"\"" Mar 19 19:09:41 crc kubenswrapper[4826]: I0319 19:09:41.030735 4826 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1119aa44-6ba2-48fd-a96f-64120fc6cb20-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 19:09:41 crc kubenswrapper[4826]: I0319 19:09:41.336730 4826 generic.go:334] "Generic (PLEG): container finished" podID="1119aa44-6ba2-48fd-a96f-64120fc6cb20" containerID="44f2587f750366c5ff0104d0dad3a31554d5862e3b655c1bd243752b313e47f6" exitCode=0 Mar 19 19:09:41 crc kubenswrapper[4826]: I0319 19:09:41.336809 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l64bk" event={"ID":"1119aa44-6ba2-48fd-a96f-64120fc6cb20","Type":"ContainerDied","Data":"44f2587f750366c5ff0104d0dad3a31554d5862e3b655c1bd243752b313e47f6"} Mar 19 19:09:41 crc kubenswrapper[4826]: I0319 19:09:41.337131 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l64bk" event={"ID":"1119aa44-6ba2-48fd-a96f-64120fc6cb20","Type":"ContainerDied","Data":"8fbe69f4a5f7f3ce3960f8cbfca2355d0314d740b3fd7decc537fd7340a8c36f"} Mar 19 19:09:41 crc kubenswrapper[4826]: I0319 19:09:41.336912 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l64bk" Mar 19 19:09:41 crc kubenswrapper[4826]: I0319 19:09:41.337174 4826 scope.go:117] "RemoveContainer" containerID="44f2587f750366c5ff0104d0dad3a31554d5862e3b655c1bd243752b313e47f6" Mar 19 19:09:41 crc kubenswrapper[4826]: I0319 19:09:41.338228 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-ingester-0" Mar 19 19:09:41 crc kubenswrapper[4826]: I0319 19:09:41.369817 4826 scope.go:117] "RemoveContainer" containerID="13eeb051d44e3b6abb3b6cbd8125342215442f32682cbac63afb81250ca79250" Mar 19 19:09:41 crc kubenswrapper[4826]: I0319 19:09:41.386536 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l64bk"] Mar 19 19:09:41 crc kubenswrapper[4826]: I0319 19:09:41.403096 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-l64bk"] Mar 19 19:09:41 crc kubenswrapper[4826]: I0319 19:09:41.409710 4826 scope.go:117] "RemoveContainer" containerID="4048af147e881fa8554dd7cb5007bc3cab018fc173227fd51a761b468f346b44" Mar 19 19:09:41 crc kubenswrapper[4826]: I0319 19:09:41.451204 4826 scope.go:117] "RemoveContainer" containerID="44f2587f750366c5ff0104d0dad3a31554d5862e3b655c1bd243752b313e47f6" Mar 19 19:09:41 crc kubenswrapper[4826]: E0319 19:09:41.451906 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44f2587f750366c5ff0104d0dad3a31554d5862e3b655c1bd243752b313e47f6\": container with ID starting with 44f2587f750366c5ff0104d0dad3a31554d5862e3b655c1bd243752b313e47f6 not found: ID does not exist" containerID="44f2587f750366c5ff0104d0dad3a31554d5862e3b655c1bd243752b313e47f6" Mar 19 19:09:41 crc kubenswrapper[4826]: I0319 19:09:41.451949 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44f2587f750366c5ff0104d0dad3a31554d5862e3b655c1bd243752b313e47f6"} err="failed to get container status \"44f2587f750366c5ff0104d0dad3a31554d5862e3b655c1bd243752b313e47f6\": rpc error: code = NotFound desc = could not find container \"44f2587f750366c5ff0104d0dad3a31554d5862e3b655c1bd243752b313e47f6\": container with ID starting with 44f2587f750366c5ff0104d0dad3a31554d5862e3b655c1bd243752b313e47f6 not found: ID does not exist" Mar 19 19:09:41 crc kubenswrapper[4826]: I0319 19:09:41.451983 4826 scope.go:117] "RemoveContainer" containerID="13eeb051d44e3b6abb3b6cbd8125342215442f32682cbac63afb81250ca79250" Mar 19 19:09:41 crc kubenswrapper[4826]: E0319 19:09:41.452581 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13eeb051d44e3b6abb3b6cbd8125342215442f32682cbac63afb81250ca79250\": container with ID starting with 13eeb051d44e3b6abb3b6cbd8125342215442f32682cbac63afb81250ca79250 not found: ID does not exist" containerID="13eeb051d44e3b6abb3b6cbd8125342215442f32682cbac63afb81250ca79250" Mar 19 19:09:41 crc kubenswrapper[4826]: I0319 19:09:41.452614 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13eeb051d44e3b6abb3b6cbd8125342215442f32682cbac63afb81250ca79250"} err="failed to get container status \"13eeb051d44e3b6abb3b6cbd8125342215442f32682cbac63afb81250ca79250\": rpc error: code = NotFound desc = could not find container \"13eeb051d44e3b6abb3b6cbd8125342215442f32682cbac63afb81250ca79250\": container with ID starting with 13eeb051d44e3b6abb3b6cbd8125342215442f32682cbac63afb81250ca79250 not found: ID does not exist" Mar 19 19:09:41 crc kubenswrapper[4826]: I0319 19:09:41.452640 4826 scope.go:117] "RemoveContainer" containerID="4048af147e881fa8554dd7cb5007bc3cab018fc173227fd51a761b468f346b44" Mar 19 19:09:41 crc kubenswrapper[4826]: E0319 19:09:41.453203 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4048af147e881fa8554dd7cb5007bc3cab018fc173227fd51a761b468f346b44\": container with ID starting with 4048af147e881fa8554dd7cb5007bc3cab018fc173227fd51a761b468f346b44 not found: ID does not exist" containerID="4048af147e881fa8554dd7cb5007bc3cab018fc173227fd51a761b468f346b44" Mar 19 19:09:41 crc kubenswrapper[4826]: I0319 19:09:41.453240 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4048af147e881fa8554dd7cb5007bc3cab018fc173227fd51a761b468f346b44"} err="failed to get container status \"4048af147e881fa8554dd7cb5007bc3cab018fc173227fd51a761b468f346b44\": rpc error: code = NotFound desc = could not find container \"4048af147e881fa8554dd7cb5007bc3cab018fc173227fd51a761b468f346b44\": container with ID starting with 4048af147e881fa8554dd7cb5007bc3cab018fc173227fd51a761b468f346b44 not found: ID does not exist" Mar 19 19:09:41 crc kubenswrapper[4826]: I0319 19:09:41.989131 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1119aa44-6ba2-48fd-a96f-64120fc6cb20" path="/var/lib/kubelet/pods/1119aa44-6ba2-48fd-a96f-64120fc6cb20/volumes" Mar 19 19:09:43 crc kubenswrapper[4826]: I0319 19:09:43.356341 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-zvtrc" event={"ID":"64ca34d8-5f9f-448d-9ab2-414c5b4757e9","Type":"ContainerStarted","Data":"41a63ad3761d3a8c1bafa832c61e51d10288c01940a0edc08f2395cf3b3f57f0"} Mar 19 19:09:43 crc kubenswrapper[4826]: I0319 19:09:43.356917 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-zvtrc" Mar 19 19:09:43 crc kubenswrapper[4826]: I0319 19:09:43.356960 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-zvtrc" Mar 19 19:09:43 crc kubenswrapper[4826]: I0319 19:09:43.359925 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-mhqzk" event={"ID":"1e1484c9-801f-4999-9754-456df604d7ca","Type":"ContainerStarted","Data":"417191565a32e7d6cf85716e26706bca83e3e43e7cf85b8ab3eab18d1c2bd364"} Mar 19 19:09:43 crc kubenswrapper[4826]: I0319 19:09:43.360403 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-mhqzk" Mar 19 19:09:43 crc kubenswrapper[4826]: I0319 19:09:43.360643 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-mhqzk" Mar 19 19:09:43 crc kubenswrapper[4826]: I0319 19:09:43.378169 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-mhqzk" Mar 19 19:09:43 crc kubenswrapper[4826]: I0319 19:09:43.381282 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-zvtrc" Mar 19 19:09:43 crc kubenswrapper[4826]: I0319 19:09:43.384646 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-mhqzk" Mar 19 19:09:43 crc kubenswrapper[4826]: I0319 19:09:43.396328 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-zvtrc" Mar 19 19:09:43 crc kubenswrapper[4826]: I0319 19:09:43.398589 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-zvtrc" podStartSLOduration=2.114217865 podStartE2EDuration="9.398555771s" podCreationTimestamp="2026-03-19 19:09:34 +0000 UTC" firstStartedPulling="2026-03-19 19:09:35.452184656 +0000 UTC m=+800.206252969" lastFinishedPulling="2026-03-19 19:09:42.736522562 +0000 UTC m=+807.490590875" observedRunningTime="2026-03-19 19:09:43.386776012 +0000 UTC m=+808.140844355" watchObservedRunningTime="2026-03-19 19:09:43.398555771 +0000 UTC m=+808.152624174" Mar 19 19:09:43 crc kubenswrapper[4826]: I0319 19:09:43.448217 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-mhqzk" podStartSLOduration=2.137017063 podStartE2EDuration="9.448193735s" podCreationTimestamp="2026-03-19 19:09:34 +0000 UTC" firstStartedPulling="2026-03-19 19:09:35.42043073 +0000 UTC m=+800.174499033" lastFinishedPulling="2026-03-19 19:09:42.731607392 +0000 UTC m=+807.485675705" observedRunningTime="2026-03-19 19:09:43.444259789 +0000 UTC m=+808.198328142" watchObservedRunningTime="2026-03-19 19:09:43.448193735 +0000 UTC m=+808.202262078" Mar 19 19:09:54 crc kubenswrapper[4826]: I0319 19:09:54.472012 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-distributor-9c6b6d984-qrlfg" Mar 19 19:09:54 crc kubenswrapper[4826]: I0319 19:09:54.778604 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-l2p46" Mar 19 19:09:54 crc kubenswrapper[4826]: I0319 19:09:54.914498 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qltmk" Mar 19 19:09:55 crc kubenswrapper[4826]: I0319 19:09:55.400929 4826 patch_prober.go:28] interesting pod/machine-config-daemon-zz87p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:09:55 crc kubenswrapper[4826]: I0319 19:09:55.401015 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:09:55 crc kubenswrapper[4826]: I0319 19:09:55.401077 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" Mar 19 19:09:55 crc kubenswrapper[4826]: I0319 19:09:55.401827 4826 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8ea88529d854cce8d681f7f626a89455b8191b0562d0f4f8577894657d2eaf5c"} pod="openshift-machine-config-operator/machine-config-daemon-zz87p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 19:09:55 crc kubenswrapper[4826]: I0319 19:09:55.401932 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" containerID="cri-o://8ea88529d854cce8d681f7f626a89455b8191b0562d0f4f8577894657d2eaf5c" gracePeriod=600 Mar 19 19:09:55 crc kubenswrapper[4826]: I0319 19:09:55.614672 4826 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: this instance owns no tokens Mar 19 19:09:55 crc kubenswrapper[4826]: I0319 19:09:55.615122 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="93990ea7-96ba-4c12-b92c-17a7c38aece4" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 19 19:09:55 crc kubenswrapper[4826]: I0319 19:09:55.726794 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-compactor-0" Mar 19 19:09:56 crc kubenswrapper[4826]: I0319 19:09:56.095867 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-index-gateway-0" Mar 19 19:09:56 crc kubenswrapper[4826]: I0319 19:09:56.482175 4826 generic.go:334] "Generic (PLEG): container finished" podID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerID="8ea88529d854cce8d681f7f626a89455b8191b0562d0f4f8577894657d2eaf5c" exitCode=0 Mar 19 19:09:56 crc kubenswrapper[4826]: I0319 19:09:56.482239 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" event={"ID":"b456fa3f-c7a7-45ca-b560-e7a9b21be05a","Type":"ContainerDied","Data":"8ea88529d854cce8d681f7f626a89455b8191b0562d0f4f8577894657d2eaf5c"} Mar 19 19:09:56 crc kubenswrapper[4826]: I0319 19:09:56.482274 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" event={"ID":"b456fa3f-c7a7-45ca-b560-e7a9b21be05a","Type":"ContainerStarted","Data":"633ba93ffe9c9e9f20a094017e3572d6ef9546ba5f85c83960d8b20fb8ddd2bc"} Mar 19 19:09:56 crc kubenswrapper[4826]: I0319 19:09:56.482298 4826 scope.go:117] "RemoveContainer" containerID="4ebb3021af767d982a8fa18f3ce572a8f3489148bb38a57e868427071dfe9b1a" Mar 19 19:10:00 crc kubenswrapper[4826]: I0319 19:10:00.143789 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565790-twk5t"] Mar 19 19:10:00 crc kubenswrapper[4826]: E0319 19:10:00.145015 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1119aa44-6ba2-48fd-a96f-64120fc6cb20" containerName="extract-utilities" Mar 19 19:10:00 crc kubenswrapper[4826]: I0319 19:10:00.145038 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="1119aa44-6ba2-48fd-a96f-64120fc6cb20" containerName="extract-utilities" Mar 19 19:10:00 crc kubenswrapper[4826]: E0319 19:10:00.145061 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1119aa44-6ba2-48fd-a96f-64120fc6cb20" containerName="registry-server" Mar 19 19:10:00 crc kubenswrapper[4826]: I0319 19:10:00.145071 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="1119aa44-6ba2-48fd-a96f-64120fc6cb20" containerName="registry-server" Mar 19 19:10:00 crc kubenswrapper[4826]: E0319 19:10:00.145086 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1119aa44-6ba2-48fd-a96f-64120fc6cb20" containerName="extract-content" Mar 19 19:10:00 crc kubenswrapper[4826]: I0319 19:10:00.145096 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="1119aa44-6ba2-48fd-a96f-64120fc6cb20" containerName="extract-content" Mar 19 19:10:00 crc kubenswrapper[4826]: I0319 19:10:00.145318 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="1119aa44-6ba2-48fd-a96f-64120fc6cb20" containerName="registry-server" Mar 19 19:10:00 crc kubenswrapper[4826]: I0319 19:10:00.146077 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565790-twk5t" Mar 19 19:10:00 crc kubenswrapper[4826]: I0319 19:10:00.149970 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 19:10:00 crc kubenswrapper[4826]: I0319 19:10:00.150409 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b27wl" Mar 19 19:10:00 crc kubenswrapper[4826]: I0319 19:10:00.150488 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 19:10:00 crc kubenswrapper[4826]: I0319 19:10:00.165915 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565790-twk5t"] Mar 19 19:10:00 crc kubenswrapper[4826]: I0319 19:10:00.281071 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsbtw\" (UniqueName: \"kubernetes.io/projected/6ac6f48b-794a-4278-8d3f-9a1cfd37c033-kube-api-access-fsbtw\") pod \"auto-csr-approver-29565790-twk5t\" (UID: \"6ac6f48b-794a-4278-8d3f-9a1cfd37c033\") " pod="openshift-infra/auto-csr-approver-29565790-twk5t" Mar 19 19:10:00 crc kubenswrapper[4826]: I0319 19:10:00.383494 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsbtw\" (UniqueName: \"kubernetes.io/projected/6ac6f48b-794a-4278-8d3f-9a1cfd37c033-kube-api-access-fsbtw\") pod \"auto-csr-approver-29565790-twk5t\" (UID: \"6ac6f48b-794a-4278-8d3f-9a1cfd37c033\") " pod="openshift-infra/auto-csr-approver-29565790-twk5t" Mar 19 19:10:00 crc kubenswrapper[4826]: I0319 19:10:00.418620 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsbtw\" (UniqueName: \"kubernetes.io/projected/6ac6f48b-794a-4278-8d3f-9a1cfd37c033-kube-api-access-fsbtw\") pod \"auto-csr-approver-29565790-twk5t\" (UID: \"6ac6f48b-794a-4278-8d3f-9a1cfd37c033\") " pod="openshift-infra/auto-csr-approver-29565790-twk5t" Mar 19 19:10:00 crc kubenswrapper[4826]: I0319 19:10:00.467091 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565790-twk5t" Mar 19 19:10:00 crc kubenswrapper[4826]: I0319 19:10:00.733177 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565790-twk5t"] Mar 19 19:10:00 crc kubenswrapper[4826]: W0319 19:10:00.739801 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ac6f48b_794a_4278_8d3f_9a1cfd37c033.slice/crio-4e93b9ab33d2eed2465b33f8a5d3bd120bbb765bc23ba676172a4bfc1144b2be WatchSource:0}: Error finding container 4e93b9ab33d2eed2465b33f8a5d3bd120bbb765bc23ba676172a4bfc1144b2be: Status 404 returned error can't find the container with id 4e93b9ab33d2eed2465b33f8a5d3bd120bbb765bc23ba676172a4bfc1144b2be Mar 19 19:10:01 crc kubenswrapper[4826]: I0319 19:10:01.546725 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565790-twk5t" event={"ID":"6ac6f48b-794a-4278-8d3f-9a1cfd37c033","Type":"ContainerStarted","Data":"4e93b9ab33d2eed2465b33f8a5d3bd120bbb765bc23ba676172a4bfc1144b2be"} Mar 19 19:10:03 crc kubenswrapper[4826]: I0319 19:10:03.565167 4826 generic.go:334] "Generic (PLEG): container finished" podID="6ac6f48b-794a-4278-8d3f-9a1cfd37c033" containerID="2a327b90a2b9883180d1d7cf93684b3f22f1a73a49c53c9ce2b62ae5b1bc9116" exitCode=0 Mar 19 19:10:03 crc kubenswrapper[4826]: I0319 19:10:03.565253 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565790-twk5t" event={"ID":"6ac6f48b-794a-4278-8d3f-9a1cfd37c033","Type":"ContainerDied","Data":"2a327b90a2b9883180d1d7cf93684b3f22f1a73a49c53c9ce2b62ae5b1bc9116"} Mar 19 19:10:04 crc kubenswrapper[4826]: I0319 19:10:04.846450 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565790-twk5t" Mar 19 19:10:04 crc kubenswrapper[4826]: I0319 19:10:04.960572 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsbtw\" (UniqueName: \"kubernetes.io/projected/6ac6f48b-794a-4278-8d3f-9a1cfd37c033-kube-api-access-fsbtw\") pod \"6ac6f48b-794a-4278-8d3f-9a1cfd37c033\" (UID: \"6ac6f48b-794a-4278-8d3f-9a1cfd37c033\") " Mar 19 19:10:04 crc kubenswrapper[4826]: I0319 19:10:04.967056 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ac6f48b-794a-4278-8d3f-9a1cfd37c033-kube-api-access-fsbtw" (OuterVolumeSpecName: "kube-api-access-fsbtw") pod "6ac6f48b-794a-4278-8d3f-9a1cfd37c033" (UID: "6ac6f48b-794a-4278-8d3f-9a1cfd37c033"). InnerVolumeSpecName "kube-api-access-fsbtw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:10:05 crc kubenswrapper[4826]: I0319 19:10:05.064329 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsbtw\" (UniqueName: \"kubernetes.io/projected/6ac6f48b-794a-4278-8d3f-9a1cfd37c033-kube-api-access-fsbtw\") on node \"crc\" DevicePath \"\"" Mar 19 19:10:05 crc kubenswrapper[4826]: I0319 19:10:05.583685 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565790-twk5t" event={"ID":"6ac6f48b-794a-4278-8d3f-9a1cfd37c033","Type":"ContainerDied","Data":"4e93b9ab33d2eed2465b33f8a5d3bd120bbb765bc23ba676172a4bfc1144b2be"} Mar 19 19:10:05 crc kubenswrapper[4826]: I0319 19:10:05.583773 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e93b9ab33d2eed2465b33f8a5d3bd120bbb765bc23ba676172a4bfc1144b2be" Mar 19 19:10:05 crc kubenswrapper[4826]: I0319 19:10:05.583773 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565790-twk5t" Mar 19 19:10:05 crc kubenswrapper[4826]: I0319 19:10:05.613324 4826 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: this instance owns no tokens Mar 19 19:10:05 crc kubenswrapper[4826]: I0319 19:10:05.613374 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="93990ea7-96ba-4c12-b92c-17a7c38aece4" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 19 19:10:05 crc kubenswrapper[4826]: I0319 19:10:05.935020 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565784-gqmkb"] Mar 19 19:10:05 crc kubenswrapper[4826]: I0319 19:10:05.942743 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565784-gqmkb"] Mar 19 19:10:05 crc kubenswrapper[4826]: I0319 19:10:05.987869 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c12e6736-71d9-48ae-8df3-c385a2dd259f" path="/var/lib/kubelet/pods/c12e6736-71d9-48ae-8df3-c385a2dd259f/volumes" Mar 19 19:10:15 crc kubenswrapper[4826]: I0319 19:10:15.612097 4826 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: waiting for 15s after being ready Mar 19 19:10:15 crc kubenswrapper[4826]: I0319 19:10:15.612735 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="93990ea7-96ba-4c12-b92c-17a7c38aece4" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 19 19:10:16 crc kubenswrapper[4826]: I0319 19:10:16.816918 4826 scope.go:117] "RemoveContainer" containerID="d19c308138bf340eca1e68c570b2372a1f1e2e310f8150adea90b049a74ec80d" Mar 19 19:10:25 crc kubenswrapper[4826]: I0319 19:10:25.612917 4826 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: waiting for 15s after being ready Mar 19 19:10:25 crc kubenswrapper[4826]: I0319 19:10:25.613478 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="93990ea7-96ba-4c12-b92c-17a7c38aece4" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 19 19:10:35 crc kubenswrapper[4826]: I0319 19:10:35.604589 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-klj9b"] Mar 19 19:10:35 crc kubenswrapper[4826]: E0319 19:10:35.605577 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ac6f48b-794a-4278-8d3f-9a1cfd37c033" containerName="oc" Mar 19 19:10:35 crc kubenswrapper[4826]: I0319 19:10:35.605602 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ac6f48b-794a-4278-8d3f-9a1cfd37c033" containerName="oc" Mar 19 19:10:35 crc kubenswrapper[4826]: I0319 19:10:35.605902 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ac6f48b-794a-4278-8d3f-9a1cfd37c033" containerName="oc" Mar 19 19:10:35 crc kubenswrapper[4826]: I0319 19:10:35.607630 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-klj9b" Mar 19 19:10:35 crc kubenswrapper[4826]: I0319 19:10:35.614341 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-klj9b"] Mar 19 19:10:35 crc kubenswrapper[4826]: I0319 19:10:35.620610 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-ingester-0" Mar 19 19:10:35 crc kubenswrapper[4826]: I0319 19:10:35.628639 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2577c1a0-e579-4d7c-a2e6-2494e67afbc1-catalog-content\") pod \"certified-operators-klj9b\" (UID: \"2577c1a0-e579-4d7c-a2e6-2494e67afbc1\") " pod="openshift-marketplace/certified-operators-klj9b" Mar 19 19:10:35 crc kubenswrapper[4826]: I0319 19:10:35.628733 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2577c1a0-e579-4d7c-a2e6-2494e67afbc1-utilities\") pod \"certified-operators-klj9b\" (UID: \"2577c1a0-e579-4d7c-a2e6-2494e67afbc1\") " pod="openshift-marketplace/certified-operators-klj9b" Mar 19 19:10:35 crc kubenswrapper[4826]: I0319 19:10:35.628896 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ftrp\" (UniqueName: \"kubernetes.io/projected/2577c1a0-e579-4d7c-a2e6-2494e67afbc1-kube-api-access-9ftrp\") pod \"certified-operators-klj9b\" (UID: \"2577c1a0-e579-4d7c-a2e6-2494e67afbc1\") " pod="openshift-marketplace/certified-operators-klj9b" Mar 19 19:10:35 crc kubenswrapper[4826]: I0319 19:10:35.729841 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ftrp\" (UniqueName: \"kubernetes.io/projected/2577c1a0-e579-4d7c-a2e6-2494e67afbc1-kube-api-access-9ftrp\") pod \"certified-operators-klj9b\" (UID: \"2577c1a0-e579-4d7c-a2e6-2494e67afbc1\") " pod="openshift-marketplace/certified-operators-klj9b" Mar 19 19:10:35 crc kubenswrapper[4826]: I0319 19:10:35.729964 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2577c1a0-e579-4d7c-a2e6-2494e67afbc1-catalog-content\") pod \"certified-operators-klj9b\" (UID: \"2577c1a0-e579-4d7c-a2e6-2494e67afbc1\") " pod="openshift-marketplace/certified-operators-klj9b" Mar 19 19:10:35 crc kubenswrapper[4826]: I0319 19:10:35.730001 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2577c1a0-e579-4d7c-a2e6-2494e67afbc1-utilities\") pod \"certified-operators-klj9b\" (UID: \"2577c1a0-e579-4d7c-a2e6-2494e67afbc1\") " pod="openshift-marketplace/certified-operators-klj9b" Mar 19 19:10:35 crc kubenswrapper[4826]: I0319 19:10:35.731291 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2577c1a0-e579-4d7c-a2e6-2494e67afbc1-utilities\") pod \"certified-operators-klj9b\" (UID: \"2577c1a0-e579-4d7c-a2e6-2494e67afbc1\") " pod="openshift-marketplace/certified-operators-klj9b" Mar 19 19:10:35 crc kubenswrapper[4826]: I0319 19:10:35.731513 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2577c1a0-e579-4d7c-a2e6-2494e67afbc1-catalog-content\") pod \"certified-operators-klj9b\" (UID: \"2577c1a0-e579-4d7c-a2e6-2494e67afbc1\") " pod="openshift-marketplace/certified-operators-klj9b" Mar 19 19:10:35 crc kubenswrapper[4826]: I0319 19:10:35.772586 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ftrp\" (UniqueName: \"kubernetes.io/projected/2577c1a0-e579-4d7c-a2e6-2494e67afbc1-kube-api-access-9ftrp\") pod \"certified-operators-klj9b\" (UID: \"2577c1a0-e579-4d7c-a2e6-2494e67afbc1\") " pod="openshift-marketplace/certified-operators-klj9b" Mar 19 19:10:35 crc kubenswrapper[4826]: I0319 19:10:35.948752 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-klj9b" Mar 19 19:10:36 crc kubenswrapper[4826]: I0319 19:10:36.228556 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-klj9b"] Mar 19 19:10:36 crc kubenswrapper[4826]: I0319 19:10:36.866895 4826 generic.go:334] "Generic (PLEG): container finished" podID="2577c1a0-e579-4d7c-a2e6-2494e67afbc1" containerID="13dfc3fc8c2bbc6845d15efa9b922b3db7a2d44941bc51c0619abadf8d344ea9" exitCode=0 Mar 19 19:10:36 crc kubenswrapper[4826]: I0319 19:10:36.866972 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-klj9b" event={"ID":"2577c1a0-e579-4d7c-a2e6-2494e67afbc1","Type":"ContainerDied","Data":"13dfc3fc8c2bbc6845d15efa9b922b3db7a2d44941bc51c0619abadf8d344ea9"} Mar 19 19:10:36 crc kubenswrapper[4826]: I0319 19:10:36.867157 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-klj9b" event={"ID":"2577c1a0-e579-4d7c-a2e6-2494e67afbc1","Type":"ContainerStarted","Data":"0a5ff86107341ffb19c10d475345714ed824cf0322121dbc975b40174ed20c07"} Mar 19 19:10:37 crc kubenswrapper[4826]: I0319 19:10:37.879696 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-klj9b" event={"ID":"2577c1a0-e579-4d7c-a2e6-2494e67afbc1","Type":"ContainerStarted","Data":"b9cedc5e45f33603664dd3c624c1d5ad354791bfcfd655bc0afdd9cb5dc3cd41"} Mar 19 19:10:38 crc kubenswrapper[4826]: I0319 19:10:38.892295 4826 generic.go:334] "Generic (PLEG): container finished" podID="2577c1a0-e579-4d7c-a2e6-2494e67afbc1" containerID="b9cedc5e45f33603664dd3c624c1d5ad354791bfcfd655bc0afdd9cb5dc3cd41" exitCode=0 Mar 19 19:10:38 crc kubenswrapper[4826]: I0319 19:10:38.892402 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-klj9b" event={"ID":"2577c1a0-e579-4d7c-a2e6-2494e67afbc1","Type":"ContainerDied","Data":"b9cedc5e45f33603664dd3c624c1d5ad354791bfcfd655bc0afdd9cb5dc3cd41"} Mar 19 19:10:39 crc kubenswrapper[4826]: I0319 19:10:39.904455 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-klj9b" event={"ID":"2577c1a0-e579-4d7c-a2e6-2494e67afbc1","Type":"ContainerStarted","Data":"84c8850f863bc1d090fc3c7878344e008033ea4baaa58cd9fe5c3b39eaa12133"} Mar 19 19:10:39 crc kubenswrapper[4826]: I0319 19:10:39.931566 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-klj9b" podStartSLOduration=2.506711739 podStartE2EDuration="4.931547315s" podCreationTimestamp="2026-03-19 19:10:35 +0000 UTC" firstStartedPulling="2026-03-19 19:10:36.869187591 +0000 UTC m=+861.623255914" lastFinishedPulling="2026-03-19 19:10:39.294023157 +0000 UTC m=+864.048091490" observedRunningTime="2026-03-19 19:10:39.925995891 +0000 UTC m=+864.680064234" watchObservedRunningTime="2026-03-19 19:10:39.931547315 +0000 UTC m=+864.685615658" Mar 19 19:10:45 crc kubenswrapper[4826]: I0319 19:10:45.949138 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-klj9b" Mar 19 19:10:45 crc kubenswrapper[4826]: I0319 19:10:45.949591 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-klj9b" Mar 19 19:10:46 crc kubenswrapper[4826]: I0319 19:10:46.001580 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-klj9b" Mar 19 19:10:46 crc kubenswrapper[4826]: I0319 19:10:46.069776 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-klj9b" Mar 19 19:10:46 crc kubenswrapper[4826]: I0319 19:10:46.247797 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-klj9b"] Mar 19 19:10:47 crc kubenswrapper[4826]: I0319 19:10:47.972266 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-klj9b" podUID="2577c1a0-e579-4d7c-a2e6-2494e67afbc1" containerName="registry-server" containerID="cri-o://84c8850f863bc1d090fc3c7878344e008033ea4baaa58cd9fe5c3b39eaa12133" gracePeriod=2 Mar 19 19:10:48 crc kubenswrapper[4826]: I0319 19:10:48.357442 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-klj9b" Mar 19 19:10:48 crc kubenswrapper[4826]: I0319 19:10:48.443688 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2577c1a0-e579-4d7c-a2e6-2494e67afbc1-utilities\") pod \"2577c1a0-e579-4d7c-a2e6-2494e67afbc1\" (UID: \"2577c1a0-e579-4d7c-a2e6-2494e67afbc1\") " Mar 19 19:10:48 crc kubenswrapper[4826]: I0319 19:10:48.443752 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ftrp\" (UniqueName: \"kubernetes.io/projected/2577c1a0-e579-4d7c-a2e6-2494e67afbc1-kube-api-access-9ftrp\") pod \"2577c1a0-e579-4d7c-a2e6-2494e67afbc1\" (UID: \"2577c1a0-e579-4d7c-a2e6-2494e67afbc1\") " Mar 19 19:10:48 crc kubenswrapper[4826]: I0319 19:10:48.443824 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2577c1a0-e579-4d7c-a2e6-2494e67afbc1-catalog-content\") pod \"2577c1a0-e579-4d7c-a2e6-2494e67afbc1\" (UID: \"2577c1a0-e579-4d7c-a2e6-2494e67afbc1\") " Mar 19 19:10:48 crc kubenswrapper[4826]: I0319 19:10:48.444602 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2577c1a0-e579-4d7c-a2e6-2494e67afbc1-utilities" (OuterVolumeSpecName: "utilities") pod "2577c1a0-e579-4d7c-a2e6-2494e67afbc1" (UID: "2577c1a0-e579-4d7c-a2e6-2494e67afbc1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:10:48 crc kubenswrapper[4826]: I0319 19:10:48.449584 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2577c1a0-e579-4d7c-a2e6-2494e67afbc1-kube-api-access-9ftrp" (OuterVolumeSpecName: "kube-api-access-9ftrp") pod "2577c1a0-e579-4d7c-a2e6-2494e67afbc1" (UID: "2577c1a0-e579-4d7c-a2e6-2494e67afbc1"). InnerVolumeSpecName "kube-api-access-9ftrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:10:48 crc kubenswrapper[4826]: I0319 19:10:48.546022 4826 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2577c1a0-e579-4d7c-a2e6-2494e67afbc1-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 19:10:48 crc kubenswrapper[4826]: I0319 19:10:48.546284 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ftrp\" (UniqueName: \"kubernetes.io/projected/2577c1a0-e579-4d7c-a2e6-2494e67afbc1-kube-api-access-9ftrp\") on node \"crc\" DevicePath \"\"" Mar 19 19:10:48 crc kubenswrapper[4826]: I0319 19:10:48.980518 4826 generic.go:334] "Generic (PLEG): container finished" podID="2577c1a0-e579-4d7c-a2e6-2494e67afbc1" containerID="84c8850f863bc1d090fc3c7878344e008033ea4baaa58cd9fe5c3b39eaa12133" exitCode=0 Mar 19 19:10:48 crc kubenswrapper[4826]: I0319 19:10:48.980558 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-klj9b" Mar 19 19:10:48 crc kubenswrapper[4826]: I0319 19:10:48.980564 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-klj9b" event={"ID":"2577c1a0-e579-4d7c-a2e6-2494e67afbc1","Type":"ContainerDied","Data":"84c8850f863bc1d090fc3c7878344e008033ea4baaa58cd9fe5c3b39eaa12133"} Mar 19 19:10:48 crc kubenswrapper[4826]: I0319 19:10:48.980589 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-klj9b" event={"ID":"2577c1a0-e579-4d7c-a2e6-2494e67afbc1","Type":"ContainerDied","Data":"0a5ff86107341ffb19c10d475345714ed824cf0322121dbc975b40174ed20c07"} Mar 19 19:10:48 crc kubenswrapper[4826]: I0319 19:10:48.980634 4826 scope.go:117] "RemoveContainer" containerID="84c8850f863bc1d090fc3c7878344e008033ea4baaa58cd9fe5c3b39eaa12133" Mar 19 19:10:49 crc kubenswrapper[4826]: I0319 19:10:49.006248 4826 scope.go:117] "RemoveContainer" containerID="b9cedc5e45f33603664dd3c624c1d5ad354791bfcfd655bc0afdd9cb5dc3cd41" Mar 19 19:10:49 crc kubenswrapper[4826]: I0319 19:10:49.025939 4826 scope.go:117] "RemoveContainer" containerID="13dfc3fc8c2bbc6845d15efa9b922b3db7a2d44941bc51c0619abadf8d344ea9" Mar 19 19:10:49 crc kubenswrapper[4826]: I0319 19:10:49.096138 4826 scope.go:117] "RemoveContainer" containerID="84c8850f863bc1d090fc3c7878344e008033ea4baaa58cd9fe5c3b39eaa12133" Mar 19 19:10:49 crc kubenswrapper[4826]: E0319 19:10:49.097029 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84c8850f863bc1d090fc3c7878344e008033ea4baaa58cd9fe5c3b39eaa12133\": container with ID starting with 84c8850f863bc1d090fc3c7878344e008033ea4baaa58cd9fe5c3b39eaa12133 not found: ID does not exist" containerID="84c8850f863bc1d090fc3c7878344e008033ea4baaa58cd9fe5c3b39eaa12133" Mar 19 19:10:49 crc kubenswrapper[4826]: I0319 19:10:49.097093 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84c8850f863bc1d090fc3c7878344e008033ea4baaa58cd9fe5c3b39eaa12133"} err="failed to get container status \"84c8850f863bc1d090fc3c7878344e008033ea4baaa58cd9fe5c3b39eaa12133\": rpc error: code = NotFound desc = could not find container \"84c8850f863bc1d090fc3c7878344e008033ea4baaa58cd9fe5c3b39eaa12133\": container with ID starting with 84c8850f863bc1d090fc3c7878344e008033ea4baaa58cd9fe5c3b39eaa12133 not found: ID does not exist" Mar 19 19:10:49 crc kubenswrapper[4826]: I0319 19:10:49.097115 4826 scope.go:117] "RemoveContainer" containerID="b9cedc5e45f33603664dd3c624c1d5ad354791bfcfd655bc0afdd9cb5dc3cd41" Mar 19 19:10:49 crc kubenswrapper[4826]: E0319 19:10:49.104163 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9cedc5e45f33603664dd3c624c1d5ad354791bfcfd655bc0afdd9cb5dc3cd41\": container with ID starting with b9cedc5e45f33603664dd3c624c1d5ad354791bfcfd655bc0afdd9cb5dc3cd41 not found: ID does not exist" containerID="b9cedc5e45f33603664dd3c624c1d5ad354791bfcfd655bc0afdd9cb5dc3cd41" Mar 19 19:10:49 crc kubenswrapper[4826]: I0319 19:10:49.104199 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9cedc5e45f33603664dd3c624c1d5ad354791bfcfd655bc0afdd9cb5dc3cd41"} err="failed to get container status \"b9cedc5e45f33603664dd3c624c1d5ad354791bfcfd655bc0afdd9cb5dc3cd41\": rpc error: code = NotFound desc = could not find container \"b9cedc5e45f33603664dd3c624c1d5ad354791bfcfd655bc0afdd9cb5dc3cd41\": container with ID starting with b9cedc5e45f33603664dd3c624c1d5ad354791bfcfd655bc0afdd9cb5dc3cd41 not found: ID does not exist" Mar 19 19:10:49 crc kubenswrapper[4826]: I0319 19:10:49.104214 4826 scope.go:117] "RemoveContainer" containerID="13dfc3fc8c2bbc6845d15efa9b922b3db7a2d44941bc51c0619abadf8d344ea9" Mar 19 19:10:49 crc kubenswrapper[4826]: E0319 19:10:49.104504 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13dfc3fc8c2bbc6845d15efa9b922b3db7a2d44941bc51c0619abadf8d344ea9\": container with ID starting with 13dfc3fc8c2bbc6845d15efa9b922b3db7a2d44941bc51c0619abadf8d344ea9 not found: ID does not exist" containerID="13dfc3fc8c2bbc6845d15efa9b922b3db7a2d44941bc51c0619abadf8d344ea9" Mar 19 19:10:49 crc kubenswrapper[4826]: I0319 19:10:49.104525 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13dfc3fc8c2bbc6845d15efa9b922b3db7a2d44941bc51c0619abadf8d344ea9"} err="failed to get container status \"13dfc3fc8c2bbc6845d15efa9b922b3db7a2d44941bc51c0619abadf8d344ea9\": rpc error: code = NotFound desc = could not find container \"13dfc3fc8c2bbc6845d15efa9b922b3db7a2d44941bc51c0619abadf8d344ea9\": container with ID starting with 13dfc3fc8c2bbc6845d15efa9b922b3db7a2d44941bc51c0619abadf8d344ea9 not found: ID does not exist" Mar 19 19:10:49 crc kubenswrapper[4826]: I0319 19:10:49.562762 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2577c1a0-e579-4d7c-a2e6-2494e67afbc1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2577c1a0-e579-4d7c-a2e6-2494e67afbc1" (UID: "2577c1a0-e579-4d7c-a2e6-2494e67afbc1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:10:49 crc kubenswrapper[4826]: I0319 19:10:49.613816 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-klj9b"] Mar 19 19:10:49 crc kubenswrapper[4826]: I0319 19:10:49.620911 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-klj9b"] Mar 19 19:10:49 crc kubenswrapper[4826]: I0319 19:10:49.665009 4826 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2577c1a0-e579-4d7c-a2e6-2494e67afbc1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 19:10:49 crc kubenswrapper[4826]: I0319 19:10:49.987833 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2577c1a0-e579-4d7c-a2e6-2494e67afbc1" path="/var/lib/kubelet/pods/2577c1a0-e579-4d7c-a2e6-2494e67afbc1/volumes" Mar 19 19:10:52 crc kubenswrapper[4826]: I0319 19:10:52.877696 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/collector-bqh6c"] Mar 19 19:10:52 crc kubenswrapper[4826]: E0319 19:10:52.878343 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2577c1a0-e579-4d7c-a2e6-2494e67afbc1" containerName="extract-utilities" Mar 19 19:10:52 crc kubenswrapper[4826]: I0319 19:10:52.878355 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="2577c1a0-e579-4d7c-a2e6-2494e67afbc1" containerName="extract-utilities" Mar 19 19:10:52 crc kubenswrapper[4826]: E0319 19:10:52.878374 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2577c1a0-e579-4d7c-a2e6-2494e67afbc1" containerName="extract-content" Mar 19 19:10:52 crc kubenswrapper[4826]: I0319 19:10:52.878381 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="2577c1a0-e579-4d7c-a2e6-2494e67afbc1" containerName="extract-content" Mar 19 19:10:52 crc kubenswrapper[4826]: E0319 19:10:52.878398 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2577c1a0-e579-4d7c-a2e6-2494e67afbc1" containerName="registry-server" Mar 19 19:10:52 crc kubenswrapper[4826]: I0319 19:10:52.878404 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="2577c1a0-e579-4d7c-a2e6-2494e67afbc1" containerName="registry-server" Mar 19 19:10:52 crc kubenswrapper[4826]: I0319 19:10:52.878511 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="2577c1a0-e579-4d7c-a2e6-2494e67afbc1" containerName="registry-server" Mar 19 19:10:52 crc kubenswrapper[4826]: I0319 19:10:52.878993 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-bqh6c" Mar 19 19:10:52 crc kubenswrapper[4826]: I0319 19:10:52.883479 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-dockercfg-xdnmr" Mar 19 19:10:52 crc kubenswrapper[4826]: I0319 19:10:52.883574 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-syslog-receiver" Mar 19 19:10:52 crc kubenswrapper[4826]: I0319 19:10:52.884199 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-config" Mar 19 19:10:52 crc kubenswrapper[4826]: I0319 19:10:52.885204 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-token" Mar 19 19:10:52 crc kubenswrapper[4826]: I0319 19:10:52.886257 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-metrics" Mar 19 19:10:52 crc kubenswrapper[4826]: I0319 19:10:52.895694 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-trustbundle" Mar 19 19:10:52 crc kubenswrapper[4826]: I0319 19:10:52.913144 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-bqh6c"] Mar 19 19:10:52 crc kubenswrapper[4826]: I0319 19:10:52.981806 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-logging/collector-bqh6c"] Mar 19 19:10:52 crc kubenswrapper[4826]: E0319 19:10:52.982497 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[collector-syslog-receiver collector-token config config-openshift-service-cacrt datadir entrypoint kube-api-access-tk7jt metrics sa-token tmp trusted-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-logging/collector-bqh6c" podUID="d7317524-d276-4126-be2a-236c742b9254" Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.010545 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-bqh6c" Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.017989 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/d7317524-d276-4126-be2a-236c742b9254-metrics\") pod \"collector-bqh6c\" (UID: \"d7317524-d276-4126-be2a-236c742b9254\") " pod="openshift-logging/collector-bqh6c" Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.018047 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/d7317524-d276-4126-be2a-236c742b9254-entrypoint\") pod \"collector-bqh6c\" (UID: \"d7317524-d276-4126-be2a-236c742b9254\") " pod="openshift-logging/collector-bqh6c" Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.018081 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7317524-d276-4126-be2a-236c742b9254-config\") pod \"collector-bqh6c\" (UID: \"d7317524-d276-4126-be2a-236c742b9254\") " pod="openshift-logging/collector-bqh6c" Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.018222 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/d7317524-d276-4126-be2a-236c742b9254-collector-syslog-receiver\") pod \"collector-bqh6c\" (UID: \"d7317524-d276-4126-be2a-236c742b9254\") " pod="openshift-logging/collector-bqh6c" Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.018377 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/d7317524-d276-4126-be2a-236c742b9254-config-openshift-service-cacrt\") pod \"collector-bqh6c\" (UID: \"d7317524-d276-4126-be2a-236c742b9254\") " pod="openshift-logging/collector-bqh6c" Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.018454 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tk7jt\" (UniqueName: \"kubernetes.io/projected/d7317524-d276-4126-be2a-236c742b9254-kube-api-access-tk7jt\") pod \"collector-bqh6c\" (UID: \"d7317524-d276-4126-be2a-236c742b9254\") " pod="openshift-logging/collector-bqh6c" Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.018551 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/d7317524-d276-4126-be2a-236c742b9254-datadir\") pod \"collector-bqh6c\" (UID: \"d7317524-d276-4126-be2a-236c742b9254\") " pod="openshift-logging/collector-bqh6c" Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.018596 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/d7317524-d276-4126-be2a-236c742b9254-sa-token\") pod \"collector-bqh6c\" (UID: \"d7317524-d276-4126-be2a-236c742b9254\") " pod="openshift-logging/collector-bqh6c" Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.018644 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/d7317524-d276-4126-be2a-236c742b9254-collector-token\") pod \"collector-bqh6c\" (UID: \"d7317524-d276-4126-be2a-236c742b9254\") " pod="openshift-logging/collector-bqh6c" Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.018694 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d7317524-d276-4126-be2a-236c742b9254-trusted-ca\") pod \"collector-bqh6c\" (UID: \"d7317524-d276-4126-be2a-236c742b9254\") " pod="openshift-logging/collector-bqh6c" Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.018707 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-bqh6c" Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.018740 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d7317524-d276-4126-be2a-236c742b9254-tmp\") pod \"collector-bqh6c\" (UID: \"d7317524-d276-4126-be2a-236c742b9254\") " pod="openshift-logging/collector-bqh6c" Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.120485 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/d7317524-d276-4126-be2a-236c742b9254-entrypoint\") pod \"collector-bqh6c\" (UID: \"d7317524-d276-4126-be2a-236c742b9254\") " pod="openshift-logging/collector-bqh6c" Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.120536 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7317524-d276-4126-be2a-236c742b9254-config\") pod \"collector-bqh6c\" (UID: \"d7317524-d276-4126-be2a-236c742b9254\") " pod="openshift-logging/collector-bqh6c" Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.120578 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/d7317524-d276-4126-be2a-236c742b9254-collector-syslog-receiver\") pod \"collector-bqh6c\" (UID: \"d7317524-d276-4126-be2a-236c742b9254\") " pod="openshift-logging/collector-bqh6c" Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.120707 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/d7317524-d276-4126-be2a-236c742b9254-config-openshift-service-cacrt\") pod \"collector-bqh6c\" (UID: \"d7317524-d276-4126-be2a-236c742b9254\") " pod="openshift-logging/collector-bqh6c" Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.120749 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tk7jt\" (UniqueName: \"kubernetes.io/projected/d7317524-d276-4126-be2a-236c742b9254-kube-api-access-tk7jt\") pod \"collector-bqh6c\" (UID: \"d7317524-d276-4126-be2a-236c742b9254\") " pod="openshift-logging/collector-bqh6c" Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.120787 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/d7317524-d276-4126-be2a-236c742b9254-datadir\") pod \"collector-bqh6c\" (UID: \"d7317524-d276-4126-be2a-236c742b9254\") " pod="openshift-logging/collector-bqh6c" Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.120811 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/d7317524-d276-4126-be2a-236c742b9254-sa-token\") pod \"collector-bqh6c\" (UID: \"d7317524-d276-4126-be2a-236c742b9254\") " pod="openshift-logging/collector-bqh6c" Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.120843 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/d7317524-d276-4126-be2a-236c742b9254-collector-token\") pod \"collector-bqh6c\" (UID: \"d7317524-d276-4126-be2a-236c742b9254\") " pod="openshift-logging/collector-bqh6c" Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.120869 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d7317524-d276-4126-be2a-236c742b9254-trusted-ca\") pod \"collector-bqh6c\" (UID: \"d7317524-d276-4126-be2a-236c742b9254\") " pod="openshift-logging/collector-bqh6c" Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.120898 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d7317524-d276-4126-be2a-236c742b9254-tmp\") pod \"collector-bqh6c\" (UID: \"d7317524-d276-4126-be2a-236c742b9254\") " pod="openshift-logging/collector-bqh6c" Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.120929 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/d7317524-d276-4126-be2a-236c742b9254-datadir\") pod \"collector-bqh6c\" (UID: \"d7317524-d276-4126-be2a-236c742b9254\") " pod="openshift-logging/collector-bqh6c" Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.120960 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/d7317524-d276-4126-be2a-236c742b9254-metrics\") pod \"collector-bqh6c\" (UID: \"d7317524-d276-4126-be2a-236c742b9254\") " pod="openshift-logging/collector-bqh6c" Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.122148 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7317524-d276-4126-be2a-236c742b9254-config\") pod \"collector-bqh6c\" (UID: \"d7317524-d276-4126-be2a-236c742b9254\") " pod="openshift-logging/collector-bqh6c" Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.122605 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/d7317524-d276-4126-be2a-236c742b9254-entrypoint\") pod \"collector-bqh6c\" (UID: \"d7317524-d276-4126-be2a-236c742b9254\") " pod="openshift-logging/collector-bqh6c" Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.123035 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/d7317524-d276-4126-be2a-236c742b9254-config-openshift-service-cacrt\") pod \"collector-bqh6c\" (UID: \"d7317524-d276-4126-be2a-236c742b9254\") " pod="openshift-logging/collector-bqh6c" Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.123715 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d7317524-d276-4126-be2a-236c742b9254-trusted-ca\") pod \"collector-bqh6c\" (UID: \"d7317524-d276-4126-be2a-236c742b9254\") " pod="openshift-logging/collector-bqh6c" Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.128010 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d7317524-d276-4126-be2a-236c742b9254-tmp\") pod \"collector-bqh6c\" (UID: \"d7317524-d276-4126-be2a-236c742b9254\") " pod="openshift-logging/collector-bqh6c" Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.129119 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/d7317524-d276-4126-be2a-236c742b9254-collector-syslog-receiver\") pod \"collector-bqh6c\" (UID: \"d7317524-d276-4126-be2a-236c742b9254\") " pod="openshift-logging/collector-bqh6c" Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.129949 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/d7317524-d276-4126-be2a-236c742b9254-metrics\") pod \"collector-bqh6c\" (UID: \"d7317524-d276-4126-be2a-236c742b9254\") " pod="openshift-logging/collector-bqh6c" Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.135668 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/d7317524-d276-4126-be2a-236c742b9254-collector-token\") pod \"collector-bqh6c\" (UID: \"d7317524-d276-4126-be2a-236c742b9254\") " pod="openshift-logging/collector-bqh6c" Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.138150 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/d7317524-d276-4126-be2a-236c742b9254-sa-token\") pod \"collector-bqh6c\" (UID: \"d7317524-d276-4126-be2a-236c742b9254\") " pod="openshift-logging/collector-bqh6c" Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.141140 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tk7jt\" (UniqueName: \"kubernetes.io/projected/d7317524-d276-4126-be2a-236c742b9254-kube-api-access-tk7jt\") pod \"collector-bqh6c\" (UID: \"d7317524-d276-4126-be2a-236c742b9254\") " pod="openshift-logging/collector-bqh6c" Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.222253 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/d7317524-d276-4126-be2a-236c742b9254-sa-token\") pod \"d7317524-d276-4126-be2a-236c742b9254\" (UID: \"d7317524-d276-4126-be2a-236c742b9254\") " Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.222332 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/d7317524-d276-4126-be2a-236c742b9254-datadir\") pod \"d7317524-d276-4126-be2a-236c742b9254\" (UID: \"d7317524-d276-4126-be2a-236c742b9254\") " Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.222377 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d7317524-d276-4126-be2a-236c742b9254-trusted-ca\") pod \"d7317524-d276-4126-be2a-236c742b9254\" (UID: \"d7317524-d276-4126-be2a-236c742b9254\") " Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.222415 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/d7317524-d276-4126-be2a-236c742b9254-metrics\") pod \"d7317524-d276-4126-be2a-236c742b9254\" (UID: \"d7317524-d276-4126-be2a-236c742b9254\") " Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.222439 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d7317524-d276-4126-be2a-236c742b9254-tmp\") pod \"d7317524-d276-4126-be2a-236c742b9254\" (UID: \"d7317524-d276-4126-be2a-236c742b9254\") " Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.222464 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/d7317524-d276-4126-be2a-236c742b9254-config-openshift-service-cacrt\") pod \"d7317524-d276-4126-be2a-236c742b9254\" (UID: \"d7317524-d276-4126-be2a-236c742b9254\") " Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.222486 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/d7317524-d276-4126-be2a-236c742b9254-collector-syslog-receiver\") pod \"d7317524-d276-4126-be2a-236c742b9254\" (UID: \"d7317524-d276-4126-be2a-236c742b9254\") " Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.222514 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/d7317524-d276-4126-be2a-236c742b9254-collector-token\") pod \"d7317524-d276-4126-be2a-236c742b9254\" (UID: \"d7317524-d276-4126-be2a-236c742b9254\") " Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.222498 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d7317524-d276-4126-be2a-236c742b9254-datadir" (OuterVolumeSpecName: "datadir") pod "d7317524-d276-4126-be2a-236c742b9254" (UID: "d7317524-d276-4126-be2a-236c742b9254"). InnerVolumeSpecName "datadir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.222537 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7317524-d276-4126-be2a-236c742b9254-config\") pod \"d7317524-d276-4126-be2a-236c742b9254\" (UID: \"d7317524-d276-4126-be2a-236c742b9254\") " Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.222562 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/d7317524-d276-4126-be2a-236c742b9254-entrypoint\") pod \"d7317524-d276-4126-be2a-236c742b9254\" (UID: \"d7317524-d276-4126-be2a-236c742b9254\") " Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.223029 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7317524-d276-4126-be2a-236c742b9254-config" (OuterVolumeSpecName: "config") pod "d7317524-d276-4126-be2a-236c742b9254" (UID: "d7317524-d276-4126-be2a-236c742b9254"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.223120 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk7jt\" (UniqueName: \"kubernetes.io/projected/d7317524-d276-4126-be2a-236c742b9254-kube-api-access-tk7jt\") pod \"d7317524-d276-4126-be2a-236c742b9254\" (UID: \"d7317524-d276-4126-be2a-236c742b9254\") " Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.223130 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7317524-d276-4126-be2a-236c742b9254-entrypoint" (OuterVolumeSpecName: "entrypoint") pod "d7317524-d276-4126-be2a-236c742b9254" (UID: "d7317524-d276-4126-be2a-236c742b9254"). InnerVolumeSpecName "entrypoint". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.223441 4826 reconciler_common.go:293] "Volume detached for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/d7317524-d276-4126-be2a-236c742b9254-datadir\") on node \"crc\" DevicePath \"\"" Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.223458 4826 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7317524-d276-4126-be2a-236c742b9254-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.223467 4826 reconciler_common.go:293] "Volume detached for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/d7317524-d276-4126-be2a-236c742b9254-entrypoint\") on node \"crc\" DevicePath \"\"" Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.223515 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7317524-d276-4126-be2a-236c742b9254-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "d7317524-d276-4126-be2a-236c742b9254" (UID: "d7317524-d276-4126-be2a-236c742b9254"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.223790 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7317524-d276-4126-be2a-236c742b9254-config-openshift-service-cacrt" (OuterVolumeSpecName: "config-openshift-service-cacrt") pod "d7317524-d276-4126-be2a-236c742b9254" (UID: "d7317524-d276-4126-be2a-236c742b9254"). InnerVolumeSpecName "config-openshift-service-cacrt". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.225801 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7317524-d276-4126-be2a-236c742b9254-collector-syslog-receiver" (OuterVolumeSpecName: "collector-syslog-receiver") pod "d7317524-d276-4126-be2a-236c742b9254" (UID: "d7317524-d276-4126-be2a-236c742b9254"). InnerVolumeSpecName "collector-syslog-receiver". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.225956 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7317524-d276-4126-be2a-236c742b9254-tmp" (OuterVolumeSpecName: "tmp") pod "d7317524-d276-4126-be2a-236c742b9254" (UID: "d7317524-d276-4126-be2a-236c742b9254"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.226014 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7317524-d276-4126-be2a-236c742b9254-sa-token" (OuterVolumeSpecName: "sa-token") pod "d7317524-d276-4126-be2a-236c742b9254" (UID: "d7317524-d276-4126-be2a-236c742b9254"). InnerVolumeSpecName "sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.226508 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7317524-d276-4126-be2a-236c742b9254-kube-api-access-tk7jt" (OuterVolumeSpecName: "kube-api-access-tk7jt") pod "d7317524-d276-4126-be2a-236c742b9254" (UID: "d7317524-d276-4126-be2a-236c742b9254"). InnerVolumeSpecName "kube-api-access-tk7jt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.227162 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7317524-d276-4126-be2a-236c742b9254-metrics" (OuterVolumeSpecName: "metrics") pod "d7317524-d276-4126-be2a-236c742b9254" (UID: "d7317524-d276-4126-be2a-236c742b9254"). InnerVolumeSpecName "metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.227186 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7317524-d276-4126-be2a-236c742b9254-collector-token" (OuterVolumeSpecName: "collector-token") pod "d7317524-d276-4126-be2a-236c742b9254" (UID: "d7317524-d276-4126-be2a-236c742b9254"). InnerVolumeSpecName "collector-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.325368 4826 reconciler_common.go:293] "Volume detached for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/d7317524-d276-4126-be2a-236c742b9254-metrics\") on node \"crc\" DevicePath \"\"" Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.325420 4826 reconciler_common.go:293] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d7317524-d276-4126-be2a-236c742b9254-tmp\") on node \"crc\" DevicePath \"\"" Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.325438 4826 reconciler_common.go:293] "Volume detached for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/d7317524-d276-4126-be2a-236c742b9254-config-openshift-service-cacrt\") on node \"crc\" DevicePath \"\"" Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.325461 4826 reconciler_common.go:293] "Volume detached for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/d7317524-d276-4126-be2a-236c742b9254-collector-syslog-receiver\") on node \"crc\" DevicePath \"\"" Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.325483 4826 reconciler_common.go:293] "Volume detached for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/d7317524-d276-4126-be2a-236c742b9254-collector-token\") on node \"crc\" DevicePath \"\"" Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.325500 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk7jt\" (UniqueName: \"kubernetes.io/projected/d7317524-d276-4126-be2a-236c742b9254-kube-api-access-tk7jt\") on node \"crc\" DevicePath \"\"" Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.325516 4826 reconciler_common.go:293] "Volume detached for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/d7317524-d276-4126-be2a-236c742b9254-sa-token\") on node \"crc\" DevicePath \"\"" Mar 19 19:10:53 crc kubenswrapper[4826]: I0319 19:10:53.325534 4826 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d7317524-d276-4126-be2a-236c742b9254-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 19:10:54 crc kubenswrapper[4826]: I0319 19:10:54.019216 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-bqh6c" Mar 19 19:10:54 crc kubenswrapper[4826]: I0319 19:10:54.079974 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-logging/collector-bqh6c"] Mar 19 19:10:54 crc kubenswrapper[4826]: I0319 19:10:54.085479 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-logging/collector-bqh6c"] Mar 19 19:10:54 crc kubenswrapper[4826]: I0319 19:10:54.136809 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/collector-76xlf"] Mar 19 19:10:54 crc kubenswrapper[4826]: I0319 19:10:54.137697 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-76xlf" Mar 19 19:10:54 crc kubenswrapper[4826]: I0319 19:10:54.140822 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-metrics" Mar 19 19:10:54 crc kubenswrapper[4826]: I0319 19:10:54.140886 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-syslog-receiver" Mar 19 19:10:54 crc kubenswrapper[4826]: I0319 19:10:54.141010 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-config" Mar 19 19:10:54 crc kubenswrapper[4826]: I0319 19:10:54.141179 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-dockercfg-xdnmr" Mar 19 19:10:54 crc kubenswrapper[4826]: I0319 19:10:54.141413 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-token" Mar 19 19:10:54 crc kubenswrapper[4826]: I0319 19:10:54.153747 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-trustbundle" Mar 19 19:10:54 crc kubenswrapper[4826]: I0319 19:10:54.201729 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-76xlf"] Mar 19 19:10:54 crc kubenswrapper[4826]: I0319 19:10:54.238180 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ec6a09bc-174e-4e15-a61c-74bac9b3baa3-tmp\") pod \"collector-76xlf\" (UID: \"ec6a09bc-174e-4e15-a61c-74bac9b3baa3\") " pod="openshift-logging/collector-76xlf" Mar 19 19:10:54 crc kubenswrapper[4826]: I0319 19:10:54.238230 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/ec6a09bc-174e-4e15-a61c-74bac9b3baa3-collector-token\") pod \"collector-76xlf\" (UID: \"ec6a09bc-174e-4e15-a61c-74bac9b3baa3\") " pod="openshift-logging/collector-76xlf" Mar 19 19:10:54 crc kubenswrapper[4826]: I0319 19:10:54.238249 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/ec6a09bc-174e-4e15-a61c-74bac9b3baa3-metrics\") pod \"collector-76xlf\" (UID: \"ec6a09bc-174e-4e15-a61c-74bac9b3baa3\") " pod="openshift-logging/collector-76xlf" Mar 19 19:10:54 crc kubenswrapper[4826]: I0319 19:10:54.238346 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/ec6a09bc-174e-4e15-a61c-74bac9b3baa3-sa-token\") pod \"collector-76xlf\" (UID: \"ec6a09bc-174e-4e15-a61c-74bac9b3baa3\") " pod="openshift-logging/collector-76xlf" Mar 19 19:10:54 crc kubenswrapper[4826]: I0319 19:10:54.238380 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ec6a09bc-174e-4e15-a61c-74bac9b3baa3-trusted-ca\") pod \"collector-76xlf\" (UID: \"ec6a09bc-174e-4e15-a61c-74bac9b3baa3\") " pod="openshift-logging/collector-76xlf" Mar 19 19:10:54 crc kubenswrapper[4826]: I0319 19:10:54.238401 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/ec6a09bc-174e-4e15-a61c-74bac9b3baa3-datadir\") pod \"collector-76xlf\" (UID: \"ec6a09bc-174e-4e15-a61c-74bac9b3baa3\") " pod="openshift-logging/collector-76xlf" Mar 19 19:10:54 crc kubenswrapper[4826]: I0319 19:10:54.238457 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/ec6a09bc-174e-4e15-a61c-74bac9b3baa3-collector-syslog-receiver\") pod \"collector-76xlf\" (UID: \"ec6a09bc-174e-4e15-a61c-74bac9b3baa3\") " pod="openshift-logging/collector-76xlf" Mar 19 19:10:54 crc kubenswrapper[4826]: I0319 19:10:54.238529 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/ec6a09bc-174e-4e15-a61c-74bac9b3baa3-entrypoint\") pod \"collector-76xlf\" (UID: \"ec6a09bc-174e-4e15-a61c-74bac9b3baa3\") " pod="openshift-logging/collector-76xlf" Mar 19 19:10:54 crc kubenswrapper[4826]: I0319 19:10:54.238567 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59shs\" (UniqueName: \"kubernetes.io/projected/ec6a09bc-174e-4e15-a61c-74bac9b3baa3-kube-api-access-59shs\") pod \"collector-76xlf\" (UID: \"ec6a09bc-174e-4e15-a61c-74bac9b3baa3\") " pod="openshift-logging/collector-76xlf" Mar 19 19:10:54 crc kubenswrapper[4826]: I0319 19:10:54.238633 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec6a09bc-174e-4e15-a61c-74bac9b3baa3-config\") pod \"collector-76xlf\" (UID: \"ec6a09bc-174e-4e15-a61c-74bac9b3baa3\") " pod="openshift-logging/collector-76xlf" Mar 19 19:10:54 crc kubenswrapper[4826]: I0319 19:10:54.238712 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/ec6a09bc-174e-4e15-a61c-74bac9b3baa3-config-openshift-service-cacrt\") pod \"collector-76xlf\" (UID: \"ec6a09bc-174e-4e15-a61c-74bac9b3baa3\") " pod="openshift-logging/collector-76xlf" Mar 19 19:10:54 crc kubenswrapper[4826]: I0319 19:10:54.339615 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ec6a09bc-174e-4e15-a61c-74bac9b3baa3-tmp\") pod \"collector-76xlf\" (UID: \"ec6a09bc-174e-4e15-a61c-74bac9b3baa3\") " pod="openshift-logging/collector-76xlf" Mar 19 19:10:54 crc kubenswrapper[4826]: I0319 19:10:54.339700 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/ec6a09bc-174e-4e15-a61c-74bac9b3baa3-collector-token\") pod \"collector-76xlf\" (UID: \"ec6a09bc-174e-4e15-a61c-74bac9b3baa3\") " pod="openshift-logging/collector-76xlf" Mar 19 19:10:54 crc kubenswrapper[4826]: I0319 19:10:54.339732 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/ec6a09bc-174e-4e15-a61c-74bac9b3baa3-metrics\") pod \"collector-76xlf\" (UID: \"ec6a09bc-174e-4e15-a61c-74bac9b3baa3\") " pod="openshift-logging/collector-76xlf" Mar 19 19:10:54 crc kubenswrapper[4826]: I0319 19:10:54.339777 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/ec6a09bc-174e-4e15-a61c-74bac9b3baa3-sa-token\") pod \"collector-76xlf\" (UID: \"ec6a09bc-174e-4e15-a61c-74bac9b3baa3\") " pod="openshift-logging/collector-76xlf" Mar 19 19:10:54 crc kubenswrapper[4826]: I0319 19:10:54.339821 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ec6a09bc-174e-4e15-a61c-74bac9b3baa3-trusted-ca\") pod \"collector-76xlf\" (UID: \"ec6a09bc-174e-4e15-a61c-74bac9b3baa3\") " pod="openshift-logging/collector-76xlf" Mar 19 19:10:54 crc kubenswrapper[4826]: I0319 19:10:54.339848 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/ec6a09bc-174e-4e15-a61c-74bac9b3baa3-datadir\") pod \"collector-76xlf\" (UID: \"ec6a09bc-174e-4e15-a61c-74bac9b3baa3\") " pod="openshift-logging/collector-76xlf" Mar 19 19:10:54 crc kubenswrapper[4826]: I0319 19:10:54.339920 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/ec6a09bc-174e-4e15-a61c-74bac9b3baa3-collector-syslog-receiver\") pod \"collector-76xlf\" (UID: \"ec6a09bc-174e-4e15-a61c-74bac9b3baa3\") " pod="openshift-logging/collector-76xlf" Mar 19 19:10:54 crc kubenswrapper[4826]: I0319 19:10:54.339948 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/ec6a09bc-174e-4e15-a61c-74bac9b3baa3-entrypoint\") pod \"collector-76xlf\" (UID: \"ec6a09bc-174e-4e15-a61c-74bac9b3baa3\") " pod="openshift-logging/collector-76xlf" Mar 19 19:10:54 crc kubenswrapper[4826]: I0319 19:10:54.339966 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59shs\" (UniqueName: \"kubernetes.io/projected/ec6a09bc-174e-4e15-a61c-74bac9b3baa3-kube-api-access-59shs\") pod \"collector-76xlf\" (UID: \"ec6a09bc-174e-4e15-a61c-74bac9b3baa3\") " pod="openshift-logging/collector-76xlf" Mar 19 19:10:54 crc kubenswrapper[4826]: I0319 19:10:54.340003 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec6a09bc-174e-4e15-a61c-74bac9b3baa3-config\") pod \"collector-76xlf\" (UID: \"ec6a09bc-174e-4e15-a61c-74bac9b3baa3\") " pod="openshift-logging/collector-76xlf" Mar 19 19:10:54 crc kubenswrapper[4826]: I0319 19:10:54.340037 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/ec6a09bc-174e-4e15-a61c-74bac9b3baa3-config-openshift-service-cacrt\") pod \"collector-76xlf\" (UID: \"ec6a09bc-174e-4e15-a61c-74bac9b3baa3\") " pod="openshift-logging/collector-76xlf" Mar 19 19:10:54 crc kubenswrapper[4826]: I0319 19:10:54.340641 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/ec6a09bc-174e-4e15-a61c-74bac9b3baa3-config-openshift-service-cacrt\") pod \"collector-76xlf\" (UID: \"ec6a09bc-174e-4e15-a61c-74bac9b3baa3\") " pod="openshift-logging/collector-76xlf" Mar 19 19:10:54 crc kubenswrapper[4826]: I0319 19:10:54.341329 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/ec6a09bc-174e-4e15-a61c-74bac9b3baa3-entrypoint\") pod \"collector-76xlf\" (UID: \"ec6a09bc-174e-4e15-a61c-74bac9b3baa3\") " pod="openshift-logging/collector-76xlf" Mar 19 19:10:54 crc kubenswrapper[4826]: I0319 19:10:54.341760 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ec6a09bc-174e-4e15-a61c-74bac9b3baa3-trusted-ca\") pod \"collector-76xlf\" (UID: \"ec6a09bc-174e-4e15-a61c-74bac9b3baa3\") " pod="openshift-logging/collector-76xlf" Mar 19 19:10:54 crc kubenswrapper[4826]: I0319 19:10:54.341873 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/ec6a09bc-174e-4e15-a61c-74bac9b3baa3-datadir\") pod \"collector-76xlf\" (UID: \"ec6a09bc-174e-4e15-a61c-74bac9b3baa3\") " pod="openshift-logging/collector-76xlf" Mar 19 19:10:54 crc kubenswrapper[4826]: I0319 19:10:54.342274 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec6a09bc-174e-4e15-a61c-74bac9b3baa3-config\") pod \"collector-76xlf\" (UID: \"ec6a09bc-174e-4e15-a61c-74bac9b3baa3\") " pod="openshift-logging/collector-76xlf" Mar 19 19:10:54 crc kubenswrapper[4826]: I0319 19:10:54.343229 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ec6a09bc-174e-4e15-a61c-74bac9b3baa3-tmp\") pod \"collector-76xlf\" (UID: \"ec6a09bc-174e-4e15-a61c-74bac9b3baa3\") " pod="openshift-logging/collector-76xlf" Mar 19 19:10:54 crc kubenswrapper[4826]: I0319 19:10:54.343439 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/ec6a09bc-174e-4e15-a61c-74bac9b3baa3-collector-syslog-receiver\") pod \"collector-76xlf\" (UID: \"ec6a09bc-174e-4e15-a61c-74bac9b3baa3\") " pod="openshift-logging/collector-76xlf" Mar 19 19:10:54 crc kubenswrapper[4826]: I0319 19:10:54.343591 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/ec6a09bc-174e-4e15-a61c-74bac9b3baa3-metrics\") pod \"collector-76xlf\" (UID: \"ec6a09bc-174e-4e15-a61c-74bac9b3baa3\") " pod="openshift-logging/collector-76xlf" Mar 19 19:10:54 crc kubenswrapper[4826]: I0319 19:10:54.344151 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/ec6a09bc-174e-4e15-a61c-74bac9b3baa3-collector-token\") pod \"collector-76xlf\" (UID: \"ec6a09bc-174e-4e15-a61c-74bac9b3baa3\") " pod="openshift-logging/collector-76xlf" Mar 19 19:10:54 crc kubenswrapper[4826]: I0319 19:10:54.364790 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/ec6a09bc-174e-4e15-a61c-74bac9b3baa3-sa-token\") pod \"collector-76xlf\" (UID: \"ec6a09bc-174e-4e15-a61c-74bac9b3baa3\") " pod="openshift-logging/collector-76xlf" Mar 19 19:10:54 crc kubenswrapper[4826]: I0319 19:10:54.365982 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59shs\" (UniqueName: \"kubernetes.io/projected/ec6a09bc-174e-4e15-a61c-74bac9b3baa3-kube-api-access-59shs\") pod \"collector-76xlf\" (UID: \"ec6a09bc-174e-4e15-a61c-74bac9b3baa3\") " pod="openshift-logging/collector-76xlf" Mar 19 19:10:54 crc kubenswrapper[4826]: I0319 19:10:54.457204 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-76xlf" Mar 19 19:10:54 crc kubenswrapper[4826]: I0319 19:10:54.878900 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-76xlf"] Mar 19 19:10:55 crc kubenswrapper[4826]: I0319 19:10:55.028555 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/collector-76xlf" event={"ID":"ec6a09bc-174e-4e15-a61c-74bac9b3baa3","Type":"ContainerStarted","Data":"9cf33b3e7db205691377c3528a8bbd7f88f04ca32b19d586e48eca4c5240f5ee"} Mar 19 19:10:55 crc kubenswrapper[4826]: I0319 19:10:55.991820 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7317524-d276-4126-be2a-236c742b9254" path="/var/lib/kubelet/pods/d7317524-d276-4126-be2a-236c742b9254/volumes" Mar 19 19:11:00 crc kubenswrapper[4826]: I0319 19:11:00.074719 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/collector-76xlf" event={"ID":"ec6a09bc-174e-4e15-a61c-74bac9b3baa3","Type":"ContainerStarted","Data":"d461a70b659456881d78da518ed279ce1bbf0f33b92179ae07aeb2c786e6d14b"} Mar 19 19:11:00 crc kubenswrapper[4826]: I0319 19:11:00.102728 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/collector-76xlf" podStartSLOduration=1.819292197 podStartE2EDuration="6.102709857s" podCreationTimestamp="2026-03-19 19:10:54 +0000 UTC" firstStartedPulling="2026-03-19 19:10:54.880366806 +0000 UTC m=+879.634435119" lastFinishedPulling="2026-03-19 19:10:59.163784426 +0000 UTC m=+883.917852779" observedRunningTime="2026-03-19 19:11:00.099993071 +0000 UTC m=+884.854061474" watchObservedRunningTime="2026-03-19 19:11:00.102709857 +0000 UTC m=+884.856778180" Mar 19 19:11:12 crc kubenswrapper[4826]: I0319 19:11:12.145335 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dmh28"] Mar 19 19:11:12 crc kubenswrapper[4826]: I0319 19:11:12.155554 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dmh28"] Mar 19 19:11:12 crc kubenswrapper[4826]: I0319 19:11:12.155696 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dmh28" Mar 19 19:11:12 crc kubenswrapper[4826]: I0319 19:11:12.290866 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb58021b-8ae1-4b43-a6af-e1e57a9e9ec0-utilities\") pod \"redhat-marketplace-dmh28\" (UID: \"fb58021b-8ae1-4b43-a6af-e1e57a9e9ec0\") " pod="openshift-marketplace/redhat-marketplace-dmh28" Mar 19 19:11:12 crc kubenswrapper[4826]: I0319 19:11:12.291277 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjlhp\" (UniqueName: \"kubernetes.io/projected/fb58021b-8ae1-4b43-a6af-e1e57a9e9ec0-kube-api-access-xjlhp\") pod \"redhat-marketplace-dmh28\" (UID: \"fb58021b-8ae1-4b43-a6af-e1e57a9e9ec0\") " pod="openshift-marketplace/redhat-marketplace-dmh28" Mar 19 19:11:12 crc kubenswrapper[4826]: I0319 19:11:12.291343 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb58021b-8ae1-4b43-a6af-e1e57a9e9ec0-catalog-content\") pod \"redhat-marketplace-dmh28\" (UID: \"fb58021b-8ae1-4b43-a6af-e1e57a9e9ec0\") " pod="openshift-marketplace/redhat-marketplace-dmh28" Mar 19 19:11:12 crc kubenswrapper[4826]: I0319 19:11:12.392943 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjlhp\" (UniqueName: \"kubernetes.io/projected/fb58021b-8ae1-4b43-a6af-e1e57a9e9ec0-kube-api-access-xjlhp\") pod \"redhat-marketplace-dmh28\" (UID: \"fb58021b-8ae1-4b43-a6af-e1e57a9e9ec0\") " pod="openshift-marketplace/redhat-marketplace-dmh28" Mar 19 19:11:12 crc kubenswrapper[4826]: I0319 19:11:12.393033 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb58021b-8ae1-4b43-a6af-e1e57a9e9ec0-catalog-content\") pod \"redhat-marketplace-dmh28\" (UID: \"fb58021b-8ae1-4b43-a6af-e1e57a9e9ec0\") " pod="openshift-marketplace/redhat-marketplace-dmh28" Mar 19 19:11:12 crc kubenswrapper[4826]: I0319 19:11:12.393082 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb58021b-8ae1-4b43-a6af-e1e57a9e9ec0-utilities\") pod \"redhat-marketplace-dmh28\" (UID: \"fb58021b-8ae1-4b43-a6af-e1e57a9e9ec0\") " pod="openshift-marketplace/redhat-marketplace-dmh28" Mar 19 19:11:12 crc kubenswrapper[4826]: I0319 19:11:12.393700 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb58021b-8ae1-4b43-a6af-e1e57a9e9ec0-utilities\") pod \"redhat-marketplace-dmh28\" (UID: \"fb58021b-8ae1-4b43-a6af-e1e57a9e9ec0\") " pod="openshift-marketplace/redhat-marketplace-dmh28" Mar 19 19:11:12 crc kubenswrapper[4826]: I0319 19:11:12.393862 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb58021b-8ae1-4b43-a6af-e1e57a9e9ec0-catalog-content\") pod \"redhat-marketplace-dmh28\" (UID: \"fb58021b-8ae1-4b43-a6af-e1e57a9e9ec0\") " pod="openshift-marketplace/redhat-marketplace-dmh28" Mar 19 19:11:12 crc kubenswrapper[4826]: I0319 19:11:12.413446 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjlhp\" (UniqueName: \"kubernetes.io/projected/fb58021b-8ae1-4b43-a6af-e1e57a9e9ec0-kube-api-access-xjlhp\") pod \"redhat-marketplace-dmh28\" (UID: \"fb58021b-8ae1-4b43-a6af-e1e57a9e9ec0\") " pod="openshift-marketplace/redhat-marketplace-dmh28" Mar 19 19:11:12 crc kubenswrapper[4826]: I0319 19:11:12.481143 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dmh28" Mar 19 19:11:12 crc kubenswrapper[4826]: I0319 19:11:12.935513 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dmh28"] Mar 19 19:11:13 crc kubenswrapper[4826]: I0319 19:11:13.201766 4826 generic.go:334] "Generic (PLEG): container finished" podID="fb58021b-8ae1-4b43-a6af-e1e57a9e9ec0" containerID="f3f1edd3c076f036101b3c40c3a65a6adb296d8bbed9e027eebd4f204bf57127" exitCode=0 Mar 19 19:11:13 crc kubenswrapper[4826]: I0319 19:11:13.201829 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dmh28" event={"ID":"fb58021b-8ae1-4b43-a6af-e1e57a9e9ec0","Type":"ContainerDied","Data":"f3f1edd3c076f036101b3c40c3a65a6adb296d8bbed9e027eebd4f204bf57127"} Mar 19 19:11:13 crc kubenswrapper[4826]: I0319 19:11:13.201911 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dmh28" event={"ID":"fb58021b-8ae1-4b43-a6af-e1e57a9e9ec0","Type":"ContainerStarted","Data":"7d97319fa262be761ff369bb7505d9ad9c62ba39c4c119150e930b68e0a1ccf9"} Mar 19 19:11:15 crc kubenswrapper[4826]: I0319 19:11:15.222981 4826 generic.go:334] "Generic (PLEG): container finished" podID="fb58021b-8ae1-4b43-a6af-e1e57a9e9ec0" containerID="9ed41f3822a8304e04979e86ce35e8cbeb0677462e0e24b59bb97f9799354ab5" exitCode=0 Mar 19 19:11:15 crc kubenswrapper[4826]: I0319 19:11:15.223596 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dmh28" event={"ID":"fb58021b-8ae1-4b43-a6af-e1e57a9e9ec0","Type":"ContainerDied","Data":"9ed41f3822a8304e04979e86ce35e8cbeb0677462e0e24b59bb97f9799354ab5"} Mar 19 19:11:16 crc kubenswrapper[4826]: I0319 19:11:16.234076 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dmh28" event={"ID":"fb58021b-8ae1-4b43-a6af-e1e57a9e9ec0","Type":"ContainerStarted","Data":"d4d086f860457e06a76634d8648db5c3123c55b723c9efe84cd21a64cefdad84"} Mar 19 19:11:16 crc kubenswrapper[4826]: I0319 19:11:16.256836 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dmh28" podStartSLOduration=1.816674768 podStartE2EDuration="4.256818784s" podCreationTimestamp="2026-03-19 19:11:12 +0000 UTC" firstStartedPulling="2026-03-19 19:11:13.203965051 +0000 UTC m=+897.958033374" lastFinishedPulling="2026-03-19 19:11:15.644109037 +0000 UTC m=+900.398177390" observedRunningTime="2026-03-19 19:11:16.254513468 +0000 UTC m=+901.008581861" watchObservedRunningTime="2026-03-19 19:11:16.256818784 +0000 UTC m=+901.010887107" Mar 19 19:11:22 crc kubenswrapper[4826]: I0319 19:11:22.482035 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dmh28" Mar 19 19:11:22 crc kubenswrapper[4826]: I0319 19:11:22.482699 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dmh28" Mar 19 19:11:22 crc kubenswrapper[4826]: I0319 19:11:22.537156 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dmh28" Mar 19 19:11:23 crc kubenswrapper[4826]: I0319 19:11:23.378084 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dmh28" Mar 19 19:11:23 crc kubenswrapper[4826]: I0319 19:11:23.435499 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dmh28"] Mar 19 19:11:25 crc kubenswrapper[4826]: I0319 19:11:25.327250 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dmh28" podUID="fb58021b-8ae1-4b43-a6af-e1e57a9e9ec0" containerName="registry-server" containerID="cri-o://d4d086f860457e06a76634d8648db5c3123c55b723c9efe84cd21a64cefdad84" gracePeriod=2 Mar 19 19:11:26 crc kubenswrapper[4826]: I0319 19:11:26.336930 4826 generic.go:334] "Generic (PLEG): container finished" podID="fb58021b-8ae1-4b43-a6af-e1e57a9e9ec0" containerID="d4d086f860457e06a76634d8648db5c3123c55b723c9efe84cd21a64cefdad84" exitCode=0 Mar 19 19:11:26 crc kubenswrapper[4826]: I0319 19:11:26.336994 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dmh28" event={"ID":"fb58021b-8ae1-4b43-a6af-e1e57a9e9ec0","Type":"ContainerDied","Data":"d4d086f860457e06a76634d8648db5c3123c55b723c9efe84cd21a64cefdad84"} Mar 19 19:11:26 crc kubenswrapper[4826]: I0319 19:11:26.337284 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dmh28" event={"ID":"fb58021b-8ae1-4b43-a6af-e1e57a9e9ec0","Type":"ContainerDied","Data":"7d97319fa262be761ff369bb7505d9ad9c62ba39c4c119150e930b68e0a1ccf9"} Mar 19 19:11:26 crc kubenswrapper[4826]: I0319 19:11:26.337300 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d97319fa262be761ff369bb7505d9ad9c62ba39c4c119150e930b68e0a1ccf9" Mar 19 19:11:26 crc kubenswrapper[4826]: I0319 19:11:26.377580 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dmh28" Mar 19 19:11:26 crc kubenswrapper[4826]: I0319 19:11:26.531120 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb58021b-8ae1-4b43-a6af-e1e57a9e9ec0-utilities\") pod \"fb58021b-8ae1-4b43-a6af-e1e57a9e9ec0\" (UID: \"fb58021b-8ae1-4b43-a6af-e1e57a9e9ec0\") " Mar 19 19:11:26 crc kubenswrapper[4826]: I0319 19:11:26.531175 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjlhp\" (UniqueName: \"kubernetes.io/projected/fb58021b-8ae1-4b43-a6af-e1e57a9e9ec0-kube-api-access-xjlhp\") pod \"fb58021b-8ae1-4b43-a6af-e1e57a9e9ec0\" (UID: \"fb58021b-8ae1-4b43-a6af-e1e57a9e9ec0\") " Mar 19 19:11:26 crc kubenswrapper[4826]: I0319 19:11:26.531237 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb58021b-8ae1-4b43-a6af-e1e57a9e9ec0-catalog-content\") pod \"fb58021b-8ae1-4b43-a6af-e1e57a9e9ec0\" (UID: \"fb58021b-8ae1-4b43-a6af-e1e57a9e9ec0\") " Mar 19 19:11:26 crc kubenswrapper[4826]: I0319 19:11:26.532822 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb58021b-8ae1-4b43-a6af-e1e57a9e9ec0-utilities" (OuterVolumeSpecName: "utilities") pod "fb58021b-8ae1-4b43-a6af-e1e57a9e9ec0" (UID: "fb58021b-8ae1-4b43-a6af-e1e57a9e9ec0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:11:26 crc kubenswrapper[4826]: I0319 19:11:26.543643 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb58021b-8ae1-4b43-a6af-e1e57a9e9ec0-kube-api-access-xjlhp" (OuterVolumeSpecName: "kube-api-access-xjlhp") pod "fb58021b-8ae1-4b43-a6af-e1e57a9e9ec0" (UID: "fb58021b-8ae1-4b43-a6af-e1e57a9e9ec0"). InnerVolumeSpecName "kube-api-access-xjlhp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:11:26 crc kubenswrapper[4826]: I0319 19:11:26.577684 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb58021b-8ae1-4b43-a6af-e1e57a9e9ec0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fb58021b-8ae1-4b43-a6af-e1e57a9e9ec0" (UID: "fb58021b-8ae1-4b43-a6af-e1e57a9e9ec0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:11:26 crc kubenswrapper[4826]: I0319 19:11:26.633178 4826 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb58021b-8ae1-4b43-a6af-e1e57a9e9ec0-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 19:11:26 crc kubenswrapper[4826]: I0319 19:11:26.633230 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjlhp\" (UniqueName: \"kubernetes.io/projected/fb58021b-8ae1-4b43-a6af-e1e57a9e9ec0-kube-api-access-xjlhp\") on node \"crc\" DevicePath \"\"" Mar 19 19:11:26 crc kubenswrapper[4826]: I0319 19:11:26.633256 4826 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb58021b-8ae1-4b43-a6af-e1e57a9e9ec0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 19:11:27 crc kubenswrapper[4826]: I0319 19:11:27.350000 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dmh28" Mar 19 19:11:27 crc kubenswrapper[4826]: I0319 19:11:27.411376 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dmh28"] Mar 19 19:11:27 crc kubenswrapper[4826]: I0319 19:11:27.422749 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dmh28"] Mar 19 19:11:27 crc kubenswrapper[4826]: I0319 19:11:27.989250 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb58021b-8ae1-4b43-a6af-e1e57a9e9ec0" path="/var/lib/kubelet/pods/fb58021b-8ae1-4b43-a6af-e1e57a9e9ec0/volumes" Mar 19 19:11:32 crc kubenswrapper[4826]: I0319 19:11:32.167019 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n5kn7"] Mar 19 19:11:32 crc kubenswrapper[4826]: E0319 19:11:32.167735 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb58021b-8ae1-4b43-a6af-e1e57a9e9ec0" containerName="extract-utilities" Mar 19 19:11:32 crc kubenswrapper[4826]: I0319 19:11:32.167749 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb58021b-8ae1-4b43-a6af-e1e57a9e9ec0" containerName="extract-utilities" Mar 19 19:11:32 crc kubenswrapper[4826]: E0319 19:11:32.167761 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb58021b-8ae1-4b43-a6af-e1e57a9e9ec0" containerName="registry-server" Mar 19 19:11:32 crc kubenswrapper[4826]: I0319 19:11:32.167769 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb58021b-8ae1-4b43-a6af-e1e57a9e9ec0" containerName="registry-server" Mar 19 19:11:32 crc kubenswrapper[4826]: E0319 19:11:32.167796 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb58021b-8ae1-4b43-a6af-e1e57a9e9ec0" containerName="extract-content" Mar 19 19:11:32 crc kubenswrapper[4826]: I0319 19:11:32.167802 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb58021b-8ae1-4b43-a6af-e1e57a9e9ec0" containerName="extract-content" Mar 19 19:11:32 crc kubenswrapper[4826]: I0319 19:11:32.167930 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb58021b-8ae1-4b43-a6af-e1e57a9e9ec0" containerName="registry-server" Mar 19 19:11:32 crc kubenswrapper[4826]: I0319 19:11:32.169060 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n5kn7" Mar 19 19:11:32 crc kubenswrapper[4826]: I0319 19:11:32.177340 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 19 19:11:32 crc kubenswrapper[4826]: I0319 19:11:32.192543 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n5kn7"] Mar 19 19:11:32 crc kubenswrapper[4826]: I0319 19:11:32.225025 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvsw7\" (UniqueName: \"kubernetes.io/projected/d850e5c4-d43d-452c-9d27-69cb4cda0dd5-kube-api-access-qvsw7\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n5kn7\" (UID: \"d850e5c4-d43d-452c-9d27-69cb4cda0dd5\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n5kn7" Mar 19 19:11:32 crc kubenswrapper[4826]: I0319 19:11:32.225304 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d850e5c4-d43d-452c-9d27-69cb4cda0dd5-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n5kn7\" (UID: \"d850e5c4-d43d-452c-9d27-69cb4cda0dd5\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n5kn7" Mar 19 19:11:32 crc kubenswrapper[4826]: I0319 19:11:32.225408 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d850e5c4-d43d-452c-9d27-69cb4cda0dd5-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n5kn7\" (UID: \"d850e5c4-d43d-452c-9d27-69cb4cda0dd5\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n5kn7" Mar 19 19:11:32 crc kubenswrapper[4826]: I0319 19:11:32.326886 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvsw7\" (UniqueName: \"kubernetes.io/projected/d850e5c4-d43d-452c-9d27-69cb4cda0dd5-kube-api-access-qvsw7\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n5kn7\" (UID: \"d850e5c4-d43d-452c-9d27-69cb4cda0dd5\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n5kn7" Mar 19 19:11:32 crc kubenswrapper[4826]: I0319 19:11:32.326963 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d850e5c4-d43d-452c-9d27-69cb4cda0dd5-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n5kn7\" (UID: \"d850e5c4-d43d-452c-9d27-69cb4cda0dd5\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n5kn7" Mar 19 19:11:32 crc kubenswrapper[4826]: I0319 19:11:32.327028 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d850e5c4-d43d-452c-9d27-69cb4cda0dd5-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n5kn7\" (UID: \"d850e5c4-d43d-452c-9d27-69cb4cda0dd5\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n5kn7" Mar 19 19:11:32 crc kubenswrapper[4826]: I0319 19:11:32.327611 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d850e5c4-d43d-452c-9d27-69cb4cda0dd5-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n5kn7\" (UID: \"d850e5c4-d43d-452c-9d27-69cb4cda0dd5\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n5kn7" Mar 19 19:11:32 crc kubenswrapper[4826]: I0319 19:11:32.327677 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d850e5c4-d43d-452c-9d27-69cb4cda0dd5-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n5kn7\" (UID: \"d850e5c4-d43d-452c-9d27-69cb4cda0dd5\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n5kn7" Mar 19 19:11:32 crc kubenswrapper[4826]: I0319 19:11:32.359806 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvsw7\" (UniqueName: \"kubernetes.io/projected/d850e5c4-d43d-452c-9d27-69cb4cda0dd5-kube-api-access-qvsw7\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n5kn7\" (UID: \"d850e5c4-d43d-452c-9d27-69cb4cda0dd5\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n5kn7" Mar 19 19:11:32 crc kubenswrapper[4826]: I0319 19:11:32.484315 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n5kn7" Mar 19 19:11:32 crc kubenswrapper[4826]: I0319 19:11:32.954852 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n5kn7"] Mar 19 19:11:33 crc kubenswrapper[4826]: I0319 19:11:33.395751 4826 generic.go:334] "Generic (PLEG): container finished" podID="d850e5c4-d43d-452c-9d27-69cb4cda0dd5" containerID="ed017fdd6a475dd57882285e7f5b9d5ce20e34efd8d0ec0785eb4e835f6767b7" exitCode=0 Mar 19 19:11:33 crc kubenswrapper[4826]: I0319 19:11:33.395812 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n5kn7" event={"ID":"d850e5c4-d43d-452c-9d27-69cb4cda0dd5","Type":"ContainerDied","Data":"ed017fdd6a475dd57882285e7f5b9d5ce20e34efd8d0ec0785eb4e835f6767b7"} Mar 19 19:11:33 crc kubenswrapper[4826]: I0319 19:11:33.396058 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n5kn7" event={"ID":"d850e5c4-d43d-452c-9d27-69cb4cda0dd5","Type":"ContainerStarted","Data":"84416fc7986944e755154774fcfd967836fd0112c1f9a023a8757f6551fe2547"} Mar 19 19:11:35 crc kubenswrapper[4826]: I0319 19:11:35.421807 4826 generic.go:334] "Generic (PLEG): container finished" podID="d850e5c4-d43d-452c-9d27-69cb4cda0dd5" containerID="c5b085d56830f2c9e551e7327cae173345a4858c81d4461b826fa00b3bea6c50" exitCode=0 Mar 19 19:11:35 crc kubenswrapper[4826]: I0319 19:11:35.421951 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n5kn7" event={"ID":"d850e5c4-d43d-452c-9d27-69cb4cda0dd5","Type":"ContainerDied","Data":"c5b085d56830f2c9e551e7327cae173345a4858c81d4461b826fa00b3bea6c50"} Mar 19 19:11:36 crc kubenswrapper[4826]: I0319 19:11:36.436008 4826 generic.go:334] "Generic (PLEG): container finished" podID="d850e5c4-d43d-452c-9d27-69cb4cda0dd5" containerID="5d328e4434432a401c7085f31a10b1c20c826b357ece51c7a02c4aa62bb7828e" exitCode=0 Mar 19 19:11:36 crc kubenswrapper[4826]: I0319 19:11:36.436052 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n5kn7" event={"ID":"d850e5c4-d43d-452c-9d27-69cb4cda0dd5","Type":"ContainerDied","Data":"5d328e4434432a401c7085f31a10b1c20c826b357ece51c7a02c4aa62bb7828e"} Mar 19 19:11:37 crc kubenswrapper[4826]: I0319 19:11:37.781278 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n5kn7" Mar 19 19:11:37 crc kubenswrapper[4826]: I0319 19:11:37.949251 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvsw7\" (UniqueName: \"kubernetes.io/projected/d850e5c4-d43d-452c-9d27-69cb4cda0dd5-kube-api-access-qvsw7\") pod \"d850e5c4-d43d-452c-9d27-69cb4cda0dd5\" (UID: \"d850e5c4-d43d-452c-9d27-69cb4cda0dd5\") " Mar 19 19:11:37 crc kubenswrapper[4826]: I0319 19:11:37.949297 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d850e5c4-d43d-452c-9d27-69cb4cda0dd5-util\") pod \"d850e5c4-d43d-452c-9d27-69cb4cda0dd5\" (UID: \"d850e5c4-d43d-452c-9d27-69cb4cda0dd5\") " Mar 19 19:11:37 crc kubenswrapper[4826]: I0319 19:11:37.949348 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d850e5c4-d43d-452c-9d27-69cb4cda0dd5-bundle\") pod \"d850e5c4-d43d-452c-9d27-69cb4cda0dd5\" (UID: \"d850e5c4-d43d-452c-9d27-69cb4cda0dd5\") " Mar 19 19:11:37 crc kubenswrapper[4826]: I0319 19:11:37.950377 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d850e5c4-d43d-452c-9d27-69cb4cda0dd5-bundle" (OuterVolumeSpecName: "bundle") pod "d850e5c4-d43d-452c-9d27-69cb4cda0dd5" (UID: "d850e5c4-d43d-452c-9d27-69cb4cda0dd5"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:11:37 crc kubenswrapper[4826]: I0319 19:11:37.958140 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d850e5c4-d43d-452c-9d27-69cb4cda0dd5-kube-api-access-qvsw7" (OuterVolumeSpecName: "kube-api-access-qvsw7") pod "d850e5c4-d43d-452c-9d27-69cb4cda0dd5" (UID: "d850e5c4-d43d-452c-9d27-69cb4cda0dd5"). InnerVolumeSpecName "kube-api-access-qvsw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:11:37 crc kubenswrapper[4826]: I0319 19:11:37.968628 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d850e5c4-d43d-452c-9d27-69cb4cda0dd5-util" (OuterVolumeSpecName: "util") pod "d850e5c4-d43d-452c-9d27-69cb4cda0dd5" (UID: "d850e5c4-d43d-452c-9d27-69cb4cda0dd5"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:11:38 crc kubenswrapper[4826]: I0319 19:11:38.051424 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvsw7\" (UniqueName: \"kubernetes.io/projected/d850e5c4-d43d-452c-9d27-69cb4cda0dd5-kube-api-access-qvsw7\") on node \"crc\" DevicePath \"\"" Mar 19 19:11:38 crc kubenswrapper[4826]: I0319 19:11:38.051477 4826 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d850e5c4-d43d-452c-9d27-69cb4cda0dd5-util\") on node \"crc\" DevicePath \"\"" Mar 19 19:11:38 crc kubenswrapper[4826]: I0319 19:11:38.051506 4826 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d850e5c4-d43d-452c-9d27-69cb4cda0dd5-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:11:38 crc kubenswrapper[4826]: I0319 19:11:38.464585 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n5kn7" event={"ID":"d850e5c4-d43d-452c-9d27-69cb4cda0dd5","Type":"ContainerDied","Data":"84416fc7986944e755154774fcfd967836fd0112c1f9a023a8757f6551fe2547"} Mar 19 19:11:38 crc kubenswrapper[4826]: I0319 19:11:38.464685 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84416fc7986944e755154774fcfd967836fd0112c1f9a023a8757f6551fe2547" Mar 19 19:11:38 crc kubenswrapper[4826]: I0319 19:11:38.464689 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n5kn7" Mar 19 19:11:44 crc kubenswrapper[4826]: I0319 19:11:44.221257 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-jtglg"] Mar 19 19:11:44 crc kubenswrapper[4826]: E0319 19:11:44.222257 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d850e5c4-d43d-452c-9d27-69cb4cda0dd5" containerName="util" Mar 19 19:11:44 crc kubenswrapper[4826]: I0319 19:11:44.222276 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="d850e5c4-d43d-452c-9d27-69cb4cda0dd5" containerName="util" Mar 19 19:11:44 crc kubenswrapper[4826]: E0319 19:11:44.222299 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d850e5c4-d43d-452c-9d27-69cb4cda0dd5" containerName="pull" Mar 19 19:11:44 crc kubenswrapper[4826]: I0319 19:11:44.222308 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="d850e5c4-d43d-452c-9d27-69cb4cda0dd5" containerName="pull" Mar 19 19:11:44 crc kubenswrapper[4826]: E0319 19:11:44.222324 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d850e5c4-d43d-452c-9d27-69cb4cda0dd5" containerName="extract" Mar 19 19:11:44 crc kubenswrapper[4826]: I0319 19:11:44.222337 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="d850e5c4-d43d-452c-9d27-69cb4cda0dd5" containerName="extract" Mar 19 19:11:44 crc kubenswrapper[4826]: I0319 19:11:44.222531 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="d850e5c4-d43d-452c-9d27-69cb4cda0dd5" containerName="extract" Mar 19 19:11:44 crc kubenswrapper[4826]: I0319 19:11:44.223344 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-jtglg" Mar 19 19:11:44 crc kubenswrapper[4826]: I0319 19:11:44.228173 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-2fmk9" Mar 19 19:11:44 crc kubenswrapper[4826]: I0319 19:11:44.228316 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 19 19:11:44 crc kubenswrapper[4826]: I0319 19:11:44.228363 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 19 19:11:44 crc kubenswrapper[4826]: I0319 19:11:44.231291 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-jtglg"] Mar 19 19:11:44 crc kubenswrapper[4826]: I0319 19:11:44.356747 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gf5q\" (UniqueName: \"kubernetes.io/projected/b08225e9-d4ce-4b9f-b87e-c586e3d8c47f-kube-api-access-4gf5q\") pod \"nmstate-operator-796d4cfff4-jtglg\" (UID: \"b08225e9-d4ce-4b9f-b87e-c586e3d8c47f\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-jtglg" Mar 19 19:11:44 crc kubenswrapper[4826]: I0319 19:11:44.458790 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gf5q\" (UniqueName: \"kubernetes.io/projected/b08225e9-d4ce-4b9f-b87e-c586e3d8c47f-kube-api-access-4gf5q\") pod \"nmstate-operator-796d4cfff4-jtglg\" (UID: \"b08225e9-d4ce-4b9f-b87e-c586e3d8c47f\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-jtglg" Mar 19 19:11:44 crc kubenswrapper[4826]: I0319 19:11:44.505210 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gf5q\" (UniqueName: \"kubernetes.io/projected/b08225e9-d4ce-4b9f-b87e-c586e3d8c47f-kube-api-access-4gf5q\") pod \"nmstate-operator-796d4cfff4-jtglg\" (UID: \"b08225e9-d4ce-4b9f-b87e-c586e3d8c47f\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-jtglg" Mar 19 19:11:44 crc kubenswrapper[4826]: I0319 19:11:44.541964 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-jtglg" Mar 19 19:11:45 crc kubenswrapper[4826]: I0319 19:11:45.041326 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-jtglg"] Mar 19 19:11:45 crc kubenswrapper[4826]: I0319 19:11:45.525815 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-jtglg" event={"ID":"b08225e9-d4ce-4b9f-b87e-c586e3d8c47f","Type":"ContainerStarted","Data":"825a8f3a31732afb67079d3bf901571b2c82cd4735aa0e1179a6918c75ff53c5"} Mar 19 19:11:47 crc kubenswrapper[4826]: I0319 19:11:47.548969 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-jtglg" event={"ID":"b08225e9-d4ce-4b9f-b87e-c586e3d8c47f","Type":"ContainerStarted","Data":"fd5cf0b462068f38e1c44d3c1afdc3cb6e5a43a414e446a5220f1bb90255aef4"} Mar 19 19:11:47 crc kubenswrapper[4826]: I0319 19:11:47.577385 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-jtglg" podStartSLOduration=1.367491827 podStartE2EDuration="3.577353374s" podCreationTimestamp="2026-03-19 19:11:44 +0000 UTC" firstStartedPulling="2026-03-19 19:11:45.042209076 +0000 UTC m=+929.796277429" lastFinishedPulling="2026-03-19 19:11:47.252070623 +0000 UTC m=+932.006138976" observedRunningTime="2026-03-19 19:11:47.567626369 +0000 UTC m=+932.321694722" watchObservedRunningTime="2026-03-19 19:11:47.577353374 +0000 UTC m=+932.331421938" Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.284996 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-ws7kv"] Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.286690 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-ws7kv" Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.294541 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-b88ww" Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.306694 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-ws7kv"] Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.326922 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-d4pjw"] Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.334941 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-d4pjw" Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.338534 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.346256 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-d4pjw"] Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.357058 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-zqv4g"] Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.358181 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-zqv4g" Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.405475 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/9f450458-8845-4ec5-9971-6df9dd448312-ovs-socket\") pod \"nmstate-handler-zqv4g\" (UID: \"9f450458-8845-4ec5-9971-6df9dd448312\") " pod="openshift-nmstate/nmstate-handler-zqv4g" Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.405855 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/9f450458-8845-4ec5-9971-6df9dd448312-dbus-socket\") pod \"nmstate-handler-zqv4g\" (UID: \"9f450458-8845-4ec5-9971-6df9dd448312\") " pod="openshift-nmstate/nmstate-handler-zqv4g" Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.405887 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8pxh\" (UniqueName: \"kubernetes.io/projected/6365b3ab-4090-4880-b583-0ad075dd3c4d-kube-api-access-x8pxh\") pod \"nmstate-metrics-9b8c8685d-ws7kv\" (UID: \"6365b3ab-4090-4880-b583-0ad075dd3c4d\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-ws7kv" Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.405917 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/9f450458-8845-4ec5-9971-6df9dd448312-nmstate-lock\") pod \"nmstate-handler-zqv4g\" (UID: \"9f450458-8845-4ec5-9971-6df9dd448312\") " pod="openshift-nmstate/nmstate-handler-zqv4g" Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.405990 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/afb786fa-7916-4f36-9978-5bd829c9dbf8-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-d4pjw\" (UID: \"afb786fa-7916-4f36-9978-5bd829c9dbf8\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-d4pjw" Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.406026 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hs82w\" (UniqueName: \"kubernetes.io/projected/afb786fa-7916-4f36-9978-5bd829c9dbf8-kube-api-access-hs82w\") pod \"nmstate-webhook-5f558f5558-d4pjw\" (UID: \"afb786fa-7916-4f36-9978-5bd829c9dbf8\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-d4pjw" Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.406049 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6mzp\" (UniqueName: \"kubernetes.io/projected/9f450458-8845-4ec5-9971-6df9dd448312-kube-api-access-d6mzp\") pod \"nmstate-handler-zqv4g\" (UID: \"9f450458-8845-4ec5-9971-6df9dd448312\") " pod="openshift-nmstate/nmstate-handler-zqv4g" Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.438242 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-8f82b"] Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.439151 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-8f82b" Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.444503 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-m66v8" Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.447622 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.447632 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.455048 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-8f82b"] Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.507339 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/9f450458-8845-4ec5-9971-6df9dd448312-ovs-socket\") pod \"nmstate-handler-zqv4g\" (UID: \"9f450458-8845-4ec5-9971-6df9dd448312\") " pod="openshift-nmstate/nmstate-handler-zqv4g" Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.507405 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/9f450458-8845-4ec5-9971-6df9dd448312-dbus-socket\") pod \"nmstate-handler-zqv4g\" (UID: \"9f450458-8845-4ec5-9971-6df9dd448312\") " pod="openshift-nmstate/nmstate-handler-zqv4g" Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.507430 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d34bdc1e-adc4-4363-b623-6644010a58dc-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-8f82b\" (UID: \"d34bdc1e-adc4-4363-b623-6644010a58dc\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-8f82b" Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.507451 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8pxh\" (UniqueName: \"kubernetes.io/projected/6365b3ab-4090-4880-b583-0ad075dd3c4d-kube-api-access-x8pxh\") pod \"nmstate-metrics-9b8c8685d-ws7kv\" (UID: \"6365b3ab-4090-4880-b583-0ad075dd3c4d\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-ws7kv" Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.507472 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/9f450458-8845-4ec5-9971-6df9dd448312-nmstate-lock\") pod \"nmstate-handler-zqv4g\" (UID: \"9f450458-8845-4ec5-9971-6df9dd448312\") " pod="openshift-nmstate/nmstate-handler-zqv4g" Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.507486 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/9f450458-8845-4ec5-9971-6df9dd448312-ovs-socket\") pod \"nmstate-handler-zqv4g\" (UID: \"9f450458-8845-4ec5-9971-6df9dd448312\") " pod="openshift-nmstate/nmstate-handler-zqv4g" Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.507496 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/afb786fa-7916-4f36-9978-5bd829c9dbf8-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-d4pjw\" (UID: \"afb786fa-7916-4f36-9978-5bd829c9dbf8\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-d4pjw" Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.507608 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/9f450458-8845-4ec5-9971-6df9dd448312-nmstate-lock\") pod \"nmstate-handler-zqv4g\" (UID: \"9f450458-8845-4ec5-9971-6df9dd448312\") " pod="openshift-nmstate/nmstate-handler-zqv4g" Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.507670 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/d34bdc1e-adc4-4363-b623-6644010a58dc-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-8f82b\" (UID: \"d34bdc1e-adc4-4363-b623-6644010a58dc\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-8f82b" Mar 19 19:11:53 crc kubenswrapper[4826]: E0319 19:11:53.507627 4826 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.507740 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/9f450458-8845-4ec5-9971-6df9dd448312-dbus-socket\") pod \"nmstate-handler-zqv4g\" (UID: \"9f450458-8845-4ec5-9971-6df9dd448312\") " pod="openshift-nmstate/nmstate-handler-zqv4g" Mar 19 19:11:53 crc kubenswrapper[4826]: E0319 19:11:53.507783 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/afb786fa-7916-4f36-9978-5bd829c9dbf8-tls-key-pair podName:afb786fa-7916-4f36-9978-5bd829c9dbf8 nodeName:}" failed. No retries permitted until 2026-03-19 19:11:54.007759813 +0000 UTC m=+938.761828126 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/afb786fa-7916-4f36-9978-5bd829c9dbf8-tls-key-pair") pod "nmstate-webhook-5f558f5558-d4pjw" (UID: "afb786fa-7916-4f36-9978-5bd829c9dbf8") : secret "openshift-nmstate-webhook" not found Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.507914 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hs82w\" (UniqueName: \"kubernetes.io/projected/afb786fa-7916-4f36-9978-5bd829c9dbf8-kube-api-access-hs82w\") pod \"nmstate-webhook-5f558f5558-d4pjw\" (UID: \"afb786fa-7916-4f36-9978-5bd829c9dbf8\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-d4pjw" Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.507977 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6mzp\" (UniqueName: \"kubernetes.io/projected/9f450458-8845-4ec5-9971-6df9dd448312-kube-api-access-d6mzp\") pod \"nmstate-handler-zqv4g\" (UID: \"9f450458-8845-4ec5-9971-6df9dd448312\") " pod="openshift-nmstate/nmstate-handler-zqv4g" Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.508081 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g6tm\" (UniqueName: \"kubernetes.io/projected/d34bdc1e-adc4-4363-b623-6644010a58dc-kube-api-access-4g6tm\") pod \"nmstate-console-plugin-86f58fcf4-8f82b\" (UID: \"d34bdc1e-adc4-4363-b623-6644010a58dc\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-8f82b" Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.525807 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8pxh\" (UniqueName: \"kubernetes.io/projected/6365b3ab-4090-4880-b583-0ad075dd3c4d-kube-api-access-x8pxh\") pod \"nmstate-metrics-9b8c8685d-ws7kv\" (UID: \"6365b3ab-4090-4880-b583-0ad075dd3c4d\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-ws7kv" Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.525954 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6mzp\" (UniqueName: \"kubernetes.io/projected/9f450458-8845-4ec5-9971-6df9dd448312-kube-api-access-d6mzp\") pod \"nmstate-handler-zqv4g\" (UID: \"9f450458-8845-4ec5-9971-6df9dd448312\") " pod="openshift-nmstate/nmstate-handler-zqv4g" Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.536573 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hs82w\" (UniqueName: \"kubernetes.io/projected/afb786fa-7916-4f36-9978-5bd829c9dbf8-kube-api-access-hs82w\") pod \"nmstate-webhook-5f558f5558-d4pjw\" (UID: \"afb786fa-7916-4f36-9978-5bd829c9dbf8\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-d4pjw" Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.601820 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-ws7kv" Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.609846 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/d34bdc1e-adc4-4363-b623-6644010a58dc-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-8f82b\" (UID: \"d34bdc1e-adc4-4363-b623-6644010a58dc\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-8f82b" Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.609935 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4g6tm\" (UniqueName: \"kubernetes.io/projected/d34bdc1e-adc4-4363-b623-6644010a58dc-kube-api-access-4g6tm\") pod \"nmstate-console-plugin-86f58fcf4-8f82b\" (UID: \"d34bdc1e-adc4-4363-b623-6644010a58dc\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-8f82b" Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.610002 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d34bdc1e-adc4-4363-b623-6644010a58dc-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-8f82b\" (UID: \"d34bdc1e-adc4-4363-b623-6644010a58dc\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-8f82b" Mar 19 19:11:53 crc kubenswrapper[4826]: E0319 19:11:53.610197 4826 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Mar 19 19:11:53 crc kubenswrapper[4826]: E0319 19:11:53.610270 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d34bdc1e-adc4-4363-b623-6644010a58dc-plugin-serving-cert podName:d34bdc1e-adc4-4363-b623-6644010a58dc nodeName:}" failed. No retries permitted until 2026-03-19 19:11:54.110251506 +0000 UTC m=+938.864319819 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/d34bdc1e-adc4-4363-b623-6644010a58dc-plugin-serving-cert") pod "nmstate-console-plugin-86f58fcf4-8f82b" (UID: "d34bdc1e-adc4-4363-b623-6644010a58dc") : secret "plugin-serving-cert" not found Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.610908 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d34bdc1e-adc4-4363-b623-6644010a58dc-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-8f82b\" (UID: \"d34bdc1e-adc4-4363-b623-6644010a58dc\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-8f82b" Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.628982 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g6tm\" (UniqueName: \"kubernetes.io/projected/d34bdc1e-adc4-4363-b623-6644010a58dc-kube-api-access-4g6tm\") pod \"nmstate-console-plugin-86f58fcf4-8f82b\" (UID: \"d34bdc1e-adc4-4363-b623-6644010a58dc\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-8f82b" Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.641761 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-54c8fbc9df-9kq4s"] Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.642613 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54c8fbc9df-9kq4s" Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.653636 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-54c8fbc9df-9kq4s"] Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.672237 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-zqv4g" Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.711067 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8c238d2b-d512-419b-92dc-6d68bcba8655-console-serving-cert\") pod \"console-54c8fbc9df-9kq4s\" (UID: \"8c238d2b-d512-419b-92dc-6d68bcba8655\") " pod="openshift-console/console-54c8fbc9df-9kq4s" Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.711253 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6v2x\" (UniqueName: \"kubernetes.io/projected/8c238d2b-d512-419b-92dc-6d68bcba8655-kube-api-access-l6v2x\") pod \"console-54c8fbc9df-9kq4s\" (UID: \"8c238d2b-d512-419b-92dc-6d68bcba8655\") " pod="openshift-console/console-54c8fbc9df-9kq4s" Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.711296 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8c238d2b-d512-419b-92dc-6d68bcba8655-console-oauth-config\") pod \"console-54c8fbc9df-9kq4s\" (UID: \"8c238d2b-d512-419b-92dc-6d68bcba8655\") " pod="openshift-console/console-54c8fbc9df-9kq4s" Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.711341 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c238d2b-d512-419b-92dc-6d68bcba8655-trusted-ca-bundle\") pod \"console-54c8fbc9df-9kq4s\" (UID: \"8c238d2b-d512-419b-92dc-6d68bcba8655\") " pod="openshift-console/console-54c8fbc9df-9kq4s" Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.711388 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8c238d2b-d512-419b-92dc-6d68bcba8655-console-config\") pod \"console-54c8fbc9df-9kq4s\" (UID: \"8c238d2b-d512-419b-92dc-6d68bcba8655\") " pod="openshift-console/console-54c8fbc9df-9kq4s" Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.711406 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8c238d2b-d512-419b-92dc-6d68bcba8655-oauth-serving-cert\") pod \"console-54c8fbc9df-9kq4s\" (UID: \"8c238d2b-d512-419b-92dc-6d68bcba8655\") " pod="openshift-console/console-54c8fbc9df-9kq4s" Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.711438 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8c238d2b-d512-419b-92dc-6d68bcba8655-service-ca\") pod \"console-54c8fbc9df-9kq4s\" (UID: \"8c238d2b-d512-419b-92dc-6d68bcba8655\") " pod="openshift-console/console-54c8fbc9df-9kq4s" Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.813281 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8c238d2b-d512-419b-92dc-6d68bcba8655-console-config\") pod \"console-54c8fbc9df-9kq4s\" (UID: \"8c238d2b-d512-419b-92dc-6d68bcba8655\") " pod="openshift-console/console-54c8fbc9df-9kq4s" Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.813340 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8c238d2b-d512-419b-92dc-6d68bcba8655-oauth-serving-cert\") pod \"console-54c8fbc9df-9kq4s\" (UID: \"8c238d2b-d512-419b-92dc-6d68bcba8655\") " pod="openshift-console/console-54c8fbc9df-9kq4s" Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.813384 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8c238d2b-d512-419b-92dc-6d68bcba8655-service-ca\") pod \"console-54c8fbc9df-9kq4s\" (UID: \"8c238d2b-d512-419b-92dc-6d68bcba8655\") " pod="openshift-console/console-54c8fbc9df-9kq4s" Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.813436 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8c238d2b-d512-419b-92dc-6d68bcba8655-console-serving-cert\") pod \"console-54c8fbc9df-9kq4s\" (UID: \"8c238d2b-d512-419b-92dc-6d68bcba8655\") " pod="openshift-console/console-54c8fbc9df-9kq4s" Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.813473 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6v2x\" (UniqueName: \"kubernetes.io/projected/8c238d2b-d512-419b-92dc-6d68bcba8655-kube-api-access-l6v2x\") pod \"console-54c8fbc9df-9kq4s\" (UID: \"8c238d2b-d512-419b-92dc-6d68bcba8655\") " pod="openshift-console/console-54c8fbc9df-9kq4s" Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.813510 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8c238d2b-d512-419b-92dc-6d68bcba8655-console-oauth-config\") pod \"console-54c8fbc9df-9kq4s\" (UID: \"8c238d2b-d512-419b-92dc-6d68bcba8655\") " pod="openshift-console/console-54c8fbc9df-9kq4s" Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.813551 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c238d2b-d512-419b-92dc-6d68bcba8655-trusted-ca-bundle\") pod \"console-54c8fbc9df-9kq4s\" (UID: \"8c238d2b-d512-419b-92dc-6d68bcba8655\") " pod="openshift-console/console-54c8fbc9df-9kq4s" Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.815585 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8c238d2b-d512-419b-92dc-6d68bcba8655-service-ca\") pod \"console-54c8fbc9df-9kq4s\" (UID: \"8c238d2b-d512-419b-92dc-6d68bcba8655\") " pod="openshift-console/console-54c8fbc9df-9kq4s" Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.816232 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8c238d2b-d512-419b-92dc-6d68bcba8655-console-config\") pod \"console-54c8fbc9df-9kq4s\" (UID: \"8c238d2b-d512-419b-92dc-6d68bcba8655\") " pod="openshift-console/console-54c8fbc9df-9kq4s" Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.816940 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8c238d2b-d512-419b-92dc-6d68bcba8655-oauth-serving-cert\") pod \"console-54c8fbc9df-9kq4s\" (UID: \"8c238d2b-d512-419b-92dc-6d68bcba8655\") " pod="openshift-console/console-54c8fbc9df-9kq4s" Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.823368 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8c238d2b-d512-419b-92dc-6d68bcba8655-console-oauth-config\") pod \"console-54c8fbc9df-9kq4s\" (UID: \"8c238d2b-d512-419b-92dc-6d68bcba8655\") " pod="openshift-console/console-54c8fbc9df-9kq4s" Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.823540 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8c238d2b-d512-419b-92dc-6d68bcba8655-console-serving-cert\") pod \"console-54c8fbc9df-9kq4s\" (UID: \"8c238d2b-d512-419b-92dc-6d68bcba8655\") " pod="openshift-console/console-54c8fbc9df-9kq4s" Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.823540 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c238d2b-d512-419b-92dc-6d68bcba8655-trusted-ca-bundle\") pod \"console-54c8fbc9df-9kq4s\" (UID: \"8c238d2b-d512-419b-92dc-6d68bcba8655\") " pod="openshift-console/console-54c8fbc9df-9kq4s" Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.834022 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6v2x\" (UniqueName: \"kubernetes.io/projected/8c238d2b-d512-419b-92dc-6d68bcba8655-kube-api-access-l6v2x\") pod \"console-54c8fbc9df-9kq4s\" (UID: \"8c238d2b-d512-419b-92dc-6d68bcba8655\") " pod="openshift-console/console-54c8fbc9df-9kq4s" Mar 19 19:11:53 crc kubenswrapper[4826]: I0319 19:11:53.986648 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54c8fbc9df-9kq4s" Mar 19 19:11:54 crc kubenswrapper[4826]: I0319 19:11:54.016573 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/afb786fa-7916-4f36-9978-5bd829c9dbf8-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-d4pjw\" (UID: \"afb786fa-7916-4f36-9978-5bd829c9dbf8\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-d4pjw" Mar 19 19:11:54 crc kubenswrapper[4826]: I0319 19:11:54.022233 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/afb786fa-7916-4f36-9978-5bd829c9dbf8-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-d4pjw\" (UID: \"afb786fa-7916-4f36-9978-5bd829c9dbf8\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-d4pjw" Mar 19 19:11:54 crc kubenswrapper[4826]: I0319 19:11:54.068292 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-ws7kv"] Mar 19 19:11:54 crc kubenswrapper[4826]: I0319 19:11:54.119530 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/d34bdc1e-adc4-4363-b623-6644010a58dc-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-8f82b\" (UID: \"d34bdc1e-adc4-4363-b623-6644010a58dc\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-8f82b" Mar 19 19:11:54 crc kubenswrapper[4826]: I0319 19:11:54.123530 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/d34bdc1e-adc4-4363-b623-6644010a58dc-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-8f82b\" (UID: \"d34bdc1e-adc4-4363-b623-6644010a58dc\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-8f82b" Mar 19 19:11:54 crc kubenswrapper[4826]: I0319 19:11:54.249781 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-d4pjw" Mar 19 19:11:54 crc kubenswrapper[4826]: I0319 19:11:54.354344 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-8f82b" Mar 19 19:11:54 crc kubenswrapper[4826]: I0319 19:11:54.441396 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-54c8fbc9df-9kq4s"] Mar 19 19:11:54 crc kubenswrapper[4826]: I0319 19:11:54.617987 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-zqv4g" event={"ID":"9f450458-8845-4ec5-9971-6df9dd448312","Type":"ContainerStarted","Data":"27145ff56d78e9b0656abc0466c1a01c1ed8a78129ef701721de48baad49b944"} Mar 19 19:11:54 crc kubenswrapper[4826]: I0319 19:11:54.619246 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-ws7kv" event={"ID":"6365b3ab-4090-4880-b583-0ad075dd3c4d","Type":"ContainerStarted","Data":"be36cb96a85b707e0077c159bac419469ca95e379aa807c5f1479a97463b19f9"} Mar 19 19:11:54 crc kubenswrapper[4826]: I0319 19:11:54.620622 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54c8fbc9df-9kq4s" event={"ID":"8c238d2b-d512-419b-92dc-6d68bcba8655","Type":"ContainerStarted","Data":"12a01f38c7a125e4d2b8d03e1eb40cc01aec53b8fd0570c19324e12fdbc5e797"} Mar 19 19:11:54 crc kubenswrapper[4826]: W0319 19:11:54.675047 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podafb786fa_7916_4f36_9978_5bd829c9dbf8.slice/crio-36ae4d48fee2e43f5747db8e27e1a50f6f73be239ae872c24ad9dc2957ba7651 WatchSource:0}: Error finding container 36ae4d48fee2e43f5747db8e27e1a50f6f73be239ae872c24ad9dc2957ba7651: Status 404 returned error can't find the container with id 36ae4d48fee2e43f5747db8e27e1a50f6f73be239ae872c24ad9dc2957ba7651 Mar 19 19:11:54 crc kubenswrapper[4826]: I0319 19:11:54.676424 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-d4pjw"] Mar 19 19:11:54 crc kubenswrapper[4826]: I0319 19:11:54.935804 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-8f82b"] Mar 19 19:11:55 crc kubenswrapper[4826]: I0319 19:11:55.400251 4826 patch_prober.go:28] interesting pod/machine-config-daemon-zz87p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:11:55 crc kubenswrapper[4826]: I0319 19:11:55.400620 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:11:55 crc kubenswrapper[4826]: I0319 19:11:55.629347 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54c8fbc9df-9kq4s" event={"ID":"8c238d2b-d512-419b-92dc-6d68bcba8655","Type":"ContainerStarted","Data":"ef28f7712a323c8c461d246dd313b2eba2451f7028e39ebb46d68a6c3c18879e"} Mar 19 19:11:55 crc kubenswrapper[4826]: I0319 19:11:55.632703 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-d4pjw" event={"ID":"afb786fa-7916-4f36-9978-5bd829c9dbf8","Type":"ContainerStarted","Data":"36ae4d48fee2e43f5747db8e27e1a50f6f73be239ae872c24ad9dc2957ba7651"} Mar 19 19:11:55 crc kubenswrapper[4826]: I0319 19:11:55.640822 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-8f82b" event={"ID":"d34bdc1e-adc4-4363-b623-6644010a58dc","Type":"ContainerStarted","Data":"de1e1fa504e7a6b4beb82611b0f3452bae294117a7af801b7a08bbba826bfe15"} Mar 19 19:11:55 crc kubenswrapper[4826]: I0319 19:11:55.654809 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-54c8fbc9df-9kq4s" podStartSLOduration=2.6547897369999998 podStartE2EDuration="2.654789737s" podCreationTimestamp="2026-03-19 19:11:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:11:55.650306888 +0000 UTC m=+940.404375221" watchObservedRunningTime="2026-03-19 19:11:55.654789737 +0000 UTC m=+940.408858040" Mar 19 19:11:56 crc kubenswrapper[4826]: I0319 19:11:56.655954 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-ws7kv" event={"ID":"6365b3ab-4090-4880-b583-0ad075dd3c4d","Type":"ContainerStarted","Data":"c8912f0ad612ec0eb4db615df42d18f8b7cbcd3b36d75c03db80f46329a01751"} Mar 19 19:11:56 crc kubenswrapper[4826]: I0319 19:11:56.658845 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-d4pjw" event={"ID":"afb786fa-7916-4f36-9978-5bd829c9dbf8","Type":"ContainerStarted","Data":"779f8b42ab1b7e8b413c3e870b0eb37c1209391320d3e52b3ec5417b23a4d120"} Mar 19 19:11:56 crc kubenswrapper[4826]: I0319 19:11:56.702303 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-d4pjw" podStartSLOduration=2.014345709 podStartE2EDuration="3.702271808s" podCreationTimestamp="2026-03-19 19:11:53 +0000 UTC" firstStartedPulling="2026-03-19 19:11:54.678098841 +0000 UTC m=+939.432167154" lastFinishedPulling="2026-03-19 19:11:56.36602494 +0000 UTC m=+941.120093253" observedRunningTime="2026-03-19 19:11:56.700647949 +0000 UTC m=+941.454716292" watchObservedRunningTime="2026-03-19 19:11:56.702271808 +0000 UTC m=+941.456340121" Mar 19 19:11:57 crc kubenswrapper[4826]: I0319 19:11:57.670446 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-zqv4g" event={"ID":"9f450458-8845-4ec5-9971-6df9dd448312","Type":"ContainerStarted","Data":"520ced854653c13d1442be957c5499bf3e78280efb1d977f8e270e57e75f08bb"} Mar 19 19:11:57 crc kubenswrapper[4826]: I0319 19:11:57.671412 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-d4pjw" Mar 19 19:11:57 crc kubenswrapper[4826]: I0319 19:11:57.695641 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-zqv4g" podStartSLOduration=2.043954357 podStartE2EDuration="4.695617808s" podCreationTimestamp="2026-03-19 19:11:53 +0000 UTC" firstStartedPulling="2026-03-19 19:11:53.712532255 +0000 UTC m=+938.466600568" lastFinishedPulling="2026-03-19 19:11:56.364195706 +0000 UTC m=+941.118264019" observedRunningTime="2026-03-19 19:11:57.689899209 +0000 UTC m=+942.443967522" watchObservedRunningTime="2026-03-19 19:11:57.695617808 +0000 UTC m=+942.449686121" Mar 19 19:11:58 crc kubenswrapper[4826]: I0319 19:11:58.672416 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-zqv4g" Mar 19 19:11:58 crc kubenswrapper[4826]: I0319 19:11:58.681860 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-8f82b" event={"ID":"d34bdc1e-adc4-4363-b623-6644010a58dc","Type":"ContainerStarted","Data":"eca4e6637e24aaa399b670b626532db0f672bbf872a2cfff1932677a18769bf0"} Mar 19 19:11:58 crc kubenswrapper[4826]: I0319 19:11:58.705715 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-8f82b" podStartSLOduration=2.7186444339999998 podStartE2EDuration="5.705688472s" podCreationTimestamp="2026-03-19 19:11:53 +0000 UTC" firstStartedPulling="2026-03-19 19:11:54.944290621 +0000 UTC m=+939.698358934" lastFinishedPulling="2026-03-19 19:11:57.931334639 +0000 UTC m=+942.685402972" observedRunningTime="2026-03-19 19:11:58.696794047 +0000 UTC m=+943.450862390" watchObservedRunningTime="2026-03-19 19:11:58.705688472 +0000 UTC m=+943.459756815" Mar 19 19:12:00 crc kubenswrapper[4826]: I0319 19:12:00.155037 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565792-h8n58"] Mar 19 19:12:00 crc kubenswrapper[4826]: I0319 19:12:00.156098 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565792-h8n58" Mar 19 19:12:00 crc kubenswrapper[4826]: I0319 19:12:00.159540 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 19:12:00 crc kubenswrapper[4826]: I0319 19:12:00.159842 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b27wl" Mar 19 19:12:00 crc kubenswrapper[4826]: I0319 19:12:00.160570 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 19:12:00 crc kubenswrapper[4826]: I0319 19:12:00.162916 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565792-h8n58"] Mar 19 19:12:00 crc kubenswrapper[4826]: I0319 19:12:00.243024 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j826n\" (UniqueName: \"kubernetes.io/projected/a5ff4e07-a700-4be9-a448-7ccc4a75f8ae-kube-api-access-j826n\") pod \"auto-csr-approver-29565792-h8n58\" (UID: \"a5ff4e07-a700-4be9-a448-7ccc4a75f8ae\") " pod="openshift-infra/auto-csr-approver-29565792-h8n58" Mar 19 19:12:00 crc kubenswrapper[4826]: I0319 19:12:00.345238 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j826n\" (UniqueName: \"kubernetes.io/projected/a5ff4e07-a700-4be9-a448-7ccc4a75f8ae-kube-api-access-j826n\") pod \"auto-csr-approver-29565792-h8n58\" (UID: \"a5ff4e07-a700-4be9-a448-7ccc4a75f8ae\") " pod="openshift-infra/auto-csr-approver-29565792-h8n58" Mar 19 19:12:00 crc kubenswrapper[4826]: I0319 19:12:00.362165 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j826n\" (UniqueName: \"kubernetes.io/projected/a5ff4e07-a700-4be9-a448-7ccc4a75f8ae-kube-api-access-j826n\") pod \"auto-csr-approver-29565792-h8n58\" (UID: \"a5ff4e07-a700-4be9-a448-7ccc4a75f8ae\") " pod="openshift-infra/auto-csr-approver-29565792-h8n58" Mar 19 19:12:00 crc kubenswrapper[4826]: I0319 19:12:00.496993 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565792-h8n58" Mar 19 19:12:00 crc kubenswrapper[4826]: I0319 19:12:00.696781 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-ws7kv" event={"ID":"6365b3ab-4090-4880-b583-0ad075dd3c4d","Type":"ContainerStarted","Data":"0c20691dc65a4c4ba611edc9a5de1b8499b34b42eb0f34c15b7b9cee5fa6e3fe"} Mar 19 19:12:00 crc kubenswrapper[4826]: I0319 19:12:00.728622 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-ws7kv" podStartSLOduration=1.427050728 podStartE2EDuration="7.728605159s" podCreationTimestamp="2026-03-19 19:11:53 +0000 UTC" firstStartedPulling="2026-03-19 19:11:54.088582157 +0000 UTC m=+938.842650470" lastFinishedPulling="2026-03-19 19:12:00.390136588 +0000 UTC m=+945.144204901" observedRunningTime="2026-03-19 19:12:00.717590452 +0000 UTC m=+945.471658795" watchObservedRunningTime="2026-03-19 19:12:00.728605159 +0000 UTC m=+945.482673472" Mar 19 19:12:00 crc kubenswrapper[4826]: I0319 19:12:00.986044 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565792-h8n58"] Mar 19 19:12:01 crc kubenswrapper[4826]: I0319 19:12:01.704833 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565792-h8n58" event={"ID":"a5ff4e07-a700-4be9-a448-7ccc4a75f8ae","Type":"ContainerStarted","Data":"f7351864051063a89a26aa6446beb3739648b8ae68f7fd5f3382954f6e8b0166"} Mar 19 19:12:02 crc kubenswrapper[4826]: I0319 19:12:02.721355 4826 generic.go:334] "Generic (PLEG): container finished" podID="a5ff4e07-a700-4be9-a448-7ccc4a75f8ae" containerID="665e7e9dcfc20ff31b70f3ab3bfdd29cb1836b7d0c40dc480e7cb291800adfd0" exitCode=0 Mar 19 19:12:02 crc kubenswrapper[4826]: I0319 19:12:02.721453 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565792-h8n58" event={"ID":"a5ff4e07-a700-4be9-a448-7ccc4a75f8ae","Type":"ContainerDied","Data":"665e7e9dcfc20ff31b70f3ab3bfdd29cb1836b7d0c40dc480e7cb291800adfd0"} Mar 19 19:12:03 crc kubenswrapper[4826]: I0319 19:12:03.718376 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-zqv4g" Mar 19 19:12:04 crc kubenswrapper[4826]: I0319 19:12:04.008237 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-54c8fbc9df-9kq4s" Mar 19 19:12:04 crc kubenswrapper[4826]: I0319 19:12:04.008282 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-54c8fbc9df-9kq4s" Mar 19 19:12:04 crc kubenswrapper[4826]: I0319 19:12:04.008349 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-54c8fbc9df-9kq4s" Mar 19 19:12:04 crc kubenswrapper[4826]: I0319 19:12:04.015734 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-54c8fbc9df-9kq4s" Mar 19 19:12:04 crc kubenswrapper[4826]: I0319 19:12:04.108370 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-55ff78d854-f4zw8"] Mar 19 19:12:04 crc kubenswrapper[4826]: I0319 19:12:04.137173 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565792-h8n58" Mar 19 19:12:04 crc kubenswrapper[4826]: I0319 19:12:04.321504 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j826n\" (UniqueName: \"kubernetes.io/projected/a5ff4e07-a700-4be9-a448-7ccc4a75f8ae-kube-api-access-j826n\") pod \"a5ff4e07-a700-4be9-a448-7ccc4a75f8ae\" (UID: \"a5ff4e07-a700-4be9-a448-7ccc4a75f8ae\") " Mar 19 19:12:04 crc kubenswrapper[4826]: I0319 19:12:04.383714 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5ff4e07-a700-4be9-a448-7ccc4a75f8ae-kube-api-access-j826n" (OuterVolumeSpecName: "kube-api-access-j826n") pod "a5ff4e07-a700-4be9-a448-7ccc4a75f8ae" (UID: "a5ff4e07-a700-4be9-a448-7ccc4a75f8ae"). InnerVolumeSpecName "kube-api-access-j826n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:12:04 crc kubenswrapper[4826]: I0319 19:12:04.423885 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j826n\" (UniqueName: \"kubernetes.io/projected/a5ff4e07-a700-4be9-a448-7ccc4a75f8ae-kube-api-access-j826n\") on node \"crc\" DevicePath \"\"" Mar 19 19:12:04 crc kubenswrapper[4826]: I0319 19:12:04.741892 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565792-h8n58" event={"ID":"a5ff4e07-a700-4be9-a448-7ccc4a75f8ae","Type":"ContainerDied","Data":"f7351864051063a89a26aa6446beb3739648b8ae68f7fd5f3382954f6e8b0166"} Mar 19 19:12:04 crc kubenswrapper[4826]: I0319 19:12:04.741966 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7351864051063a89a26aa6446beb3739648b8ae68f7fd5f3382954f6e8b0166" Mar 19 19:12:04 crc kubenswrapper[4826]: I0319 19:12:04.742193 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565792-h8n58" Mar 19 19:12:05 crc kubenswrapper[4826]: I0319 19:12:05.224957 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565786-5485l"] Mar 19 19:12:05 crc kubenswrapper[4826]: I0319 19:12:05.234556 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565786-5485l"] Mar 19 19:12:05 crc kubenswrapper[4826]: I0319 19:12:05.993340 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e18768f0-9479-4355-a573-f689a0e019d7" path="/var/lib/kubelet/pods/e18768f0-9479-4355-a573-f689a0e019d7/volumes" Mar 19 19:12:14 crc kubenswrapper[4826]: I0319 19:12:14.256773 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-d4pjw" Mar 19 19:12:16 crc kubenswrapper[4826]: I0319 19:12:16.981018 4826 scope.go:117] "RemoveContainer" containerID="b5b2fb51112651a9da8493466d7b767343ce35db5484ed71b10d42ff42355bde" Mar 19 19:12:25 crc kubenswrapper[4826]: I0319 19:12:25.418918 4826 patch_prober.go:28] interesting pod/machine-config-daemon-zz87p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:12:25 crc kubenswrapper[4826]: I0319 19:12:25.419421 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:12:29 crc kubenswrapper[4826]: I0319 19:12:29.149533 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-55ff78d854-f4zw8" podUID="000caec5-bbed-4ed7-b946-7f859ab61e98" containerName="console" containerID="cri-o://1c12390e1edf103197e43825aa34a9e3f97bd897959ae9e98126429c064b8862" gracePeriod=15 Mar 19 19:12:29 crc kubenswrapper[4826]: I0319 19:12:29.574453 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-55ff78d854-f4zw8_000caec5-bbed-4ed7-b946-7f859ab61e98/console/0.log" Mar 19 19:12:29 crc kubenswrapper[4826]: I0319 19:12:29.575032 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55ff78d854-f4zw8" Mar 19 19:12:29 crc kubenswrapper[4826]: I0319 19:12:29.708739 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/000caec5-bbed-4ed7-b946-7f859ab61e98-service-ca\") pod \"000caec5-bbed-4ed7-b946-7f859ab61e98\" (UID: \"000caec5-bbed-4ed7-b946-7f859ab61e98\") " Mar 19 19:12:29 crc kubenswrapper[4826]: I0319 19:12:29.708831 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/000caec5-bbed-4ed7-b946-7f859ab61e98-console-serving-cert\") pod \"000caec5-bbed-4ed7-b946-7f859ab61e98\" (UID: \"000caec5-bbed-4ed7-b946-7f859ab61e98\") " Mar 19 19:12:29 crc kubenswrapper[4826]: I0319 19:12:29.708867 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/000caec5-bbed-4ed7-b946-7f859ab61e98-console-oauth-config\") pod \"000caec5-bbed-4ed7-b946-7f859ab61e98\" (UID: \"000caec5-bbed-4ed7-b946-7f859ab61e98\") " Mar 19 19:12:29 crc kubenswrapper[4826]: I0319 19:12:29.708940 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8jpn\" (UniqueName: \"kubernetes.io/projected/000caec5-bbed-4ed7-b946-7f859ab61e98-kube-api-access-n8jpn\") pod \"000caec5-bbed-4ed7-b946-7f859ab61e98\" (UID: \"000caec5-bbed-4ed7-b946-7f859ab61e98\") " Mar 19 19:12:29 crc kubenswrapper[4826]: I0319 19:12:29.708994 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/000caec5-bbed-4ed7-b946-7f859ab61e98-trusted-ca-bundle\") pod \"000caec5-bbed-4ed7-b946-7f859ab61e98\" (UID: \"000caec5-bbed-4ed7-b946-7f859ab61e98\") " Mar 19 19:12:29 crc kubenswrapper[4826]: I0319 19:12:29.709031 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/000caec5-bbed-4ed7-b946-7f859ab61e98-console-config\") pod \"000caec5-bbed-4ed7-b946-7f859ab61e98\" (UID: \"000caec5-bbed-4ed7-b946-7f859ab61e98\") " Mar 19 19:12:29 crc kubenswrapper[4826]: I0319 19:12:29.709070 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/000caec5-bbed-4ed7-b946-7f859ab61e98-oauth-serving-cert\") pod \"000caec5-bbed-4ed7-b946-7f859ab61e98\" (UID: \"000caec5-bbed-4ed7-b946-7f859ab61e98\") " Mar 19 19:12:29 crc kubenswrapper[4826]: I0319 19:12:29.709662 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/000caec5-bbed-4ed7-b946-7f859ab61e98-service-ca" (OuterVolumeSpecName: "service-ca") pod "000caec5-bbed-4ed7-b946-7f859ab61e98" (UID: "000caec5-bbed-4ed7-b946-7f859ab61e98"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:12:29 crc kubenswrapper[4826]: I0319 19:12:29.709996 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/000caec5-bbed-4ed7-b946-7f859ab61e98-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "000caec5-bbed-4ed7-b946-7f859ab61e98" (UID: "000caec5-bbed-4ed7-b946-7f859ab61e98"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:12:29 crc kubenswrapper[4826]: I0319 19:12:29.710250 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/000caec5-bbed-4ed7-b946-7f859ab61e98-console-config" (OuterVolumeSpecName: "console-config") pod "000caec5-bbed-4ed7-b946-7f859ab61e98" (UID: "000caec5-bbed-4ed7-b946-7f859ab61e98"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:12:29 crc kubenswrapper[4826]: I0319 19:12:29.710280 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/000caec5-bbed-4ed7-b946-7f859ab61e98-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "000caec5-bbed-4ed7-b946-7f859ab61e98" (UID: "000caec5-bbed-4ed7-b946-7f859ab61e98"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:12:29 crc kubenswrapper[4826]: I0319 19:12:29.721938 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/000caec5-bbed-4ed7-b946-7f859ab61e98-kube-api-access-n8jpn" (OuterVolumeSpecName: "kube-api-access-n8jpn") pod "000caec5-bbed-4ed7-b946-7f859ab61e98" (UID: "000caec5-bbed-4ed7-b946-7f859ab61e98"). InnerVolumeSpecName "kube-api-access-n8jpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:12:29 crc kubenswrapper[4826]: I0319 19:12:29.721952 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/000caec5-bbed-4ed7-b946-7f859ab61e98-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "000caec5-bbed-4ed7-b946-7f859ab61e98" (UID: "000caec5-bbed-4ed7-b946-7f859ab61e98"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:12:29 crc kubenswrapper[4826]: I0319 19:12:29.722035 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/000caec5-bbed-4ed7-b946-7f859ab61e98-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "000caec5-bbed-4ed7-b946-7f859ab61e98" (UID: "000caec5-bbed-4ed7-b946-7f859ab61e98"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:12:29 crc kubenswrapper[4826]: I0319 19:12:29.810480 4826 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/000caec5-bbed-4ed7-b946-7f859ab61e98-service-ca\") on node \"crc\" DevicePath \"\"" Mar 19 19:12:29 crc kubenswrapper[4826]: I0319 19:12:29.810525 4826 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/000caec5-bbed-4ed7-b946-7f859ab61e98-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 19:12:29 crc kubenswrapper[4826]: I0319 19:12:29.810542 4826 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/000caec5-bbed-4ed7-b946-7f859ab61e98-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:12:29 crc kubenswrapper[4826]: I0319 19:12:29.810554 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8jpn\" (UniqueName: \"kubernetes.io/projected/000caec5-bbed-4ed7-b946-7f859ab61e98-kube-api-access-n8jpn\") on node \"crc\" DevicePath \"\"" Mar 19 19:12:29 crc kubenswrapper[4826]: I0319 19:12:29.810565 4826 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/000caec5-bbed-4ed7-b946-7f859ab61e98-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:12:29 crc kubenswrapper[4826]: I0319 19:12:29.810575 4826 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/000caec5-bbed-4ed7-b946-7f859ab61e98-console-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:12:29 crc kubenswrapper[4826]: I0319 19:12:29.810587 4826 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/000caec5-bbed-4ed7-b946-7f859ab61e98-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 19:12:29 crc kubenswrapper[4826]: I0319 19:12:29.986409 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-55ff78d854-f4zw8_000caec5-bbed-4ed7-b946-7f859ab61e98/console/0.log" Mar 19 19:12:29 crc kubenswrapper[4826]: I0319 19:12:29.986455 4826 generic.go:334] "Generic (PLEG): container finished" podID="000caec5-bbed-4ed7-b946-7f859ab61e98" containerID="1c12390e1edf103197e43825aa34a9e3f97bd897959ae9e98126429c064b8862" exitCode=2 Mar 19 19:12:29 crc kubenswrapper[4826]: I0319 19:12:29.986480 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55ff78d854-f4zw8" event={"ID":"000caec5-bbed-4ed7-b946-7f859ab61e98","Type":"ContainerDied","Data":"1c12390e1edf103197e43825aa34a9e3f97bd897959ae9e98126429c064b8862"} Mar 19 19:12:29 crc kubenswrapper[4826]: I0319 19:12:29.986499 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55ff78d854-f4zw8" event={"ID":"000caec5-bbed-4ed7-b946-7f859ab61e98","Type":"ContainerDied","Data":"4ea2184973e17bbcfadc97427f2f12e44b8c499e6d60add58ea11b195fb32c4e"} Mar 19 19:12:29 crc kubenswrapper[4826]: I0319 19:12:29.986515 4826 scope.go:117] "RemoveContainer" containerID="1c12390e1edf103197e43825aa34a9e3f97bd897959ae9e98126429c064b8862" Mar 19 19:12:29 crc kubenswrapper[4826]: I0319 19:12:29.986519 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55ff78d854-f4zw8" Mar 19 19:12:30 crc kubenswrapper[4826]: I0319 19:12:30.016592 4826 scope.go:117] "RemoveContainer" containerID="1c12390e1edf103197e43825aa34a9e3f97bd897959ae9e98126429c064b8862" Mar 19 19:12:30 crc kubenswrapper[4826]: E0319 19:12:30.017284 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c12390e1edf103197e43825aa34a9e3f97bd897959ae9e98126429c064b8862\": container with ID starting with 1c12390e1edf103197e43825aa34a9e3f97bd897959ae9e98126429c064b8862 not found: ID does not exist" containerID="1c12390e1edf103197e43825aa34a9e3f97bd897959ae9e98126429c064b8862" Mar 19 19:12:30 crc kubenswrapper[4826]: I0319 19:12:30.017314 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c12390e1edf103197e43825aa34a9e3f97bd897959ae9e98126429c064b8862"} err="failed to get container status \"1c12390e1edf103197e43825aa34a9e3f97bd897959ae9e98126429c064b8862\": rpc error: code = NotFound desc = could not find container \"1c12390e1edf103197e43825aa34a9e3f97bd897959ae9e98126429c064b8862\": container with ID starting with 1c12390e1edf103197e43825aa34a9e3f97bd897959ae9e98126429c064b8862 not found: ID does not exist" Mar 19 19:12:30 crc kubenswrapper[4826]: I0319 19:12:30.035698 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-55ff78d854-f4zw8"] Mar 19 19:12:30 crc kubenswrapper[4826]: I0319 19:12:30.036292 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-55ff78d854-f4zw8"] Mar 19 19:12:31 crc kubenswrapper[4826]: I0319 19:12:31.997230 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="000caec5-bbed-4ed7-b946-7f859ab61e98" path="/var/lib/kubelet/pods/000caec5-bbed-4ed7-b946-7f859ab61e98/volumes" Mar 19 19:12:33 crc kubenswrapper[4826]: I0319 19:12:33.962501 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c19jdwp"] Mar 19 19:12:33 crc kubenswrapper[4826]: E0319 19:12:33.963053 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="000caec5-bbed-4ed7-b946-7f859ab61e98" containerName="console" Mar 19 19:12:33 crc kubenswrapper[4826]: I0319 19:12:33.963064 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="000caec5-bbed-4ed7-b946-7f859ab61e98" containerName="console" Mar 19 19:12:33 crc kubenswrapper[4826]: E0319 19:12:33.963078 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5ff4e07-a700-4be9-a448-7ccc4a75f8ae" containerName="oc" Mar 19 19:12:33 crc kubenswrapper[4826]: I0319 19:12:33.963084 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5ff4e07-a700-4be9-a448-7ccc4a75f8ae" containerName="oc" Mar 19 19:12:33 crc kubenswrapper[4826]: I0319 19:12:33.963209 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="000caec5-bbed-4ed7-b946-7f859ab61e98" containerName="console" Mar 19 19:12:33 crc kubenswrapper[4826]: I0319 19:12:33.963227 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5ff4e07-a700-4be9-a448-7ccc4a75f8ae" containerName="oc" Mar 19 19:12:33 crc kubenswrapper[4826]: I0319 19:12:33.964149 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c19jdwp" Mar 19 19:12:33 crc kubenswrapper[4826]: I0319 19:12:33.967645 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 19 19:12:33 crc kubenswrapper[4826]: I0319 19:12:33.990294 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c19jdwp"] Mar 19 19:12:34 crc kubenswrapper[4826]: I0319 19:12:34.082790 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9ab1a846-5046-4bc2-a83a-a7a1ee360c2e-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c19jdwp\" (UID: \"9ab1a846-5046-4bc2-a83a-a7a1ee360c2e\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c19jdwp" Mar 19 19:12:34 crc kubenswrapper[4826]: I0319 19:12:34.082863 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9ab1a846-5046-4bc2-a83a-a7a1ee360c2e-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c19jdwp\" (UID: \"9ab1a846-5046-4bc2-a83a-a7a1ee360c2e\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c19jdwp" Mar 19 19:12:34 crc kubenswrapper[4826]: I0319 19:12:34.082938 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xvsc\" (UniqueName: \"kubernetes.io/projected/9ab1a846-5046-4bc2-a83a-a7a1ee360c2e-kube-api-access-5xvsc\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c19jdwp\" (UID: \"9ab1a846-5046-4bc2-a83a-a7a1ee360c2e\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c19jdwp" Mar 19 19:12:34 crc kubenswrapper[4826]: I0319 19:12:34.184993 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9ab1a846-5046-4bc2-a83a-a7a1ee360c2e-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c19jdwp\" (UID: \"9ab1a846-5046-4bc2-a83a-a7a1ee360c2e\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c19jdwp" Mar 19 19:12:34 crc kubenswrapper[4826]: I0319 19:12:34.185107 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9ab1a846-5046-4bc2-a83a-a7a1ee360c2e-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c19jdwp\" (UID: \"9ab1a846-5046-4bc2-a83a-a7a1ee360c2e\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c19jdwp" Mar 19 19:12:34 crc kubenswrapper[4826]: I0319 19:12:34.185208 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xvsc\" (UniqueName: \"kubernetes.io/projected/9ab1a846-5046-4bc2-a83a-a7a1ee360c2e-kube-api-access-5xvsc\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c19jdwp\" (UID: \"9ab1a846-5046-4bc2-a83a-a7a1ee360c2e\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c19jdwp" Mar 19 19:12:34 crc kubenswrapper[4826]: I0319 19:12:34.185503 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9ab1a846-5046-4bc2-a83a-a7a1ee360c2e-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c19jdwp\" (UID: \"9ab1a846-5046-4bc2-a83a-a7a1ee360c2e\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c19jdwp" Mar 19 19:12:34 crc kubenswrapper[4826]: I0319 19:12:34.185753 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9ab1a846-5046-4bc2-a83a-a7a1ee360c2e-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c19jdwp\" (UID: \"9ab1a846-5046-4bc2-a83a-a7a1ee360c2e\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c19jdwp" Mar 19 19:12:34 crc kubenswrapper[4826]: I0319 19:12:34.208067 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xvsc\" (UniqueName: \"kubernetes.io/projected/9ab1a846-5046-4bc2-a83a-a7a1ee360c2e-kube-api-access-5xvsc\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c19jdwp\" (UID: \"9ab1a846-5046-4bc2-a83a-a7a1ee360c2e\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c19jdwp" Mar 19 19:12:34 crc kubenswrapper[4826]: I0319 19:12:34.281589 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c19jdwp" Mar 19 19:12:34 crc kubenswrapper[4826]: I0319 19:12:34.726754 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c19jdwp"] Mar 19 19:12:35 crc kubenswrapper[4826]: I0319 19:12:35.034124 4826 generic.go:334] "Generic (PLEG): container finished" podID="9ab1a846-5046-4bc2-a83a-a7a1ee360c2e" containerID="d5dbdeb3cf244da1ed0391a08389494c393d9217c103e21bb35313bae980ff2a" exitCode=0 Mar 19 19:12:35 crc kubenswrapper[4826]: I0319 19:12:35.034166 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c19jdwp" event={"ID":"9ab1a846-5046-4bc2-a83a-a7a1ee360c2e","Type":"ContainerDied","Data":"d5dbdeb3cf244da1ed0391a08389494c393d9217c103e21bb35313bae980ff2a"} Mar 19 19:12:35 crc kubenswrapper[4826]: I0319 19:12:35.034193 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c19jdwp" event={"ID":"9ab1a846-5046-4bc2-a83a-a7a1ee360c2e","Type":"ContainerStarted","Data":"9ad0a6cb802b97653b70cccdb4d90dd5bb735e242e29bee372bf1a503b02fa99"} Mar 19 19:12:38 crc kubenswrapper[4826]: I0319 19:12:38.072999 4826 generic.go:334] "Generic (PLEG): container finished" podID="9ab1a846-5046-4bc2-a83a-a7a1ee360c2e" containerID="4bd7869992327245413170343bde5d2fb998ccce25b5ffba2a952110e5dc6c7f" exitCode=0 Mar 19 19:12:38 crc kubenswrapper[4826]: I0319 19:12:38.073161 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c19jdwp" event={"ID":"9ab1a846-5046-4bc2-a83a-a7a1ee360c2e","Type":"ContainerDied","Data":"4bd7869992327245413170343bde5d2fb998ccce25b5ffba2a952110e5dc6c7f"} Mar 19 19:12:39 crc kubenswrapper[4826]: I0319 19:12:39.083587 4826 generic.go:334] "Generic (PLEG): container finished" podID="9ab1a846-5046-4bc2-a83a-a7a1ee360c2e" containerID="467e2426523765e518ec3a083c2bb31d54968b84f679a43d25e9d412a0f6e96a" exitCode=0 Mar 19 19:12:39 crc kubenswrapper[4826]: I0319 19:12:39.083722 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c19jdwp" event={"ID":"9ab1a846-5046-4bc2-a83a-a7a1ee360c2e","Type":"ContainerDied","Data":"467e2426523765e518ec3a083c2bb31d54968b84f679a43d25e9d412a0f6e96a"} Mar 19 19:12:40 crc kubenswrapper[4826]: I0319 19:12:40.481484 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c19jdwp" Mar 19 19:12:40 crc kubenswrapper[4826]: I0319 19:12:40.602746 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9ab1a846-5046-4bc2-a83a-a7a1ee360c2e-bundle\") pod \"9ab1a846-5046-4bc2-a83a-a7a1ee360c2e\" (UID: \"9ab1a846-5046-4bc2-a83a-a7a1ee360c2e\") " Mar 19 19:12:40 crc kubenswrapper[4826]: I0319 19:12:40.602803 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9ab1a846-5046-4bc2-a83a-a7a1ee360c2e-util\") pod \"9ab1a846-5046-4bc2-a83a-a7a1ee360c2e\" (UID: \"9ab1a846-5046-4bc2-a83a-a7a1ee360c2e\") " Mar 19 19:12:40 crc kubenswrapper[4826]: I0319 19:12:40.602899 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xvsc\" (UniqueName: \"kubernetes.io/projected/9ab1a846-5046-4bc2-a83a-a7a1ee360c2e-kube-api-access-5xvsc\") pod \"9ab1a846-5046-4bc2-a83a-a7a1ee360c2e\" (UID: \"9ab1a846-5046-4bc2-a83a-a7a1ee360c2e\") " Mar 19 19:12:40 crc kubenswrapper[4826]: I0319 19:12:40.604161 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ab1a846-5046-4bc2-a83a-a7a1ee360c2e-bundle" (OuterVolumeSpecName: "bundle") pod "9ab1a846-5046-4bc2-a83a-a7a1ee360c2e" (UID: "9ab1a846-5046-4bc2-a83a-a7a1ee360c2e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:12:40 crc kubenswrapper[4826]: I0319 19:12:40.613947 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ab1a846-5046-4bc2-a83a-a7a1ee360c2e-kube-api-access-5xvsc" (OuterVolumeSpecName: "kube-api-access-5xvsc") pod "9ab1a846-5046-4bc2-a83a-a7a1ee360c2e" (UID: "9ab1a846-5046-4bc2-a83a-a7a1ee360c2e"). InnerVolumeSpecName "kube-api-access-5xvsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:12:40 crc kubenswrapper[4826]: I0319 19:12:40.621051 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ab1a846-5046-4bc2-a83a-a7a1ee360c2e-util" (OuterVolumeSpecName: "util") pod "9ab1a846-5046-4bc2-a83a-a7a1ee360c2e" (UID: "9ab1a846-5046-4bc2-a83a-a7a1ee360c2e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:12:40 crc kubenswrapper[4826]: I0319 19:12:40.705417 4826 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9ab1a846-5046-4bc2-a83a-a7a1ee360c2e-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:12:40 crc kubenswrapper[4826]: I0319 19:12:40.705494 4826 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9ab1a846-5046-4bc2-a83a-a7a1ee360c2e-util\") on node \"crc\" DevicePath \"\"" Mar 19 19:12:40 crc kubenswrapper[4826]: I0319 19:12:40.705513 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xvsc\" (UniqueName: \"kubernetes.io/projected/9ab1a846-5046-4bc2-a83a-a7a1ee360c2e-kube-api-access-5xvsc\") on node \"crc\" DevicePath \"\"" Mar 19 19:12:41 crc kubenswrapper[4826]: I0319 19:12:41.117440 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c19jdwp" event={"ID":"9ab1a846-5046-4bc2-a83a-a7a1ee360c2e","Type":"ContainerDied","Data":"9ad0a6cb802b97653b70cccdb4d90dd5bb735e242e29bee372bf1a503b02fa99"} Mar 19 19:12:41 crc kubenswrapper[4826]: I0319 19:12:41.117498 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ad0a6cb802b97653b70cccdb4d90dd5bb735e242e29bee372bf1a503b02fa99" Mar 19 19:12:41 crc kubenswrapper[4826]: I0319 19:12:41.117535 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c19jdwp" Mar 19 19:12:49 crc kubenswrapper[4826]: I0319 19:12:49.055328 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-84d47777df-4x998"] Mar 19 19:12:49 crc kubenswrapper[4826]: E0319 19:12:49.056263 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ab1a846-5046-4bc2-a83a-a7a1ee360c2e" containerName="extract" Mar 19 19:12:49 crc kubenswrapper[4826]: I0319 19:12:49.056278 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ab1a846-5046-4bc2-a83a-a7a1ee360c2e" containerName="extract" Mar 19 19:12:49 crc kubenswrapper[4826]: E0319 19:12:49.056305 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ab1a846-5046-4bc2-a83a-a7a1ee360c2e" containerName="pull" Mar 19 19:12:49 crc kubenswrapper[4826]: I0319 19:12:49.056313 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ab1a846-5046-4bc2-a83a-a7a1ee360c2e" containerName="pull" Mar 19 19:12:49 crc kubenswrapper[4826]: E0319 19:12:49.056324 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ab1a846-5046-4bc2-a83a-a7a1ee360c2e" containerName="util" Mar 19 19:12:49 crc kubenswrapper[4826]: I0319 19:12:49.056332 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ab1a846-5046-4bc2-a83a-a7a1ee360c2e" containerName="util" Mar 19 19:12:49 crc kubenswrapper[4826]: I0319 19:12:49.056496 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ab1a846-5046-4bc2-a83a-a7a1ee360c2e" containerName="extract" Mar 19 19:12:49 crc kubenswrapper[4826]: I0319 19:12:49.057185 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-84d47777df-4x998" Mar 19 19:12:49 crc kubenswrapper[4826]: I0319 19:12:49.059889 4826 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-v6dkp" Mar 19 19:12:49 crc kubenswrapper[4826]: I0319 19:12:49.060150 4826 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 19 19:12:49 crc kubenswrapper[4826]: I0319 19:12:49.062441 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 19 19:12:49 crc kubenswrapper[4826]: I0319 19:12:49.062615 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 19 19:12:49 crc kubenswrapper[4826]: I0319 19:12:49.064090 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-84d47777df-4x998"] Mar 19 19:12:49 crc kubenswrapper[4826]: I0319 19:12:49.065959 4826 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 19 19:12:49 crc kubenswrapper[4826]: I0319 19:12:49.140083 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xs2z\" (UniqueName: \"kubernetes.io/projected/010ce31f-d333-43a9-b1e0-cd85cc0f6fd6-kube-api-access-5xs2z\") pod \"metallb-operator-controller-manager-84d47777df-4x998\" (UID: \"010ce31f-d333-43a9-b1e0-cd85cc0f6fd6\") " pod="metallb-system/metallb-operator-controller-manager-84d47777df-4x998" Mar 19 19:12:49 crc kubenswrapper[4826]: I0319 19:12:49.140417 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/010ce31f-d333-43a9-b1e0-cd85cc0f6fd6-webhook-cert\") pod \"metallb-operator-controller-manager-84d47777df-4x998\" (UID: \"010ce31f-d333-43a9-b1e0-cd85cc0f6fd6\") " pod="metallb-system/metallb-operator-controller-manager-84d47777df-4x998" Mar 19 19:12:49 crc kubenswrapper[4826]: I0319 19:12:49.140564 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/010ce31f-d333-43a9-b1e0-cd85cc0f6fd6-apiservice-cert\") pod \"metallb-operator-controller-manager-84d47777df-4x998\" (UID: \"010ce31f-d333-43a9-b1e0-cd85cc0f6fd6\") " pod="metallb-system/metallb-operator-controller-manager-84d47777df-4x998" Mar 19 19:12:49 crc kubenswrapper[4826]: I0319 19:12:49.242533 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/010ce31f-d333-43a9-b1e0-cd85cc0f6fd6-apiservice-cert\") pod \"metallb-operator-controller-manager-84d47777df-4x998\" (UID: \"010ce31f-d333-43a9-b1e0-cd85cc0f6fd6\") " pod="metallb-system/metallb-operator-controller-manager-84d47777df-4x998" Mar 19 19:12:49 crc kubenswrapper[4826]: I0319 19:12:49.242650 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xs2z\" (UniqueName: \"kubernetes.io/projected/010ce31f-d333-43a9-b1e0-cd85cc0f6fd6-kube-api-access-5xs2z\") pod \"metallb-operator-controller-manager-84d47777df-4x998\" (UID: \"010ce31f-d333-43a9-b1e0-cd85cc0f6fd6\") " pod="metallb-system/metallb-operator-controller-manager-84d47777df-4x998" Mar 19 19:12:49 crc kubenswrapper[4826]: I0319 19:12:49.242733 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/010ce31f-d333-43a9-b1e0-cd85cc0f6fd6-webhook-cert\") pod \"metallb-operator-controller-manager-84d47777df-4x998\" (UID: \"010ce31f-d333-43a9-b1e0-cd85cc0f6fd6\") " pod="metallb-system/metallb-operator-controller-manager-84d47777df-4x998" Mar 19 19:12:49 crc kubenswrapper[4826]: I0319 19:12:49.260598 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/010ce31f-d333-43a9-b1e0-cd85cc0f6fd6-webhook-cert\") pod \"metallb-operator-controller-manager-84d47777df-4x998\" (UID: \"010ce31f-d333-43a9-b1e0-cd85cc0f6fd6\") " pod="metallb-system/metallb-operator-controller-manager-84d47777df-4x998" Mar 19 19:12:49 crc kubenswrapper[4826]: I0319 19:12:49.261420 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xs2z\" (UniqueName: \"kubernetes.io/projected/010ce31f-d333-43a9-b1e0-cd85cc0f6fd6-kube-api-access-5xs2z\") pod \"metallb-operator-controller-manager-84d47777df-4x998\" (UID: \"010ce31f-d333-43a9-b1e0-cd85cc0f6fd6\") " pod="metallb-system/metallb-operator-controller-manager-84d47777df-4x998" Mar 19 19:12:49 crc kubenswrapper[4826]: I0319 19:12:49.261925 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/010ce31f-d333-43a9-b1e0-cd85cc0f6fd6-apiservice-cert\") pod \"metallb-operator-controller-manager-84d47777df-4x998\" (UID: \"010ce31f-d333-43a9-b1e0-cd85cc0f6fd6\") " pod="metallb-system/metallb-operator-controller-manager-84d47777df-4x998" Mar 19 19:12:49 crc kubenswrapper[4826]: I0319 19:12:49.305130 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-8645ff956b-rx86q"] Mar 19 19:12:49 crc kubenswrapper[4826]: I0319 19:12:49.306062 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-8645ff956b-rx86q" Mar 19 19:12:49 crc kubenswrapper[4826]: I0319 19:12:49.308073 4826 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-tz5c9" Mar 19 19:12:49 crc kubenswrapper[4826]: I0319 19:12:49.308117 4826 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 19 19:12:49 crc kubenswrapper[4826]: I0319 19:12:49.308327 4826 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 19 19:12:49 crc kubenswrapper[4826]: I0319 19:12:49.327095 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-8645ff956b-rx86q"] Mar 19 19:12:49 crc kubenswrapper[4826]: I0319 19:12:49.376005 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-84d47777df-4x998" Mar 19 19:12:49 crc kubenswrapper[4826]: I0319 19:12:49.447572 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrnr7\" (UniqueName: \"kubernetes.io/projected/b57da585-9fca-48a5-a872-e5019db1e36e-kube-api-access-rrnr7\") pod \"metallb-operator-webhook-server-8645ff956b-rx86q\" (UID: \"b57da585-9fca-48a5-a872-e5019db1e36e\") " pod="metallb-system/metallb-operator-webhook-server-8645ff956b-rx86q" Mar 19 19:12:49 crc kubenswrapper[4826]: I0319 19:12:49.447632 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b57da585-9fca-48a5-a872-e5019db1e36e-apiservice-cert\") pod \"metallb-operator-webhook-server-8645ff956b-rx86q\" (UID: \"b57da585-9fca-48a5-a872-e5019db1e36e\") " pod="metallb-system/metallb-operator-webhook-server-8645ff956b-rx86q" Mar 19 19:12:49 crc kubenswrapper[4826]: I0319 19:12:49.447760 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b57da585-9fca-48a5-a872-e5019db1e36e-webhook-cert\") pod \"metallb-operator-webhook-server-8645ff956b-rx86q\" (UID: \"b57da585-9fca-48a5-a872-e5019db1e36e\") " pod="metallb-system/metallb-operator-webhook-server-8645ff956b-rx86q" Mar 19 19:12:49 crc kubenswrapper[4826]: I0319 19:12:49.552454 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrnr7\" (UniqueName: \"kubernetes.io/projected/b57da585-9fca-48a5-a872-e5019db1e36e-kube-api-access-rrnr7\") pod \"metallb-operator-webhook-server-8645ff956b-rx86q\" (UID: \"b57da585-9fca-48a5-a872-e5019db1e36e\") " pod="metallb-system/metallb-operator-webhook-server-8645ff956b-rx86q" Mar 19 19:12:49 crc kubenswrapper[4826]: I0319 19:12:49.552789 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b57da585-9fca-48a5-a872-e5019db1e36e-apiservice-cert\") pod \"metallb-operator-webhook-server-8645ff956b-rx86q\" (UID: \"b57da585-9fca-48a5-a872-e5019db1e36e\") " pod="metallb-system/metallb-operator-webhook-server-8645ff956b-rx86q" Mar 19 19:12:49 crc kubenswrapper[4826]: I0319 19:12:49.552868 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b57da585-9fca-48a5-a872-e5019db1e36e-webhook-cert\") pod \"metallb-operator-webhook-server-8645ff956b-rx86q\" (UID: \"b57da585-9fca-48a5-a872-e5019db1e36e\") " pod="metallb-system/metallb-operator-webhook-server-8645ff956b-rx86q" Mar 19 19:12:49 crc kubenswrapper[4826]: I0319 19:12:49.561703 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b57da585-9fca-48a5-a872-e5019db1e36e-apiservice-cert\") pod \"metallb-operator-webhook-server-8645ff956b-rx86q\" (UID: \"b57da585-9fca-48a5-a872-e5019db1e36e\") " pod="metallb-system/metallb-operator-webhook-server-8645ff956b-rx86q" Mar 19 19:12:49 crc kubenswrapper[4826]: I0319 19:12:49.583536 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b57da585-9fca-48a5-a872-e5019db1e36e-webhook-cert\") pod \"metallb-operator-webhook-server-8645ff956b-rx86q\" (UID: \"b57da585-9fca-48a5-a872-e5019db1e36e\") " pod="metallb-system/metallb-operator-webhook-server-8645ff956b-rx86q" Mar 19 19:12:49 crc kubenswrapper[4826]: I0319 19:12:49.583722 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrnr7\" (UniqueName: \"kubernetes.io/projected/b57da585-9fca-48a5-a872-e5019db1e36e-kube-api-access-rrnr7\") pod \"metallb-operator-webhook-server-8645ff956b-rx86q\" (UID: \"b57da585-9fca-48a5-a872-e5019db1e36e\") " pod="metallb-system/metallb-operator-webhook-server-8645ff956b-rx86q" Mar 19 19:12:49 crc kubenswrapper[4826]: I0319 19:12:49.641739 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-8645ff956b-rx86q" Mar 19 19:12:49 crc kubenswrapper[4826]: I0319 19:12:49.861902 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-84d47777df-4x998"] Mar 19 19:12:49 crc kubenswrapper[4826]: W0319 19:12:49.866882 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod010ce31f_d333_43a9_b1e0_cd85cc0f6fd6.slice/crio-707e800a659077caf5115861abd4553ba363c6edf8dadd0d410e0b01e73127c1 WatchSource:0}: Error finding container 707e800a659077caf5115861abd4553ba363c6edf8dadd0d410e0b01e73127c1: Status 404 returned error can't find the container with id 707e800a659077caf5115861abd4553ba363c6edf8dadd0d410e0b01e73127c1 Mar 19 19:12:50 crc kubenswrapper[4826]: I0319 19:12:50.061157 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-8645ff956b-rx86q"] Mar 19 19:12:50 crc kubenswrapper[4826]: W0319 19:12:50.068334 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb57da585_9fca_48a5_a872_e5019db1e36e.slice/crio-974b4aa0ce9770a1eb4558f73a06f39ea3426665a340229417849d3e232cf3dc WatchSource:0}: Error finding container 974b4aa0ce9770a1eb4558f73a06f39ea3426665a340229417849d3e232cf3dc: Status 404 returned error can't find the container with id 974b4aa0ce9770a1eb4558f73a06f39ea3426665a340229417849d3e232cf3dc Mar 19 19:12:50 crc kubenswrapper[4826]: I0319 19:12:50.200863 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-8645ff956b-rx86q" event={"ID":"b57da585-9fca-48a5-a872-e5019db1e36e","Type":"ContainerStarted","Data":"974b4aa0ce9770a1eb4558f73a06f39ea3426665a340229417849d3e232cf3dc"} Mar 19 19:12:50 crc kubenswrapper[4826]: I0319 19:12:50.202108 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-84d47777df-4x998" event={"ID":"010ce31f-d333-43a9-b1e0-cd85cc0f6fd6","Type":"ContainerStarted","Data":"707e800a659077caf5115861abd4553ba363c6edf8dadd0d410e0b01e73127c1"} Mar 19 19:12:54 crc kubenswrapper[4826]: I0319 19:12:54.238803 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-84d47777df-4x998" event={"ID":"010ce31f-d333-43a9-b1e0-cd85cc0f6fd6","Type":"ContainerStarted","Data":"b5b421b006d9403cd764de38fd564830fc823289b50ceab2e322440e22638665"} Mar 19 19:12:54 crc kubenswrapper[4826]: I0319 19:12:54.239437 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-84d47777df-4x998" Mar 19 19:12:55 crc kubenswrapper[4826]: I0319 19:12:55.401051 4826 patch_prober.go:28] interesting pod/machine-config-daemon-zz87p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:12:55 crc kubenswrapper[4826]: I0319 19:12:55.401113 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:12:55 crc kubenswrapper[4826]: I0319 19:12:55.401161 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" Mar 19 19:12:55 crc kubenswrapper[4826]: I0319 19:12:55.401823 4826 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"633ba93ffe9c9e9f20a094017e3572d6ef9546ba5f85c83960d8b20fb8ddd2bc"} pod="openshift-machine-config-operator/machine-config-daemon-zz87p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 19:12:55 crc kubenswrapper[4826]: I0319 19:12:55.401881 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" containerID="cri-o://633ba93ffe9c9e9f20a094017e3572d6ef9546ba5f85c83960d8b20fb8ddd2bc" gracePeriod=600 Mar 19 19:12:56 crc kubenswrapper[4826]: I0319 19:12:56.000169 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-84d47777df-4x998" podStartSLOduration=3.341169973 podStartE2EDuration="7.000150198s" podCreationTimestamp="2026-03-19 19:12:49 +0000 UTC" firstStartedPulling="2026-03-19 19:12:49.869250609 +0000 UTC m=+994.623318922" lastFinishedPulling="2026-03-19 19:12:53.528230834 +0000 UTC m=+998.282299147" observedRunningTime="2026-03-19 19:12:54.26124359 +0000 UTC m=+999.015311943" watchObservedRunningTime="2026-03-19 19:12:56.000150198 +0000 UTC m=+1000.754218511" Mar 19 19:12:56 crc kubenswrapper[4826]: I0319 19:12:56.252529 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-8645ff956b-rx86q" event={"ID":"b57da585-9fca-48a5-a872-e5019db1e36e","Type":"ContainerStarted","Data":"5cfbb83518ed54ebf9988facce22068ceae695e9cf4f38cc1f2065a778e798be"} Mar 19 19:12:56 crc kubenswrapper[4826]: I0319 19:12:56.252609 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-8645ff956b-rx86q" Mar 19 19:12:56 crc kubenswrapper[4826]: I0319 19:12:56.255329 4826 generic.go:334] "Generic (PLEG): container finished" podID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerID="633ba93ffe9c9e9f20a094017e3572d6ef9546ba5f85c83960d8b20fb8ddd2bc" exitCode=0 Mar 19 19:12:56 crc kubenswrapper[4826]: I0319 19:12:56.255372 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" event={"ID":"b456fa3f-c7a7-45ca-b560-e7a9b21be05a","Type":"ContainerDied","Data":"633ba93ffe9c9e9f20a094017e3572d6ef9546ba5f85c83960d8b20fb8ddd2bc"} Mar 19 19:12:56 crc kubenswrapper[4826]: I0319 19:12:56.255396 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" event={"ID":"b456fa3f-c7a7-45ca-b560-e7a9b21be05a","Type":"ContainerStarted","Data":"8f9b98750fb35334b26ac1561a7757e06810afb82592af11d7a0e1fbf0a43d22"} Mar 19 19:12:56 crc kubenswrapper[4826]: I0319 19:12:56.255412 4826 scope.go:117] "RemoveContainer" containerID="8ea88529d854cce8d681f7f626a89455b8191b0562d0f4f8577894657d2eaf5c" Mar 19 19:12:56 crc kubenswrapper[4826]: I0319 19:12:56.274100 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-8645ff956b-rx86q" podStartSLOduration=1.757172008 podStartE2EDuration="7.274083079s" podCreationTimestamp="2026-03-19 19:12:49 +0000 UTC" firstStartedPulling="2026-03-19 19:12:50.071792734 +0000 UTC m=+994.825861087" lastFinishedPulling="2026-03-19 19:12:55.588703825 +0000 UTC m=+1000.342772158" observedRunningTime="2026-03-19 19:12:56.26789117 +0000 UTC m=+1001.021959483" watchObservedRunningTime="2026-03-19 19:12:56.274083079 +0000 UTC m=+1001.028151392" Mar 19 19:13:09 crc kubenswrapper[4826]: I0319 19:13:09.652149 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-8645ff956b-rx86q" Mar 19 19:13:29 crc kubenswrapper[4826]: I0319 19:13:29.381122 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-84d47777df-4x998" Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.210047 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-prxxj"] Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.214472 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-prxxj" Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.218461 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.218621 4826 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.230580 4826 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-7zgkx" Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.230599 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-6btqx"] Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.231668 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-6btqx" Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.233455 4826 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.249331 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-6btqx"] Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.332384 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4ks4\" (UniqueName: \"kubernetes.io/projected/b724e39c-45b5-4701-b4f0-a19969224d90-kube-api-access-q4ks4\") pod \"frr-k8s-prxxj\" (UID: \"b724e39c-45b5-4701-b4f0-a19969224d90\") " pod="metallb-system/frr-k8s-prxxj" Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.332448 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/b724e39c-45b5-4701-b4f0-a19969224d90-frr-conf\") pod \"frr-k8s-prxxj\" (UID: \"b724e39c-45b5-4701-b4f0-a19969224d90\") " pod="metallb-system/frr-k8s-prxxj" Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.332487 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/b724e39c-45b5-4701-b4f0-a19969224d90-metrics\") pod \"frr-k8s-prxxj\" (UID: \"b724e39c-45b5-4701-b4f0-a19969224d90\") " pod="metallb-system/frr-k8s-prxxj" Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.332672 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/b724e39c-45b5-4701-b4f0-a19969224d90-frr-startup\") pod \"frr-k8s-prxxj\" (UID: \"b724e39c-45b5-4701-b4f0-a19969224d90\") " pod="metallb-system/frr-k8s-prxxj" Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.332696 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b724e39c-45b5-4701-b4f0-a19969224d90-metrics-certs\") pod \"frr-k8s-prxxj\" (UID: \"b724e39c-45b5-4701-b4f0-a19969224d90\") " pod="metallb-system/frr-k8s-prxxj" Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.332723 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/b724e39c-45b5-4701-b4f0-a19969224d90-reloader\") pod \"frr-k8s-prxxj\" (UID: \"b724e39c-45b5-4701-b4f0-a19969224d90\") " pod="metallb-system/frr-k8s-prxxj" Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.332747 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/81cad5dc-6bd8-4081-adc1-28f65b056636-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-6btqx\" (UID: \"81cad5dc-6bd8-4081-adc1-28f65b056636\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-6btqx" Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.332822 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/b724e39c-45b5-4701-b4f0-a19969224d90-frr-sockets\") pod \"frr-k8s-prxxj\" (UID: \"b724e39c-45b5-4701-b4f0-a19969224d90\") " pod="metallb-system/frr-k8s-prxxj" Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.332846 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4wtq\" (UniqueName: \"kubernetes.io/projected/81cad5dc-6bd8-4081-adc1-28f65b056636-kube-api-access-w4wtq\") pod \"frr-k8s-webhook-server-bcc4b6f68-6btqx\" (UID: \"81cad5dc-6bd8-4081-adc1-28f65b056636\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-6btqx" Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.343538 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-cnfr9"] Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.353258 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-cnfr9" Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.356951 4826 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.371141 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-w2f68"] Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.373083 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-w2f68" Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.377050 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.377478 4826 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.378703 4826 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-xctxx" Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.379740 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-cnfr9"] Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.381106 4826 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.434054 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4ks4\" (UniqueName: \"kubernetes.io/projected/b724e39c-45b5-4701-b4f0-a19969224d90-kube-api-access-q4ks4\") pod \"frr-k8s-prxxj\" (UID: \"b724e39c-45b5-4701-b4f0-a19969224d90\") " pod="metallb-system/frr-k8s-prxxj" Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.434102 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/b724e39c-45b5-4701-b4f0-a19969224d90-frr-conf\") pod \"frr-k8s-prxxj\" (UID: \"b724e39c-45b5-4701-b4f0-a19969224d90\") " pod="metallb-system/frr-k8s-prxxj" Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.434123 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/b812f1db-b2c8-467c-977a-a8661540546e-metallb-excludel2\") pod \"speaker-w2f68\" (UID: \"b812f1db-b2c8-467c-977a-a8661540546e\") " pod="metallb-system/speaker-w2f68" Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.434153 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/13651756-55fe-46f1-b849-fbdc5dc20887-metrics-certs\") pod \"controller-7bb4cc7c98-cnfr9\" (UID: \"13651756-55fe-46f1-b849-fbdc5dc20887\") " pod="metallb-system/controller-7bb4cc7c98-cnfr9" Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.434171 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/b724e39c-45b5-4701-b4f0-a19969224d90-metrics\") pod \"frr-k8s-prxxj\" (UID: \"b724e39c-45b5-4701-b4f0-a19969224d90\") " pod="metallb-system/frr-k8s-prxxj" Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.434216 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/b724e39c-45b5-4701-b4f0-a19969224d90-frr-startup\") pod \"frr-k8s-prxxj\" (UID: \"b724e39c-45b5-4701-b4f0-a19969224d90\") " pod="metallb-system/frr-k8s-prxxj" Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.434232 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b724e39c-45b5-4701-b4f0-a19969224d90-metrics-certs\") pod \"frr-k8s-prxxj\" (UID: \"b724e39c-45b5-4701-b4f0-a19969224d90\") " pod="metallb-system/frr-k8s-prxxj" Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.434255 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/b724e39c-45b5-4701-b4f0-a19969224d90-reloader\") pod \"frr-k8s-prxxj\" (UID: \"b724e39c-45b5-4701-b4f0-a19969224d90\") " pod="metallb-system/frr-k8s-prxxj" Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.434271 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/13651756-55fe-46f1-b849-fbdc5dc20887-cert\") pod \"controller-7bb4cc7c98-cnfr9\" (UID: \"13651756-55fe-46f1-b849-fbdc5dc20887\") " pod="metallb-system/controller-7bb4cc7c98-cnfr9" Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.434289 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/81cad5dc-6bd8-4081-adc1-28f65b056636-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-6btqx\" (UID: \"81cad5dc-6bd8-4081-adc1-28f65b056636\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-6btqx" Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.434304 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/b812f1db-b2c8-467c-977a-a8661540546e-memberlist\") pod \"speaker-w2f68\" (UID: \"b812f1db-b2c8-467c-977a-a8661540546e\") " pod="metallb-system/speaker-w2f68" Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.434331 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2vq7\" (UniqueName: \"kubernetes.io/projected/b812f1db-b2c8-467c-977a-a8661540546e-kube-api-access-m2vq7\") pod \"speaker-w2f68\" (UID: \"b812f1db-b2c8-467c-977a-a8661540546e\") " pod="metallb-system/speaker-w2f68" Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.434364 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vswrq\" (UniqueName: \"kubernetes.io/projected/13651756-55fe-46f1-b849-fbdc5dc20887-kube-api-access-vswrq\") pod \"controller-7bb4cc7c98-cnfr9\" (UID: \"13651756-55fe-46f1-b849-fbdc5dc20887\") " pod="metallb-system/controller-7bb4cc7c98-cnfr9" Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.434407 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b812f1db-b2c8-467c-977a-a8661540546e-metrics-certs\") pod \"speaker-w2f68\" (UID: \"b812f1db-b2c8-467c-977a-a8661540546e\") " pod="metallb-system/speaker-w2f68" Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.434425 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/b724e39c-45b5-4701-b4f0-a19969224d90-frr-sockets\") pod \"frr-k8s-prxxj\" (UID: \"b724e39c-45b5-4701-b4f0-a19969224d90\") " pod="metallb-system/frr-k8s-prxxj" Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.434444 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4wtq\" (UniqueName: \"kubernetes.io/projected/81cad5dc-6bd8-4081-adc1-28f65b056636-kube-api-access-w4wtq\") pod \"frr-k8s-webhook-server-bcc4b6f68-6btqx\" (UID: \"81cad5dc-6bd8-4081-adc1-28f65b056636\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-6btqx" Mar 19 19:13:30 crc kubenswrapper[4826]: E0319 19:13:30.434779 4826 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.434904 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/b724e39c-45b5-4701-b4f0-a19969224d90-reloader\") pod \"frr-k8s-prxxj\" (UID: \"b724e39c-45b5-4701-b4f0-a19969224d90\") " pod="metallb-system/frr-k8s-prxxj" Mar 19 19:13:30 crc kubenswrapper[4826]: E0319 19:13:30.434947 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81cad5dc-6bd8-4081-adc1-28f65b056636-cert podName:81cad5dc-6bd8-4081-adc1-28f65b056636 nodeName:}" failed. No retries permitted until 2026-03-19 19:13:30.934923001 +0000 UTC m=+1035.688991314 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/81cad5dc-6bd8-4081-adc1-28f65b056636-cert") pod "frr-k8s-webhook-server-bcc4b6f68-6btqx" (UID: "81cad5dc-6bd8-4081-adc1-28f65b056636") : secret "frr-k8s-webhook-server-cert" not found Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.435047 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/b724e39c-45b5-4701-b4f0-a19969224d90-frr-sockets\") pod \"frr-k8s-prxxj\" (UID: \"b724e39c-45b5-4701-b4f0-a19969224d90\") " pod="metallb-system/frr-k8s-prxxj" Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.435144 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/b724e39c-45b5-4701-b4f0-a19969224d90-frr-conf\") pod \"frr-k8s-prxxj\" (UID: \"b724e39c-45b5-4701-b4f0-a19969224d90\") " pod="metallb-system/frr-k8s-prxxj" Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.435238 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/b724e39c-45b5-4701-b4f0-a19969224d90-metrics\") pod \"frr-k8s-prxxj\" (UID: \"b724e39c-45b5-4701-b4f0-a19969224d90\") " pod="metallb-system/frr-k8s-prxxj" Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.435938 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/b724e39c-45b5-4701-b4f0-a19969224d90-frr-startup\") pod \"frr-k8s-prxxj\" (UID: \"b724e39c-45b5-4701-b4f0-a19969224d90\") " pod="metallb-system/frr-k8s-prxxj" Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.441421 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b724e39c-45b5-4701-b4f0-a19969224d90-metrics-certs\") pod \"frr-k8s-prxxj\" (UID: \"b724e39c-45b5-4701-b4f0-a19969224d90\") " pod="metallb-system/frr-k8s-prxxj" Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.449918 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4wtq\" (UniqueName: \"kubernetes.io/projected/81cad5dc-6bd8-4081-adc1-28f65b056636-kube-api-access-w4wtq\") pod \"frr-k8s-webhook-server-bcc4b6f68-6btqx\" (UID: \"81cad5dc-6bd8-4081-adc1-28f65b056636\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-6btqx" Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.453397 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4ks4\" (UniqueName: \"kubernetes.io/projected/b724e39c-45b5-4701-b4f0-a19969224d90-kube-api-access-q4ks4\") pod \"frr-k8s-prxxj\" (UID: \"b724e39c-45b5-4701-b4f0-a19969224d90\") " pod="metallb-system/frr-k8s-prxxj" Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.535558 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-prxxj" Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.536195 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/13651756-55fe-46f1-b849-fbdc5dc20887-cert\") pod \"controller-7bb4cc7c98-cnfr9\" (UID: \"13651756-55fe-46f1-b849-fbdc5dc20887\") " pod="metallb-system/controller-7bb4cc7c98-cnfr9" Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.536251 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/b812f1db-b2c8-467c-977a-a8661540546e-memberlist\") pod \"speaker-w2f68\" (UID: \"b812f1db-b2c8-467c-977a-a8661540546e\") " pod="metallb-system/speaker-w2f68" Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.536305 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2vq7\" (UniqueName: \"kubernetes.io/projected/b812f1db-b2c8-467c-977a-a8661540546e-kube-api-access-m2vq7\") pod \"speaker-w2f68\" (UID: \"b812f1db-b2c8-467c-977a-a8661540546e\") " pod="metallb-system/speaker-w2f68" Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.536352 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vswrq\" (UniqueName: \"kubernetes.io/projected/13651756-55fe-46f1-b849-fbdc5dc20887-kube-api-access-vswrq\") pod \"controller-7bb4cc7c98-cnfr9\" (UID: \"13651756-55fe-46f1-b849-fbdc5dc20887\") " pod="metallb-system/controller-7bb4cc7c98-cnfr9" Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.536411 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b812f1db-b2c8-467c-977a-a8661540546e-metrics-certs\") pod \"speaker-w2f68\" (UID: \"b812f1db-b2c8-467c-977a-a8661540546e\") " pod="metallb-system/speaker-w2f68" Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.536481 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/b812f1db-b2c8-467c-977a-a8661540546e-metallb-excludel2\") pod \"speaker-w2f68\" (UID: \"b812f1db-b2c8-467c-977a-a8661540546e\") " pod="metallb-system/speaker-w2f68" Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.536516 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/13651756-55fe-46f1-b849-fbdc5dc20887-metrics-certs\") pod \"controller-7bb4cc7c98-cnfr9\" (UID: \"13651756-55fe-46f1-b849-fbdc5dc20887\") " pod="metallb-system/controller-7bb4cc7c98-cnfr9" Mar 19 19:13:30 crc kubenswrapper[4826]: E0319 19:13:30.536698 4826 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Mar 19 19:13:30 crc kubenswrapper[4826]: E0319 19:13:30.536759 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13651756-55fe-46f1-b849-fbdc5dc20887-metrics-certs podName:13651756-55fe-46f1-b849-fbdc5dc20887 nodeName:}" failed. No retries permitted until 2026-03-19 19:13:31.036739882 +0000 UTC m=+1035.790808195 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/13651756-55fe-46f1-b849-fbdc5dc20887-metrics-certs") pod "controller-7bb4cc7c98-cnfr9" (UID: "13651756-55fe-46f1-b849-fbdc5dc20887") : secret "controller-certs-secret" not found Mar 19 19:13:30 crc kubenswrapper[4826]: E0319 19:13:30.536864 4826 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 19 19:13:30 crc kubenswrapper[4826]: E0319 19:13:30.536922 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b812f1db-b2c8-467c-977a-a8661540546e-memberlist podName:b812f1db-b2c8-467c-977a-a8661540546e nodeName:}" failed. No retries permitted until 2026-03-19 19:13:31.036901556 +0000 UTC m=+1035.790969949 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/b812f1db-b2c8-467c-977a-a8661540546e-memberlist") pod "speaker-w2f68" (UID: "b812f1db-b2c8-467c-977a-a8661540546e") : secret "metallb-memberlist" not found Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.537711 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/b812f1db-b2c8-467c-977a-a8661540546e-metallb-excludel2\") pod \"speaker-w2f68\" (UID: \"b812f1db-b2c8-467c-977a-a8661540546e\") " pod="metallb-system/speaker-w2f68" Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.537832 4826 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.540479 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b812f1db-b2c8-467c-977a-a8661540546e-metrics-certs\") pod \"speaker-w2f68\" (UID: \"b812f1db-b2c8-467c-977a-a8661540546e\") " pod="metallb-system/speaker-w2f68" Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.556276 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vswrq\" (UniqueName: \"kubernetes.io/projected/13651756-55fe-46f1-b849-fbdc5dc20887-kube-api-access-vswrq\") pod \"controller-7bb4cc7c98-cnfr9\" (UID: \"13651756-55fe-46f1-b849-fbdc5dc20887\") " pod="metallb-system/controller-7bb4cc7c98-cnfr9" Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.556275 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2vq7\" (UniqueName: \"kubernetes.io/projected/b812f1db-b2c8-467c-977a-a8661540546e-kube-api-access-m2vq7\") pod \"speaker-w2f68\" (UID: \"b812f1db-b2c8-467c-977a-a8661540546e\") " pod="metallb-system/speaker-w2f68" Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.558465 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/13651756-55fe-46f1-b849-fbdc5dc20887-cert\") pod \"controller-7bb4cc7c98-cnfr9\" (UID: \"13651756-55fe-46f1-b849-fbdc5dc20887\") " pod="metallb-system/controller-7bb4cc7c98-cnfr9" Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.943620 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/81cad5dc-6bd8-4081-adc1-28f65b056636-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-6btqx\" (UID: \"81cad5dc-6bd8-4081-adc1-28f65b056636\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-6btqx" Mar 19 19:13:30 crc kubenswrapper[4826]: I0319 19:13:30.949339 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/81cad5dc-6bd8-4081-adc1-28f65b056636-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-6btqx\" (UID: \"81cad5dc-6bd8-4081-adc1-28f65b056636\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-6btqx" Mar 19 19:13:31 crc kubenswrapper[4826]: I0319 19:13:31.046080 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/13651756-55fe-46f1-b849-fbdc5dc20887-metrics-certs\") pod \"controller-7bb4cc7c98-cnfr9\" (UID: \"13651756-55fe-46f1-b849-fbdc5dc20887\") " pod="metallb-system/controller-7bb4cc7c98-cnfr9" Mar 19 19:13:31 crc kubenswrapper[4826]: I0319 19:13:31.046272 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/b812f1db-b2c8-467c-977a-a8661540546e-memberlist\") pod \"speaker-w2f68\" (UID: \"b812f1db-b2c8-467c-977a-a8661540546e\") " pod="metallb-system/speaker-w2f68" Mar 19 19:13:31 crc kubenswrapper[4826]: E0319 19:13:31.046775 4826 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 19 19:13:31 crc kubenswrapper[4826]: E0319 19:13:31.046894 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b812f1db-b2c8-467c-977a-a8661540546e-memberlist podName:b812f1db-b2c8-467c-977a-a8661540546e nodeName:}" failed. No retries permitted until 2026-03-19 19:13:32.046864131 +0000 UTC m=+1036.800932474 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/b812f1db-b2c8-467c-977a-a8661540546e-memberlist") pod "speaker-w2f68" (UID: "b812f1db-b2c8-467c-977a-a8661540546e") : secret "metallb-memberlist" not found Mar 19 19:13:31 crc kubenswrapper[4826]: I0319 19:13:31.049825 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/13651756-55fe-46f1-b849-fbdc5dc20887-metrics-certs\") pod \"controller-7bb4cc7c98-cnfr9\" (UID: \"13651756-55fe-46f1-b849-fbdc5dc20887\") " pod="metallb-system/controller-7bb4cc7c98-cnfr9" Mar 19 19:13:31 crc kubenswrapper[4826]: I0319 19:13:31.156117 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-6btqx" Mar 19 19:13:31 crc kubenswrapper[4826]: I0319 19:13:31.287318 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-cnfr9" Mar 19 19:13:31 crc kubenswrapper[4826]: I0319 19:13:31.584056 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-prxxj" event={"ID":"b724e39c-45b5-4701-b4f0-a19969224d90","Type":"ContainerStarted","Data":"aa0b6fa2210bf390374e4954a8c007aafdf32fc16cf3e91d30560251b8791a31"} Mar 19 19:13:31 crc kubenswrapper[4826]: I0319 19:13:31.642474 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-6btqx"] Mar 19 19:13:31 crc kubenswrapper[4826]: I0319 19:13:31.738119 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-cnfr9"] Mar 19 19:13:32 crc kubenswrapper[4826]: I0319 19:13:32.067104 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/b812f1db-b2c8-467c-977a-a8661540546e-memberlist\") pod \"speaker-w2f68\" (UID: \"b812f1db-b2c8-467c-977a-a8661540546e\") " pod="metallb-system/speaker-w2f68" Mar 19 19:13:32 crc kubenswrapper[4826]: I0319 19:13:32.077154 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/b812f1db-b2c8-467c-977a-a8661540546e-memberlist\") pod \"speaker-w2f68\" (UID: \"b812f1db-b2c8-467c-977a-a8661540546e\") " pod="metallb-system/speaker-w2f68" Mar 19 19:13:32 crc kubenswrapper[4826]: I0319 19:13:32.192542 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-w2f68" Mar 19 19:13:32 crc kubenswrapper[4826]: W0319 19:13:32.222967 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb812f1db_b2c8_467c_977a_a8661540546e.slice/crio-20fd8e6d25ef7b7b7fa6b7eb40e4760a5a2ddf555bbbdb0b93655895142a0730 WatchSource:0}: Error finding container 20fd8e6d25ef7b7b7fa6b7eb40e4760a5a2ddf555bbbdb0b93655895142a0730: Status 404 returned error can't find the container with id 20fd8e6d25ef7b7b7fa6b7eb40e4760a5a2ddf555bbbdb0b93655895142a0730 Mar 19 19:13:32 crc kubenswrapper[4826]: I0319 19:13:32.592587 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-w2f68" event={"ID":"b812f1db-b2c8-467c-977a-a8661540546e","Type":"ContainerStarted","Data":"cecf9f716377394271759e79568a9bf3e3c6352f09e209ea04029a11db3f6fc7"} Mar 19 19:13:32 crc kubenswrapper[4826]: I0319 19:13:32.592668 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-w2f68" event={"ID":"b812f1db-b2c8-467c-977a-a8661540546e","Type":"ContainerStarted","Data":"20fd8e6d25ef7b7b7fa6b7eb40e4760a5a2ddf555bbbdb0b93655895142a0730"} Mar 19 19:13:32 crc kubenswrapper[4826]: I0319 19:13:32.594987 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-cnfr9" event={"ID":"13651756-55fe-46f1-b849-fbdc5dc20887","Type":"ContainerStarted","Data":"b818e7cdc7da8bbfb5ec5f0957d5ec815cf85b7d9998f5c8067f76eff1f90d30"} Mar 19 19:13:32 crc kubenswrapper[4826]: I0319 19:13:32.595053 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-cnfr9" event={"ID":"13651756-55fe-46f1-b849-fbdc5dc20887","Type":"ContainerStarted","Data":"9d1db1e186f4dd843e2e7246861156296c2f0851424bf285edc9f4625d32c0e2"} Mar 19 19:13:32 crc kubenswrapper[4826]: I0319 19:13:32.595069 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-cnfr9" event={"ID":"13651756-55fe-46f1-b849-fbdc5dc20887","Type":"ContainerStarted","Data":"79e23bc832302cb5731b4ad5e93c67f15921cf3445358e45797acfc56e01d55d"} Mar 19 19:13:32 crc kubenswrapper[4826]: I0319 19:13:32.596983 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-cnfr9" Mar 19 19:13:32 crc kubenswrapper[4826]: I0319 19:13:32.598441 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-6btqx" event={"ID":"81cad5dc-6bd8-4081-adc1-28f65b056636","Type":"ContainerStarted","Data":"1bae5efce611aa863c27e07a3204e005de74c0d7d7e4ba9d73fcc6e6f9f7d521"} Mar 19 19:13:32 crc kubenswrapper[4826]: I0319 19:13:32.633833 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-cnfr9" podStartSLOduration=2.633813437 podStartE2EDuration="2.633813437s" podCreationTimestamp="2026-03-19 19:13:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:13:32.628897488 +0000 UTC m=+1037.382965831" watchObservedRunningTime="2026-03-19 19:13:32.633813437 +0000 UTC m=+1037.387881760" Mar 19 19:13:33 crc kubenswrapper[4826]: I0319 19:13:33.610311 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-w2f68" event={"ID":"b812f1db-b2c8-467c-977a-a8661540546e","Type":"ContainerStarted","Data":"8cbba9338d775803d523af5f8bf5339d9689faa1f03357a98834f4f9658c545d"} Mar 19 19:13:33 crc kubenswrapper[4826]: I0319 19:13:33.610419 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-w2f68" Mar 19 19:13:33 crc kubenswrapper[4826]: I0319 19:13:33.630883 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-w2f68" podStartSLOduration=3.630867395 podStartE2EDuration="3.630867395s" podCreationTimestamp="2026-03-19 19:13:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:13:33.627790201 +0000 UTC m=+1038.381858514" watchObservedRunningTime="2026-03-19 19:13:33.630867395 +0000 UTC m=+1038.384935698" Mar 19 19:13:39 crc kubenswrapper[4826]: I0319 19:13:39.660639 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-6btqx" event={"ID":"81cad5dc-6bd8-4081-adc1-28f65b056636","Type":"ContainerStarted","Data":"41fa4ca7bfde9c58a52e5d57a2358483ce84ae4dac66f1e5cadf2397e21f9fbf"} Mar 19 19:13:39 crc kubenswrapper[4826]: I0319 19:13:39.663365 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-6btqx" Mar 19 19:13:39 crc kubenswrapper[4826]: I0319 19:13:39.666885 4826 generic.go:334] "Generic (PLEG): container finished" podID="b724e39c-45b5-4701-b4f0-a19969224d90" containerID="07f5780e22fc8efa15331dfe71931e8fcfcfdc6c5b0ac928cc973068abecc0b9" exitCode=0 Mar 19 19:13:39 crc kubenswrapper[4826]: I0319 19:13:39.666971 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-prxxj" event={"ID":"b724e39c-45b5-4701-b4f0-a19969224d90","Type":"ContainerDied","Data":"07f5780e22fc8efa15331dfe71931e8fcfcfdc6c5b0ac928cc973068abecc0b9"} Mar 19 19:13:39 crc kubenswrapper[4826]: I0319 19:13:39.684974 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-6btqx" podStartSLOduration=2.800600608 podStartE2EDuration="9.684952188s" podCreationTimestamp="2026-03-19 19:13:30 +0000 UTC" firstStartedPulling="2026-03-19 19:13:31.656252009 +0000 UTC m=+1036.410320332" lastFinishedPulling="2026-03-19 19:13:38.540603599 +0000 UTC m=+1043.294671912" observedRunningTime="2026-03-19 19:13:39.684912967 +0000 UTC m=+1044.438981320" watchObservedRunningTime="2026-03-19 19:13:39.684952188 +0000 UTC m=+1044.439020531" Mar 19 19:13:40 crc kubenswrapper[4826]: I0319 19:13:40.675050 4826 generic.go:334] "Generic (PLEG): container finished" podID="b724e39c-45b5-4701-b4f0-a19969224d90" containerID="e628f49d54daeec9e5663008a74b8987446a8ae37d081a99e4bf8985553d5446" exitCode=0 Mar 19 19:13:40 crc kubenswrapper[4826]: I0319 19:13:40.675110 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-prxxj" event={"ID":"b724e39c-45b5-4701-b4f0-a19969224d90","Type":"ContainerDied","Data":"e628f49d54daeec9e5663008a74b8987446a8ae37d081a99e4bf8985553d5446"} Mar 19 19:13:41 crc kubenswrapper[4826]: I0319 19:13:41.293139 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-cnfr9" Mar 19 19:13:41 crc kubenswrapper[4826]: I0319 19:13:41.688843 4826 generic.go:334] "Generic (PLEG): container finished" podID="b724e39c-45b5-4701-b4f0-a19969224d90" containerID="296d8c7df0a5430a1ef31301e81caa77799eee7e2ce9de47f7f058a33ad6a62f" exitCode=0 Mar 19 19:13:41 crc kubenswrapper[4826]: I0319 19:13:41.688916 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-prxxj" event={"ID":"b724e39c-45b5-4701-b4f0-a19969224d90","Type":"ContainerDied","Data":"296d8c7df0a5430a1ef31301e81caa77799eee7e2ce9de47f7f058a33ad6a62f"} Mar 19 19:13:42 crc kubenswrapper[4826]: I0319 19:13:42.200396 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-w2f68" Mar 19 19:13:42 crc kubenswrapper[4826]: I0319 19:13:42.703757 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-prxxj" event={"ID":"b724e39c-45b5-4701-b4f0-a19969224d90","Type":"ContainerStarted","Data":"b5d5c87b49f6f2bf71d99c2b111b2dfd8718baa7681cf2a84f711e42ed743d7e"} Mar 19 19:13:42 crc kubenswrapper[4826]: I0319 19:13:42.704339 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-prxxj" event={"ID":"b724e39c-45b5-4701-b4f0-a19969224d90","Type":"ContainerStarted","Data":"9d6ba94311eda6306a3392e29ffc629cb7e01f99ea2201e9830e8e1ab10d2f0b"} Mar 19 19:13:42 crc kubenswrapper[4826]: I0319 19:13:42.704357 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-prxxj" event={"ID":"b724e39c-45b5-4701-b4f0-a19969224d90","Type":"ContainerStarted","Data":"d6501efe4d21439968204859b1ed1ca17f8790cc86545382812dea2434ac9b1b"} Mar 19 19:13:42 crc kubenswrapper[4826]: I0319 19:13:42.704371 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-prxxj" event={"ID":"b724e39c-45b5-4701-b4f0-a19969224d90","Type":"ContainerStarted","Data":"04a2f678f0acbe9cdead3d329e1d3aa6d7f40e03a8448d5eb57c7065bb9d062e"} Mar 19 19:13:43 crc kubenswrapper[4826]: I0319 19:13:43.728693 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-prxxj" event={"ID":"b724e39c-45b5-4701-b4f0-a19969224d90","Type":"ContainerStarted","Data":"f5a9926af843c94bec0c9b3bbc185c40fdad87b0807937d5841d94b89e92b3a0"} Mar 19 19:13:43 crc kubenswrapper[4826]: I0319 19:13:43.729042 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-prxxj" event={"ID":"b724e39c-45b5-4701-b4f0-a19969224d90","Type":"ContainerStarted","Data":"ca399e9102a8ffdb9de5613948b702eda4b8d8c96b642bb891f4a7d310c51884"} Mar 19 19:13:43 crc kubenswrapper[4826]: I0319 19:13:43.729097 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-prxxj" Mar 19 19:13:43 crc kubenswrapper[4826]: I0319 19:13:43.757070 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-prxxj" podStartSLOduration=5.935115297 podStartE2EDuration="13.757053907s" podCreationTimestamp="2026-03-19 19:13:30 +0000 UTC" firstStartedPulling="2026-03-19 19:13:30.694950766 +0000 UTC m=+1035.449019079" lastFinishedPulling="2026-03-19 19:13:38.516889336 +0000 UTC m=+1043.270957689" observedRunningTime="2026-03-19 19:13:43.750131369 +0000 UTC m=+1048.504199752" watchObservedRunningTime="2026-03-19 19:13:43.757053907 +0000 UTC m=+1048.511122220" Mar 19 19:13:44 crc kubenswrapper[4826]: I0319 19:13:44.916461 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-khftc"] Mar 19 19:13:44 crc kubenswrapper[4826]: I0319 19:13:44.917581 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-khftc" Mar 19 19:13:44 crc kubenswrapper[4826]: I0319 19:13:44.920052 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 19 19:13:44 crc kubenswrapper[4826]: I0319 19:13:44.920563 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-t2t6k" Mar 19 19:13:44 crc kubenswrapper[4826]: I0319 19:13:44.922615 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 19 19:13:44 crc kubenswrapper[4826]: I0319 19:13:44.926915 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-khftc"] Mar 19 19:13:45 crc kubenswrapper[4826]: I0319 19:13:45.012384 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhmvw\" (UniqueName: \"kubernetes.io/projected/b96103dd-d73c-4179-8696-2c804081f69f-kube-api-access-vhmvw\") pod \"openstack-operator-index-khftc\" (UID: \"b96103dd-d73c-4179-8696-2c804081f69f\") " pod="openstack-operators/openstack-operator-index-khftc" Mar 19 19:13:45 crc kubenswrapper[4826]: I0319 19:13:45.114255 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhmvw\" (UniqueName: \"kubernetes.io/projected/b96103dd-d73c-4179-8696-2c804081f69f-kube-api-access-vhmvw\") pod \"openstack-operator-index-khftc\" (UID: \"b96103dd-d73c-4179-8696-2c804081f69f\") " pod="openstack-operators/openstack-operator-index-khftc" Mar 19 19:13:45 crc kubenswrapper[4826]: I0319 19:13:45.143771 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhmvw\" (UniqueName: \"kubernetes.io/projected/b96103dd-d73c-4179-8696-2c804081f69f-kube-api-access-vhmvw\") pod \"openstack-operator-index-khftc\" (UID: \"b96103dd-d73c-4179-8696-2c804081f69f\") " pod="openstack-operators/openstack-operator-index-khftc" Mar 19 19:13:45 crc kubenswrapper[4826]: I0319 19:13:45.234387 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-khftc" Mar 19 19:13:45 crc kubenswrapper[4826]: I0319 19:13:45.536859 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-prxxj" Mar 19 19:13:45 crc kubenswrapper[4826]: I0319 19:13:45.576692 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-prxxj" Mar 19 19:13:45 crc kubenswrapper[4826]: I0319 19:13:45.714793 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-khftc"] Mar 19 19:13:45 crc kubenswrapper[4826]: W0319 19:13:45.715585 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb96103dd_d73c_4179_8696_2c804081f69f.slice/crio-f1ffd031e26d9119d07d7e6411067d1269c75bcb9e6e96a2cda779c601a98445 WatchSource:0}: Error finding container f1ffd031e26d9119d07d7e6411067d1269c75bcb9e6e96a2cda779c601a98445: Status 404 returned error can't find the container with id f1ffd031e26d9119d07d7e6411067d1269c75bcb9e6e96a2cda779c601a98445 Mar 19 19:13:45 crc kubenswrapper[4826]: I0319 19:13:45.753130 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-khftc" event={"ID":"b96103dd-d73c-4179-8696-2c804081f69f","Type":"ContainerStarted","Data":"f1ffd031e26d9119d07d7e6411067d1269c75bcb9e6e96a2cda779c601a98445"} Mar 19 19:13:47 crc kubenswrapper[4826]: I0319 19:13:47.897954 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-khftc"] Mar 19 19:13:48 crc kubenswrapper[4826]: I0319 19:13:48.503971 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-wrwzn"] Mar 19 19:13:48 crc kubenswrapper[4826]: I0319 19:13:48.506731 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wrwzn" Mar 19 19:13:48 crc kubenswrapper[4826]: I0319 19:13:48.514713 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-wrwzn"] Mar 19 19:13:48 crc kubenswrapper[4826]: I0319 19:13:48.684228 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxwhv\" (UniqueName: \"kubernetes.io/projected/f9f3d33c-0421-473c-94e6-a7860932d772-kube-api-access-nxwhv\") pod \"openstack-operator-index-wrwzn\" (UID: \"f9f3d33c-0421-473c-94e6-a7860932d772\") " pod="openstack-operators/openstack-operator-index-wrwzn" Mar 19 19:13:48 crc kubenswrapper[4826]: I0319 19:13:48.784059 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-khftc" event={"ID":"b96103dd-d73c-4179-8696-2c804081f69f","Type":"ContainerStarted","Data":"5de51280a8b8a679445d6a7bc53d198cfe7c9931bbb0f0116f0d050a8fffd536"} Mar 19 19:13:48 crc kubenswrapper[4826]: I0319 19:13:48.784249 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-khftc" podUID="b96103dd-d73c-4179-8696-2c804081f69f" containerName="registry-server" containerID="cri-o://5de51280a8b8a679445d6a7bc53d198cfe7c9931bbb0f0116f0d050a8fffd536" gracePeriod=2 Mar 19 19:13:48 crc kubenswrapper[4826]: I0319 19:13:48.785695 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxwhv\" (UniqueName: \"kubernetes.io/projected/f9f3d33c-0421-473c-94e6-a7860932d772-kube-api-access-nxwhv\") pod \"openstack-operator-index-wrwzn\" (UID: \"f9f3d33c-0421-473c-94e6-a7860932d772\") " pod="openstack-operators/openstack-operator-index-wrwzn" Mar 19 19:13:48 crc kubenswrapper[4826]: I0319 19:13:48.822073 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxwhv\" (UniqueName: \"kubernetes.io/projected/f9f3d33c-0421-473c-94e6-a7860932d772-kube-api-access-nxwhv\") pod \"openstack-operator-index-wrwzn\" (UID: \"f9f3d33c-0421-473c-94e6-a7860932d772\") " pod="openstack-operators/openstack-operator-index-wrwzn" Mar 19 19:13:48 crc kubenswrapper[4826]: I0319 19:13:48.857866 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wrwzn" Mar 19 19:13:49 crc kubenswrapper[4826]: I0319 19:13:49.357757 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-khftc" Mar 19 19:13:49 crc kubenswrapper[4826]: I0319 19:13:49.461681 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-wrwzn"] Mar 19 19:13:49 crc kubenswrapper[4826]: W0319 19:13:49.466100 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9f3d33c_0421_473c_94e6_a7860932d772.slice/crio-fa26744c0a6e6a4f8a848dac3e87ac599646826062b542d1d508314237092e82 WatchSource:0}: Error finding container fa26744c0a6e6a4f8a848dac3e87ac599646826062b542d1d508314237092e82: Status 404 returned error can't find the container with id fa26744c0a6e6a4f8a848dac3e87ac599646826062b542d1d508314237092e82 Mar 19 19:13:49 crc kubenswrapper[4826]: I0319 19:13:49.473239 4826 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 19:13:49 crc kubenswrapper[4826]: I0319 19:13:49.500739 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhmvw\" (UniqueName: \"kubernetes.io/projected/b96103dd-d73c-4179-8696-2c804081f69f-kube-api-access-vhmvw\") pod \"b96103dd-d73c-4179-8696-2c804081f69f\" (UID: \"b96103dd-d73c-4179-8696-2c804081f69f\") " Mar 19 19:13:49 crc kubenswrapper[4826]: I0319 19:13:49.506186 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b96103dd-d73c-4179-8696-2c804081f69f-kube-api-access-vhmvw" (OuterVolumeSpecName: "kube-api-access-vhmvw") pod "b96103dd-d73c-4179-8696-2c804081f69f" (UID: "b96103dd-d73c-4179-8696-2c804081f69f"). InnerVolumeSpecName "kube-api-access-vhmvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:13:49 crc kubenswrapper[4826]: I0319 19:13:49.602990 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhmvw\" (UniqueName: \"kubernetes.io/projected/b96103dd-d73c-4179-8696-2c804081f69f-kube-api-access-vhmvw\") on node \"crc\" DevicePath \"\"" Mar 19 19:13:49 crc kubenswrapper[4826]: I0319 19:13:49.796709 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wrwzn" event={"ID":"f9f3d33c-0421-473c-94e6-a7860932d772","Type":"ContainerStarted","Data":"9fce9490185a1bb948a973838565e9f7f627139a2f115ccc5704f0289fc76b3f"} Mar 19 19:13:49 crc kubenswrapper[4826]: I0319 19:13:49.797321 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wrwzn" event={"ID":"f9f3d33c-0421-473c-94e6-a7860932d772","Type":"ContainerStarted","Data":"fa26744c0a6e6a4f8a848dac3e87ac599646826062b542d1d508314237092e82"} Mar 19 19:13:49 crc kubenswrapper[4826]: I0319 19:13:49.801135 4826 generic.go:334] "Generic (PLEG): container finished" podID="b96103dd-d73c-4179-8696-2c804081f69f" containerID="5de51280a8b8a679445d6a7bc53d198cfe7c9931bbb0f0116f0d050a8fffd536" exitCode=0 Mar 19 19:13:49 crc kubenswrapper[4826]: I0319 19:13:49.801191 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-khftc" event={"ID":"b96103dd-d73c-4179-8696-2c804081f69f","Type":"ContainerDied","Data":"5de51280a8b8a679445d6a7bc53d198cfe7c9931bbb0f0116f0d050a8fffd536"} Mar 19 19:13:49 crc kubenswrapper[4826]: I0319 19:13:49.801221 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-khftc" event={"ID":"b96103dd-d73c-4179-8696-2c804081f69f","Type":"ContainerDied","Data":"f1ffd031e26d9119d07d7e6411067d1269c75bcb9e6e96a2cda779c601a98445"} Mar 19 19:13:49 crc kubenswrapper[4826]: I0319 19:13:49.801235 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-khftc" Mar 19 19:13:49 crc kubenswrapper[4826]: I0319 19:13:49.801247 4826 scope.go:117] "RemoveContainer" containerID="5de51280a8b8a679445d6a7bc53d198cfe7c9931bbb0f0116f0d050a8fffd536" Mar 19 19:13:49 crc kubenswrapper[4826]: I0319 19:13:49.841856 4826 scope.go:117] "RemoveContainer" containerID="5de51280a8b8a679445d6a7bc53d198cfe7c9931bbb0f0116f0d050a8fffd536" Mar 19 19:13:49 crc kubenswrapper[4826]: E0319 19:13:49.842700 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5de51280a8b8a679445d6a7bc53d198cfe7c9931bbb0f0116f0d050a8fffd536\": container with ID starting with 5de51280a8b8a679445d6a7bc53d198cfe7c9931bbb0f0116f0d050a8fffd536 not found: ID does not exist" containerID="5de51280a8b8a679445d6a7bc53d198cfe7c9931bbb0f0116f0d050a8fffd536" Mar 19 19:13:49 crc kubenswrapper[4826]: I0319 19:13:49.842764 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5de51280a8b8a679445d6a7bc53d198cfe7c9931bbb0f0116f0d050a8fffd536"} err="failed to get container status \"5de51280a8b8a679445d6a7bc53d198cfe7c9931bbb0f0116f0d050a8fffd536\": rpc error: code = NotFound desc = could not find container \"5de51280a8b8a679445d6a7bc53d198cfe7c9931bbb0f0116f0d050a8fffd536\": container with ID starting with 5de51280a8b8a679445d6a7bc53d198cfe7c9931bbb0f0116f0d050a8fffd536 not found: ID does not exist" Mar 19 19:13:49 crc kubenswrapper[4826]: I0319 19:13:49.845239 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-wrwzn" podStartSLOduration=1.793579115 podStartE2EDuration="1.845222573s" podCreationTimestamp="2026-03-19 19:13:48 +0000 UTC" firstStartedPulling="2026-03-19 19:13:49.472967636 +0000 UTC m=+1054.227035959" lastFinishedPulling="2026-03-19 19:13:49.524611094 +0000 UTC m=+1054.278679417" observedRunningTime="2026-03-19 19:13:49.822882453 +0000 UTC m=+1054.576950806" watchObservedRunningTime="2026-03-19 19:13:49.845222573 +0000 UTC m=+1054.599290896" Mar 19 19:13:49 crc kubenswrapper[4826]: I0319 19:13:49.870404 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-khftc"] Mar 19 19:13:49 crc kubenswrapper[4826]: I0319 19:13:49.879106 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-khftc"] Mar 19 19:13:49 crc kubenswrapper[4826]: I0319 19:13:49.990015 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b96103dd-d73c-4179-8696-2c804081f69f" path="/var/lib/kubelet/pods/b96103dd-d73c-4179-8696-2c804081f69f/volumes" Mar 19 19:13:51 crc kubenswrapper[4826]: I0319 19:13:51.164859 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-6btqx" Mar 19 19:13:58 crc kubenswrapper[4826]: I0319 19:13:58.861034 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-wrwzn" Mar 19 19:13:58 crc kubenswrapper[4826]: I0319 19:13:58.861782 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-wrwzn" Mar 19 19:13:58 crc kubenswrapper[4826]: I0319 19:13:58.901217 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-wrwzn" Mar 19 19:13:58 crc kubenswrapper[4826]: I0319 19:13:58.937076 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-wrwzn" Mar 19 19:14:00 crc kubenswrapper[4826]: I0319 19:14:00.149227 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565794-6r8f4"] Mar 19 19:14:00 crc kubenswrapper[4826]: E0319 19:14:00.150981 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b96103dd-d73c-4179-8696-2c804081f69f" containerName="registry-server" Mar 19 19:14:00 crc kubenswrapper[4826]: I0319 19:14:00.151085 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="b96103dd-d73c-4179-8696-2c804081f69f" containerName="registry-server" Mar 19 19:14:00 crc kubenswrapper[4826]: I0319 19:14:00.151356 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="b96103dd-d73c-4179-8696-2c804081f69f" containerName="registry-server" Mar 19 19:14:00 crc kubenswrapper[4826]: I0319 19:14:00.152097 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565794-6r8f4" Mar 19 19:14:00 crc kubenswrapper[4826]: I0319 19:14:00.155823 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 19:14:00 crc kubenswrapper[4826]: I0319 19:14:00.156307 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 19:14:00 crc kubenswrapper[4826]: I0319 19:14:00.157721 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b27wl" Mar 19 19:14:00 crc kubenswrapper[4826]: I0319 19:14:00.172083 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565794-6r8f4"] Mar 19 19:14:00 crc kubenswrapper[4826]: I0319 19:14:00.214057 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9ttx\" (UniqueName: \"kubernetes.io/projected/6c439c10-be6f-469b-a52e-1a77f9afc7be-kube-api-access-p9ttx\") pod \"auto-csr-approver-29565794-6r8f4\" (UID: \"6c439c10-be6f-469b-a52e-1a77f9afc7be\") " pod="openshift-infra/auto-csr-approver-29565794-6r8f4" Mar 19 19:14:00 crc kubenswrapper[4826]: I0319 19:14:00.315924 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9ttx\" (UniqueName: \"kubernetes.io/projected/6c439c10-be6f-469b-a52e-1a77f9afc7be-kube-api-access-p9ttx\") pod \"auto-csr-approver-29565794-6r8f4\" (UID: \"6c439c10-be6f-469b-a52e-1a77f9afc7be\") " pod="openshift-infra/auto-csr-approver-29565794-6r8f4" Mar 19 19:14:00 crc kubenswrapper[4826]: I0319 19:14:00.337989 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9ttx\" (UniqueName: \"kubernetes.io/projected/6c439c10-be6f-469b-a52e-1a77f9afc7be-kube-api-access-p9ttx\") pod \"auto-csr-approver-29565794-6r8f4\" (UID: \"6c439c10-be6f-469b-a52e-1a77f9afc7be\") " pod="openshift-infra/auto-csr-approver-29565794-6r8f4" Mar 19 19:14:00 crc kubenswrapper[4826]: I0319 19:14:00.477488 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565794-6r8f4" Mar 19 19:14:00 crc kubenswrapper[4826]: I0319 19:14:00.556115 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-prxxj" Mar 19 19:14:00 crc kubenswrapper[4826]: I0319 19:14:00.959754 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565794-6r8f4"] Mar 19 19:14:01 crc kubenswrapper[4826]: I0319 19:14:01.927202 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565794-6r8f4" event={"ID":"6c439c10-be6f-469b-a52e-1a77f9afc7be","Type":"ContainerStarted","Data":"5370848f3e5595b38c7391c13fe3d0f79a6dd2058ba20e12599f25d9dfc3c669"} Mar 19 19:14:03 crc kubenswrapper[4826]: I0319 19:14:03.948432 4826 generic.go:334] "Generic (PLEG): container finished" podID="6c439c10-be6f-469b-a52e-1a77f9afc7be" containerID="36c100cd6943dc6bbc62bf14ad99171aedef87ef8190d7f788ed589a8325a9f0" exitCode=0 Mar 19 19:14:03 crc kubenswrapper[4826]: I0319 19:14:03.948517 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565794-6r8f4" event={"ID":"6c439c10-be6f-469b-a52e-1a77f9afc7be","Type":"ContainerDied","Data":"36c100cd6943dc6bbc62bf14ad99171aedef87ef8190d7f788ed589a8325a9f0"} Mar 19 19:14:05 crc kubenswrapper[4826]: I0319 19:14:05.269102 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565794-6r8f4" Mar 19 19:14:05 crc kubenswrapper[4826]: I0319 19:14:05.407739 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9ttx\" (UniqueName: \"kubernetes.io/projected/6c439c10-be6f-469b-a52e-1a77f9afc7be-kube-api-access-p9ttx\") pod \"6c439c10-be6f-469b-a52e-1a77f9afc7be\" (UID: \"6c439c10-be6f-469b-a52e-1a77f9afc7be\") " Mar 19 19:14:05 crc kubenswrapper[4826]: I0319 19:14:05.427821 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c439c10-be6f-469b-a52e-1a77f9afc7be-kube-api-access-p9ttx" (OuterVolumeSpecName: "kube-api-access-p9ttx") pod "6c439c10-be6f-469b-a52e-1a77f9afc7be" (UID: "6c439c10-be6f-469b-a52e-1a77f9afc7be"). InnerVolumeSpecName "kube-api-access-p9ttx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:14:05 crc kubenswrapper[4826]: I0319 19:14:05.510535 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9ttx\" (UniqueName: \"kubernetes.io/projected/6c439c10-be6f-469b-a52e-1a77f9afc7be-kube-api-access-p9ttx\") on node \"crc\" DevicePath \"\"" Mar 19 19:14:05 crc kubenswrapper[4826]: I0319 19:14:05.964613 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565794-6r8f4" event={"ID":"6c439c10-be6f-469b-a52e-1a77f9afc7be","Type":"ContainerDied","Data":"5370848f3e5595b38c7391c13fe3d0f79a6dd2058ba20e12599f25d9dfc3c669"} Mar 19 19:14:05 crc kubenswrapper[4826]: I0319 19:14:05.964666 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5370848f3e5595b38c7391c13fe3d0f79a6dd2058ba20e12599f25d9dfc3c669" Mar 19 19:14:05 crc kubenswrapper[4826]: I0319 19:14:05.964669 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565794-6r8f4" Mar 19 19:14:06 crc kubenswrapper[4826]: I0319 19:14:06.353507 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565788-qhtlk"] Mar 19 19:14:06 crc kubenswrapper[4826]: I0319 19:14:06.368743 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565788-qhtlk"] Mar 19 19:14:06 crc kubenswrapper[4826]: I0319 19:14:06.946049 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908b64vts"] Mar 19 19:14:06 crc kubenswrapper[4826]: E0319 19:14:06.946689 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c439c10-be6f-469b-a52e-1a77f9afc7be" containerName="oc" Mar 19 19:14:06 crc kubenswrapper[4826]: I0319 19:14:06.946708 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c439c10-be6f-469b-a52e-1a77f9afc7be" containerName="oc" Mar 19 19:14:06 crc kubenswrapper[4826]: I0319 19:14:06.946983 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c439c10-be6f-469b-a52e-1a77f9afc7be" containerName="oc" Mar 19 19:14:06 crc kubenswrapper[4826]: I0319 19:14:06.948410 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908b64vts" Mar 19 19:14:06 crc kubenswrapper[4826]: I0319 19:14:06.951401 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-nkl6c" Mar 19 19:14:06 crc kubenswrapper[4826]: I0319 19:14:06.954111 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908b64vts"] Mar 19 19:14:07 crc kubenswrapper[4826]: I0319 19:14:07.041266 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6898d200-6ab5-4f88-8390-712724dbeb63-util\") pod \"05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908b64vts\" (UID: \"6898d200-6ab5-4f88-8390-712724dbeb63\") " pod="openstack-operators/05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908b64vts" Mar 19 19:14:07 crc kubenswrapper[4826]: I0319 19:14:07.041345 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkzcm\" (UniqueName: \"kubernetes.io/projected/6898d200-6ab5-4f88-8390-712724dbeb63-kube-api-access-hkzcm\") pod \"05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908b64vts\" (UID: \"6898d200-6ab5-4f88-8390-712724dbeb63\") " pod="openstack-operators/05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908b64vts" Mar 19 19:14:07 crc kubenswrapper[4826]: I0319 19:14:07.041441 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6898d200-6ab5-4f88-8390-712724dbeb63-bundle\") pod \"05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908b64vts\" (UID: \"6898d200-6ab5-4f88-8390-712724dbeb63\") " pod="openstack-operators/05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908b64vts" Mar 19 19:14:07 crc kubenswrapper[4826]: I0319 19:14:07.143266 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6898d200-6ab5-4f88-8390-712724dbeb63-util\") pod \"05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908b64vts\" (UID: \"6898d200-6ab5-4f88-8390-712724dbeb63\") " pod="openstack-operators/05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908b64vts" Mar 19 19:14:07 crc kubenswrapper[4826]: I0319 19:14:07.143315 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkzcm\" (UniqueName: \"kubernetes.io/projected/6898d200-6ab5-4f88-8390-712724dbeb63-kube-api-access-hkzcm\") pod \"05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908b64vts\" (UID: \"6898d200-6ab5-4f88-8390-712724dbeb63\") " pod="openstack-operators/05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908b64vts" Mar 19 19:14:07 crc kubenswrapper[4826]: I0319 19:14:07.143382 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6898d200-6ab5-4f88-8390-712724dbeb63-bundle\") pod \"05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908b64vts\" (UID: \"6898d200-6ab5-4f88-8390-712724dbeb63\") " pod="openstack-operators/05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908b64vts" Mar 19 19:14:07 crc kubenswrapper[4826]: I0319 19:14:07.143941 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6898d200-6ab5-4f88-8390-712724dbeb63-util\") pod \"05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908b64vts\" (UID: \"6898d200-6ab5-4f88-8390-712724dbeb63\") " pod="openstack-operators/05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908b64vts" Mar 19 19:14:07 crc kubenswrapper[4826]: I0319 19:14:07.143955 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6898d200-6ab5-4f88-8390-712724dbeb63-bundle\") pod \"05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908b64vts\" (UID: \"6898d200-6ab5-4f88-8390-712724dbeb63\") " pod="openstack-operators/05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908b64vts" Mar 19 19:14:07 crc kubenswrapper[4826]: I0319 19:14:07.162705 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkzcm\" (UniqueName: \"kubernetes.io/projected/6898d200-6ab5-4f88-8390-712724dbeb63-kube-api-access-hkzcm\") pod \"05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908b64vts\" (UID: \"6898d200-6ab5-4f88-8390-712724dbeb63\") " pod="openstack-operators/05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908b64vts" Mar 19 19:14:07 crc kubenswrapper[4826]: I0319 19:14:07.277225 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908b64vts" Mar 19 19:14:07 crc kubenswrapper[4826]: I0319 19:14:07.561744 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908b64vts"] Mar 19 19:14:07 crc kubenswrapper[4826]: I0319 19:14:07.987519 4826 generic.go:334] "Generic (PLEG): container finished" podID="6898d200-6ab5-4f88-8390-712724dbeb63" containerID="16bb3e684b533de1828a47dd67fba4049b20e7276e81e714a10f7a83092311a1" exitCode=0 Mar 19 19:14:07 crc kubenswrapper[4826]: I0319 19:14:07.988355 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22b7e1a6-658b-4f9d-a9ee-947a6283266f" path="/var/lib/kubelet/pods/22b7e1a6-658b-4f9d-a9ee-947a6283266f/volumes" Mar 19 19:14:07 crc kubenswrapper[4826]: I0319 19:14:07.990186 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908b64vts" event={"ID":"6898d200-6ab5-4f88-8390-712724dbeb63","Type":"ContainerDied","Data":"16bb3e684b533de1828a47dd67fba4049b20e7276e81e714a10f7a83092311a1"} Mar 19 19:14:07 crc kubenswrapper[4826]: I0319 19:14:07.990405 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908b64vts" event={"ID":"6898d200-6ab5-4f88-8390-712724dbeb63","Type":"ContainerStarted","Data":"ecaed1568e9f8b00e5a0ee8dcbf716409d9241af099a2cafefd155ea0cea19c5"} Mar 19 19:14:08 crc kubenswrapper[4826]: I0319 19:14:08.995787 4826 generic.go:334] "Generic (PLEG): container finished" podID="6898d200-6ab5-4f88-8390-712724dbeb63" containerID="5f3ebb6e9a24d5493208e535b2236467649f1d5eb42d96412b6d2cb2b86bbc4a" exitCode=0 Mar 19 19:14:08 crc kubenswrapper[4826]: I0319 19:14:08.995827 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908b64vts" event={"ID":"6898d200-6ab5-4f88-8390-712724dbeb63","Type":"ContainerDied","Data":"5f3ebb6e9a24d5493208e535b2236467649f1d5eb42d96412b6d2cb2b86bbc4a"} Mar 19 19:14:10 crc kubenswrapper[4826]: I0319 19:14:10.021186 4826 generic.go:334] "Generic (PLEG): container finished" podID="6898d200-6ab5-4f88-8390-712724dbeb63" containerID="c8f4307921555b00a2232eaee41150f9c3350f7c0d4f9acf055d135a741b545e" exitCode=0 Mar 19 19:14:10 crc kubenswrapper[4826]: I0319 19:14:10.021234 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908b64vts" event={"ID":"6898d200-6ab5-4f88-8390-712724dbeb63","Type":"ContainerDied","Data":"c8f4307921555b00a2232eaee41150f9c3350f7c0d4f9acf055d135a741b545e"} Mar 19 19:14:11 crc kubenswrapper[4826]: I0319 19:14:11.392293 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908b64vts" Mar 19 19:14:11 crc kubenswrapper[4826]: I0319 19:14:11.526723 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkzcm\" (UniqueName: \"kubernetes.io/projected/6898d200-6ab5-4f88-8390-712724dbeb63-kube-api-access-hkzcm\") pod \"6898d200-6ab5-4f88-8390-712724dbeb63\" (UID: \"6898d200-6ab5-4f88-8390-712724dbeb63\") " Mar 19 19:14:11 crc kubenswrapper[4826]: I0319 19:14:11.527059 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6898d200-6ab5-4f88-8390-712724dbeb63-util\") pod \"6898d200-6ab5-4f88-8390-712724dbeb63\" (UID: \"6898d200-6ab5-4f88-8390-712724dbeb63\") " Mar 19 19:14:11 crc kubenswrapper[4826]: I0319 19:14:11.527201 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6898d200-6ab5-4f88-8390-712724dbeb63-bundle\") pod \"6898d200-6ab5-4f88-8390-712724dbeb63\" (UID: \"6898d200-6ab5-4f88-8390-712724dbeb63\") " Mar 19 19:14:11 crc kubenswrapper[4826]: I0319 19:14:11.527718 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6898d200-6ab5-4f88-8390-712724dbeb63-bundle" (OuterVolumeSpecName: "bundle") pod "6898d200-6ab5-4f88-8390-712724dbeb63" (UID: "6898d200-6ab5-4f88-8390-712724dbeb63"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:14:11 crc kubenswrapper[4826]: I0319 19:14:11.531581 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6898d200-6ab5-4f88-8390-712724dbeb63-kube-api-access-hkzcm" (OuterVolumeSpecName: "kube-api-access-hkzcm") pod "6898d200-6ab5-4f88-8390-712724dbeb63" (UID: "6898d200-6ab5-4f88-8390-712724dbeb63"). InnerVolumeSpecName "kube-api-access-hkzcm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:14:11 crc kubenswrapper[4826]: I0319 19:14:11.544948 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6898d200-6ab5-4f88-8390-712724dbeb63-util" (OuterVolumeSpecName: "util") pod "6898d200-6ab5-4f88-8390-712724dbeb63" (UID: "6898d200-6ab5-4f88-8390-712724dbeb63"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:14:11 crc kubenswrapper[4826]: I0319 19:14:11.629127 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkzcm\" (UniqueName: \"kubernetes.io/projected/6898d200-6ab5-4f88-8390-712724dbeb63-kube-api-access-hkzcm\") on node \"crc\" DevicePath \"\"" Mar 19 19:14:11 crc kubenswrapper[4826]: I0319 19:14:11.629172 4826 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6898d200-6ab5-4f88-8390-712724dbeb63-util\") on node \"crc\" DevicePath \"\"" Mar 19 19:14:11 crc kubenswrapper[4826]: I0319 19:14:11.629186 4826 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6898d200-6ab5-4f88-8390-712724dbeb63-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:14:12 crc kubenswrapper[4826]: I0319 19:14:12.038430 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908b64vts" event={"ID":"6898d200-6ab5-4f88-8390-712724dbeb63","Type":"ContainerDied","Data":"ecaed1568e9f8b00e5a0ee8dcbf716409d9241af099a2cafefd155ea0cea19c5"} Mar 19 19:14:12 crc kubenswrapper[4826]: I0319 19:14:12.038466 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ecaed1568e9f8b00e5a0ee8dcbf716409d9241af099a2cafefd155ea0cea19c5" Mar 19 19:14:12 crc kubenswrapper[4826]: I0319 19:14:12.038524 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908b64vts" Mar 19 19:14:14 crc kubenswrapper[4826]: I0319 19:14:14.829642 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-6c6f68556d-k5tlt"] Mar 19 19:14:14 crc kubenswrapper[4826]: E0319 19:14:14.830356 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6898d200-6ab5-4f88-8390-712724dbeb63" containerName="util" Mar 19 19:14:14 crc kubenswrapper[4826]: I0319 19:14:14.830373 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="6898d200-6ab5-4f88-8390-712724dbeb63" containerName="util" Mar 19 19:14:14 crc kubenswrapper[4826]: E0319 19:14:14.830398 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6898d200-6ab5-4f88-8390-712724dbeb63" containerName="pull" Mar 19 19:14:14 crc kubenswrapper[4826]: I0319 19:14:14.830407 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="6898d200-6ab5-4f88-8390-712724dbeb63" containerName="pull" Mar 19 19:14:14 crc kubenswrapper[4826]: E0319 19:14:14.830448 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6898d200-6ab5-4f88-8390-712724dbeb63" containerName="extract" Mar 19 19:14:14 crc kubenswrapper[4826]: I0319 19:14:14.830456 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="6898d200-6ab5-4f88-8390-712724dbeb63" containerName="extract" Mar 19 19:14:14 crc kubenswrapper[4826]: I0319 19:14:14.830640 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="6898d200-6ab5-4f88-8390-712724dbeb63" containerName="extract" Mar 19 19:14:14 crc kubenswrapper[4826]: I0319 19:14:14.831337 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6c6f68556d-k5tlt" Mar 19 19:14:14 crc kubenswrapper[4826]: I0319 19:14:14.834697 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-t82jf" Mar 19 19:14:14 crc kubenswrapper[4826]: I0319 19:14:14.891490 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqqwb\" (UniqueName: \"kubernetes.io/projected/c045bb2f-b87b-4a14-92b5-0b98cdc7a0d1-kube-api-access-zqqwb\") pod \"openstack-operator-controller-init-6c6f68556d-k5tlt\" (UID: \"c045bb2f-b87b-4a14-92b5-0b98cdc7a0d1\") " pod="openstack-operators/openstack-operator-controller-init-6c6f68556d-k5tlt" Mar 19 19:14:14 crc kubenswrapper[4826]: I0319 19:14:14.900024 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6c6f68556d-k5tlt"] Mar 19 19:14:14 crc kubenswrapper[4826]: I0319 19:14:14.993852 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqqwb\" (UniqueName: \"kubernetes.io/projected/c045bb2f-b87b-4a14-92b5-0b98cdc7a0d1-kube-api-access-zqqwb\") pod \"openstack-operator-controller-init-6c6f68556d-k5tlt\" (UID: \"c045bb2f-b87b-4a14-92b5-0b98cdc7a0d1\") " pod="openstack-operators/openstack-operator-controller-init-6c6f68556d-k5tlt" Mar 19 19:14:15 crc kubenswrapper[4826]: I0319 19:14:15.014896 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqqwb\" (UniqueName: \"kubernetes.io/projected/c045bb2f-b87b-4a14-92b5-0b98cdc7a0d1-kube-api-access-zqqwb\") pod \"openstack-operator-controller-init-6c6f68556d-k5tlt\" (UID: \"c045bb2f-b87b-4a14-92b5-0b98cdc7a0d1\") " pod="openstack-operators/openstack-operator-controller-init-6c6f68556d-k5tlt" Mar 19 19:14:15 crc kubenswrapper[4826]: I0319 19:14:15.153604 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6c6f68556d-k5tlt" Mar 19 19:14:15 crc kubenswrapper[4826]: I0319 19:14:15.670199 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6c6f68556d-k5tlt"] Mar 19 19:14:16 crc kubenswrapper[4826]: I0319 19:14:16.077626 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6c6f68556d-k5tlt" event={"ID":"c045bb2f-b87b-4a14-92b5-0b98cdc7a0d1","Type":"ContainerStarted","Data":"5a3781780bdb6c0dd626c61cbe52ff9c99881759952817b3fd87e7a485711a3a"} Mar 19 19:14:17 crc kubenswrapper[4826]: I0319 19:14:17.110211 4826 scope.go:117] "RemoveContainer" containerID="253ccce7086c613fe8e20ecca1aacd8e92ef8fb49f779399af2e442d661625c0" Mar 19 19:14:21 crc kubenswrapper[4826]: I0319 19:14:21.129748 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6c6f68556d-k5tlt" event={"ID":"c045bb2f-b87b-4a14-92b5-0b98cdc7a0d1","Type":"ContainerStarted","Data":"8b365c4de811f05a97f6948925e513561fcd22296731daa07c22f7bbd7cfd7cc"} Mar 19 19:14:21 crc kubenswrapper[4826]: I0319 19:14:21.130299 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-6c6f68556d-k5tlt" Mar 19 19:14:21 crc kubenswrapper[4826]: I0319 19:14:21.185393 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-6c6f68556d-k5tlt" podStartSLOduration=3.103349183 podStartE2EDuration="7.185373592s" podCreationTimestamp="2026-03-19 19:14:14 +0000 UTC" firstStartedPulling="2026-03-19 19:14:15.671826543 +0000 UTC m=+1080.425894856" lastFinishedPulling="2026-03-19 19:14:19.753850912 +0000 UTC m=+1084.507919265" observedRunningTime="2026-03-19 19:14:21.18117763 +0000 UTC m=+1085.935246063" watchObservedRunningTime="2026-03-19 19:14:21.185373592 +0000 UTC m=+1085.939441915" Mar 19 19:14:25 crc kubenswrapper[4826]: I0319 19:14:25.158624 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-6c6f68556d-k5tlt" Mar 19 19:14:45 crc kubenswrapper[4826]: I0319 19:14:45.604511 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-jjqrs"] Mar 19 19:14:45 crc kubenswrapper[4826]: I0319 19:14:45.605956 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-jjqrs" Mar 19 19:14:45 crc kubenswrapper[4826]: I0319 19:14:45.607410 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-pnj7r" Mar 19 19:14:45 crc kubenswrapper[4826]: I0319 19:14:45.610077 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-zm4ps"] Mar 19 19:14:45 crc kubenswrapper[4826]: I0319 19:14:45.611109 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-zm4ps" Mar 19 19:14:45 crc kubenswrapper[4826]: I0319 19:14:45.612329 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-9p24t" Mar 19 19:14:45 crc kubenswrapper[4826]: I0319 19:14:45.616827 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-zm4ps"] Mar 19 19:14:45 crc kubenswrapper[4826]: I0319 19:14:45.623187 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-jjqrs"] Mar 19 19:14:45 crc kubenswrapper[4826]: I0319 19:14:45.629754 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qs9d5\" (UniqueName: \"kubernetes.io/projected/5f60643c-c919-436b-bd23-9e39698d9c9b-kube-api-access-qs9d5\") pod \"barbican-operator-controller-manager-59bc569d95-jjqrs\" (UID: \"5f60643c-c919-436b-bd23-9e39698d9c9b\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-jjqrs" Mar 19 19:14:45 crc kubenswrapper[4826]: I0319 19:14:45.644339 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-hf8n5"] Mar 19 19:14:45 crc kubenswrapper[4826]: I0319 19:14:45.645664 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-hf8n5" Mar 19 19:14:45 crc kubenswrapper[4826]: I0319 19:14:45.650083 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-xpvgc" Mar 19 19:14:45 crc kubenswrapper[4826]: I0319 19:14:45.650240 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-8265b"] Mar 19 19:14:45 crc kubenswrapper[4826]: I0319 19:14:45.651273 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-8265b" Mar 19 19:14:45 crc kubenswrapper[4826]: I0319 19:14:45.664700 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-rsrjx"] Mar 19 19:14:45 crc kubenswrapper[4826]: I0319 19:14:45.665744 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-rsrjx" Mar 19 19:14:45 crc kubenswrapper[4826]: I0319 19:14:45.678358 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-drvfb" Mar 19 19:14:45 crc kubenswrapper[4826]: I0319 19:14:45.679434 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-8265b"] Mar 19 19:14:45 crc kubenswrapper[4826]: I0319 19:14:45.680262 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-d4wxj" Mar 19 19:14:45 crc kubenswrapper[4826]: I0319 19:14:45.688340 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-hf8n5"] Mar 19 19:14:45 crc kubenswrapper[4826]: I0319 19:14:45.701850 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-ngb9j"] Mar 19 19:14:45 crc kubenswrapper[4826]: I0319 19:14:45.702869 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-ngb9j" Mar 19 19:14:45 crc kubenswrapper[4826]: I0319 19:14:45.705053 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-jfbh9" Mar 19 19:14:45 crc kubenswrapper[4826]: I0319 19:14:45.731551 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkvsl\" (UniqueName: \"kubernetes.io/projected/ee5c97c9-5dc0-4292-9a34-08ca45f5387a-kube-api-access-hkvsl\") pod \"horizon-operator-controller-manager-8464cc45fb-ngb9j\" (UID: \"ee5c97c9-5dc0-4292-9a34-08ca45f5387a\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-ngb9j" Mar 19 19:14:45 crc kubenswrapper[4826]: I0319 19:14:45.731625 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5hnl\" (UniqueName: \"kubernetes.io/projected/d2375678-e630-4376-9dfd-28efbc77aed4-kube-api-access-s5hnl\") pod \"heat-operator-controller-manager-67dd5f86f5-rsrjx\" (UID: \"d2375678-e630-4376-9dfd-28efbc77aed4\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-rsrjx" Mar 19 19:14:45 crc kubenswrapper[4826]: I0319 19:14:45.731691 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hltv7\" (UniqueName: \"kubernetes.io/projected/0f77f094-1b90-43a6-85be-27e8b1fda71f-kube-api-access-hltv7\") pod \"glance-operator-controller-manager-79df6bcc97-8265b\" (UID: \"0f77f094-1b90-43a6-85be-27e8b1fda71f\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-8265b" Mar 19 19:14:45 crc kubenswrapper[4826]: I0319 19:14:45.731760 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bfrg\" (UniqueName: \"kubernetes.io/projected/080fa697-4720-424e-b75e-6564061cd68f-kube-api-access-9bfrg\") pod \"designate-operator-controller-manager-588d4d986b-hf8n5\" (UID: \"080fa697-4720-424e-b75e-6564061cd68f\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-hf8n5" Mar 19 19:14:45 crc kubenswrapper[4826]: I0319 19:14:45.731789 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xvx2\" (UniqueName: \"kubernetes.io/projected/38267b94-39ea-4067-9b6e-3d863ff60494-kube-api-access-7xvx2\") pod \"cinder-operator-controller-manager-8d58dc466-zm4ps\" (UID: \"38267b94-39ea-4067-9b6e-3d863ff60494\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-zm4ps" Mar 19 19:14:45 crc kubenswrapper[4826]: I0319 19:14:45.731851 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qs9d5\" (UniqueName: \"kubernetes.io/projected/5f60643c-c919-436b-bd23-9e39698d9c9b-kube-api-access-qs9d5\") pod \"barbican-operator-controller-manager-59bc569d95-jjqrs\" (UID: \"5f60643c-c919-436b-bd23-9e39698d9c9b\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-jjqrs" Mar 19 19:14:45 crc kubenswrapper[4826]: I0319 19:14:45.744741 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-ngb9j"] Mar 19 19:14:45 crc kubenswrapper[4826]: I0319 19:14:45.751723 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-rsrjx"] Mar 19 19:14:45 crc kubenswrapper[4826]: I0319 19:14:45.761840 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qs9d5\" (UniqueName: \"kubernetes.io/projected/5f60643c-c919-436b-bd23-9e39698d9c9b-kube-api-access-qs9d5\") pod \"barbican-operator-controller-manager-59bc569d95-jjqrs\" (UID: \"5f60643c-c919-436b-bd23-9e39698d9c9b\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-jjqrs" Mar 19 19:14:45 crc kubenswrapper[4826]: I0319 19:14:45.834241 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkvsl\" (UniqueName: \"kubernetes.io/projected/ee5c97c9-5dc0-4292-9a34-08ca45f5387a-kube-api-access-hkvsl\") pod \"horizon-operator-controller-manager-8464cc45fb-ngb9j\" (UID: \"ee5c97c9-5dc0-4292-9a34-08ca45f5387a\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-ngb9j" Mar 19 19:14:45 crc kubenswrapper[4826]: I0319 19:14:45.834318 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5hnl\" (UniqueName: \"kubernetes.io/projected/d2375678-e630-4376-9dfd-28efbc77aed4-kube-api-access-s5hnl\") pod \"heat-operator-controller-manager-67dd5f86f5-rsrjx\" (UID: \"d2375678-e630-4376-9dfd-28efbc77aed4\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-rsrjx" Mar 19 19:14:45 crc kubenswrapper[4826]: I0319 19:14:45.834366 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hltv7\" (UniqueName: \"kubernetes.io/projected/0f77f094-1b90-43a6-85be-27e8b1fda71f-kube-api-access-hltv7\") pod \"glance-operator-controller-manager-79df6bcc97-8265b\" (UID: \"0f77f094-1b90-43a6-85be-27e8b1fda71f\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-8265b" Mar 19 19:14:45 crc kubenswrapper[4826]: I0319 19:14:45.834407 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bfrg\" (UniqueName: \"kubernetes.io/projected/080fa697-4720-424e-b75e-6564061cd68f-kube-api-access-9bfrg\") pod \"designate-operator-controller-manager-588d4d986b-hf8n5\" (UID: \"080fa697-4720-424e-b75e-6564061cd68f\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-hf8n5" Mar 19 19:14:45 crc kubenswrapper[4826]: I0319 19:14:45.834419 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-zjkbj"] Mar 19 19:14:45 crc kubenswrapper[4826]: I0319 19:14:45.834440 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xvx2\" (UniqueName: \"kubernetes.io/projected/38267b94-39ea-4067-9b6e-3d863ff60494-kube-api-access-7xvx2\") pod \"cinder-operator-controller-manager-8d58dc466-zm4ps\" (UID: \"38267b94-39ea-4067-9b6e-3d863ff60494\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-zm4ps" Mar 19 19:14:45 crc kubenswrapper[4826]: I0319 19:14:45.843531 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-zjkbj" Mar 19 19:14:45 crc kubenswrapper[4826]: I0319 19:14:45.850008 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-725xd"] Mar 19 19:14:45 crc kubenswrapper[4826]: I0319 19:14:45.877393 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkvsl\" (UniqueName: \"kubernetes.io/projected/ee5c97c9-5dc0-4292-9a34-08ca45f5387a-kube-api-access-hkvsl\") pod \"horizon-operator-controller-manager-8464cc45fb-ngb9j\" (UID: \"ee5c97c9-5dc0-4292-9a34-08ca45f5387a\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-ngb9j" Mar 19 19:14:45 crc kubenswrapper[4826]: I0319 19:14:45.909114 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 19 19:14:45 crc kubenswrapper[4826]: I0319 19:14:45.909307 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-r8vxn" Mar 19 19:14:45 crc kubenswrapper[4826]: I0319 19:14:45.910291 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xvx2\" (UniqueName: \"kubernetes.io/projected/38267b94-39ea-4067-9b6e-3d863ff60494-kube-api-access-7xvx2\") pod \"cinder-operator-controller-manager-8d58dc466-zm4ps\" (UID: \"38267b94-39ea-4067-9b6e-3d863ff60494\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-zm4ps" Mar 19 19:14:45 crc kubenswrapper[4826]: I0319 19:14:45.910641 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bfrg\" (UniqueName: \"kubernetes.io/projected/080fa697-4720-424e-b75e-6564061cd68f-kube-api-access-9bfrg\") pod \"designate-operator-controller-manager-588d4d986b-hf8n5\" (UID: \"080fa697-4720-424e-b75e-6564061cd68f\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-hf8n5" Mar 19 19:14:45 crc kubenswrapper[4826]: I0319 19:14:45.911330 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-zjkbj"] Mar 19 19:14:45 crc kubenswrapper[4826]: I0319 19:14:45.911401 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-725xd" Mar 19 19:14:45 crc kubenswrapper[4826]: I0319 19:14:45.911992 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5hnl\" (UniqueName: \"kubernetes.io/projected/d2375678-e630-4376-9dfd-28efbc77aed4-kube-api-access-s5hnl\") pod \"heat-operator-controller-manager-67dd5f86f5-rsrjx\" (UID: \"d2375678-e630-4376-9dfd-28efbc77aed4\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-rsrjx" Mar 19 19:14:45 crc kubenswrapper[4826]: I0319 19:14:45.925064 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-jjqrs" Mar 19 19:14:45 crc kubenswrapper[4826]: I0319 19:14:45.925735 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-fzszp" Mar 19 19:14:45 crc kubenswrapper[4826]: I0319 19:14:45.937365 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hltv7\" (UniqueName: \"kubernetes.io/projected/0f77f094-1b90-43a6-85be-27e8b1fda71f-kube-api-access-hltv7\") pod \"glance-operator-controller-manager-79df6bcc97-8265b\" (UID: \"0f77f094-1b90-43a6-85be-27e8b1fda71f\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-8265b" Mar 19 19:14:45 crc kubenswrapper[4826]: I0319 19:14:45.937735 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a960df53-d712-424a-85a7-64b0e50c911f-cert\") pod \"infra-operator-controller-manager-7b9c774f96-zjkbj\" (UID: \"a960df53-d712-424a-85a7-64b0e50c911f\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-zjkbj" Mar 19 19:14:45 crc kubenswrapper[4826]: I0319 19:14:45.937815 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwff2\" (UniqueName: \"kubernetes.io/projected/f073a654-efe9-4fd0-9c08-23d9fdb0d492-kube-api-access-rwff2\") pod \"ironic-operator-controller-manager-6f787dddc9-725xd\" (UID: \"f073a654-efe9-4fd0-9c08-23d9fdb0d492\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-725xd" Mar 19 19:14:45 crc kubenswrapper[4826]: I0319 19:14:45.937913 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vqpd\" (UniqueName: \"kubernetes.io/projected/a960df53-d712-424a-85a7-64b0e50c911f-kube-api-access-6vqpd\") pod \"infra-operator-controller-manager-7b9c774f96-zjkbj\" (UID: \"a960df53-d712-424a-85a7-64b0e50c911f\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-zjkbj" Mar 19 19:14:45 crc kubenswrapper[4826]: I0319 19:14:45.969812 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-zrczt"] Mar 19 19:14:45 crc kubenswrapper[4826]: I0319 19:14:45.971035 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-zrczt" Mar 19 19:14:45 crc kubenswrapper[4826]: I0319 19:14:45.992054 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-2crnw" Mar 19 19:14:45 crc kubenswrapper[4826]: I0319 19:14:45.992179 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-zm4ps" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:45.998031 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-rsrjx" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.006483 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-hf8n5" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.022878 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-8265b" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.039010 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwff2\" (UniqueName: \"kubernetes.io/projected/f073a654-efe9-4fd0-9c08-23d9fdb0d492-kube-api-access-rwff2\") pod \"ironic-operator-controller-manager-6f787dddc9-725xd\" (UID: \"f073a654-efe9-4fd0-9c08-23d9fdb0d492\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-725xd" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.039162 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26t8z\" (UniqueName: \"kubernetes.io/projected/6a5ffd48-ea97-46a0-b9ed-f7c38d5d8a90-kube-api-access-26t8z\") pod \"manila-operator-controller-manager-55f864c847-zrczt\" (UID: \"6a5ffd48-ea97-46a0-b9ed-f7c38d5d8a90\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-zrczt" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.039228 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vqpd\" (UniqueName: \"kubernetes.io/projected/a960df53-d712-424a-85a7-64b0e50c911f-kube-api-access-6vqpd\") pod \"infra-operator-controller-manager-7b9c774f96-zjkbj\" (UID: \"a960df53-d712-424a-85a7-64b0e50c911f\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-zjkbj" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.039272 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a960df53-d712-424a-85a7-64b0e50c911f-cert\") pod \"infra-operator-controller-manager-7b9c774f96-zjkbj\" (UID: \"a960df53-d712-424a-85a7-64b0e50c911f\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-zjkbj" Mar 19 19:14:46 crc kubenswrapper[4826]: E0319 19:14:46.039415 4826 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 19 19:14:46 crc kubenswrapper[4826]: E0319 19:14:46.039461 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a960df53-d712-424a-85a7-64b0e50c911f-cert podName:a960df53-d712-424a-85a7-64b0e50c911f nodeName:}" failed. No retries permitted until 2026-03-19 19:14:46.539441816 +0000 UTC m=+1111.293510129 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a960df53-d712-424a-85a7-64b0e50c911f-cert") pod "infra-operator-controller-manager-7b9c774f96-zjkbj" (UID: "a960df53-d712-424a-85a7-64b0e50c911f") : secret "infra-operator-webhook-server-cert" not found Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.040104 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-ngb9j" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.059288 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-725xd"] Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.065716 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-4bkbn"] Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.066838 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-4bkbn" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.074860 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-zrczt"] Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.081997 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-l9j8v" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.085862 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-xpq6x"] Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.089076 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-xpq6x" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.092970 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-zcj6f" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.096585 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwff2\" (UniqueName: \"kubernetes.io/projected/f073a654-efe9-4fd0-9c08-23d9fdb0d492-kube-api-access-rwff2\") pod \"ironic-operator-controller-manager-6f787dddc9-725xd\" (UID: \"f073a654-efe9-4fd0-9c08-23d9fdb0d492\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-725xd" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.106401 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vqpd\" (UniqueName: \"kubernetes.io/projected/a960df53-d712-424a-85a7-64b0e50c911f-kube-api-access-6vqpd\") pod \"infra-operator-controller-manager-7b9c774f96-zjkbj\" (UID: \"a960df53-d712-424a-85a7-64b0e50c911f\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-zjkbj" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.110719 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-4bkbn"] Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.125735 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-xpq6x"] Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.138415 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-sfs65"] Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.139603 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-sfs65" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.141610 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mrtq\" (UniqueName: \"kubernetes.io/projected/271f8c86-929d-46a4-8852-f5ec8e701bcb-kube-api-access-4mrtq\") pod \"mariadb-operator-controller-manager-67ccfc9778-xpq6x\" (UID: \"271f8c86-929d-46a4-8852-f5ec8e701bcb\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-xpq6x" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.147259 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bccgh\" (UniqueName: \"kubernetes.io/projected/49f5fbe6-ba93-4ff2-b575-aa08dceb2622-kube-api-access-bccgh\") pod \"keystone-operator-controller-manager-768b96df4c-4bkbn\" (UID: \"49f5fbe6-ba93-4ff2-b575-aa08dceb2622\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-4bkbn" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.147407 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26t8z\" (UniqueName: \"kubernetes.io/projected/6a5ffd48-ea97-46a0-b9ed-f7c38d5d8a90-kube-api-access-26t8z\") pod \"manila-operator-controller-manager-55f864c847-zrczt\" (UID: \"6a5ffd48-ea97-46a0-b9ed-f7c38d5d8a90\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-zrczt" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.148784 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-sfs65"] Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.149232 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-bc85d" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.156150 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-725xd" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.158981 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-b76w9"] Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.160397 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-b76w9" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.169265 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-92k7w" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.172900 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.215434 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-zs74n"] Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.216864 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-zs74n" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.218388 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26t8z\" (UniqueName: \"kubernetes.io/projected/6a5ffd48-ea97-46a0-b9ed-f7c38d5d8a90-kube-api-access-26t8z\") pod \"manila-operator-controller-manager-55f864c847-zrczt\" (UID: \"6a5ffd48-ea97-46a0-b9ed-f7c38d5d8a90\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-zrczt" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.219230 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-ngbf5" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.223845 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-zrczt" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.246311 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-j4p25"] Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.247763 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-j4p25" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.248919 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bccgh\" (UniqueName: \"kubernetes.io/projected/49f5fbe6-ba93-4ff2-b575-aa08dceb2622-kube-api-access-bccgh\") pod \"keystone-operator-controller-manager-768b96df4c-4bkbn\" (UID: \"49f5fbe6-ba93-4ff2-b575-aa08dceb2622\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-4bkbn" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.248952 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wh7gl\" (UniqueName: \"kubernetes.io/projected/6243b523-966a-4f1d-b663-2f1ed4614fdb-kube-api-access-wh7gl\") pod \"ovn-operator-controller-manager-884679f54-zs74n\" (UID: \"6243b523-966a-4f1d-b663-2f1ed4614fdb\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-zs74n" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.249018 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2q5b\" (UniqueName: \"kubernetes.io/projected/918ac815-fe60-44b9-b6c0-c99ee8dc80b8-kube-api-access-l2q5b\") pod \"neutron-operator-controller-manager-767865f676-sfs65\" (UID: \"918ac815-fe60-44b9-b6c0-c99ee8dc80b8\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-sfs65" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.249053 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mrtq\" (UniqueName: \"kubernetes.io/projected/271f8c86-929d-46a4-8852-f5ec8e701bcb-kube-api-access-4mrtq\") pod \"mariadb-operator-controller-manager-67ccfc9778-xpq6x\" (UID: \"271f8c86-929d-46a4-8852-f5ec8e701bcb\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-xpq6x" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.249087 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4f382869-5ee2-4a46-8188-d4ddd0bee2fa-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-b76w9\" (UID: \"4f382869-5ee2-4a46-8188-d4ddd0bee2fa\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-b76w9" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.249118 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfdb7\" (UniqueName: \"kubernetes.io/projected/4f382869-5ee2-4a46-8188-d4ddd0bee2fa-kube-api-access-hfdb7\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-b76w9\" (UID: \"4f382869-5ee2-4a46-8188-d4ddd0bee2fa\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-b76w9" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.251016 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-cmbts" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.276272 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-tjcmb"] Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.277533 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-tjcmb" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.280443 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bccgh\" (UniqueName: \"kubernetes.io/projected/49f5fbe6-ba93-4ff2-b575-aa08dceb2622-kube-api-access-bccgh\") pod \"keystone-operator-controller-manager-768b96df4c-4bkbn\" (UID: \"49f5fbe6-ba93-4ff2-b575-aa08dceb2622\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-4bkbn" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.294616 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mrtq\" (UniqueName: \"kubernetes.io/projected/271f8c86-929d-46a4-8852-f5ec8e701bcb-kube-api-access-4mrtq\") pod \"mariadb-operator-controller-manager-67ccfc9778-xpq6x\" (UID: \"271f8c86-929d-46a4-8852-f5ec8e701bcb\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-xpq6x" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.303541 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-j4p25"] Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.305416 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-4bkbn" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.310970 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-xrc9x" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.331221 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-xpq6x" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.331597 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-b76w9"] Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.350375 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4f382869-5ee2-4a46-8188-d4ddd0bee2fa-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-b76w9\" (UID: \"4f382869-5ee2-4a46-8188-d4ddd0bee2fa\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-b76w9" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.350428 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfdb7\" (UniqueName: \"kubernetes.io/projected/4f382869-5ee2-4a46-8188-d4ddd0bee2fa-kube-api-access-hfdb7\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-b76w9\" (UID: \"4f382869-5ee2-4a46-8188-d4ddd0bee2fa\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-b76w9" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.350485 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6jm9\" (UniqueName: \"kubernetes.io/projected/44055ef9-1bc5-4b25-a40d-553a1546fc15-kube-api-access-d6jm9\") pod \"nova-operator-controller-manager-5d488d59fb-tjcmb\" (UID: \"44055ef9-1bc5-4b25-a40d-553a1546fc15\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-tjcmb" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.350524 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wh7gl\" (UniqueName: \"kubernetes.io/projected/6243b523-966a-4f1d-b663-2f1ed4614fdb-kube-api-access-wh7gl\") pod \"ovn-operator-controller-manager-884679f54-zs74n\" (UID: \"6243b523-966a-4f1d-b663-2f1ed4614fdb\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-zs74n" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.350550 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4f94\" (UniqueName: \"kubernetes.io/projected/7137162e-cccf-4ce6-9dc4-7380db33a85a-kube-api-access-b4f94\") pod \"octavia-operator-controller-manager-5b9f45d989-j4p25\" (UID: \"7137162e-cccf-4ce6-9dc4-7380db33a85a\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-j4p25" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.350588 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2q5b\" (UniqueName: \"kubernetes.io/projected/918ac815-fe60-44b9-b6c0-c99ee8dc80b8-kube-api-access-l2q5b\") pod \"neutron-operator-controller-manager-767865f676-sfs65\" (UID: \"918ac815-fe60-44b9-b6c0-c99ee8dc80b8\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-sfs65" Mar 19 19:14:46 crc kubenswrapper[4826]: E0319 19:14:46.357823 4826 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 19:14:46 crc kubenswrapper[4826]: E0319 19:14:46.357929 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f382869-5ee2-4a46-8188-d4ddd0bee2fa-cert podName:4f382869-5ee2-4a46-8188-d4ddd0bee2fa nodeName:}" failed. No retries permitted until 2026-03-19 19:14:46.857908793 +0000 UTC m=+1111.611977096 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4f382869-5ee2-4a46-8188-d4ddd0bee2fa-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-b76w9" (UID: "4f382869-5ee2-4a46-8188-d4ddd0bee2fa") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.370940 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-zs74n"] Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.388797 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wh7gl\" (UniqueName: \"kubernetes.io/projected/6243b523-966a-4f1d-b663-2f1ed4614fdb-kube-api-access-wh7gl\") pod \"ovn-operator-controller-manager-884679f54-zs74n\" (UID: \"6243b523-966a-4f1d-b663-2f1ed4614fdb\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-zs74n" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.397418 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfdb7\" (UniqueName: \"kubernetes.io/projected/4f382869-5ee2-4a46-8188-d4ddd0bee2fa-kube-api-access-hfdb7\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-b76w9\" (UID: \"4f382869-5ee2-4a46-8188-d4ddd0bee2fa\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-b76w9" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.398140 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2q5b\" (UniqueName: \"kubernetes.io/projected/918ac815-fe60-44b9-b6c0-c99ee8dc80b8-kube-api-access-l2q5b\") pod \"neutron-operator-controller-manager-767865f676-sfs65\" (UID: \"918ac815-fe60-44b9-b6c0-c99ee8dc80b8\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-sfs65" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.429310 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-kkmzl"] Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.430562 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-kkmzl" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.432762 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-wtjck" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.441332 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-zs74n" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.453178 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4f94\" (UniqueName: \"kubernetes.io/projected/7137162e-cccf-4ce6-9dc4-7380db33a85a-kube-api-access-b4f94\") pod \"octavia-operator-controller-manager-5b9f45d989-j4p25\" (UID: \"7137162e-cccf-4ce6-9dc4-7380db33a85a\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-j4p25" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.453352 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpfb7\" (UniqueName: \"kubernetes.io/projected/79a89fcd-3226-4314-951d-d94af2ac242c-kube-api-access-xpfb7\") pod \"placement-operator-controller-manager-5784578c99-kkmzl\" (UID: \"79a89fcd-3226-4314-951d-d94af2ac242c\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-kkmzl" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.453387 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6jm9\" (UniqueName: \"kubernetes.io/projected/44055ef9-1bc5-4b25-a40d-553a1546fc15-kube-api-access-d6jm9\") pod \"nova-operator-controller-manager-5d488d59fb-tjcmb\" (UID: \"44055ef9-1bc5-4b25-a40d-553a1546fc15\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-tjcmb" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.464896 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-tjcmb"] Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.488288 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6jm9\" (UniqueName: \"kubernetes.io/projected/44055ef9-1bc5-4b25-a40d-553a1546fc15-kube-api-access-d6jm9\") pod \"nova-operator-controller-manager-5d488d59fb-tjcmb\" (UID: \"44055ef9-1bc5-4b25-a40d-553a1546fc15\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-tjcmb" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.495265 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-skdcp"] Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.502227 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-skdcp" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.502634 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-tjcmb" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.503232 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-kkmzl"] Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.503477 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4f94\" (UniqueName: \"kubernetes.io/projected/7137162e-cccf-4ce6-9dc4-7380db33a85a-kube-api-access-b4f94\") pod \"octavia-operator-controller-manager-5b9f45d989-j4p25\" (UID: \"7137162e-cccf-4ce6-9dc4-7380db33a85a\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-j4p25" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.506145 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-xx259" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.513295 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-skdcp"] Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.518536 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6c5c766d94-258q2"] Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.520120 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6c5c766d94-258q2" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.528387 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-xx44z" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.529941 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6c5c766d94-258q2"] Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.551617 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-6vmk6"] Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.555290 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-6vmk6" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.558588 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hffqn\" (UniqueName: \"kubernetes.io/projected/5d8869b3-7d43-4db2-b79d-f05c13d0d6f2-kube-api-access-hffqn\") pod \"telemetry-operator-controller-manager-6c5c766d94-258q2\" (UID: \"5d8869b3-7d43-4db2-b79d-f05c13d0d6f2\") " pod="openstack-operators/telemetry-operator-controller-manager-6c5c766d94-258q2" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.558690 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpfb7\" (UniqueName: \"kubernetes.io/projected/79a89fcd-3226-4314-951d-d94af2ac242c-kube-api-access-xpfb7\") pod \"placement-operator-controller-manager-5784578c99-kkmzl\" (UID: \"79a89fcd-3226-4314-951d-d94af2ac242c\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-kkmzl" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.558724 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a960df53-d712-424a-85a7-64b0e50c911f-cert\") pod \"infra-operator-controller-manager-7b9c774f96-zjkbj\" (UID: \"a960df53-d712-424a-85a7-64b0e50c911f\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-zjkbj" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.558791 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mnkc\" (UniqueName: \"kubernetes.io/projected/aff2d31f-3465-4c0c-8bbf-b04dfdb92db0-kube-api-access-6mnkc\") pod \"swift-operator-controller-manager-c674c5965-skdcp\" (UID: \"aff2d31f-3465-4c0c-8bbf-b04dfdb92db0\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-skdcp" Mar 19 19:14:46 crc kubenswrapper[4826]: E0319 19:14:46.560425 4826 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 19 19:14:46 crc kubenswrapper[4826]: E0319 19:14:46.560496 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a960df53-d712-424a-85a7-64b0e50c911f-cert podName:a960df53-d712-424a-85a7-64b0e50c911f nodeName:}" failed. No retries permitted until 2026-03-19 19:14:47.560470329 +0000 UTC m=+1112.314538642 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a960df53-d712-424a-85a7-64b0e50c911f-cert") pod "infra-operator-controller-manager-7b9c774f96-zjkbj" (UID: "a960df53-d712-424a-85a7-64b0e50c911f") : secret "infra-operator-webhook-server-cert" not found Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.564990 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-dz8c6" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.571105 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-6vmk6"] Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.580149 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpfb7\" (UniqueName: \"kubernetes.io/projected/79a89fcd-3226-4314-951d-d94af2ac242c-kube-api-access-xpfb7\") pod \"placement-operator-controller-manager-5784578c99-kkmzl\" (UID: \"79a89fcd-3226-4314-951d-d94af2ac242c\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-kkmzl" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.595186 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-kkmzl" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.648746 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-7l4t6"] Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.649941 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-7l4t6" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.653947 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-4959v" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.660556 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrs8z\" (UniqueName: \"kubernetes.io/projected/e36e6f7a-53ec-4262-b9e5-798353e5bf15-kube-api-access-lrs8z\") pod \"test-operator-controller-manager-5c5cb9c4d7-6vmk6\" (UID: \"e36e6f7a-53ec-4262-b9e5-798353e5bf15\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-6vmk6" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.660815 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hffqn\" (UniqueName: \"kubernetes.io/projected/5d8869b3-7d43-4db2-b79d-f05c13d0d6f2-kube-api-access-hffqn\") pod \"telemetry-operator-controller-manager-6c5c766d94-258q2\" (UID: \"5d8869b3-7d43-4db2-b79d-f05c13d0d6f2\") " pod="openstack-operators/telemetry-operator-controller-manager-6c5c766d94-258q2" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.660940 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mnkc\" (UniqueName: \"kubernetes.io/projected/aff2d31f-3465-4c0c-8bbf-b04dfdb92db0-kube-api-access-6mnkc\") pod \"swift-operator-controller-manager-c674c5965-skdcp\" (UID: \"aff2d31f-3465-4c0c-8bbf-b04dfdb92db0\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-skdcp" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.675038 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-sfs65" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.685922 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hffqn\" (UniqueName: \"kubernetes.io/projected/5d8869b3-7d43-4db2-b79d-f05c13d0d6f2-kube-api-access-hffqn\") pod \"telemetry-operator-controller-manager-6c5c766d94-258q2\" (UID: \"5d8869b3-7d43-4db2-b79d-f05c13d0d6f2\") " pod="openstack-operators/telemetry-operator-controller-manager-6c5c766d94-258q2" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.686762 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mnkc\" (UniqueName: \"kubernetes.io/projected/aff2d31f-3465-4c0c-8bbf-b04dfdb92db0-kube-api-access-6mnkc\") pod \"swift-operator-controller-manager-c674c5965-skdcp\" (UID: \"aff2d31f-3465-4c0c-8bbf-b04dfdb92db0\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-skdcp" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.706610 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-7l4t6"] Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.762876 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwsll\" (UniqueName: \"kubernetes.io/projected/dc64459f-49c1-41f5-b946-88ab7bc8e1d8-kube-api-access-nwsll\") pod \"watcher-operator-controller-manager-6c4d75f7f9-7l4t6\" (UID: \"dc64459f-49c1-41f5-b946-88ab7bc8e1d8\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-7l4t6" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.762948 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrs8z\" (UniqueName: \"kubernetes.io/projected/e36e6f7a-53ec-4262-b9e5-798353e5bf15-kube-api-access-lrs8z\") pod \"test-operator-controller-manager-5c5cb9c4d7-6vmk6\" (UID: \"e36e6f7a-53ec-4262-b9e5-798353e5bf15\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-6vmk6" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.764174 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-646cd56bc9-8t2bm"] Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.765512 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-8t2bm" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.768415 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-kqr8k" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.769009 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.769111 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.782009 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-646cd56bc9-8t2bm"] Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.785000 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrs8z\" (UniqueName: \"kubernetes.io/projected/e36e6f7a-53ec-4262-b9e5-798353e5bf15-kube-api-access-lrs8z\") pod \"test-operator-controller-manager-5c5cb9c4d7-6vmk6\" (UID: \"e36e6f7a-53ec-4262-b9e5-798353e5bf15\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-6vmk6" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.786939 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-j4p25" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.825374 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5jvh4"] Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.827854 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5jvh4" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.834418 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-g4m5l" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.838380 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5jvh4"] Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.865367 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/50980b03-91b0-4e4d-9923-e2a531458fd4-metrics-certs\") pod \"openstack-operator-controller-manager-646cd56bc9-8t2bm\" (UID: \"50980b03-91b0-4e4d-9923-e2a531458fd4\") " pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-8t2bm" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.865431 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4f382869-5ee2-4a46-8188-d4ddd0bee2fa-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-b76w9\" (UID: \"4f382869-5ee2-4a46-8188-d4ddd0bee2fa\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-b76w9" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.865500 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnhhq\" (UniqueName: \"kubernetes.io/projected/b00ec043-3d8c-41dd-bbef-fc99f7ad0bb6-kube-api-access-lnhhq\") pod \"rabbitmq-cluster-operator-manager-668c99d594-5jvh4\" (UID: \"b00ec043-3d8c-41dd-bbef-fc99f7ad0bb6\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5jvh4" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.865539 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwsll\" (UniqueName: \"kubernetes.io/projected/dc64459f-49c1-41f5-b946-88ab7bc8e1d8-kube-api-access-nwsll\") pod \"watcher-operator-controller-manager-6c4d75f7f9-7l4t6\" (UID: \"dc64459f-49c1-41f5-b946-88ab7bc8e1d8\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-7l4t6" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.865574 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/50980b03-91b0-4e4d-9923-e2a531458fd4-webhook-certs\") pod \"openstack-operator-controller-manager-646cd56bc9-8t2bm\" (UID: \"50980b03-91b0-4e4d-9923-e2a531458fd4\") " pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-8t2bm" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.865600 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-744x2\" (UniqueName: \"kubernetes.io/projected/50980b03-91b0-4e4d-9923-e2a531458fd4-kube-api-access-744x2\") pod \"openstack-operator-controller-manager-646cd56bc9-8t2bm\" (UID: \"50980b03-91b0-4e4d-9923-e2a531458fd4\") " pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-8t2bm" Mar 19 19:14:46 crc kubenswrapper[4826]: E0319 19:14:46.865783 4826 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 19:14:46 crc kubenswrapper[4826]: E0319 19:14:46.865826 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f382869-5ee2-4a46-8188-d4ddd0bee2fa-cert podName:4f382869-5ee2-4a46-8188-d4ddd0bee2fa nodeName:}" failed. No retries permitted until 2026-03-19 19:14:47.865811929 +0000 UTC m=+1112.619880242 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4f382869-5ee2-4a46-8188-d4ddd0bee2fa-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-b76w9" (UID: "4f382869-5ee2-4a46-8188-d4ddd0bee2fa") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.884939 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-jjqrs"] Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.887815 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwsll\" (UniqueName: \"kubernetes.io/projected/dc64459f-49c1-41f5-b946-88ab7bc8e1d8-kube-api-access-nwsll\") pod \"watcher-operator-controller-manager-6c4d75f7f9-7l4t6\" (UID: \"dc64459f-49c1-41f5-b946-88ab7bc8e1d8\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-7l4t6" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.954678 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-skdcp" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.967516 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/50980b03-91b0-4e4d-9923-e2a531458fd4-metrics-certs\") pod \"openstack-operator-controller-manager-646cd56bc9-8t2bm\" (UID: \"50980b03-91b0-4e4d-9923-e2a531458fd4\") " pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-8t2bm" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.967673 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnhhq\" (UniqueName: \"kubernetes.io/projected/b00ec043-3d8c-41dd-bbef-fc99f7ad0bb6-kube-api-access-lnhhq\") pod \"rabbitmq-cluster-operator-manager-668c99d594-5jvh4\" (UID: \"b00ec043-3d8c-41dd-bbef-fc99f7ad0bb6\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5jvh4" Mar 19 19:14:46 crc kubenswrapper[4826]: E0319 19:14:46.967707 4826 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.967737 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/50980b03-91b0-4e4d-9923-e2a531458fd4-webhook-certs\") pod \"openstack-operator-controller-manager-646cd56bc9-8t2bm\" (UID: \"50980b03-91b0-4e4d-9923-e2a531458fd4\") " pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-8t2bm" Mar 19 19:14:46 crc kubenswrapper[4826]: E0319 19:14:46.967771 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50980b03-91b0-4e4d-9923-e2a531458fd4-metrics-certs podName:50980b03-91b0-4e4d-9923-e2a531458fd4 nodeName:}" failed. No retries permitted until 2026-03-19 19:14:47.467751793 +0000 UTC m=+1112.221820106 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/50980b03-91b0-4e4d-9923-e2a531458fd4-metrics-certs") pod "openstack-operator-controller-manager-646cd56bc9-8t2bm" (UID: "50980b03-91b0-4e4d-9923-e2a531458fd4") : secret "metrics-server-cert" not found Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.967795 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-744x2\" (UniqueName: \"kubernetes.io/projected/50980b03-91b0-4e4d-9923-e2a531458fd4-kube-api-access-744x2\") pod \"openstack-operator-controller-manager-646cd56bc9-8t2bm\" (UID: \"50980b03-91b0-4e4d-9923-e2a531458fd4\") " pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-8t2bm" Mar 19 19:14:46 crc kubenswrapper[4826]: E0319 19:14:46.967867 4826 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 19 19:14:46 crc kubenswrapper[4826]: E0319 19:14:46.967913 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50980b03-91b0-4e4d-9923-e2a531458fd4-webhook-certs podName:50980b03-91b0-4e4d-9923-e2a531458fd4 nodeName:}" failed. No retries permitted until 2026-03-19 19:14:47.467896226 +0000 UTC m=+1112.221964539 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/50980b03-91b0-4e4d-9923-e2a531458fd4-webhook-certs") pod "openstack-operator-controller-manager-646cd56bc9-8t2bm" (UID: "50980b03-91b0-4e4d-9923-e2a531458fd4") : secret "webhook-server-cert" not found Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.976123 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6c5c766d94-258q2" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.986232 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnhhq\" (UniqueName: \"kubernetes.io/projected/b00ec043-3d8c-41dd-bbef-fc99f7ad0bb6-kube-api-access-lnhhq\") pod \"rabbitmq-cluster-operator-manager-668c99d594-5jvh4\" (UID: \"b00ec043-3d8c-41dd-bbef-fc99f7ad0bb6\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5jvh4" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.988098 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-744x2\" (UniqueName: \"kubernetes.io/projected/50980b03-91b0-4e4d-9923-e2a531458fd4-kube-api-access-744x2\") pod \"openstack-operator-controller-manager-646cd56bc9-8t2bm\" (UID: \"50980b03-91b0-4e4d-9923-e2a531458fd4\") " pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-8t2bm" Mar 19 19:14:46 crc kubenswrapper[4826]: I0319 19:14:46.992573 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-6vmk6" Mar 19 19:14:47 crc kubenswrapper[4826]: I0319 19:14:47.017808 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-7l4t6" Mar 19 19:14:47 crc kubenswrapper[4826]: I0319 19:14:47.184354 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-rsrjx"] Mar 19 19:14:47 crc kubenswrapper[4826]: W0319 19:14:47.192401 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2375678_e630_4376_9dfd_28efbc77aed4.slice/crio-59b64879b2001be6a10e12d5010843aeba2e3e6d2e2bf140c1bfde8be392df7c WatchSource:0}: Error finding container 59b64879b2001be6a10e12d5010843aeba2e3e6d2e2bf140c1bfde8be392df7c: Status 404 returned error can't find the container with id 59b64879b2001be6a10e12d5010843aeba2e3e6d2e2bf140c1bfde8be392df7c Mar 19 19:14:47 crc kubenswrapper[4826]: I0319 19:14:47.214966 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-zm4ps"] Mar 19 19:14:47 crc kubenswrapper[4826]: I0319 19:14:47.235529 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-hf8n5"] Mar 19 19:14:47 crc kubenswrapper[4826]: W0319 19:14:47.244826 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38267b94_39ea_4067_9b6e_3d863ff60494.slice/crio-91a2769edfff19a8e539cac502f4142c08bd3c9356c5eb048a0c1f978b8cc895 WatchSource:0}: Error finding container 91a2769edfff19a8e539cac502f4142c08bd3c9356c5eb048a0c1f978b8cc895: Status 404 returned error can't find the container with id 91a2769edfff19a8e539cac502f4142c08bd3c9356c5eb048a0c1f978b8cc895 Mar 19 19:14:47 crc kubenswrapper[4826]: I0319 19:14:47.247613 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5jvh4" Mar 19 19:14:47 crc kubenswrapper[4826]: I0319 19:14:47.306053 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-4bkbn"] Mar 19 19:14:47 crc kubenswrapper[4826]: I0319 19:14:47.324987 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-8265b"] Mar 19 19:14:47 crc kubenswrapper[4826]: W0319 19:14:47.329319 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49f5fbe6_ba93_4ff2_b575_aa08dceb2622.slice/crio-7a962ebaf0ad0f652fc34f714e120d370e21fc160586fa9600b1bdfab1ed2ff7 WatchSource:0}: Error finding container 7a962ebaf0ad0f652fc34f714e120d370e21fc160586fa9600b1bdfab1ed2ff7: Status 404 returned error can't find the container with id 7a962ebaf0ad0f652fc34f714e120d370e21fc160586fa9600b1bdfab1ed2ff7 Mar 19 19:14:47 crc kubenswrapper[4826]: I0319 19:14:47.337707 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-725xd"] Mar 19 19:14:47 crc kubenswrapper[4826]: W0319 19:14:47.343919 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f77f094_1b90_43a6_85be_27e8b1fda71f.slice/crio-313a7481d85cbf6a33ba361811d09656da6699a6fc8b650fe0e613304015ee5a WatchSource:0}: Error finding container 313a7481d85cbf6a33ba361811d09656da6699a6fc8b650fe0e613304015ee5a: Status 404 returned error can't find the container with id 313a7481d85cbf6a33ba361811d09656da6699a6fc8b650fe0e613304015ee5a Mar 19 19:14:47 crc kubenswrapper[4826]: W0319 19:14:47.345291 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf073a654_efe9_4fd0_9c08_23d9fdb0d492.slice/crio-54c5b632d22479cb1e763c021a77c35af73a55b1fb24b145acb44126cb8856f1 WatchSource:0}: Error finding container 54c5b632d22479cb1e763c021a77c35af73a55b1fb24b145acb44126cb8856f1: Status 404 returned error can't find the container with id 54c5b632d22479cb1e763c021a77c35af73a55b1fb24b145acb44126cb8856f1 Mar 19 19:14:47 crc kubenswrapper[4826]: I0319 19:14:47.430416 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-jjqrs" event={"ID":"5f60643c-c919-436b-bd23-9e39698d9c9b","Type":"ContainerStarted","Data":"36868ae495bb1472de73a11e82aef549b32aa17c8c476f453c710a8a8540d382"} Mar 19 19:14:47 crc kubenswrapper[4826]: I0319 19:14:47.431625 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-rsrjx" event={"ID":"d2375678-e630-4376-9dfd-28efbc77aed4","Type":"ContainerStarted","Data":"59b64879b2001be6a10e12d5010843aeba2e3e6d2e2bf140c1bfde8be392df7c"} Mar 19 19:14:47 crc kubenswrapper[4826]: I0319 19:14:47.432933 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-725xd" event={"ID":"f073a654-efe9-4fd0-9c08-23d9fdb0d492","Type":"ContainerStarted","Data":"54c5b632d22479cb1e763c021a77c35af73a55b1fb24b145acb44126cb8856f1"} Mar 19 19:14:47 crc kubenswrapper[4826]: I0319 19:14:47.434115 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-4bkbn" event={"ID":"49f5fbe6-ba93-4ff2-b575-aa08dceb2622","Type":"ContainerStarted","Data":"7a962ebaf0ad0f652fc34f714e120d370e21fc160586fa9600b1bdfab1ed2ff7"} Mar 19 19:14:47 crc kubenswrapper[4826]: I0319 19:14:47.435632 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-zm4ps" event={"ID":"38267b94-39ea-4067-9b6e-3d863ff60494","Type":"ContainerStarted","Data":"91a2769edfff19a8e539cac502f4142c08bd3c9356c5eb048a0c1f978b8cc895"} Mar 19 19:14:47 crc kubenswrapper[4826]: I0319 19:14:47.436951 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-hf8n5" event={"ID":"080fa697-4720-424e-b75e-6564061cd68f","Type":"ContainerStarted","Data":"f5e3c43e8aaf16cca5bbc23f37b15db16e5b876bff410ea3638f570923d40ed5"} Mar 19 19:14:47 crc kubenswrapper[4826]: I0319 19:14:47.438200 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-8265b" event={"ID":"0f77f094-1b90-43a6-85be-27e8b1fda71f","Type":"ContainerStarted","Data":"313a7481d85cbf6a33ba361811d09656da6699a6fc8b650fe0e613304015ee5a"} Mar 19 19:14:47 crc kubenswrapper[4826]: I0319 19:14:47.481599 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/50980b03-91b0-4e4d-9923-e2a531458fd4-webhook-certs\") pod \"openstack-operator-controller-manager-646cd56bc9-8t2bm\" (UID: \"50980b03-91b0-4e4d-9923-e2a531458fd4\") " pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-8t2bm" Mar 19 19:14:47 crc kubenswrapper[4826]: I0319 19:14:47.481950 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/50980b03-91b0-4e4d-9923-e2a531458fd4-metrics-certs\") pod \"openstack-operator-controller-manager-646cd56bc9-8t2bm\" (UID: \"50980b03-91b0-4e4d-9923-e2a531458fd4\") " pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-8t2bm" Mar 19 19:14:47 crc kubenswrapper[4826]: E0319 19:14:47.481836 4826 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 19 19:14:47 crc kubenswrapper[4826]: E0319 19:14:47.482427 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50980b03-91b0-4e4d-9923-e2a531458fd4-webhook-certs podName:50980b03-91b0-4e4d-9923-e2a531458fd4 nodeName:}" failed. No retries permitted until 2026-03-19 19:14:48.482408842 +0000 UTC m=+1113.236477155 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/50980b03-91b0-4e4d-9923-e2a531458fd4-webhook-certs") pod "openstack-operator-controller-manager-646cd56bc9-8t2bm" (UID: "50980b03-91b0-4e4d-9923-e2a531458fd4") : secret "webhook-server-cert" not found Mar 19 19:14:47 crc kubenswrapper[4826]: E0319 19:14:47.482149 4826 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 19 19:14:47 crc kubenswrapper[4826]: E0319 19:14:47.482733 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50980b03-91b0-4e4d-9923-e2a531458fd4-metrics-certs podName:50980b03-91b0-4e4d-9923-e2a531458fd4 nodeName:}" failed. No retries permitted until 2026-03-19 19:14:48.482717439 +0000 UTC m=+1113.236785752 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/50980b03-91b0-4e4d-9923-e2a531458fd4-metrics-certs") pod "openstack-operator-controller-manager-646cd56bc9-8t2bm" (UID: "50980b03-91b0-4e4d-9923-e2a531458fd4") : secret "metrics-server-cert" not found Mar 19 19:14:47 crc kubenswrapper[4826]: I0319 19:14:47.583558 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a960df53-d712-424a-85a7-64b0e50c911f-cert\") pod \"infra-operator-controller-manager-7b9c774f96-zjkbj\" (UID: \"a960df53-d712-424a-85a7-64b0e50c911f\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-zjkbj" Mar 19 19:14:47 crc kubenswrapper[4826]: E0319 19:14:47.583848 4826 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 19 19:14:47 crc kubenswrapper[4826]: E0319 19:14:47.583991 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a960df53-d712-424a-85a7-64b0e50c911f-cert podName:a960df53-d712-424a-85a7-64b0e50c911f nodeName:}" failed. No retries permitted until 2026-03-19 19:14:49.583973756 +0000 UTC m=+1114.338042069 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a960df53-d712-424a-85a7-64b0e50c911f-cert") pod "infra-operator-controller-manager-7b9c774f96-zjkbj" (UID: "a960df53-d712-424a-85a7-64b0e50c911f") : secret "infra-operator-webhook-server-cert" not found Mar 19 19:14:48 crc kubenswrapper[4826]: I0319 19:14:47.889720 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4f382869-5ee2-4a46-8188-d4ddd0bee2fa-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-b76w9\" (UID: \"4f382869-5ee2-4a46-8188-d4ddd0bee2fa\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-b76w9" Mar 19 19:14:48 crc kubenswrapper[4826]: E0319 19:14:47.889932 4826 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 19:14:48 crc kubenswrapper[4826]: E0319 19:14:47.890107 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f382869-5ee2-4a46-8188-d4ddd0bee2fa-cert podName:4f382869-5ee2-4a46-8188-d4ddd0bee2fa nodeName:}" failed. No retries permitted until 2026-03-19 19:14:49.890090845 +0000 UTC m=+1114.644159158 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4f382869-5ee2-4a46-8188-d4ddd0bee2fa-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-b76w9" (UID: "4f382869-5ee2-4a46-8188-d4ddd0bee2fa") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 19:14:48 crc kubenswrapper[4826]: I0319 19:14:48.115733 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-zs74n"] Mar 19 19:14:48 crc kubenswrapper[4826]: I0319 19:14:48.201721 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-xpq6x"] Mar 19 19:14:48 crc kubenswrapper[4826]: E0319 19:14:48.219597 4826 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod271f8c86_929d_46a4_8852_f5ec8e701bcb.slice/crio-7bf244fcb5ecedf05220116ce71008044a584c25aaf5c38b1060081ee4a5e37c\": RecentStats: unable to find data in memory cache]" Mar 19 19:14:48 crc kubenswrapper[4826]: I0319 19:14:48.253311 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-j4p25"] Mar 19 19:14:48 crc kubenswrapper[4826]: I0319 19:14:48.281723 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-tjcmb"] Mar 19 19:14:48 crc kubenswrapper[4826]: I0319 19:14:48.285148 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-kkmzl"] Mar 19 19:14:48 crc kubenswrapper[4826]: I0319 19:14:48.306853 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-sfs65"] Mar 19 19:14:48 crc kubenswrapper[4826]: I0319 19:14:48.319159 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-zrczt"] Mar 19 19:14:48 crc kubenswrapper[4826]: I0319 19:14:48.363818 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-ngb9j"] Mar 19 19:14:48 crc kubenswrapper[4826]: I0319 19:14:48.382893 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-6vmk6"] Mar 19 19:14:48 crc kubenswrapper[4826]: I0319 19:14:48.410114 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-skdcp"] Mar 19 19:14:48 crc kubenswrapper[4826]: I0319 19:14:48.427756 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6c5c766d94-258q2"] Mar 19 19:14:48 crc kubenswrapper[4826]: I0319 19:14:48.435202 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-7l4t6"] Mar 19 19:14:48 crc kubenswrapper[4826]: I0319 19:14:48.441574 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5jvh4"] Mar 19 19:14:48 crc kubenswrapper[4826]: W0319 19:14:48.442034 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d8869b3_7d43_4db2_b79d_f05c13d0d6f2.slice/crio-6fcf9c593bf3a489d4d346fad60ef53a299cb981a4e0192208613bae86e39802 WatchSource:0}: Error finding container 6fcf9c593bf3a489d4d346fad60ef53a299cb981a4e0192208613bae86e39802: Status 404 returned error can't find the container with id 6fcf9c593bf3a489d4d346fad60ef53a299cb981a4e0192208613bae86e39802 Mar 19 19:14:48 crc kubenswrapper[4826]: E0319 19:14:48.457144 4826 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6mnkc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-c674c5965-skdcp_openstack-operators(aff2d31f-3465-4c0c-8bbf-b04dfdb92db0): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 19 19:14:48 crc kubenswrapper[4826]: E0319 19:14:48.459096 4826 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nwsll,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6c4d75f7f9-7l4t6_openstack-operators(dc64459f-49c1-41f5-b946-88ab7bc8e1d8): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 19 19:14:48 crc kubenswrapper[4826]: E0319 19:14:48.459154 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-skdcp" podUID="aff2d31f-3465-4c0c-8bbf-b04dfdb92db0" Mar 19 19:14:48 crc kubenswrapper[4826]: I0319 19:14:48.460463 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-6vmk6" event={"ID":"e36e6f7a-53ec-4262-b9e5-798353e5bf15","Type":"ContainerStarted","Data":"535d7d8e75c049a6753990f425d2e2f4e177700070a91d5424801b0b7ca0f2ff"} Mar 19 19:14:48 crc kubenswrapper[4826]: E0319 19:14:48.460526 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-7l4t6" podUID="dc64459f-49c1-41f5-b946-88ab7bc8e1d8" Mar 19 19:14:48 crc kubenswrapper[4826]: I0319 19:14:48.462869 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-tjcmb" event={"ID":"44055ef9-1bc5-4b25-a40d-553a1546fc15","Type":"ContainerStarted","Data":"78fa23e1d1c200f7442286a3f56788b32482c418f20d06c30a8ca65f429bf68e"} Mar 19 19:14:48 crc kubenswrapper[4826]: E0319 19:14:48.463131 4826 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lnhhq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-5jvh4_openstack-operators(b00ec043-3d8c-41dd-bbef-fc99f7ad0bb6): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 19 19:14:48 crc kubenswrapper[4826]: E0319 19:14:48.464349 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5jvh4" podUID="b00ec043-3d8c-41dd-bbef-fc99f7ad0bb6" Mar 19 19:14:48 crc kubenswrapper[4826]: I0319 19:14:48.465500 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-ngb9j" event={"ID":"ee5c97c9-5dc0-4292-9a34-08ca45f5387a","Type":"ContainerStarted","Data":"14df9860721722944791af802e7e28b7a4c19902e7d81a682b23a628a4c2a453"} Mar 19 19:14:48 crc kubenswrapper[4826]: I0319 19:14:48.467438 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-zrczt" event={"ID":"6a5ffd48-ea97-46a0-b9ed-f7c38d5d8a90","Type":"ContainerStarted","Data":"e189b6462803ffc7e8a48d871805c76828bec81c6ed00f23b566aafb35235d89"} Mar 19 19:14:48 crc kubenswrapper[4826]: I0319 19:14:48.476409 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-sfs65" event={"ID":"918ac815-fe60-44b9-b6c0-c99ee8dc80b8","Type":"ContainerStarted","Data":"ffef93288c0832bc89812a9c6da5496a31235c1576f341dd3dabb298a93bc005"} Mar 19 19:14:48 crc kubenswrapper[4826]: I0319 19:14:48.479215 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-j4p25" event={"ID":"7137162e-cccf-4ce6-9dc4-7380db33a85a","Type":"ContainerStarted","Data":"ee46af0ec4f0c68ae8fbbf9303052f39fe9798e98d6a60fec4da1193a990a99b"} Mar 19 19:14:48 crc kubenswrapper[4826]: I0319 19:14:48.495563 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-zs74n" event={"ID":"6243b523-966a-4f1d-b663-2f1ed4614fdb","Type":"ContainerStarted","Data":"7dcd4ecff2ecbab9e7f9051f04683d4ac03588597c8844f01b0ef5bba3f51133"} Mar 19 19:14:48 crc kubenswrapper[4826]: I0319 19:14:48.502891 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/50980b03-91b0-4e4d-9923-e2a531458fd4-metrics-certs\") pod \"openstack-operator-controller-manager-646cd56bc9-8t2bm\" (UID: \"50980b03-91b0-4e4d-9923-e2a531458fd4\") " pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-8t2bm" Mar 19 19:14:48 crc kubenswrapper[4826]: I0319 19:14:48.503068 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/50980b03-91b0-4e4d-9923-e2a531458fd4-webhook-certs\") pod \"openstack-operator-controller-manager-646cd56bc9-8t2bm\" (UID: \"50980b03-91b0-4e4d-9923-e2a531458fd4\") " pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-8t2bm" Mar 19 19:14:48 crc kubenswrapper[4826]: E0319 19:14:48.503322 4826 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 19 19:14:48 crc kubenswrapper[4826]: E0319 19:14:48.503375 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50980b03-91b0-4e4d-9923-e2a531458fd4-webhook-certs podName:50980b03-91b0-4e4d-9923-e2a531458fd4 nodeName:}" failed. No retries permitted until 2026-03-19 19:14:50.503357787 +0000 UTC m=+1115.257426100 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/50980b03-91b0-4e4d-9923-e2a531458fd4-webhook-certs") pod "openstack-operator-controller-manager-646cd56bc9-8t2bm" (UID: "50980b03-91b0-4e4d-9923-e2a531458fd4") : secret "webhook-server-cert" not found Mar 19 19:14:48 crc kubenswrapper[4826]: E0319 19:14:48.503937 4826 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 19 19:14:48 crc kubenswrapper[4826]: E0319 19:14:48.503973 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50980b03-91b0-4e4d-9923-e2a531458fd4-metrics-certs podName:50980b03-91b0-4e4d-9923-e2a531458fd4 nodeName:}" failed. No retries permitted until 2026-03-19 19:14:50.503960652 +0000 UTC m=+1115.258028965 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/50980b03-91b0-4e4d-9923-e2a531458fd4-metrics-certs") pod "openstack-operator-controller-manager-646cd56bc9-8t2bm" (UID: "50980b03-91b0-4e4d-9923-e2a531458fd4") : secret "metrics-server-cert" not found Mar 19 19:14:48 crc kubenswrapper[4826]: I0319 19:14:48.504114 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-kkmzl" event={"ID":"79a89fcd-3226-4314-951d-d94af2ac242c","Type":"ContainerStarted","Data":"4432b3ab1df67ae83483493d0bce1cc4ea997f8c7a2db9c59d51a5c2abff0169"} Mar 19 19:14:48 crc kubenswrapper[4826]: I0319 19:14:48.506134 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-xpq6x" event={"ID":"271f8c86-929d-46a4-8852-f5ec8e701bcb","Type":"ContainerStarted","Data":"7bf244fcb5ecedf05220116ce71008044a584c25aaf5c38b1060081ee4a5e37c"} Mar 19 19:14:49 crc kubenswrapper[4826]: I0319 19:14:49.531018 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6c5c766d94-258q2" event={"ID":"5d8869b3-7d43-4db2-b79d-f05c13d0d6f2","Type":"ContainerStarted","Data":"6fcf9c593bf3a489d4d346fad60ef53a299cb981a4e0192208613bae86e39802"} Mar 19 19:14:49 crc kubenswrapper[4826]: I0319 19:14:49.533819 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-7l4t6" event={"ID":"dc64459f-49c1-41f5-b946-88ab7bc8e1d8","Type":"ContainerStarted","Data":"cae386598ad0e7e468726210285b40537b0c8dec63c35ad706371f2b8035ba18"} Mar 19 19:14:49 crc kubenswrapper[4826]: E0319 19:14:49.536274 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-7l4t6" podUID="dc64459f-49c1-41f5-b946-88ab7bc8e1d8" Mar 19 19:14:49 crc kubenswrapper[4826]: I0319 19:14:49.543580 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5jvh4" event={"ID":"b00ec043-3d8c-41dd-bbef-fc99f7ad0bb6","Type":"ContainerStarted","Data":"c068f945ef9ad6f15d8061cd0eb3074a674afde6bb53f96f27f8d6ca98da0258"} Mar 19 19:14:49 crc kubenswrapper[4826]: E0319 19:14:49.554244 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5jvh4" podUID="b00ec043-3d8c-41dd-bbef-fc99f7ad0bb6" Mar 19 19:14:49 crc kubenswrapper[4826]: I0319 19:14:49.562941 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-skdcp" event={"ID":"aff2d31f-3465-4c0c-8bbf-b04dfdb92db0","Type":"ContainerStarted","Data":"5f258ed833fad65e18600cf7f11449710b1b36e1e3154b690824af50af35589f"} Mar 19 19:14:49 crc kubenswrapper[4826]: E0319 19:14:49.567415 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-skdcp" podUID="aff2d31f-3465-4c0c-8bbf-b04dfdb92db0" Mar 19 19:14:49 crc kubenswrapper[4826]: I0319 19:14:49.632674 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a960df53-d712-424a-85a7-64b0e50c911f-cert\") pod \"infra-operator-controller-manager-7b9c774f96-zjkbj\" (UID: \"a960df53-d712-424a-85a7-64b0e50c911f\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-zjkbj" Mar 19 19:14:49 crc kubenswrapper[4826]: E0319 19:14:49.632864 4826 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 19 19:14:49 crc kubenswrapper[4826]: E0319 19:14:49.632934 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a960df53-d712-424a-85a7-64b0e50c911f-cert podName:a960df53-d712-424a-85a7-64b0e50c911f nodeName:}" failed. No retries permitted until 2026-03-19 19:14:53.632915467 +0000 UTC m=+1118.386983780 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a960df53-d712-424a-85a7-64b0e50c911f-cert") pod "infra-operator-controller-manager-7b9c774f96-zjkbj" (UID: "a960df53-d712-424a-85a7-64b0e50c911f") : secret "infra-operator-webhook-server-cert" not found Mar 19 19:14:49 crc kubenswrapper[4826]: I0319 19:14:49.942916 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4f382869-5ee2-4a46-8188-d4ddd0bee2fa-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-b76w9\" (UID: \"4f382869-5ee2-4a46-8188-d4ddd0bee2fa\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-b76w9" Mar 19 19:14:49 crc kubenswrapper[4826]: E0319 19:14:49.943162 4826 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 19:14:49 crc kubenswrapper[4826]: E0319 19:14:49.943245 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f382869-5ee2-4a46-8188-d4ddd0bee2fa-cert podName:4f382869-5ee2-4a46-8188-d4ddd0bee2fa nodeName:}" failed. No retries permitted until 2026-03-19 19:14:53.943224538 +0000 UTC m=+1118.697292851 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4f382869-5ee2-4a46-8188-d4ddd0bee2fa-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-b76w9" (UID: "4f382869-5ee2-4a46-8188-d4ddd0bee2fa") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 19:14:50 crc kubenswrapper[4826]: I0319 19:14:50.554366 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/50980b03-91b0-4e4d-9923-e2a531458fd4-webhook-certs\") pod \"openstack-operator-controller-manager-646cd56bc9-8t2bm\" (UID: \"50980b03-91b0-4e4d-9923-e2a531458fd4\") " pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-8t2bm" Mar 19 19:14:50 crc kubenswrapper[4826]: I0319 19:14:50.554450 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/50980b03-91b0-4e4d-9923-e2a531458fd4-metrics-certs\") pod \"openstack-operator-controller-manager-646cd56bc9-8t2bm\" (UID: \"50980b03-91b0-4e4d-9923-e2a531458fd4\") " pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-8t2bm" Mar 19 19:14:50 crc kubenswrapper[4826]: E0319 19:14:50.554589 4826 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 19 19:14:50 crc kubenswrapper[4826]: E0319 19:14:50.554672 4826 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 19 19:14:50 crc kubenswrapper[4826]: E0319 19:14:50.554682 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50980b03-91b0-4e4d-9923-e2a531458fd4-webhook-certs podName:50980b03-91b0-4e4d-9923-e2a531458fd4 nodeName:}" failed. No retries permitted until 2026-03-19 19:14:54.554648615 +0000 UTC m=+1119.308716928 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/50980b03-91b0-4e4d-9923-e2a531458fd4-webhook-certs") pod "openstack-operator-controller-manager-646cd56bc9-8t2bm" (UID: "50980b03-91b0-4e4d-9923-e2a531458fd4") : secret "webhook-server-cert" not found Mar 19 19:14:50 crc kubenswrapper[4826]: E0319 19:14:50.554761 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50980b03-91b0-4e4d-9923-e2a531458fd4-metrics-certs podName:50980b03-91b0-4e4d-9923-e2a531458fd4 nodeName:}" failed. No retries permitted until 2026-03-19 19:14:54.554743878 +0000 UTC m=+1119.308812181 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/50980b03-91b0-4e4d-9923-e2a531458fd4-metrics-certs") pod "openstack-operator-controller-manager-646cd56bc9-8t2bm" (UID: "50980b03-91b0-4e4d-9923-e2a531458fd4") : secret "metrics-server-cert" not found Mar 19 19:14:50 crc kubenswrapper[4826]: E0319 19:14:50.576520 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5jvh4" podUID="b00ec043-3d8c-41dd-bbef-fc99f7ad0bb6" Mar 19 19:14:50 crc kubenswrapper[4826]: E0319 19:14:50.577007 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-7l4t6" podUID="dc64459f-49c1-41f5-b946-88ab7bc8e1d8" Mar 19 19:14:50 crc kubenswrapper[4826]: E0319 19:14:50.577055 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-skdcp" podUID="aff2d31f-3465-4c0c-8bbf-b04dfdb92db0" Mar 19 19:14:53 crc kubenswrapper[4826]: I0319 19:14:53.721862 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a960df53-d712-424a-85a7-64b0e50c911f-cert\") pod \"infra-operator-controller-manager-7b9c774f96-zjkbj\" (UID: \"a960df53-d712-424a-85a7-64b0e50c911f\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-zjkbj" Mar 19 19:14:53 crc kubenswrapper[4826]: E0319 19:14:53.722125 4826 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 19 19:14:53 crc kubenswrapper[4826]: E0319 19:14:53.722557 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a960df53-d712-424a-85a7-64b0e50c911f-cert podName:a960df53-d712-424a-85a7-64b0e50c911f nodeName:}" failed. No retries permitted until 2026-03-19 19:15:01.72251838 +0000 UTC m=+1126.476586733 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a960df53-d712-424a-85a7-64b0e50c911f-cert") pod "infra-operator-controller-manager-7b9c774f96-zjkbj" (UID: "a960df53-d712-424a-85a7-64b0e50c911f") : secret "infra-operator-webhook-server-cert" not found Mar 19 19:14:54 crc kubenswrapper[4826]: I0319 19:14:54.027228 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4f382869-5ee2-4a46-8188-d4ddd0bee2fa-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-b76w9\" (UID: \"4f382869-5ee2-4a46-8188-d4ddd0bee2fa\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-b76w9" Mar 19 19:14:54 crc kubenswrapper[4826]: E0319 19:14:54.027435 4826 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 19:14:54 crc kubenswrapper[4826]: E0319 19:14:54.027729 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f382869-5ee2-4a46-8188-d4ddd0bee2fa-cert podName:4f382869-5ee2-4a46-8188-d4ddd0bee2fa nodeName:}" failed. No retries permitted until 2026-03-19 19:15:02.027703486 +0000 UTC m=+1126.781771839 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4f382869-5ee2-4a46-8188-d4ddd0bee2fa-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-b76w9" (UID: "4f382869-5ee2-4a46-8188-d4ddd0bee2fa") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 19:14:54 crc kubenswrapper[4826]: I0319 19:14:54.556500 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/50980b03-91b0-4e4d-9923-e2a531458fd4-webhook-certs\") pod \"openstack-operator-controller-manager-646cd56bc9-8t2bm\" (UID: \"50980b03-91b0-4e4d-9923-e2a531458fd4\") " pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-8t2bm" Mar 19 19:14:54 crc kubenswrapper[4826]: I0319 19:14:54.556595 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/50980b03-91b0-4e4d-9923-e2a531458fd4-metrics-certs\") pod \"openstack-operator-controller-manager-646cd56bc9-8t2bm\" (UID: \"50980b03-91b0-4e4d-9923-e2a531458fd4\") " pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-8t2bm" Mar 19 19:14:54 crc kubenswrapper[4826]: E0319 19:14:54.556692 4826 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 19 19:14:54 crc kubenswrapper[4826]: E0319 19:14:54.556759 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50980b03-91b0-4e4d-9923-e2a531458fd4-webhook-certs podName:50980b03-91b0-4e4d-9923-e2a531458fd4 nodeName:}" failed. No retries permitted until 2026-03-19 19:15:02.556741223 +0000 UTC m=+1127.310809536 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/50980b03-91b0-4e4d-9923-e2a531458fd4-webhook-certs") pod "openstack-operator-controller-manager-646cd56bc9-8t2bm" (UID: "50980b03-91b0-4e4d-9923-e2a531458fd4") : secret "webhook-server-cert" not found Mar 19 19:14:54 crc kubenswrapper[4826]: E0319 19:14:54.556820 4826 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 19 19:14:54 crc kubenswrapper[4826]: E0319 19:14:54.556872 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50980b03-91b0-4e4d-9923-e2a531458fd4-metrics-certs podName:50980b03-91b0-4e4d-9923-e2a531458fd4 nodeName:}" failed. No retries permitted until 2026-03-19 19:15:02.556856876 +0000 UTC m=+1127.310925189 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/50980b03-91b0-4e4d-9923-e2a531458fd4-metrics-certs") pod "openstack-operator-controller-manager-646cd56bc9-8t2bm" (UID: "50980b03-91b0-4e4d-9923-e2a531458fd4") : secret "metrics-server-cert" not found Mar 19 19:14:55 crc kubenswrapper[4826]: I0319 19:14:55.400215 4826 patch_prober.go:28] interesting pod/machine-config-daemon-zz87p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:14:55 crc kubenswrapper[4826]: I0319 19:14:55.400310 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:15:00 crc kubenswrapper[4826]: I0319 19:15:00.143516 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565795-49jd6"] Mar 19 19:15:00 crc kubenswrapper[4826]: I0319 19:15:00.146145 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565795-49jd6" Mar 19 19:15:00 crc kubenswrapper[4826]: I0319 19:15:00.155420 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 19 19:15:00 crc kubenswrapper[4826]: I0319 19:15:00.155577 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 19 19:15:00 crc kubenswrapper[4826]: I0319 19:15:00.158380 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565795-49jd6"] Mar 19 19:15:00 crc kubenswrapper[4826]: E0319 19:15:00.233818 4826 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:12841b27173f5f1beeb83112e057c8753f4cf411f583fba4f0610fac0f60b7ad" Mar 19 19:15:00 crc kubenswrapper[4826]: E0319 19:15:00.234023 4826 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:12841b27173f5f1beeb83112e057c8753f4cf411f583fba4f0610fac0f60b7ad,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9bfrg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-588d4d986b-hf8n5_openstack-operators(080fa697-4720-424e-b75e-6564061cd68f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 19:15:00 crc kubenswrapper[4826]: E0319 19:15:00.235282 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-hf8n5" podUID="080fa697-4720-424e-b75e-6564061cd68f" Mar 19 19:15:00 crc kubenswrapper[4826]: I0319 19:15:00.264124 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9aa2d483-2b01-4910-91fc-28ad369bbb17-config-volume\") pod \"collect-profiles-29565795-49jd6\" (UID: \"9aa2d483-2b01-4910-91fc-28ad369bbb17\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565795-49jd6" Mar 19 19:15:00 crc kubenswrapper[4826]: I0319 19:15:00.264210 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9aa2d483-2b01-4910-91fc-28ad369bbb17-secret-volume\") pod \"collect-profiles-29565795-49jd6\" (UID: \"9aa2d483-2b01-4910-91fc-28ad369bbb17\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565795-49jd6" Mar 19 19:15:00 crc kubenswrapper[4826]: I0319 19:15:00.264247 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5fdb\" (UniqueName: \"kubernetes.io/projected/9aa2d483-2b01-4910-91fc-28ad369bbb17-kube-api-access-j5fdb\") pod \"collect-profiles-29565795-49jd6\" (UID: \"9aa2d483-2b01-4910-91fc-28ad369bbb17\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565795-49jd6" Mar 19 19:15:00 crc kubenswrapper[4826]: I0319 19:15:00.365369 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5fdb\" (UniqueName: \"kubernetes.io/projected/9aa2d483-2b01-4910-91fc-28ad369bbb17-kube-api-access-j5fdb\") pod \"collect-profiles-29565795-49jd6\" (UID: \"9aa2d483-2b01-4910-91fc-28ad369bbb17\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565795-49jd6" Mar 19 19:15:00 crc kubenswrapper[4826]: I0319 19:15:00.365526 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9aa2d483-2b01-4910-91fc-28ad369bbb17-config-volume\") pod \"collect-profiles-29565795-49jd6\" (UID: \"9aa2d483-2b01-4910-91fc-28ad369bbb17\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565795-49jd6" Mar 19 19:15:00 crc kubenswrapper[4826]: I0319 19:15:00.365575 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9aa2d483-2b01-4910-91fc-28ad369bbb17-secret-volume\") pod \"collect-profiles-29565795-49jd6\" (UID: \"9aa2d483-2b01-4910-91fc-28ad369bbb17\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565795-49jd6" Mar 19 19:15:00 crc kubenswrapper[4826]: I0319 19:15:00.366642 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9aa2d483-2b01-4910-91fc-28ad369bbb17-config-volume\") pod \"collect-profiles-29565795-49jd6\" (UID: \"9aa2d483-2b01-4910-91fc-28ad369bbb17\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565795-49jd6" Mar 19 19:15:00 crc kubenswrapper[4826]: I0319 19:15:00.376282 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9aa2d483-2b01-4910-91fc-28ad369bbb17-secret-volume\") pod \"collect-profiles-29565795-49jd6\" (UID: \"9aa2d483-2b01-4910-91fc-28ad369bbb17\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565795-49jd6" Mar 19 19:15:00 crc kubenswrapper[4826]: I0319 19:15:00.389742 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5fdb\" (UniqueName: \"kubernetes.io/projected/9aa2d483-2b01-4910-91fc-28ad369bbb17-kube-api-access-j5fdb\") pod \"collect-profiles-29565795-49jd6\" (UID: \"9aa2d483-2b01-4910-91fc-28ad369bbb17\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565795-49jd6" Mar 19 19:15:00 crc kubenswrapper[4826]: I0319 19:15:00.481385 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565795-49jd6" Mar 19 19:15:00 crc kubenswrapper[4826]: E0319 19:15:00.690781 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/designate-operator@sha256:12841b27173f5f1beeb83112e057c8753f4cf411f583fba4f0610fac0f60b7ad\\\"\"" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-hf8n5" podUID="080fa697-4720-424e-b75e-6564061cd68f" Mar 19 19:15:00 crc kubenswrapper[4826]: E0319 19:15:00.836184 4826 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:76a1cde9f29fb39ed715b06be16adb803b9a2e24d68acb369911c0a88e33bc7d" Mar 19 19:15:00 crc kubenswrapper[4826]: E0319 19:15:00.836500 4826 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:76a1cde9f29fb39ed715b06be16adb803b9a2e24d68acb369911c0a88e33bc7d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hltv7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-79df6bcc97-8265b_openstack-operators(0f77f094-1b90-43a6-85be-27e8b1fda71f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 19:15:00 crc kubenswrapper[4826]: E0319 19:15:00.837782 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-8265b" podUID="0f77f094-1b90-43a6-85be-27e8b1fda71f" Mar 19 19:15:01 crc kubenswrapper[4826]: E0319 19:15:01.698725 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:76a1cde9f29fb39ed715b06be16adb803b9a2e24d68acb369911c0a88e33bc7d\\\"\"" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-8265b" podUID="0f77f094-1b90-43a6-85be-27e8b1fda71f" Mar 19 19:15:01 crc kubenswrapper[4826]: I0319 19:15:01.789348 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a960df53-d712-424a-85a7-64b0e50c911f-cert\") pod \"infra-operator-controller-manager-7b9c774f96-zjkbj\" (UID: \"a960df53-d712-424a-85a7-64b0e50c911f\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-zjkbj" Mar 19 19:15:01 crc kubenswrapper[4826]: I0319 19:15:01.794303 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a960df53-d712-424a-85a7-64b0e50c911f-cert\") pod \"infra-operator-controller-manager-7b9c774f96-zjkbj\" (UID: \"a960df53-d712-424a-85a7-64b0e50c911f\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-zjkbj" Mar 19 19:15:01 crc kubenswrapper[4826]: I0319 19:15:01.953647 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-zjkbj" Mar 19 19:15:02 crc kubenswrapper[4826]: I0319 19:15:02.094996 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4f382869-5ee2-4a46-8188-d4ddd0bee2fa-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-b76w9\" (UID: \"4f382869-5ee2-4a46-8188-d4ddd0bee2fa\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-b76w9" Mar 19 19:15:02 crc kubenswrapper[4826]: I0319 19:15:02.099714 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4f382869-5ee2-4a46-8188-d4ddd0bee2fa-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-b76w9\" (UID: \"4f382869-5ee2-4a46-8188-d4ddd0bee2fa\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-b76w9" Mar 19 19:15:02 crc kubenswrapper[4826]: E0319 19:15:02.244497 4826 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:d8210bb21d4d298271a7b43f92fe58789393546e616aaaec1ce71bb2a754e777" Mar 19 19:15:02 crc kubenswrapper[4826]: E0319 19:15:02.244738 4826 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:d8210bb21d4d298271a7b43f92fe58789393546e616aaaec1ce71bb2a754e777,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7xvx2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-8d58dc466-zm4ps_openstack-operators(38267b94-39ea-4067-9b6e-3d863ff60494): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 19:15:02 crc kubenswrapper[4826]: E0319 19:15:02.249302 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-zm4ps" podUID="38267b94-39ea-4067-9b6e-3d863ff60494" Mar 19 19:15:02 crc kubenswrapper[4826]: I0319 19:15:02.291237 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-b76w9" Mar 19 19:15:02 crc kubenswrapper[4826]: I0319 19:15:02.619737 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/50980b03-91b0-4e4d-9923-e2a531458fd4-metrics-certs\") pod \"openstack-operator-controller-manager-646cd56bc9-8t2bm\" (UID: \"50980b03-91b0-4e4d-9923-e2a531458fd4\") " pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-8t2bm" Mar 19 19:15:02 crc kubenswrapper[4826]: I0319 19:15:02.619901 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/50980b03-91b0-4e4d-9923-e2a531458fd4-webhook-certs\") pod \"openstack-operator-controller-manager-646cd56bc9-8t2bm\" (UID: \"50980b03-91b0-4e4d-9923-e2a531458fd4\") " pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-8t2bm" Mar 19 19:15:02 crc kubenswrapper[4826]: E0319 19:15:02.620045 4826 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 19 19:15:02 crc kubenswrapper[4826]: E0319 19:15:02.620110 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50980b03-91b0-4e4d-9923-e2a531458fd4-webhook-certs podName:50980b03-91b0-4e4d-9923-e2a531458fd4 nodeName:}" failed. No retries permitted until 2026-03-19 19:15:18.620090419 +0000 UTC m=+1143.374158732 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/50980b03-91b0-4e4d-9923-e2a531458fd4-webhook-certs") pod "openstack-operator-controller-manager-646cd56bc9-8t2bm" (UID: "50980b03-91b0-4e4d-9923-e2a531458fd4") : secret "webhook-server-cert" not found Mar 19 19:15:02 crc kubenswrapper[4826]: E0319 19:15:02.620172 4826 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 19 19:15:02 crc kubenswrapper[4826]: E0319 19:15:02.620206 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50980b03-91b0-4e4d-9923-e2a531458fd4-metrics-certs podName:50980b03-91b0-4e4d-9923-e2a531458fd4 nodeName:}" failed. No retries permitted until 2026-03-19 19:15:18.620196941 +0000 UTC m=+1143.374265254 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/50980b03-91b0-4e4d-9923-e2a531458fd4-metrics-certs") pod "openstack-operator-controller-manager-646cd56bc9-8t2bm" (UID: "50980b03-91b0-4e4d-9923-e2a531458fd4") : secret "metrics-server-cert" not found Mar 19 19:15:02 crc kubenswrapper[4826]: E0319 19:15:02.712177 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:d8210bb21d4d298271a7b43f92fe58789393546e616aaaec1ce71bb2a754e777\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-zm4ps" podUID="38267b94-39ea-4067-9b6e-3d863ff60494" Mar 19 19:15:03 crc kubenswrapper[4826]: E0319 19:15:03.217245 4826 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622" Mar 19 19:15:03 crc kubenswrapper[4826]: E0319 19:15:03.217707 4826 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xpfb7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-5784578c99-kkmzl_openstack-operators(79a89fcd-3226-4314-951d-d94af2ac242c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 19:15:03 crc kubenswrapper[4826]: E0319 19:15:03.219019 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-kkmzl" podUID="79a89fcd-3226-4314-951d-d94af2ac242c" Mar 19 19:15:04 crc kubenswrapper[4826]: E0319 19:15:03.719063 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-kkmzl" podUID="79a89fcd-3226-4314-951d-d94af2ac242c" Mar 19 19:15:04 crc kubenswrapper[4826]: E0319 19:15:04.311804 4826 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42" Mar 19 19:15:04 crc kubenswrapper[4826]: E0319 19:15:04.312010 4826 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lrs8z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5c5cb9c4d7-6vmk6_openstack-operators(e36e6f7a-53ec-4262-b9e5-798353e5bf15): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 19:15:04 crc kubenswrapper[4826]: E0319 19:15:04.313187 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-6vmk6" podUID="e36e6f7a-53ec-4262-b9e5-798353e5bf15" Mar 19 19:15:04 crc kubenswrapper[4826]: E0319 19:15:04.729747 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-6vmk6" podUID="e36e6f7a-53ec-4262-b9e5-798353e5bf15" Mar 19 19:15:05 crc kubenswrapper[4826]: E0319 19:15:05.947683 4826 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:526f9d4965431e1a5e4f8c3224bcee3f636a3108a5e0767296a994c2a517404a" Mar 19 19:15:05 crc kubenswrapper[4826]: E0319 19:15:05.948290 4826 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:526f9d4965431e1a5e4f8c3224bcee3f636a3108a5e0767296a994c2a517404a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-l2q5b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-767865f676-sfs65_openstack-operators(918ac815-fe60-44b9-b6c0-c99ee8dc80b8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 19:15:05 crc kubenswrapper[4826]: E0319 19:15:05.949506 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-767865f676-sfs65" podUID="918ac815-fe60-44b9-b6c0-c99ee8dc80b8" Mar 19 19:15:06 crc kubenswrapper[4826]: E0319 19:15:06.741315 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:526f9d4965431e1a5e4f8c3224bcee3f636a3108a5e0767296a994c2a517404a\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-767865f676-sfs65" podUID="918ac815-fe60-44b9-b6c0-c99ee8dc80b8" Mar 19 19:15:08 crc kubenswrapper[4826]: E0319 19:15:08.495949 4826 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55" Mar 19 19:15:08 crc kubenswrapper[4826]: E0319 19:15:08.496412 4826 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wh7gl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-884679f54-zs74n_openstack-operators(6243b523-966a-4f1d-b663-2f1ed4614fdb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 19:15:08 crc kubenswrapper[4826]: E0319 19:15:08.497691 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-zs74n" podUID="6243b523-966a-4f1d-b663-2f1ed4614fdb" Mar 19 19:15:08 crc kubenswrapper[4826]: E0319 19:15:08.757639 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-zs74n" podUID="6243b523-966a-4f1d-b663-2f1ed4614fdb" Mar 19 19:15:09 crc kubenswrapper[4826]: E0319 19:15:09.065392 4826 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:f2e0b0fb34995b8acbbf1b0b60b5dbcf488b4f3899d1bb0763ae7dcee9bae6da" Mar 19 19:15:09 crc kubenswrapper[4826]: E0319 19:15:09.065614 4826 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:f2e0b0fb34995b8acbbf1b0b60b5dbcf488b4f3899d1bb0763ae7dcee9bae6da,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-26t8z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-55f864c847-zrczt_openstack-operators(6a5ffd48-ea97-46a0-b9ed-f7c38d5d8a90): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 19:15:09 crc kubenswrapper[4826]: E0319 19:15:09.066828 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-55f864c847-zrczt" podUID="6a5ffd48-ea97-46a0-b9ed-f7c38d5d8a90" Mar 19 19:15:09 crc kubenswrapper[4826]: E0319 19:15:09.768859 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:f2e0b0fb34995b8acbbf1b0b60b5dbcf488b4f3899d1bb0763ae7dcee9bae6da\\\"\"" pod="openstack-operators/manila-operator-controller-manager-55f864c847-zrczt" podUID="6a5ffd48-ea97-46a0-b9ed-f7c38d5d8a90" Mar 19 19:15:10 crc kubenswrapper[4826]: E0319 19:15:10.791103 4826 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a" Mar 19 19:15:10 crc kubenswrapper[4826]: E0319 19:15:10.791824 4826 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-d6jm9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-5d488d59fb-tjcmb_openstack-operators(44055ef9-1bc5-4b25-a40d-553a1546fc15): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 19:15:10 crc kubenswrapper[4826]: E0319 19:15:10.793151 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-tjcmb" podUID="44055ef9-1bc5-4b25-a40d-553a1546fc15" Mar 19 19:15:11 crc kubenswrapper[4826]: E0319 19:15:11.462152 4826 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:c6ef5db244d874430a56c3cc9d27662e4bd57cdaa489e1f6059abcacf3aa0900" Mar 19 19:15:11 crc kubenswrapper[4826]: E0319 19:15:11.462309 4826 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:c6ef5db244d874430a56c3cc9d27662e4bd57cdaa489e1f6059abcacf3aa0900,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-s5hnl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-67dd5f86f5-rsrjx_openstack-operators(d2375678-e630-4376-9dfd-28efbc77aed4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 19:15:11 crc kubenswrapper[4826]: E0319 19:15:11.463574 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-rsrjx" podUID="d2375678-e630-4376-9dfd-28efbc77aed4" Mar 19 19:15:11 crc kubenswrapper[4826]: E0319 19:15:11.785300 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-tjcmb" podUID="44055ef9-1bc5-4b25-a40d-553a1546fc15" Mar 19 19:15:11 crc kubenswrapper[4826]: E0319 19:15:11.785366 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:c6ef5db244d874430a56c3cc9d27662e4bd57cdaa489e1f6059abcacf3aa0900\\\"\"" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-rsrjx" podUID="d2375678-e630-4376-9dfd-28efbc77aed4" Mar 19 19:15:12 crc kubenswrapper[4826]: E0319 19:15:12.060784 4826 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:6e7552996253fc66667eaa3eb0e11b4e97145efa2ae577155ceabf8e9913ddc1" Mar 19 19:15:12 crc kubenswrapper[4826]: E0319 19:15:12.060998 4826 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:6e7552996253fc66667eaa3eb0e11b4e97145efa2ae577155ceabf8e9913ddc1,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4mrtq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-67ccfc9778-xpq6x_openstack-operators(271f8c86-929d-46a4-8852-f5ec8e701bcb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 19:15:12 crc kubenswrapper[4826]: E0319 19:15:12.062725 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-xpq6x" podUID="271f8c86-929d-46a4-8852-f5ec8e701bcb" Mar 19 19:15:12 crc kubenswrapper[4826]: E0319 19:15:12.128117 4826 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.36:5001/openstack-k8s-operators/telemetry-operator:15c2ffcfe08e13a1dec28232b4ee653042564ac3" Mar 19 19:15:12 crc kubenswrapper[4826]: E0319 19:15:12.128376 4826 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.36:5001/openstack-k8s-operators/telemetry-operator:15c2ffcfe08e13a1dec28232b4ee653042564ac3" Mar 19 19:15:12 crc kubenswrapper[4826]: E0319 19:15:12.128499 4826 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.36:5001/openstack-k8s-operators/telemetry-operator:15c2ffcfe08e13a1dec28232b4ee653042564ac3,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hffqn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-6c5c766d94-258q2_openstack-operators(5d8869b3-7d43-4db2-b79d-f05c13d0d6f2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 19:15:12 crc kubenswrapper[4826]: E0319 19:15:12.130472 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-6c5c766d94-258q2" podUID="5d8869b3-7d43-4db2-b79d-f05c13d0d6f2" Mar 19 19:15:12 crc kubenswrapper[4826]: E0319 19:15:12.758470 4826 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:703ad3a2b749bce100f1e2a445312b65dc3b8b45e8c8ba59f311d3f8f3368113" Mar 19 19:15:12 crc kubenswrapper[4826]: E0319 19:15:12.758701 4826 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:703ad3a2b749bce100f1e2a445312b65dc3b8b45e8c8ba59f311d3f8f3368113,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hkvsl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-8464cc45fb-ngb9j_openstack-operators(ee5c97c9-5dc0-4292-9a34-08ca45f5387a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 19:15:12 crc kubenswrapper[4826]: E0319 19:15:12.760577 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-ngb9j" podUID="ee5c97c9-5dc0-4292-9a34-08ca45f5387a" Mar 19 19:15:12 crc kubenswrapper[4826]: E0319 19:15:12.796746 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:6e7552996253fc66667eaa3eb0e11b4e97145efa2ae577155ceabf8e9913ddc1\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-xpq6x" podUID="271f8c86-929d-46a4-8852-f5ec8e701bcb" Mar 19 19:15:12 crc kubenswrapper[4826]: E0319 19:15:12.797288 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:703ad3a2b749bce100f1e2a445312b65dc3b8b45e8c8ba59f311d3f8f3368113\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-ngb9j" podUID="ee5c97c9-5dc0-4292-9a34-08ca45f5387a" Mar 19 19:15:12 crc kubenswrapper[4826]: E0319 19:15:12.797398 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.36:5001/openstack-k8s-operators/telemetry-operator:15c2ffcfe08e13a1dec28232b4ee653042564ac3\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6c5c766d94-258q2" podUID="5d8869b3-7d43-4db2-b79d-f05c13d0d6f2" Mar 19 19:15:17 crc kubenswrapper[4826]: I0319 19:15:17.580203 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-b76w9"] Mar 19 19:15:18 crc kubenswrapper[4826]: I0319 19:15:18.262048 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565795-49jd6"] Mar 19 19:15:18 crc kubenswrapper[4826]: I0319 19:15:18.330584 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-zjkbj"] Mar 19 19:15:18 crc kubenswrapper[4826]: W0319 19:15:18.338821 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda960df53_d712_424a_85a7_64b0e50c911f.slice/crio-08b4fcc95faf856efb5fd54f7d59e7a1d92b14dfa90172fe06beefb715371ca3 WatchSource:0}: Error finding container 08b4fcc95faf856efb5fd54f7d59e7a1d92b14dfa90172fe06beefb715371ca3: Status 404 returned error can't find the container with id 08b4fcc95faf856efb5fd54f7d59e7a1d92b14dfa90172fe06beefb715371ca3 Mar 19 19:15:18 crc kubenswrapper[4826]: I0319 19:15:18.637256 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/50980b03-91b0-4e4d-9923-e2a531458fd4-webhook-certs\") pod \"openstack-operator-controller-manager-646cd56bc9-8t2bm\" (UID: \"50980b03-91b0-4e4d-9923-e2a531458fd4\") " pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-8t2bm" Mar 19 19:15:18 crc kubenswrapper[4826]: I0319 19:15:18.638150 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/50980b03-91b0-4e4d-9923-e2a531458fd4-metrics-certs\") pod \"openstack-operator-controller-manager-646cd56bc9-8t2bm\" (UID: \"50980b03-91b0-4e4d-9923-e2a531458fd4\") " pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-8t2bm" Mar 19 19:15:18 crc kubenswrapper[4826]: I0319 19:15:18.643995 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/50980b03-91b0-4e4d-9923-e2a531458fd4-metrics-certs\") pod \"openstack-operator-controller-manager-646cd56bc9-8t2bm\" (UID: \"50980b03-91b0-4e4d-9923-e2a531458fd4\") " pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-8t2bm" Mar 19 19:15:18 crc kubenswrapper[4826]: I0319 19:15:18.646262 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/50980b03-91b0-4e4d-9923-e2a531458fd4-webhook-certs\") pod \"openstack-operator-controller-manager-646cd56bc9-8t2bm\" (UID: \"50980b03-91b0-4e4d-9923-e2a531458fd4\") " pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-8t2bm" Mar 19 19:15:18 crc kubenswrapper[4826]: I0319 19:15:18.730196 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-kqr8k" Mar 19 19:15:18 crc kubenswrapper[4826]: I0319 19:15:18.737864 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-8t2bm" Mar 19 19:15:18 crc kubenswrapper[4826]: I0319 19:15:18.857791 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-zjkbj" event={"ID":"a960df53-d712-424a-85a7-64b0e50c911f","Type":"ContainerStarted","Data":"08b4fcc95faf856efb5fd54f7d59e7a1d92b14dfa90172fe06beefb715371ca3"} Mar 19 19:15:18 crc kubenswrapper[4826]: I0319 19:15:18.861303 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-4bkbn" event={"ID":"49f5fbe6-ba93-4ff2-b575-aa08dceb2622","Type":"ContainerStarted","Data":"fbae391a355fca143beec49ad5c733d24d2d87928b4d4ffc5d5d192fd32d6246"} Mar 19 19:15:18 crc kubenswrapper[4826]: I0319 19:15:18.862460 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-4bkbn" Mar 19 19:15:18 crc kubenswrapper[4826]: I0319 19:15:18.879908 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-7l4t6" event={"ID":"dc64459f-49c1-41f5-b946-88ab7bc8e1d8","Type":"ContainerStarted","Data":"86341981f990dd313fafd3ad31288090af93dcebf13e2ceb51c5487c724c77d5"} Mar 19 19:15:18 crc kubenswrapper[4826]: I0319 19:15:18.880893 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-7l4t6" Mar 19 19:15:18 crc kubenswrapper[4826]: I0319 19:15:18.886855 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5jvh4" event={"ID":"b00ec043-3d8c-41dd-bbef-fc99f7ad0bb6","Type":"ContainerStarted","Data":"d7fea80738dae6f7fc1cd87625eb88d0f943a85c0a65baac45a7c3ceda09127e"} Mar 19 19:15:18 crc kubenswrapper[4826]: I0319 19:15:18.894078 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-6vmk6" event={"ID":"e36e6f7a-53ec-4262-b9e5-798353e5bf15","Type":"ContainerStarted","Data":"61d79f08aef814872ceb84e75c780ec6ec93ec84ca6dc2fa776e7a8ab8d65433"} Mar 19 19:15:18 crc kubenswrapper[4826]: I0319 19:15:18.894942 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-6vmk6" Mar 19 19:15:18 crc kubenswrapper[4826]: I0319 19:15:18.900823 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-4bkbn" podStartSLOduration=8.459783771 podStartE2EDuration="33.900804122s" podCreationTimestamp="2026-03-19 19:14:45 +0000 UTC" firstStartedPulling="2026-03-19 19:14:47.335829219 +0000 UTC m=+1112.089897532" lastFinishedPulling="2026-03-19 19:15:12.77684957 +0000 UTC m=+1137.530917883" observedRunningTime="2026-03-19 19:15:18.89658087 +0000 UTC m=+1143.650649203" watchObservedRunningTime="2026-03-19 19:15:18.900804122 +0000 UTC m=+1143.654872435" Mar 19 19:15:18 crc kubenswrapper[4826]: I0319 19:15:18.909927 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-hf8n5" event={"ID":"080fa697-4720-424e-b75e-6564061cd68f","Type":"ContainerStarted","Data":"91ee439cd3099250e4f963c3380bde90f7c6552487df9e0cc33fb009ec6db276"} Mar 19 19:15:18 crc kubenswrapper[4826]: I0319 19:15:18.910597 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-hf8n5" Mar 19 19:15:18 crc kubenswrapper[4826]: I0319 19:15:18.915889 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-kkmzl" event={"ID":"79a89fcd-3226-4314-951d-d94af2ac242c","Type":"ContainerStarted","Data":"aa78967d3e7a9430579fcede00919a69d6078be67f4a5f03bb285f24e92fcd4a"} Mar 19 19:15:18 crc kubenswrapper[4826]: I0319 19:15:18.916152 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5784578c99-kkmzl" Mar 19 19:15:18 crc kubenswrapper[4826]: I0319 19:15:18.932454 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-6vmk6" podStartSLOduration=3.358583898 podStartE2EDuration="32.932436877s" podCreationTimestamp="2026-03-19 19:14:46 +0000 UTC" firstStartedPulling="2026-03-19 19:14:48.440362615 +0000 UTC m=+1113.194430928" lastFinishedPulling="2026-03-19 19:15:18.014215594 +0000 UTC m=+1142.768283907" observedRunningTime="2026-03-19 19:15:18.925571761 +0000 UTC m=+1143.679640084" watchObservedRunningTime="2026-03-19 19:15:18.932436877 +0000 UTC m=+1143.686505190" Mar 19 19:15:18 crc kubenswrapper[4826]: I0319 19:15:18.936801 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-725xd" event={"ID":"f073a654-efe9-4fd0-9c08-23d9fdb0d492","Type":"ContainerStarted","Data":"551ea97bff248de44602944c01ec7662e22f9f72421824709fd6bb957972ba0a"} Mar 19 19:15:18 crc kubenswrapper[4826]: I0319 19:15:18.937473 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-725xd" Mar 19 19:15:18 crc kubenswrapper[4826]: I0319 19:15:18.938694 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-zm4ps" event={"ID":"38267b94-39ea-4067-9b6e-3d863ff60494","Type":"ContainerStarted","Data":"d0aa998d1f9489b9c4d36b5764a678bbc5c98ff707f120f886b97f7655249ec1"} Mar 19 19:15:18 crc kubenswrapper[4826]: I0319 19:15:18.939095 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-zm4ps" Mar 19 19:15:18 crc kubenswrapper[4826]: I0319 19:15:18.940011 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-8265b" event={"ID":"0f77f094-1b90-43a6-85be-27e8b1fda71f","Type":"ContainerStarted","Data":"308e7798925ff5c58289d7bde2eacf17176005865caf2c61701339d14b9e2601"} Mar 19 19:15:18 crc kubenswrapper[4826]: I0319 19:15:18.940376 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-8265b" Mar 19 19:15:18 crc kubenswrapper[4826]: I0319 19:15:18.941330 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-skdcp" event={"ID":"aff2d31f-3465-4c0c-8bbf-b04dfdb92db0","Type":"ContainerStarted","Data":"ef3a4bf9976065fb17545bb63f573bcc90d42737d10d49ee3bf409d934d68fed"} Mar 19 19:15:18 crc kubenswrapper[4826]: I0319 19:15:18.941683 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-c674c5965-skdcp" Mar 19 19:15:18 crc kubenswrapper[4826]: I0319 19:15:18.958854 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-7l4t6" podStartSLOduration=3.52217484 podStartE2EDuration="32.958839364s" podCreationTimestamp="2026-03-19 19:14:46 +0000 UTC" firstStartedPulling="2026-03-19 19:14:48.459022075 +0000 UTC m=+1113.213090388" lastFinishedPulling="2026-03-19 19:15:17.895686599 +0000 UTC m=+1142.649754912" observedRunningTime="2026-03-19 19:15:18.958052065 +0000 UTC m=+1143.712120378" watchObservedRunningTime="2026-03-19 19:15:18.958839364 +0000 UTC m=+1143.712907677" Mar 19 19:15:18 crc kubenswrapper[4826]: I0319 19:15:18.973944 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-jjqrs" event={"ID":"5f60643c-c919-436b-bd23-9e39698d9c9b","Type":"ContainerStarted","Data":"d2c73a1088155c8cd7286d35d4140d9ee02ab13a54703394bd455123958580b9"} Mar 19 19:15:18 crc kubenswrapper[4826]: I0319 19:15:18.974842 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-jjqrs" Mar 19 19:15:18 crc kubenswrapper[4826]: I0319 19:15:18.983355 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565795-49jd6" event={"ID":"9aa2d483-2b01-4910-91fc-28ad369bbb17","Type":"ContainerStarted","Data":"fc3e9fe6b07ef836e5f967c917f083ea8f6c8fd4f479a1732ee5ee4089692923"} Mar 19 19:15:18 crc kubenswrapper[4826]: I0319 19:15:18.983557 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565795-49jd6" event={"ID":"9aa2d483-2b01-4910-91fc-28ad369bbb17","Type":"ContainerStarted","Data":"a46bf98ee63066773400f171cb1428fd6fec1b907693468570c8c37d38a29426"} Mar 19 19:15:18 crc kubenswrapper[4826]: I0319 19:15:18.991128 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-j4p25" event={"ID":"7137162e-cccf-4ce6-9dc4-7380db33a85a","Type":"ContainerStarted","Data":"0b5c426ab0a77b4f05b5408a3685d9a3d53c05a20614569d4f59f0e560fa47e3"} Mar 19 19:15:18 crc kubenswrapper[4826]: I0319 19:15:18.993414 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-j4p25" Mar 19 19:15:18 crc kubenswrapper[4826]: I0319 19:15:18.995535 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5jvh4" podStartSLOduration=3.484164533 podStartE2EDuration="32.995517531s" podCreationTimestamp="2026-03-19 19:14:46 +0000 UTC" firstStartedPulling="2026-03-19 19:14:48.462990462 +0000 UTC m=+1113.217058775" lastFinishedPulling="2026-03-19 19:15:17.97434346 +0000 UTC m=+1142.728411773" observedRunningTime="2026-03-19 19:15:18.988789478 +0000 UTC m=+1143.742857801" watchObservedRunningTime="2026-03-19 19:15:18.995517531 +0000 UTC m=+1143.749585844" Mar 19 19:15:19 crc kubenswrapper[4826]: I0319 19:15:19.018600 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-725xd" podStartSLOduration=8.615219946 podStartE2EDuration="34.018587988s" podCreationTimestamp="2026-03-19 19:14:45 +0000 UTC" firstStartedPulling="2026-03-19 19:14:47.347521751 +0000 UTC m=+1112.101590064" lastFinishedPulling="2026-03-19 19:15:12.750889783 +0000 UTC m=+1137.504958106" observedRunningTime="2026-03-19 19:15:19.016864947 +0000 UTC m=+1143.770933270" watchObservedRunningTime="2026-03-19 19:15:19.018587988 +0000 UTC m=+1143.772656301" Mar 19 19:15:19 crc kubenswrapper[4826]: I0319 19:15:19.022310 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-b76w9" event={"ID":"4f382869-5ee2-4a46-8188-d4ddd0bee2fa","Type":"ContainerStarted","Data":"aa8bf77cb0fedc4db98922638a55a00825190bacf604a6ab8ecc4e8516ec6c7b"} Mar 19 19:15:19 crc kubenswrapper[4826]: I0319 19:15:19.080315 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-hf8n5" podStartSLOduration=3.370707231 podStartE2EDuration="34.08029452s" podCreationTimestamp="2026-03-19 19:14:45 +0000 UTC" firstStartedPulling="2026-03-19 19:14:47.257319091 +0000 UTC m=+1112.011387404" lastFinishedPulling="2026-03-19 19:15:17.96690638 +0000 UTC m=+1142.720974693" observedRunningTime="2026-03-19 19:15:19.05130399 +0000 UTC m=+1143.805372323" watchObservedRunningTime="2026-03-19 19:15:19.08029452 +0000 UTC m=+1143.834362833" Mar 19 19:15:19 crc kubenswrapper[4826]: I0319 19:15:19.099760 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29565795-49jd6" podStartSLOduration=19.09974182 podStartE2EDuration="19.09974182s" podCreationTimestamp="2026-03-19 19:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:15:19.093574281 +0000 UTC m=+1143.847642604" watchObservedRunningTime="2026-03-19 19:15:19.09974182 +0000 UTC m=+1143.853810133" Mar 19 19:15:19 crc kubenswrapper[4826]: I0319 19:15:19.223301 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-8265b" podStartSLOduration=3.569124897 podStartE2EDuration="34.223263895s" podCreationTimestamp="2026-03-19 19:14:45 +0000 UTC" firstStartedPulling="2026-03-19 19:14:47.351103828 +0000 UTC m=+1112.105172131" lastFinishedPulling="2026-03-19 19:15:18.005242826 +0000 UTC m=+1142.759311129" observedRunningTime="2026-03-19 19:15:19.140520115 +0000 UTC m=+1143.894588428" watchObservedRunningTime="2026-03-19 19:15:19.223263895 +0000 UTC m=+1143.977332208" Mar 19 19:15:19 crc kubenswrapper[4826]: I0319 19:15:19.317433 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-c674c5965-skdcp" podStartSLOduration=4.807554189 podStartE2EDuration="34.317414261s" podCreationTimestamp="2026-03-19 19:14:45 +0000 UTC" firstStartedPulling="2026-03-19 19:14:48.457020768 +0000 UTC m=+1113.211089071" lastFinishedPulling="2026-03-19 19:15:17.96688083 +0000 UTC m=+1142.720949143" observedRunningTime="2026-03-19 19:15:19.200411403 +0000 UTC m=+1143.954479726" watchObservedRunningTime="2026-03-19 19:15:19.317414261 +0000 UTC m=+1144.071482574" Mar 19 19:15:19 crc kubenswrapper[4826]: I0319 19:15:19.331554 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-jjqrs" podStartSLOduration=4.841843887 podStartE2EDuration="34.331531782s" podCreationTimestamp="2026-03-19 19:14:45 +0000 UTC" firstStartedPulling="2026-03-19 19:14:46.787826453 +0000 UTC m=+1111.541894766" lastFinishedPulling="2026-03-19 19:15:16.277514358 +0000 UTC m=+1141.031582661" observedRunningTime="2026-03-19 19:15:19.218018589 +0000 UTC m=+1143.972086902" watchObservedRunningTime="2026-03-19 19:15:19.331531782 +0000 UTC m=+1144.085600095" Mar 19 19:15:19 crc kubenswrapper[4826]: I0319 19:15:19.335451 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-zm4ps" podStartSLOduration=3.150005356 podStartE2EDuration="34.335441676s" podCreationTimestamp="2026-03-19 19:14:45 +0000 UTC" firstStartedPulling="2026-03-19 19:14:47.247246028 +0000 UTC m=+1112.001314341" lastFinishedPulling="2026-03-19 19:15:18.432682348 +0000 UTC m=+1143.186750661" observedRunningTime="2026-03-19 19:15:19.243760211 +0000 UTC m=+1143.997828524" watchObservedRunningTime="2026-03-19 19:15:19.335441676 +0000 UTC m=+1144.089509979" Mar 19 19:15:19 crc kubenswrapper[4826]: I0319 19:15:19.355059 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5784578c99-kkmzl" podStartSLOduration=4.724355537 podStartE2EDuration="34.35503958s" podCreationTimestamp="2026-03-19 19:14:45 +0000 UTC" firstStartedPulling="2026-03-19 19:14:48.364378298 +0000 UTC m=+1113.118446611" lastFinishedPulling="2026-03-19 19:15:17.995062341 +0000 UTC m=+1142.749130654" observedRunningTime="2026-03-19 19:15:19.305029182 +0000 UTC m=+1144.059097495" watchObservedRunningTime="2026-03-19 19:15:19.35503958 +0000 UTC m=+1144.109107893" Mar 19 19:15:19 crc kubenswrapper[4826]: I0319 19:15:19.383170 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-j4p25" podStartSLOduration=9.934942434 podStartE2EDuration="34.38314742s" podCreationTimestamp="2026-03-19 19:14:45 +0000 UTC" firstStartedPulling="2026-03-19 19:14:48.331264788 +0000 UTC m=+1113.085333101" lastFinishedPulling="2026-03-19 19:15:12.779469774 +0000 UTC m=+1137.533538087" observedRunningTime="2026-03-19 19:15:19.361075566 +0000 UTC m=+1144.115143879" watchObservedRunningTime="2026-03-19 19:15:19.38314742 +0000 UTC m=+1144.137215733" Mar 19 19:15:19 crc kubenswrapper[4826]: I0319 19:15:19.692204 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-646cd56bc9-8t2bm"] Mar 19 19:15:20 crc kubenswrapper[4826]: I0319 19:15:20.043522 4826 generic.go:334] "Generic (PLEG): container finished" podID="9aa2d483-2b01-4910-91fc-28ad369bbb17" containerID="fc3e9fe6b07ef836e5f967c917f083ea8f6c8fd4f479a1732ee5ee4089692923" exitCode=0 Mar 19 19:15:20 crc kubenswrapper[4826]: I0319 19:15:20.043628 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565795-49jd6" event={"ID":"9aa2d483-2b01-4910-91fc-28ad369bbb17","Type":"ContainerDied","Data":"fc3e9fe6b07ef836e5f967c917f083ea8f6c8fd4f479a1732ee5ee4089692923"} Mar 19 19:15:20 crc kubenswrapper[4826]: I0319 19:15:20.045157 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-8t2bm" event={"ID":"50980b03-91b0-4e4d-9923-e2a531458fd4","Type":"ContainerStarted","Data":"36b1f9aad82e66d08f2fabd266586fcee997e821760384b9140d3f25f645e508"} Mar 19 19:15:21 crc kubenswrapper[4826]: I0319 19:15:21.055632 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-8t2bm" event={"ID":"50980b03-91b0-4e4d-9923-e2a531458fd4","Type":"ContainerStarted","Data":"0c2f34ab591639d111c4934a0d9282a82981c96de35c94a53b74fbcfb47cae74"} Mar 19 19:15:21 crc kubenswrapper[4826]: I0319 19:15:21.090442 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-8t2bm" podStartSLOduration=35.090423763 podStartE2EDuration="35.090423763s" podCreationTimestamp="2026-03-19 19:14:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:15:21.084217403 +0000 UTC m=+1145.838285716" watchObservedRunningTime="2026-03-19 19:15:21.090423763 +0000 UTC m=+1145.844492076" Mar 19 19:15:22 crc kubenswrapper[4826]: I0319 19:15:22.001223 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565795-49jd6" Mar 19 19:15:22 crc kubenswrapper[4826]: I0319 19:15:22.069032 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565795-49jd6" event={"ID":"9aa2d483-2b01-4910-91fc-28ad369bbb17","Type":"ContainerDied","Data":"a46bf98ee63066773400f171cb1428fd6fec1b907693468570c8c37d38a29426"} Mar 19 19:15:22 crc kubenswrapper[4826]: I0319 19:15:22.069077 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565795-49jd6" Mar 19 19:15:22 crc kubenswrapper[4826]: I0319 19:15:22.069090 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a46bf98ee63066773400f171cb1428fd6fec1b907693468570c8c37d38a29426" Mar 19 19:15:22 crc kubenswrapper[4826]: I0319 19:15:22.069323 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-8t2bm" Mar 19 19:15:22 crc kubenswrapper[4826]: I0319 19:15:22.122685 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9aa2d483-2b01-4910-91fc-28ad369bbb17-secret-volume\") pod \"9aa2d483-2b01-4910-91fc-28ad369bbb17\" (UID: \"9aa2d483-2b01-4910-91fc-28ad369bbb17\") " Mar 19 19:15:22 crc kubenswrapper[4826]: I0319 19:15:22.122832 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5fdb\" (UniqueName: \"kubernetes.io/projected/9aa2d483-2b01-4910-91fc-28ad369bbb17-kube-api-access-j5fdb\") pod \"9aa2d483-2b01-4910-91fc-28ad369bbb17\" (UID: \"9aa2d483-2b01-4910-91fc-28ad369bbb17\") " Mar 19 19:15:22 crc kubenswrapper[4826]: I0319 19:15:22.122880 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9aa2d483-2b01-4910-91fc-28ad369bbb17-config-volume\") pod \"9aa2d483-2b01-4910-91fc-28ad369bbb17\" (UID: \"9aa2d483-2b01-4910-91fc-28ad369bbb17\") " Mar 19 19:15:22 crc kubenswrapper[4826]: I0319 19:15:22.124214 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9aa2d483-2b01-4910-91fc-28ad369bbb17-config-volume" (OuterVolumeSpecName: "config-volume") pod "9aa2d483-2b01-4910-91fc-28ad369bbb17" (UID: "9aa2d483-2b01-4910-91fc-28ad369bbb17"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:15:22 crc kubenswrapper[4826]: I0319 19:15:22.129152 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9aa2d483-2b01-4910-91fc-28ad369bbb17-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9aa2d483-2b01-4910-91fc-28ad369bbb17" (UID: "9aa2d483-2b01-4910-91fc-28ad369bbb17"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:15:22 crc kubenswrapper[4826]: I0319 19:15:22.129252 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9aa2d483-2b01-4910-91fc-28ad369bbb17-kube-api-access-j5fdb" (OuterVolumeSpecName: "kube-api-access-j5fdb") pod "9aa2d483-2b01-4910-91fc-28ad369bbb17" (UID: "9aa2d483-2b01-4910-91fc-28ad369bbb17"). InnerVolumeSpecName "kube-api-access-j5fdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:15:22 crc kubenswrapper[4826]: I0319 19:15:22.224909 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5fdb\" (UniqueName: \"kubernetes.io/projected/9aa2d483-2b01-4910-91fc-28ad369bbb17-kube-api-access-j5fdb\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:22 crc kubenswrapper[4826]: I0319 19:15:22.224936 4826 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9aa2d483-2b01-4910-91fc-28ad369bbb17-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:22 crc kubenswrapper[4826]: I0319 19:15:22.224945 4826 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9aa2d483-2b01-4910-91fc-28ad369bbb17-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 19 19:15:24 crc kubenswrapper[4826]: I0319 19:15:24.091401 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-zjkbj" event={"ID":"a960df53-d712-424a-85a7-64b0e50c911f","Type":"ContainerStarted","Data":"c85539ce9517f0b13ccc4eb01b1c3dbeae6d6c7e1282718adce636b8cd3ae051"} Mar 19 19:15:24 crc kubenswrapper[4826]: I0319 19:15:24.091836 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-zjkbj" Mar 19 19:15:24 crc kubenswrapper[4826]: I0319 19:15:24.095410 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-zrczt" event={"ID":"6a5ffd48-ea97-46a0-b9ed-f7c38d5d8a90","Type":"ContainerStarted","Data":"022ac60c65f54821ec408ecfa4b5cfccbe434d4dc34cd8ae28845a4daa3ceccd"} Mar 19 19:15:24 crc kubenswrapper[4826]: I0319 19:15:24.095641 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-55f864c847-zrczt" Mar 19 19:15:24 crc kubenswrapper[4826]: I0319 19:15:24.097345 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-sfs65" event={"ID":"918ac815-fe60-44b9-b6c0-c99ee8dc80b8","Type":"ContainerStarted","Data":"72ab8a50302e29cc88abed996900e9d0b141a6a645cbea56cf1e66e7ad06f07c"} Mar 19 19:15:24 crc kubenswrapper[4826]: I0319 19:15:24.097547 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-767865f676-sfs65" Mar 19 19:15:24 crc kubenswrapper[4826]: I0319 19:15:24.099009 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-zs74n" event={"ID":"6243b523-966a-4f1d-b663-2f1ed4614fdb","Type":"ContainerStarted","Data":"b404056dacf1668e9a6ee2842e12e9fe4d3565f74446b93e187a9a86c94ff70e"} Mar 19 19:15:24 crc kubenswrapper[4826]: I0319 19:15:24.099169 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-884679f54-zs74n" Mar 19 19:15:24 crc kubenswrapper[4826]: I0319 19:15:24.101126 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-b76w9" event={"ID":"4f382869-5ee2-4a46-8188-d4ddd0bee2fa","Type":"ContainerStarted","Data":"b857d8cdc6d9f653d3c6edbb6a6cdeff43f9dad5bd645072af281d18fefa3ae7"} Mar 19 19:15:24 crc kubenswrapper[4826]: I0319 19:15:24.101262 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-b76w9" Mar 19 19:15:24 crc kubenswrapper[4826]: I0319 19:15:24.102673 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-tjcmb" event={"ID":"44055ef9-1bc5-4b25-a40d-553a1546fc15","Type":"ContainerStarted","Data":"c0aea16d54113eea374338a2a5ad761f722d63d272579619d2e5a1377a28915e"} Mar 19 19:15:24 crc kubenswrapper[4826]: I0319 19:15:24.102852 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-tjcmb" Mar 19 19:15:24 crc kubenswrapper[4826]: I0319 19:15:24.107349 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-zjkbj" podStartSLOduration=34.352577281 podStartE2EDuration="39.10733536s" podCreationTimestamp="2026-03-19 19:14:45 +0000 UTC" firstStartedPulling="2026-03-19 19:15:18.341219507 +0000 UTC m=+1143.095287820" lastFinishedPulling="2026-03-19 19:15:23.095977566 +0000 UTC m=+1147.850045899" observedRunningTime="2026-03-19 19:15:24.104782388 +0000 UTC m=+1148.858850721" watchObservedRunningTime="2026-03-19 19:15:24.10733536 +0000 UTC m=+1148.861403673" Mar 19 19:15:24 crc kubenswrapper[4826]: I0319 19:15:24.125584 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-tjcmb" podStartSLOduration=3.922379574 podStartE2EDuration="39.125555991s" podCreationTimestamp="2026-03-19 19:14:45 +0000 UTC" firstStartedPulling="2026-03-19 19:14:48.243587448 +0000 UTC m=+1112.997655761" lastFinishedPulling="2026-03-19 19:15:23.446763865 +0000 UTC m=+1148.200832178" observedRunningTime="2026-03-19 19:15:24.118224413 +0000 UTC m=+1148.872292736" watchObservedRunningTime="2026-03-19 19:15:24.125555991 +0000 UTC m=+1148.879624304" Mar 19 19:15:24 crc kubenswrapper[4826]: I0319 19:15:24.133422 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-55f864c847-zrczt" podStartSLOduration=4.366601851 podStartE2EDuration="39.13340522s" podCreationTimestamp="2026-03-19 19:14:45 +0000 UTC" firstStartedPulling="2026-03-19 19:14:48.330980261 +0000 UTC m=+1113.085048574" lastFinishedPulling="2026-03-19 19:15:23.09778362 +0000 UTC m=+1147.851851943" observedRunningTime="2026-03-19 19:15:24.131298349 +0000 UTC m=+1148.885366662" watchObservedRunningTime="2026-03-19 19:15:24.13340522 +0000 UTC m=+1148.887473533" Mar 19 19:15:24 crc kubenswrapper[4826]: I0319 19:15:24.147967 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-884679f54-zs74n" podStartSLOduration=4.269573386 podStartE2EDuration="39.147949992s" podCreationTimestamp="2026-03-19 19:14:45 +0000 UTC" firstStartedPulling="2026-03-19 19:14:48.24325035 +0000 UTC m=+1112.997318663" lastFinishedPulling="2026-03-19 19:15:23.121626936 +0000 UTC m=+1147.875695269" observedRunningTime="2026-03-19 19:15:24.143437942 +0000 UTC m=+1148.897506285" watchObservedRunningTime="2026-03-19 19:15:24.147949992 +0000 UTC m=+1148.902018305" Mar 19 19:15:24 crc kubenswrapper[4826]: I0319 19:15:24.181445 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-767865f676-sfs65" podStartSLOduration=4.602891623 podStartE2EDuration="39.181417431s" podCreationTimestamp="2026-03-19 19:14:45 +0000 UTC" firstStartedPulling="2026-03-19 19:14:48.343894163 +0000 UTC m=+1113.097962476" lastFinishedPulling="2026-03-19 19:15:22.922419921 +0000 UTC m=+1147.676488284" observedRunningTime="2026-03-19 19:15:24.164546932 +0000 UTC m=+1148.918615255" watchObservedRunningTime="2026-03-19 19:15:24.181417431 +0000 UTC m=+1148.935485744" Mar 19 19:15:24 crc kubenswrapper[4826]: I0319 19:15:24.206744 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-b76w9" podStartSLOduration=34.174929168 podStartE2EDuration="39.206723672s" podCreationTimestamp="2026-03-19 19:14:45 +0000 UTC" firstStartedPulling="2026-03-19 19:15:17.897690348 +0000 UTC m=+1142.651758671" lastFinishedPulling="2026-03-19 19:15:22.929484852 +0000 UTC m=+1147.683553175" observedRunningTime="2026-03-19 19:15:24.19340843 +0000 UTC m=+1148.947476743" watchObservedRunningTime="2026-03-19 19:15:24.206723672 +0000 UTC m=+1148.960791995" Mar 19 19:15:25 crc kubenswrapper[4826]: I0319 19:15:25.110898 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-rsrjx" event={"ID":"d2375678-e630-4376-9dfd-28efbc77aed4","Type":"ContainerStarted","Data":"497d04c6c02cd33a4535ca06ac6841664f0d2fb4d066c68a30101f9f2463f892"} Mar 19 19:15:25 crc kubenswrapper[4826]: I0319 19:15:25.111448 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-rsrjx" Mar 19 19:15:25 crc kubenswrapper[4826]: I0319 19:15:25.112339 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-xpq6x" event={"ID":"271f8c86-929d-46a4-8852-f5ec8e701bcb","Type":"ContainerStarted","Data":"7f6367b0f9bff3bc08127efc7af9c624813a6efd18a0521444513af3a6da5995"} Mar 19 19:15:25 crc kubenswrapper[4826]: I0319 19:15:25.127570 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-rsrjx" podStartSLOduration=2.9043644 podStartE2EDuration="40.127554768s" podCreationTimestamp="2026-03-19 19:14:45 +0000 UTC" firstStartedPulling="2026-03-19 19:14:47.195699402 +0000 UTC m=+1111.949767715" lastFinishedPulling="2026-03-19 19:15:24.41888977 +0000 UTC m=+1149.172958083" observedRunningTime="2026-03-19 19:15:25.12599043 +0000 UTC m=+1149.880058763" watchObservedRunningTime="2026-03-19 19:15:25.127554768 +0000 UTC m=+1149.881623081" Mar 19 19:15:25 crc kubenswrapper[4826]: I0319 19:15:25.150309 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-xpq6x" podStartSLOduration=3.958784475 podStartE2EDuration="40.150289087s" podCreationTimestamp="2026-03-19 19:14:45 +0000 UTC" firstStartedPulling="2026-03-19 19:14:48.172623714 +0000 UTC m=+1112.926692017" lastFinishedPulling="2026-03-19 19:15:24.364128316 +0000 UTC m=+1149.118196629" observedRunningTime="2026-03-19 19:15:25.146300001 +0000 UTC m=+1149.900368314" watchObservedRunningTime="2026-03-19 19:15:25.150289087 +0000 UTC m=+1149.904357400" Mar 19 19:15:25 crc kubenswrapper[4826]: I0319 19:15:25.400535 4826 patch_prober.go:28] interesting pod/machine-config-daemon-zz87p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:15:25 crc kubenswrapper[4826]: I0319 19:15:25.400596 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:15:25 crc kubenswrapper[4826]: I0319 19:15:25.928184 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-jjqrs" Mar 19 19:15:26 crc kubenswrapper[4826]: I0319 19:15:26.002751 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-zm4ps" Mar 19 19:15:26 crc kubenswrapper[4826]: I0319 19:15:26.009150 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-hf8n5" Mar 19 19:15:26 crc kubenswrapper[4826]: I0319 19:15:26.034600 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-8265b" Mar 19 19:15:26 crc kubenswrapper[4826]: I0319 19:15:26.129841 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6c5c766d94-258q2" event={"ID":"5d8869b3-7d43-4db2-b79d-f05c13d0d6f2","Type":"ContainerStarted","Data":"f6c4b15dc1fd2e3d53eb881ac1d35c2fddfbfb0c51cb4b365781a678c923c4f9"} Mar 19 19:15:26 crc kubenswrapper[4826]: I0319 19:15:26.130097 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-6c5c766d94-258q2" Mar 19 19:15:26 crc kubenswrapper[4826]: I0319 19:15:26.157490 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-6c5c766d94-258q2" podStartSLOduration=3.569531497 podStartE2EDuration="40.157472001s" podCreationTimestamp="2026-03-19 19:14:46 +0000 UTC" firstStartedPulling="2026-03-19 19:14:48.446032402 +0000 UTC m=+1113.200100715" lastFinishedPulling="2026-03-19 19:15:25.033972896 +0000 UTC m=+1149.788041219" observedRunningTime="2026-03-19 19:15:26.148839012 +0000 UTC m=+1150.902907345" watchObservedRunningTime="2026-03-19 19:15:26.157472001 +0000 UTC m=+1150.911540304" Mar 19 19:15:26 crc kubenswrapper[4826]: I0319 19:15:26.159406 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-725xd" Mar 19 19:15:26 crc kubenswrapper[4826]: I0319 19:15:26.308991 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-4bkbn" Mar 19 19:15:26 crc kubenswrapper[4826]: I0319 19:15:26.331945 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-xpq6x" Mar 19 19:15:26 crc kubenswrapper[4826]: I0319 19:15:26.599060 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5784578c99-kkmzl" Mar 19 19:15:26 crc kubenswrapper[4826]: I0319 19:15:26.789676 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-j4p25" Mar 19 19:15:26 crc kubenswrapper[4826]: I0319 19:15:26.958078 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-c674c5965-skdcp" Mar 19 19:15:27 crc kubenswrapper[4826]: I0319 19:15:27.002773 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-6vmk6" Mar 19 19:15:27 crc kubenswrapper[4826]: I0319 19:15:27.024705 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-7l4t6" Mar 19 19:15:28 crc kubenswrapper[4826]: I0319 19:15:28.144248 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-ngb9j" event={"ID":"ee5c97c9-5dc0-4292-9a34-08ca45f5387a","Type":"ContainerStarted","Data":"9aab00592e3103b510ac88ec4710424ef42bc2192e995f53db7890e497783d40"} Mar 19 19:15:28 crc kubenswrapper[4826]: I0319 19:15:28.145550 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-ngb9j" Mar 19 19:15:28 crc kubenswrapper[4826]: I0319 19:15:28.165290 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-ngb9j" podStartSLOduration=4.03638533 podStartE2EDuration="43.165266738s" podCreationTimestamp="2026-03-19 19:14:45 +0000 UTC" firstStartedPulling="2026-03-19 19:14:48.408030883 +0000 UTC m=+1113.162099196" lastFinishedPulling="2026-03-19 19:15:27.536912251 +0000 UTC m=+1152.290980604" observedRunningTime="2026-03-19 19:15:28.163168756 +0000 UTC m=+1152.917237089" watchObservedRunningTime="2026-03-19 19:15:28.165266738 +0000 UTC m=+1152.919335041" Mar 19 19:15:28 crc kubenswrapper[4826]: I0319 19:15:28.745114 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-8t2bm" Mar 19 19:15:31 crc kubenswrapper[4826]: I0319 19:15:31.963329 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-zjkbj" Mar 19 19:15:32 crc kubenswrapper[4826]: I0319 19:15:32.301612 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-b76w9" Mar 19 19:15:36 crc kubenswrapper[4826]: I0319 19:15:36.002522 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-rsrjx" Mar 19 19:15:36 crc kubenswrapper[4826]: I0319 19:15:36.044131 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-ngb9j" Mar 19 19:15:36 crc kubenswrapper[4826]: I0319 19:15:36.226573 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-55f864c847-zrczt" Mar 19 19:15:36 crc kubenswrapper[4826]: I0319 19:15:36.334878 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-xpq6x" Mar 19 19:15:36 crc kubenswrapper[4826]: I0319 19:15:36.444317 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-884679f54-zs74n" Mar 19 19:15:36 crc kubenswrapper[4826]: I0319 19:15:36.509155 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-tjcmb" Mar 19 19:15:36 crc kubenswrapper[4826]: I0319 19:15:36.679832 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-767865f676-sfs65" Mar 19 19:15:36 crc kubenswrapper[4826]: I0319 19:15:36.980016 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-6c5c766d94-258q2" Mar 19 19:15:55 crc kubenswrapper[4826]: I0319 19:15:55.400118 4826 patch_prober.go:28] interesting pod/machine-config-daemon-zz87p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:15:55 crc kubenswrapper[4826]: I0319 19:15:55.400693 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:15:55 crc kubenswrapper[4826]: I0319 19:15:55.400745 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" Mar 19 19:15:55 crc kubenswrapper[4826]: I0319 19:15:55.401466 4826 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8f9b98750fb35334b26ac1561a7757e06810afb82592af11d7a0e1fbf0a43d22"} pod="openshift-machine-config-operator/machine-config-daemon-zz87p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 19:15:55 crc kubenswrapper[4826]: I0319 19:15:55.401529 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" containerID="cri-o://8f9b98750fb35334b26ac1561a7757e06810afb82592af11d7a0e1fbf0a43d22" gracePeriod=600 Mar 19 19:15:55 crc kubenswrapper[4826]: I0319 19:15:55.444034 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-m87mz"] Mar 19 19:15:55 crc kubenswrapper[4826]: E0319 19:15:55.444416 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9aa2d483-2b01-4910-91fc-28ad369bbb17" containerName="collect-profiles" Mar 19 19:15:55 crc kubenswrapper[4826]: I0319 19:15:55.444440 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aa2d483-2b01-4910-91fc-28ad369bbb17" containerName="collect-profiles" Mar 19 19:15:55 crc kubenswrapper[4826]: I0319 19:15:55.444630 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="9aa2d483-2b01-4910-91fc-28ad369bbb17" containerName="collect-profiles" Mar 19 19:15:55 crc kubenswrapper[4826]: I0319 19:15:55.445550 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-m87mz" Mar 19 19:15:55 crc kubenswrapper[4826]: I0319 19:15:55.448129 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 19 19:15:55 crc kubenswrapper[4826]: I0319 19:15:55.448333 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-h7h5q" Mar 19 19:15:55 crc kubenswrapper[4826]: I0319 19:15:55.449325 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 19 19:15:55 crc kubenswrapper[4826]: I0319 19:15:55.450138 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 19 19:15:55 crc kubenswrapper[4826]: I0319 19:15:55.462999 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-m87mz"] Mar 19 19:15:55 crc kubenswrapper[4826]: I0319 19:15:55.536965 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-2z2f5"] Mar 19 19:15:55 crc kubenswrapper[4826]: I0319 19:15:55.538407 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-2z2f5" Mar 19 19:15:55 crc kubenswrapper[4826]: I0319 19:15:55.541331 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 19 19:15:55 crc kubenswrapper[4826]: I0319 19:15:55.544570 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-2z2f5"] Mar 19 19:15:55 crc kubenswrapper[4826]: I0319 19:15:55.610443 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gfh5\" (UniqueName: \"kubernetes.io/projected/306202dd-aac9-4865-a5aa-69c04b06cf09-kube-api-access-9gfh5\") pod \"dnsmasq-dns-675f4bcbfc-m87mz\" (UID: \"306202dd-aac9-4865-a5aa-69c04b06cf09\") " pod="openstack/dnsmasq-dns-675f4bcbfc-m87mz" Mar 19 19:15:55 crc kubenswrapper[4826]: I0319 19:15:55.610718 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/306202dd-aac9-4865-a5aa-69c04b06cf09-config\") pod \"dnsmasq-dns-675f4bcbfc-m87mz\" (UID: \"306202dd-aac9-4865-a5aa-69c04b06cf09\") " pod="openstack/dnsmasq-dns-675f4bcbfc-m87mz" Mar 19 19:15:55 crc kubenswrapper[4826]: I0319 19:15:55.713616 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68e26478-de34-4ca1-8c1f-ea760101ec64-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-2z2f5\" (UID: \"68e26478-de34-4ca1-8c1f-ea760101ec64\") " pod="openstack/dnsmasq-dns-78dd6ddcc-2z2f5" Mar 19 19:15:55 crc kubenswrapper[4826]: I0319 19:15:55.713717 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbg5m\" (UniqueName: \"kubernetes.io/projected/68e26478-de34-4ca1-8c1f-ea760101ec64-kube-api-access-rbg5m\") pod \"dnsmasq-dns-78dd6ddcc-2z2f5\" (UID: \"68e26478-de34-4ca1-8c1f-ea760101ec64\") " pod="openstack/dnsmasq-dns-78dd6ddcc-2z2f5" Mar 19 19:15:55 crc kubenswrapper[4826]: I0319 19:15:55.713764 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gfh5\" (UniqueName: \"kubernetes.io/projected/306202dd-aac9-4865-a5aa-69c04b06cf09-kube-api-access-9gfh5\") pod \"dnsmasq-dns-675f4bcbfc-m87mz\" (UID: \"306202dd-aac9-4865-a5aa-69c04b06cf09\") " pod="openstack/dnsmasq-dns-675f4bcbfc-m87mz" Mar 19 19:15:55 crc kubenswrapper[4826]: I0319 19:15:55.713805 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68e26478-de34-4ca1-8c1f-ea760101ec64-config\") pod \"dnsmasq-dns-78dd6ddcc-2z2f5\" (UID: \"68e26478-de34-4ca1-8c1f-ea760101ec64\") " pod="openstack/dnsmasq-dns-78dd6ddcc-2z2f5" Mar 19 19:15:55 crc kubenswrapper[4826]: I0319 19:15:55.713831 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/306202dd-aac9-4865-a5aa-69c04b06cf09-config\") pod \"dnsmasq-dns-675f4bcbfc-m87mz\" (UID: \"306202dd-aac9-4865-a5aa-69c04b06cf09\") " pod="openstack/dnsmasq-dns-675f4bcbfc-m87mz" Mar 19 19:15:55 crc kubenswrapper[4826]: I0319 19:15:55.714568 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/306202dd-aac9-4865-a5aa-69c04b06cf09-config\") pod \"dnsmasq-dns-675f4bcbfc-m87mz\" (UID: \"306202dd-aac9-4865-a5aa-69c04b06cf09\") " pod="openstack/dnsmasq-dns-675f4bcbfc-m87mz" Mar 19 19:15:55 crc kubenswrapper[4826]: I0319 19:15:55.738049 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gfh5\" (UniqueName: \"kubernetes.io/projected/306202dd-aac9-4865-a5aa-69c04b06cf09-kube-api-access-9gfh5\") pod \"dnsmasq-dns-675f4bcbfc-m87mz\" (UID: \"306202dd-aac9-4865-a5aa-69c04b06cf09\") " pod="openstack/dnsmasq-dns-675f4bcbfc-m87mz" Mar 19 19:15:55 crc kubenswrapper[4826]: I0319 19:15:55.768478 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-m87mz" Mar 19 19:15:55 crc kubenswrapper[4826]: I0319 19:15:55.815724 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68e26478-de34-4ca1-8c1f-ea760101ec64-config\") pod \"dnsmasq-dns-78dd6ddcc-2z2f5\" (UID: \"68e26478-de34-4ca1-8c1f-ea760101ec64\") " pod="openstack/dnsmasq-dns-78dd6ddcc-2z2f5" Mar 19 19:15:55 crc kubenswrapper[4826]: I0319 19:15:55.815830 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68e26478-de34-4ca1-8c1f-ea760101ec64-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-2z2f5\" (UID: \"68e26478-de34-4ca1-8c1f-ea760101ec64\") " pod="openstack/dnsmasq-dns-78dd6ddcc-2z2f5" Mar 19 19:15:55 crc kubenswrapper[4826]: I0319 19:15:55.815899 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbg5m\" (UniqueName: \"kubernetes.io/projected/68e26478-de34-4ca1-8c1f-ea760101ec64-kube-api-access-rbg5m\") pod \"dnsmasq-dns-78dd6ddcc-2z2f5\" (UID: \"68e26478-de34-4ca1-8c1f-ea760101ec64\") " pod="openstack/dnsmasq-dns-78dd6ddcc-2z2f5" Mar 19 19:15:55 crc kubenswrapper[4826]: I0319 19:15:55.817519 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68e26478-de34-4ca1-8c1f-ea760101ec64-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-2z2f5\" (UID: \"68e26478-de34-4ca1-8c1f-ea760101ec64\") " pod="openstack/dnsmasq-dns-78dd6ddcc-2z2f5" Mar 19 19:15:55 crc kubenswrapper[4826]: I0319 19:15:55.818274 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68e26478-de34-4ca1-8c1f-ea760101ec64-config\") pod \"dnsmasq-dns-78dd6ddcc-2z2f5\" (UID: \"68e26478-de34-4ca1-8c1f-ea760101ec64\") " pod="openstack/dnsmasq-dns-78dd6ddcc-2z2f5" Mar 19 19:15:55 crc kubenswrapper[4826]: I0319 19:15:55.835210 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbg5m\" (UniqueName: \"kubernetes.io/projected/68e26478-de34-4ca1-8c1f-ea760101ec64-kube-api-access-rbg5m\") pod \"dnsmasq-dns-78dd6ddcc-2z2f5\" (UID: \"68e26478-de34-4ca1-8c1f-ea760101ec64\") " pod="openstack/dnsmasq-dns-78dd6ddcc-2z2f5" Mar 19 19:15:55 crc kubenswrapper[4826]: I0319 19:15:55.910592 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-2z2f5" Mar 19 19:15:56 crc kubenswrapper[4826]: W0319 19:15:56.325081 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod306202dd_aac9_4865_a5aa_69c04b06cf09.slice/crio-eef8018a26014fc78ac5bc6a25c7450cfd24a02a8c11eb682709531589451ba9 WatchSource:0}: Error finding container eef8018a26014fc78ac5bc6a25c7450cfd24a02a8c11eb682709531589451ba9: Status 404 returned error can't find the container with id eef8018a26014fc78ac5bc6a25c7450cfd24a02a8c11eb682709531589451ba9 Mar 19 19:15:56 crc kubenswrapper[4826]: I0319 19:15:56.333824 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-m87mz"] Mar 19 19:15:56 crc kubenswrapper[4826]: I0319 19:15:56.392423 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-2z2f5"] Mar 19 19:15:56 crc kubenswrapper[4826]: W0319 19:15:56.394835 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68e26478_de34_4ca1_8c1f_ea760101ec64.slice/crio-a2c1ef2af69a688f61631fd0c5055ed3e1176b9f0d3c234bb7f64b20b8e5771b WatchSource:0}: Error finding container a2c1ef2af69a688f61631fd0c5055ed3e1176b9f0d3c234bb7f64b20b8e5771b: Status 404 returned error can't find the container with id a2c1ef2af69a688f61631fd0c5055ed3e1176b9f0d3c234bb7f64b20b8e5771b Mar 19 19:15:56 crc kubenswrapper[4826]: I0319 19:15:56.413816 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-2z2f5" event={"ID":"68e26478-de34-4ca1-8c1f-ea760101ec64","Type":"ContainerStarted","Data":"a2c1ef2af69a688f61631fd0c5055ed3e1176b9f0d3c234bb7f64b20b8e5771b"} Mar 19 19:15:56 crc kubenswrapper[4826]: I0319 19:15:56.415627 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-m87mz" event={"ID":"306202dd-aac9-4865-a5aa-69c04b06cf09","Type":"ContainerStarted","Data":"eef8018a26014fc78ac5bc6a25c7450cfd24a02a8c11eb682709531589451ba9"} Mar 19 19:15:56 crc kubenswrapper[4826]: I0319 19:15:56.418399 4826 generic.go:334] "Generic (PLEG): container finished" podID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerID="8f9b98750fb35334b26ac1561a7757e06810afb82592af11d7a0e1fbf0a43d22" exitCode=0 Mar 19 19:15:56 crc kubenswrapper[4826]: I0319 19:15:56.418470 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" event={"ID":"b456fa3f-c7a7-45ca-b560-e7a9b21be05a","Type":"ContainerDied","Data":"8f9b98750fb35334b26ac1561a7757e06810afb82592af11d7a0e1fbf0a43d22"} Mar 19 19:15:56 crc kubenswrapper[4826]: I0319 19:15:56.418520 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" event={"ID":"b456fa3f-c7a7-45ca-b560-e7a9b21be05a","Type":"ContainerStarted","Data":"1238977d0e09446586a9032546be2d2ff642cd7a1d8371018f40396f2b3eff68"} Mar 19 19:15:56 crc kubenswrapper[4826]: I0319 19:15:56.418542 4826 scope.go:117] "RemoveContainer" containerID="633ba93ffe9c9e9f20a094017e3572d6ef9546ba5f85c83960d8b20fb8ddd2bc" Mar 19 19:15:58 crc kubenswrapper[4826]: I0319 19:15:58.412847 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-m87mz"] Mar 19 19:15:58 crc kubenswrapper[4826]: I0319 19:15:58.439291 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-qtzpm"] Mar 19 19:15:58 crc kubenswrapper[4826]: I0319 19:15:58.440980 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-qtzpm" Mar 19 19:15:58 crc kubenswrapper[4826]: I0319 19:15:58.443496 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-qtzpm"] Mar 19 19:15:58 crc kubenswrapper[4826]: I0319 19:15:58.576310 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2bb5b40f-d23c-411f-82d2-7d443b88a00b-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-qtzpm\" (UID: \"2bb5b40f-d23c-411f-82d2-7d443b88a00b\") " pod="openstack/dnsmasq-dns-5ccc8479f9-qtzpm" Mar 19 19:15:58 crc kubenswrapper[4826]: I0319 19:15:58.576624 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnqgk\" (UniqueName: \"kubernetes.io/projected/2bb5b40f-d23c-411f-82d2-7d443b88a00b-kube-api-access-xnqgk\") pod \"dnsmasq-dns-5ccc8479f9-qtzpm\" (UID: \"2bb5b40f-d23c-411f-82d2-7d443b88a00b\") " pod="openstack/dnsmasq-dns-5ccc8479f9-qtzpm" Mar 19 19:15:58 crc kubenswrapper[4826]: I0319 19:15:58.576721 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bb5b40f-d23c-411f-82d2-7d443b88a00b-config\") pod \"dnsmasq-dns-5ccc8479f9-qtzpm\" (UID: \"2bb5b40f-d23c-411f-82d2-7d443b88a00b\") " pod="openstack/dnsmasq-dns-5ccc8479f9-qtzpm" Mar 19 19:15:58 crc kubenswrapper[4826]: I0319 19:15:58.639908 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-2z2f5"] Mar 19 19:15:58 crc kubenswrapper[4826]: I0319 19:15:58.665922 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-css7p"] Mar 19 19:15:58 crc kubenswrapper[4826]: I0319 19:15:58.667425 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-css7p" Mar 19 19:15:58 crc kubenswrapper[4826]: I0319 19:15:58.676304 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-css7p"] Mar 19 19:15:58 crc kubenswrapper[4826]: I0319 19:15:58.678288 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnqgk\" (UniqueName: \"kubernetes.io/projected/2bb5b40f-d23c-411f-82d2-7d443b88a00b-kube-api-access-xnqgk\") pod \"dnsmasq-dns-5ccc8479f9-qtzpm\" (UID: \"2bb5b40f-d23c-411f-82d2-7d443b88a00b\") " pod="openstack/dnsmasq-dns-5ccc8479f9-qtzpm" Mar 19 19:15:58 crc kubenswrapper[4826]: I0319 19:15:58.678329 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bb5b40f-d23c-411f-82d2-7d443b88a00b-config\") pod \"dnsmasq-dns-5ccc8479f9-qtzpm\" (UID: \"2bb5b40f-d23c-411f-82d2-7d443b88a00b\") " pod="openstack/dnsmasq-dns-5ccc8479f9-qtzpm" Mar 19 19:15:58 crc kubenswrapper[4826]: I0319 19:15:58.678415 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2bb5b40f-d23c-411f-82d2-7d443b88a00b-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-qtzpm\" (UID: \"2bb5b40f-d23c-411f-82d2-7d443b88a00b\") " pod="openstack/dnsmasq-dns-5ccc8479f9-qtzpm" Mar 19 19:15:58 crc kubenswrapper[4826]: I0319 19:15:58.679726 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2bb5b40f-d23c-411f-82d2-7d443b88a00b-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-qtzpm\" (UID: \"2bb5b40f-d23c-411f-82d2-7d443b88a00b\") " pod="openstack/dnsmasq-dns-5ccc8479f9-qtzpm" Mar 19 19:15:58 crc kubenswrapper[4826]: I0319 19:15:58.680671 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bb5b40f-d23c-411f-82d2-7d443b88a00b-config\") pod \"dnsmasq-dns-5ccc8479f9-qtzpm\" (UID: \"2bb5b40f-d23c-411f-82d2-7d443b88a00b\") " pod="openstack/dnsmasq-dns-5ccc8479f9-qtzpm" Mar 19 19:15:58 crc kubenswrapper[4826]: I0319 19:15:58.720916 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnqgk\" (UniqueName: \"kubernetes.io/projected/2bb5b40f-d23c-411f-82d2-7d443b88a00b-kube-api-access-xnqgk\") pod \"dnsmasq-dns-5ccc8479f9-qtzpm\" (UID: \"2bb5b40f-d23c-411f-82d2-7d443b88a00b\") " pod="openstack/dnsmasq-dns-5ccc8479f9-qtzpm" Mar 19 19:15:58 crc kubenswrapper[4826]: I0319 19:15:58.770697 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-qtzpm" Mar 19 19:15:58 crc kubenswrapper[4826]: I0319 19:15:58.780912 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szr2d\" (UniqueName: \"kubernetes.io/projected/a9d44695-2372-4bc3-a893-1d82703fc963-kube-api-access-szr2d\") pod \"dnsmasq-dns-57d769cc4f-css7p\" (UID: \"a9d44695-2372-4bc3-a893-1d82703fc963\") " pod="openstack/dnsmasq-dns-57d769cc4f-css7p" Mar 19 19:15:58 crc kubenswrapper[4826]: I0319 19:15:58.786536 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9d44695-2372-4bc3-a893-1d82703fc963-config\") pod \"dnsmasq-dns-57d769cc4f-css7p\" (UID: \"a9d44695-2372-4bc3-a893-1d82703fc963\") " pod="openstack/dnsmasq-dns-57d769cc4f-css7p" Mar 19 19:15:58 crc kubenswrapper[4826]: I0319 19:15:58.786706 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9d44695-2372-4bc3-a893-1d82703fc963-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-css7p\" (UID: \"a9d44695-2372-4bc3-a893-1d82703fc963\") " pod="openstack/dnsmasq-dns-57d769cc4f-css7p" Mar 19 19:15:58 crc kubenswrapper[4826]: I0319 19:15:58.896981 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9d44695-2372-4bc3-a893-1d82703fc963-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-css7p\" (UID: \"a9d44695-2372-4bc3-a893-1d82703fc963\") " pod="openstack/dnsmasq-dns-57d769cc4f-css7p" Mar 19 19:15:58 crc kubenswrapper[4826]: I0319 19:15:58.897113 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szr2d\" (UniqueName: \"kubernetes.io/projected/a9d44695-2372-4bc3-a893-1d82703fc963-kube-api-access-szr2d\") pod \"dnsmasq-dns-57d769cc4f-css7p\" (UID: \"a9d44695-2372-4bc3-a893-1d82703fc963\") " pod="openstack/dnsmasq-dns-57d769cc4f-css7p" Mar 19 19:15:58 crc kubenswrapper[4826]: I0319 19:15:58.897266 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9d44695-2372-4bc3-a893-1d82703fc963-config\") pod \"dnsmasq-dns-57d769cc4f-css7p\" (UID: \"a9d44695-2372-4bc3-a893-1d82703fc963\") " pod="openstack/dnsmasq-dns-57d769cc4f-css7p" Mar 19 19:15:58 crc kubenswrapper[4826]: I0319 19:15:58.897977 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9d44695-2372-4bc3-a893-1d82703fc963-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-css7p\" (UID: \"a9d44695-2372-4bc3-a893-1d82703fc963\") " pod="openstack/dnsmasq-dns-57d769cc4f-css7p" Mar 19 19:15:58 crc kubenswrapper[4826]: I0319 19:15:58.898744 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9d44695-2372-4bc3-a893-1d82703fc963-config\") pod \"dnsmasq-dns-57d769cc4f-css7p\" (UID: \"a9d44695-2372-4bc3-a893-1d82703fc963\") " pod="openstack/dnsmasq-dns-57d769cc4f-css7p" Mar 19 19:15:58 crc kubenswrapper[4826]: I0319 19:15:58.919565 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szr2d\" (UniqueName: \"kubernetes.io/projected/a9d44695-2372-4bc3-a893-1d82703fc963-kube-api-access-szr2d\") pod \"dnsmasq-dns-57d769cc4f-css7p\" (UID: \"a9d44695-2372-4bc3-a893-1d82703fc963\") " pod="openstack/dnsmasq-dns-57d769cc4f-css7p" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.021773 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-css7p" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.453330 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-qtzpm"] Mar 19 19:15:59 crc kubenswrapper[4826]: W0319 19:15:59.506438 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2bb5b40f_d23c_411f_82d2_7d443b88a00b.slice/crio-c862cca108d182ddae56c93e9ddfb98b3996c6bff89adb16e3511c1f0ab34b9f WatchSource:0}: Error finding container c862cca108d182ddae56c93e9ddfb98b3996c6bff89adb16e3511c1f0ab34b9f: Status 404 returned error can't find the container with id c862cca108d182ddae56c93e9ddfb98b3996c6bff89adb16e3511c1f0ab34b9f Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.507401 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.509309 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.512623 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.512914 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.513009 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.513055 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-x4lnc" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.513167 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.513174 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.513244 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.517860 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.615903 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/69dc8d23-ac18-40b1-99d9-365705c5753b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"69dc8d23-ac18-40b1-99d9-365705c5753b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.616041 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xxg5\" (UniqueName: \"kubernetes.io/projected/69dc8d23-ac18-40b1-99d9-365705c5753b-kube-api-access-6xxg5\") pod \"rabbitmq-cell1-server-0\" (UID: \"69dc8d23-ac18-40b1-99d9-365705c5753b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.616119 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/69dc8d23-ac18-40b1-99d9-365705c5753b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"69dc8d23-ac18-40b1-99d9-365705c5753b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.616162 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5f383150-575a-4ec4-8521-2f187b5ecf9e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5f383150-575a-4ec4-8521-2f187b5ecf9e\") pod \"rabbitmq-cell1-server-0\" (UID: \"69dc8d23-ac18-40b1-99d9-365705c5753b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.616206 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/69dc8d23-ac18-40b1-99d9-365705c5753b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"69dc8d23-ac18-40b1-99d9-365705c5753b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.616229 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/69dc8d23-ac18-40b1-99d9-365705c5753b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"69dc8d23-ac18-40b1-99d9-365705c5753b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.616258 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/69dc8d23-ac18-40b1-99d9-365705c5753b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"69dc8d23-ac18-40b1-99d9-365705c5753b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.616279 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/69dc8d23-ac18-40b1-99d9-365705c5753b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"69dc8d23-ac18-40b1-99d9-365705c5753b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.616348 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/69dc8d23-ac18-40b1-99d9-365705c5753b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"69dc8d23-ac18-40b1-99d9-365705c5753b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.616390 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/69dc8d23-ac18-40b1-99d9-365705c5753b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"69dc8d23-ac18-40b1-99d9-365705c5753b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.616421 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/69dc8d23-ac18-40b1-99d9-365705c5753b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"69dc8d23-ac18-40b1-99d9-365705c5753b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.621898 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-css7p"] Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.717842 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/69dc8d23-ac18-40b1-99d9-365705c5753b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"69dc8d23-ac18-40b1-99d9-365705c5753b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.718775 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5f383150-575a-4ec4-8521-2f187b5ecf9e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5f383150-575a-4ec4-8521-2f187b5ecf9e\") pod \"rabbitmq-cell1-server-0\" (UID: \"69dc8d23-ac18-40b1-99d9-365705c5753b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.718913 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/69dc8d23-ac18-40b1-99d9-365705c5753b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"69dc8d23-ac18-40b1-99d9-365705c5753b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.721902 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/69dc8d23-ac18-40b1-99d9-365705c5753b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"69dc8d23-ac18-40b1-99d9-365705c5753b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.725900 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/69dc8d23-ac18-40b1-99d9-365705c5753b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"69dc8d23-ac18-40b1-99d9-365705c5753b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.726371 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/69dc8d23-ac18-40b1-99d9-365705c5753b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"69dc8d23-ac18-40b1-99d9-365705c5753b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.726436 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/69dc8d23-ac18-40b1-99d9-365705c5753b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"69dc8d23-ac18-40b1-99d9-365705c5753b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.726469 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/69dc8d23-ac18-40b1-99d9-365705c5753b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"69dc8d23-ac18-40b1-99d9-365705c5753b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.726609 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/69dc8d23-ac18-40b1-99d9-365705c5753b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"69dc8d23-ac18-40b1-99d9-365705c5753b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.726727 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/69dc8d23-ac18-40b1-99d9-365705c5753b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"69dc8d23-ac18-40b1-99d9-365705c5753b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.726771 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/69dc8d23-ac18-40b1-99d9-365705c5753b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"69dc8d23-ac18-40b1-99d9-365705c5753b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.726833 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/69dc8d23-ac18-40b1-99d9-365705c5753b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"69dc8d23-ac18-40b1-99d9-365705c5753b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.726866 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xxg5\" (UniqueName: \"kubernetes.io/projected/69dc8d23-ac18-40b1-99d9-365705c5753b-kube-api-access-6xxg5\") pod \"rabbitmq-cell1-server-0\" (UID: \"69dc8d23-ac18-40b1-99d9-365705c5753b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.727774 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/69dc8d23-ac18-40b1-99d9-365705c5753b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"69dc8d23-ac18-40b1-99d9-365705c5753b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.729265 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/69dc8d23-ac18-40b1-99d9-365705c5753b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"69dc8d23-ac18-40b1-99d9-365705c5753b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.731116 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/69dc8d23-ac18-40b1-99d9-365705c5753b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"69dc8d23-ac18-40b1-99d9-365705c5753b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.731812 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/69dc8d23-ac18-40b1-99d9-365705c5753b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"69dc8d23-ac18-40b1-99d9-365705c5753b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.732295 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/69dc8d23-ac18-40b1-99d9-365705c5753b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"69dc8d23-ac18-40b1-99d9-365705c5753b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.736335 4826 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.736437 4826 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5f383150-575a-4ec4-8521-2f187b5ecf9e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5f383150-575a-4ec4-8521-2f187b5ecf9e\") pod \"rabbitmq-cell1-server-0\" (UID: \"69dc8d23-ac18-40b1-99d9-365705c5753b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a75865b4496e6c0b397e1e83bc349881282d7666fda34b40e214046b93469f8a/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.738559 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/69dc8d23-ac18-40b1-99d9-365705c5753b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"69dc8d23-ac18-40b1-99d9-365705c5753b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.743389 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/69dc8d23-ac18-40b1-99d9-365705c5753b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"69dc8d23-ac18-40b1-99d9-365705c5753b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.758370 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xxg5\" (UniqueName: \"kubernetes.io/projected/69dc8d23-ac18-40b1-99d9-365705c5753b-kube-api-access-6xxg5\") pod \"rabbitmq-cell1-server-0\" (UID: \"69dc8d23-ac18-40b1-99d9-365705c5753b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.819120 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5f383150-575a-4ec4-8521-2f187b5ecf9e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5f383150-575a-4ec4-8521-2f187b5ecf9e\") pod \"rabbitmq-cell1-server-0\" (UID: \"69dc8d23-ac18-40b1-99d9-365705c5753b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.820115 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.824286 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.838240 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.838242 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.838599 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.838814 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.838931 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.839148 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-rgqtt" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.839508 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.841920 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.880329 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-2"] Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.885645 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.898530 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.921703 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-1"] Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.928224 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.936815 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e617bcf9-daaa-4a7a-949c-cdf0fc9646a5-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e617bcf9-daaa-4a7a-949c-cdf0fc9646a5\") " pod="openstack/rabbitmq-server-0" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.936901 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvhpq\" (UniqueName: \"kubernetes.io/projected/e617bcf9-daaa-4a7a-949c-cdf0fc9646a5-kube-api-access-pvhpq\") pod \"rabbitmq-server-0\" (UID: \"e617bcf9-daaa-4a7a-949c-cdf0fc9646a5\") " pod="openstack/rabbitmq-server-0" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.936924 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e617bcf9-daaa-4a7a-949c-cdf0fc9646a5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e617bcf9-daaa-4a7a-949c-cdf0fc9646a5\") " pod="openstack/rabbitmq-server-0" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.936945 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e617bcf9-daaa-4a7a-949c-cdf0fc9646a5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e617bcf9-daaa-4a7a-949c-cdf0fc9646a5\") " pod="openstack/rabbitmq-server-0" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.936979 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a4b76c04-891e-4a0b-9bcd-c8581b59c5c0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a4b76c04-891e-4a0b-9bcd-c8581b59c5c0\") pod \"rabbitmq-server-0\" (UID: \"e617bcf9-daaa-4a7a-949c-cdf0fc9646a5\") " pod="openstack/rabbitmq-server-0" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.936996 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e617bcf9-daaa-4a7a-949c-cdf0fc9646a5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e617bcf9-daaa-4a7a-949c-cdf0fc9646a5\") " pod="openstack/rabbitmq-server-0" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.937030 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e617bcf9-daaa-4a7a-949c-cdf0fc9646a5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e617bcf9-daaa-4a7a-949c-cdf0fc9646a5\") " pod="openstack/rabbitmq-server-0" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.937234 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e617bcf9-daaa-4a7a-949c-cdf0fc9646a5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e617bcf9-daaa-4a7a-949c-cdf0fc9646a5\") " pod="openstack/rabbitmq-server-0" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.937285 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e617bcf9-daaa-4a7a-949c-cdf0fc9646a5-config-data\") pod \"rabbitmq-server-0\" (UID: \"e617bcf9-daaa-4a7a-949c-cdf0fc9646a5\") " pod="openstack/rabbitmq-server-0" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.937307 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e617bcf9-daaa-4a7a-949c-cdf0fc9646a5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e617bcf9-daaa-4a7a-949c-cdf0fc9646a5\") " pod="openstack/rabbitmq-server-0" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.939724 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e617bcf9-daaa-4a7a-949c-cdf0fc9646a5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e617bcf9-daaa-4a7a-949c-cdf0fc9646a5\") " pod="openstack/rabbitmq-server-0" Mar 19 19:15:59 crc kubenswrapper[4826]: I0319 19:15:59.969372 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.015685 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.040708 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e617bcf9-daaa-4a7a-949c-cdf0fc9646a5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e617bcf9-daaa-4a7a-949c-cdf0fc9646a5\") " pod="openstack/rabbitmq-server-0" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.040749 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ad041e2d-3400-49ce-b25f-0d335f3b6738-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"ad041e2d-3400-49ce-b25f-0d335f3b6738\") " pod="openstack/rabbitmq-server-2" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.040773 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e617bcf9-daaa-4a7a-949c-cdf0fc9646a5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e617bcf9-daaa-4a7a-949c-cdf0fc9646a5\") " pod="openstack/rabbitmq-server-0" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.040798 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c3b6961b-20a9-4b00-9638-0d75e0bb359a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3b6961b-20a9-4b00-9638-0d75e0bb359a\") pod \"rabbitmq-server-2\" (UID: \"ad041e2d-3400-49ce-b25f-0d335f3b6738\") " pod="openstack/rabbitmq-server-2" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.040816 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad041e2d-3400-49ce-b25f-0d335f3b6738-config-data\") pod \"rabbitmq-server-2\" (UID: \"ad041e2d-3400-49ce-b25f-0d335f3b6738\") " pod="openstack/rabbitmq-server-2" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.040916 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2325ef7c-90a0-48f3-81f0-ede3e7f33570-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"2325ef7c-90a0-48f3-81f0-ede3e7f33570\") " pod="openstack/rabbitmq-server-1" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.040963 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a4b76c04-891e-4a0b-9bcd-c8581b59c5c0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a4b76c04-891e-4a0b-9bcd-c8581b59c5c0\") pod \"rabbitmq-server-0\" (UID: \"e617bcf9-daaa-4a7a-949c-cdf0fc9646a5\") " pod="openstack/rabbitmq-server-0" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.040985 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2325ef7c-90a0-48f3-81f0-ede3e7f33570-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"2325ef7c-90a0-48f3-81f0-ede3e7f33570\") " pod="openstack/rabbitmq-server-1" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.041008 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ad041e2d-3400-49ce-b25f-0d335f3b6738-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"ad041e2d-3400-49ce-b25f-0d335f3b6738\") " pod="openstack/rabbitmq-server-2" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.041040 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e617bcf9-daaa-4a7a-949c-cdf0fc9646a5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e617bcf9-daaa-4a7a-949c-cdf0fc9646a5\") " pod="openstack/rabbitmq-server-0" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.041095 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2325ef7c-90a0-48f3-81f0-ede3e7f33570-server-conf\") pod \"rabbitmq-server-1\" (UID: \"2325ef7c-90a0-48f3-81f0-ede3e7f33570\") " pod="openstack/rabbitmq-server-1" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.041119 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2325ef7c-90a0-48f3-81f0-ede3e7f33570-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"2325ef7c-90a0-48f3-81f0-ede3e7f33570\") " pod="openstack/rabbitmq-server-1" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.041167 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ad041e2d-3400-49ce-b25f-0d335f3b6738-server-conf\") pod \"rabbitmq-server-2\" (UID: \"ad041e2d-3400-49ce-b25f-0d335f3b6738\") " pod="openstack/rabbitmq-server-2" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.041191 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e617bcf9-daaa-4a7a-949c-cdf0fc9646a5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e617bcf9-daaa-4a7a-949c-cdf0fc9646a5\") " pod="openstack/rabbitmq-server-0" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.041259 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e617bcf9-daaa-4a7a-949c-cdf0fc9646a5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e617bcf9-daaa-4a7a-949c-cdf0fc9646a5\") " pod="openstack/rabbitmq-server-0" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.041308 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2325ef7c-90a0-48f3-81f0-ede3e7f33570-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"2325ef7c-90a0-48f3-81f0-ede3e7f33570\") " pod="openstack/rabbitmq-server-1" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.041371 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e617bcf9-daaa-4a7a-949c-cdf0fc9646a5-config-data\") pod \"rabbitmq-server-0\" (UID: \"e617bcf9-daaa-4a7a-949c-cdf0fc9646a5\") " pod="openstack/rabbitmq-server-0" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.041391 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4c5s\" (UniqueName: \"kubernetes.io/projected/ad041e2d-3400-49ce-b25f-0d335f3b6738-kube-api-access-r4c5s\") pod \"rabbitmq-server-2\" (UID: \"ad041e2d-3400-49ce-b25f-0d335f3b6738\") " pod="openstack/rabbitmq-server-2" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.041416 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ad041e2d-3400-49ce-b25f-0d335f3b6738-pod-info\") pod \"rabbitmq-server-2\" (UID: \"ad041e2d-3400-49ce-b25f-0d335f3b6738\") " pod="openstack/rabbitmq-server-2" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.041434 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1ab1b068-4700-4ede-9fbf-f1e7d28eb79e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1ab1b068-4700-4ede-9fbf-f1e7d28eb79e\") pod \"rabbitmq-server-1\" (UID: \"2325ef7c-90a0-48f3-81f0-ede3e7f33570\") " pod="openstack/rabbitmq-server-1" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.041458 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2325ef7c-90a0-48f3-81f0-ede3e7f33570-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"2325ef7c-90a0-48f3-81f0-ede3e7f33570\") " pod="openstack/rabbitmq-server-1" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.041496 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e617bcf9-daaa-4a7a-949c-cdf0fc9646a5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e617bcf9-daaa-4a7a-949c-cdf0fc9646a5\") " pod="openstack/rabbitmq-server-0" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.041538 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2325ef7c-90a0-48f3-81f0-ede3e7f33570-pod-info\") pod \"rabbitmq-server-1\" (UID: \"2325ef7c-90a0-48f3-81f0-ede3e7f33570\") " pod="openstack/rabbitmq-server-1" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.041570 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4vr4\" (UniqueName: \"kubernetes.io/projected/2325ef7c-90a0-48f3-81f0-ede3e7f33570-kube-api-access-m4vr4\") pod \"rabbitmq-server-1\" (UID: \"2325ef7c-90a0-48f3-81f0-ede3e7f33570\") " pod="openstack/rabbitmq-server-1" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.041606 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e617bcf9-daaa-4a7a-949c-cdf0fc9646a5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e617bcf9-daaa-4a7a-949c-cdf0fc9646a5\") " pod="openstack/rabbitmq-server-0" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.044776 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2325ef7c-90a0-48f3-81f0-ede3e7f33570-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"2325ef7c-90a0-48f3-81f0-ede3e7f33570\") " pod="openstack/rabbitmq-server-1" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.044852 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ad041e2d-3400-49ce-b25f-0d335f3b6738-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"ad041e2d-3400-49ce-b25f-0d335f3b6738\") " pod="openstack/rabbitmq-server-2" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.045625 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e617bcf9-daaa-4a7a-949c-cdf0fc9646a5-config-data\") pod \"rabbitmq-server-0\" (UID: \"e617bcf9-daaa-4a7a-949c-cdf0fc9646a5\") " pod="openstack/rabbitmq-server-0" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.047171 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e617bcf9-daaa-4a7a-949c-cdf0fc9646a5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e617bcf9-daaa-4a7a-949c-cdf0fc9646a5\") " pod="openstack/rabbitmq-server-0" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.050153 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e617bcf9-daaa-4a7a-949c-cdf0fc9646a5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e617bcf9-daaa-4a7a-949c-cdf0fc9646a5\") " pod="openstack/rabbitmq-server-0" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.050843 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e617bcf9-daaa-4a7a-949c-cdf0fc9646a5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e617bcf9-daaa-4a7a-949c-cdf0fc9646a5\") " pod="openstack/rabbitmq-server-0" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.072002 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e617bcf9-daaa-4a7a-949c-cdf0fc9646a5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e617bcf9-daaa-4a7a-949c-cdf0fc9646a5\") " pod="openstack/rabbitmq-server-0" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.072527 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e617bcf9-daaa-4a7a-949c-cdf0fc9646a5-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e617bcf9-daaa-4a7a-949c-cdf0fc9646a5\") " pod="openstack/rabbitmq-server-0" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.072555 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ad041e2d-3400-49ce-b25f-0d335f3b6738-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"ad041e2d-3400-49ce-b25f-0d335f3b6738\") " pod="openstack/rabbitmq-server-2" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.073465 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e617bcf9-daaa-4a7a-949c-cdf0fc9646a5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e617bcf9-daaa-4a7a-949c-cdf0fc9646a5\") " pod="openstack/rabbitmq-server-0" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.074073 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ad041e2d-3400-49ce-b25f-0d335f3b6738-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"ad041e2d-3400-49ce-b25f-0d335f3b6738\") " pod="openstack/rabbitmq-server-2" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.074431 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2325ef7c-90a0-48f3-81f0-ede3e7f33570-config-data\") pod \"rabbitmq-server-1\" (UID: \"2325ef7c-90a0-48f3-81f0-ede3e7f33570\") " pod="openstack/rabbitmq-server-1" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.074598 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvhpq\" (UniqueName: \"kubernetes.io/projected/e617bcf9-daaa-4a7a-949c-cdf0fc9646a5-kube-api-access-pvhpq\") pod \"rabbitmq-server-0\" (UID: \"e617bcf9-daaa-4a7a-949c-cdf0fc9646a5\") " pod="openstack/rabbitmq-server-0" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.074721 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ad041e2d-3400-49ce-b25f-0d335f3b6738-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"ad041e2d-3400-49ce-b25f-0d335f3b6738\") " pod="openstack/rabbitmq-server-2" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.087099 4826 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.087147 4826 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a4b76c04-891e-4a0b-9bcd-c8581b59c5c0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a4b76c04-891e-4a0b-9bcd-c8581b59c5c0\") pod \"rabbitmq-server-0\" (UID: \"e617bcf9-daaa-4a7a-949c-cdf0fc9646a5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/474e894b78a3d9a61a889b2bf5e73544c970fa007f104547e7045d9e5c9c2882/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.087732 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e617bcf9-daaa-4a7a-949c-cdf0fc9646a5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e617bcf9-daaa-4a7a-949c-cdf0fc9646a5\") " pod="openstack/rabbitmq-server-0" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.088040 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e617bcf9-daaa-4a7a-949c-cdf0fc9646a5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e617bcf9-daaa-4a7a-949c-cdf0fc9646a5\") " pod="openstack/rabbitmq-server-0" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.089073 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e617bcf9-daaa-4a7a-949c-cdf0fc9646a5-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e617bcf9-daaa-4a7a-949c-cdf0fc9646a5\") " pod="openstack/rabbitmq-server-0" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.100251 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvhpq\" (UniqueName: \"kubernetes.io/projected/e617bcf9-daaa-4a7a-949c-cdf0fc9646a5-kube-api-access-pvhpq\") pod \"rabbitmq-server-0\" (UID: \"e617bcf9-daaa-4a7a-949c-cdf0fc9646a5\") " pod="openstack/rabbitmq-server-0" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.130005 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565796-wqw9j"] Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.131534 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565796-wqw9j" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.134124 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b27wl" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.134342 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.134480 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.140142 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565796-wqw9j"] Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.179015 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2325ef7c-90a0-48f3-81f0-ede3e7f33570-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"2325ef7c-90a0-48f3-81f0-ede3e7f33570\") " pod="openstack/rabbitmq-server-1" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.179113 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ad041e2d-3400-49ce-b25f-0d335f3b6738-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"ad041e2d-3400-49ce-b25f-0d335f3b6738\") " pod="openstack/rabbitmq-server-2" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.179169 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ad041e2d-3400-49ce-b25f-0d335f3b6738-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"ad041e2d-3400-49ce-b25f-0d335f3b6738\") " pod="openstack/rabbitmq-server-2" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.179445 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2325ef7c-90a0-48f3-81f0-ede3e7f33570-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"2325ef7c-90a0-48f3-81f0-ede3e7f33570\") " pod="openstack/rabbitmq-server-1" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.179687 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ad041e2d-3400-49ce-b25f-0d335f3b6738-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"ad041e2d-3400-49ce-b25f-0d335f3b6738\") " pod="openstack/rabbitmq-server-2" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.179724 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2325ef7c-90a0-48f3-81f0-ede3e7f33570-config-data\") pod \"rabbitmq-server-1\" (UID: \"2325ef7c-90a0-48f3-81f0-ede3e7f33570\") " pod="openstack/rabbitmq-server-1" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.179776 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ad041e2d-3400-49ce-b25f-0d335f3b6738-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"ad041e2d-3400-49ce-b25f-0d335f3b6738\") " pod="openstack/rabbitmq-server-2" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.179800 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzvb6\" (UniqueName: \"kubernetes.io/projected/4d5fbfcc-053e-4453-961c-91a0719cdaa6-kube-api-access-rzvb6\") pod \"auto-csr-approver-29565796-wqw9j\" (UID: \"4d5fbfcc-053e-4453-961c-91a0719cdaa6\") " pod="openshift-infra/auto-csr-approver-29565796-wqw9j" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.179854 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ad041e2d-3400-49ce-b25f-0d335f3b6738-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"ad041e2d-3400-49ce-b25f-0d335f3b6738\") " pod="openstack/rabbitmq-server-2" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.179886 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad041e2d-3400-49ce-b25f-0d335f3b6738-config-data\") pod \"rabbitmq-server-2\" (UID: \"ad041e2d-3400-49ce-b25f-0d335f3b6738\") " pod="openstack/rabbitmq-server-2" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.179933 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c3b6961b-20a9-4b00-9638-0d75e0bb359a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3b6961b-20a9-4b00-9638-0d75e0bb359a\") pod \"rabbitmq-server-2\" (UID: \"ad041e2d-3400-49ce-b25f-0d335f3b6738\") " pod="openstack/rabbitmq-server-2" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.179970 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2325ef7c-90a0-48f3-81f0-ede3e7f33570-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"2325ef7c-90a0-48f3-81f0-ede3e7f33570\") " pod="openstack/rabbitmq-server-1" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.180020 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2325ef7c-90a0-48f3-81f0-ede3e7f33570-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"2325ef7c-90a0-48f3-81f0-ede3e7f33570\") " pod="openstack/rabbitmq-server-1" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.180040 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ad041e2d-3400-49ce-b25f-0d335f3b6738-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"ad041e2d-3400-49ce-b25f-0d335f3b6738\") " pod="openstack/rabbitmq-server-2" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.180327 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ad041e2d-3400-49ce-b25f-0d335f3b6738-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"ad041e2d-3400-49ce-b25f-0d335f3b6738\") " pod="openstack/rabbitmq-server-2" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.180750 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2325ef7c-90a0-48f3-81f0-ede3e7f33570-server-conf\") pod \"rabbitmq-server-1\" (UID: \"2325ef7c-90a0-48f3-81f0-ede3e7f33570\") " pod="openstack/rabbitmq-server-1" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.180784 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2325ef7c-90a0-48f3-81f0-ede3e7f33570-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"2325ef7c-90a0-48f3-81f0-ede3e7f33570\") " pod="openstack/rabbitmq-server-1" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.180803 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ad041e2d-3400-49ce-b25f-0d335f3b6738-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"ad041e2d-3400-49ce-b25f-0d335f3b6738\") " pod="openstack/rabbitmq-server-2" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.180835 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ad041e2d-3400-49ce-b25f-0d335f3b6738-server-conf\") pod \"rabbitmq-server-2\" (UID: \"ad041e2d-3400-49ce-b25f-0d335f3b6738\") " pod="openstack/rabbitmq-server-2" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.180925 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2325ef7c-90a0-48f3-81f0-ede3e7f33570-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"2325ef7c-90a0-48f3-81f0-ede3e7f33570\") " pod="openstack/rabbitmq-server-1" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.180937 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2325ef7c-90a0-48f3-81f0-ede3e7f33570-config-data\") pod \"rabbitmq-server-1\" (UID: \"2325ef7c-90a0-48f3-81f0-ede3e7f33570\") " pod="openstack/rabbitmq-server-1" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.181044 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4c5s\" (UniqueName: \"kubernetes.io/projected/ad041e2d-3400-49ce-b25f-0d335f3b6738-kube-api-access-r4c5s\") pod \"rabbitmq-server-2\" (UID: \"ad041e2d-3400-49ce-b25f-0d335f3b6738\") " pod="openstack/rabbitmq-server-2" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.181074 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ad041e2d-3400-49ce-b25f-0d335f3b6738-pod-info\") pod \"rabbitmq-server-2\" (UID: \"ad041e2d-3400-49ce-b25f-0d335f3b6738\") " pod="openstack/rabbitmq-server-2" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.181119 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1ab1b068-4700-4ede-9fbf-f1e7d28eb79e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1ab1b068-4700-4ede-9fbf-f1e7d28eb79e\") pod \"rabbitmq-server-1\" (UID: \"2325ef7c-90a0-48f3-81f0-ede3e7f33570\") " pod="openstack/rabbitmq-server-1" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.181142 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2325ef7c-90a0-48f3-81f0-ede3e7f33570-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"2325ef7c-90a0-48f3-81f0-ede3e7f33570\") " pod="openstack/rabbitmq-server-1" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.181200 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2325ef7c-90a0-48f3-81f0-ede3e7f33570-pod-info\") pod \"rabbitmq-server-1\" (UID: \"2325ef7c-90a0-48f3-81f0-ede3e7f33570\") " pod="openstack/rabbitmq-server-1" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.181219 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4vr4\" (UniqueName: \"kubernetes.io/projected/2325ef7c-90a0-48f3-81f0-ede3e7f33570-kube-api-access-m4vr4\") pod \"rabbitmq-server-1\" (UID: \"2325ef7c-90a0-48f3-81f0-ede3e7f33570\") " pod="openstack/rabbitmq-server-1" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.181278 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad041e2d-3400-49ce-b25f-0d335f3b6738-config-data\") pod \"rabbitmq-server-2\" (UID: \"ad041e2d-3400-49ce-b25f-0d335f3b6738\") " pod="openstack/rabbitmq-server-2" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.182719 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ad041e2d-3400-49ce-b25f-0d335f3b6738-server-conf\") pod \"rabbitmq-server-2\" (UID: \"ad041e2d-3400-49ce-b25f-0d335f3b6738\") " pod="openstack/rabbitmq-server-2" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.182931 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ad041e2d-3400-49ce-b25f-0d335f3b6738-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"ad041e2d-3400-49ce-b25f-0d335f3b6738\") " pod="openstack/rabbitmq-server-2" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.183140 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2325ef7c-90a0-48f3-81f0-ede3e7f33570-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"2325ef7c-90a0-48f3-81f0-ede3e7f33570\") " pod="openstack/rabbitmq-server-1" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.183546 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2325ef7c-90a0-48f3-81f0-ede3e7f33570-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"2325ef7c-90a0-48f3-81f0-ede3e7f33570\") " pod="openstack/rabbitmq-server-1" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.185272 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2325ef7c-90a0-48f3-81f0-ede3e7f33570-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"2325ef7c-90a0-48f3-81f0-ede3e7f33570\") " pod="openstack/rabbitmq-server-1" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.185850 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ad041e2d-3400-49ce-b25f-0d335f3b6738-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"ad041e2d-3400-49ce-b25f-0d335f3b6738\") " pod="openstack/rabbitmq-server-2" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.186327 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ad041e2d-3400-49ce-b25f-0d335f3b6738-pod-info\") pod \"rabbitmq-server-2\" (UID: \"ad041e2d-3400-49ce-b25f-0d335f3b6738\") " pod="openstack/rabbitmq-server-2" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.193407 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ad041e2d-3400-49ce-b25f-0d335f3b6738-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"ad041e2d-3400-49ce-b25f-0d335f3b6738\") " pod="openstack/rabbitmq-server-2" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.195645 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2325ef7c-90a0-48f3-81f0-ede3e7f33570-server-conf\") pod \"rabbitmq-server-1\" (UID: \"2325ef7c-90a0-48f3-81f0-ede3e7f33570\") " pod="openstack/rabbitmq-server-1" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.197336 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2325ef7c-90a0-48f3-81f0-ede3e7f33570-pod-info\") pod \"rabbitmq-server-1\" (UID: \"2325ef7c-90a0-48f3-81f0-ede3e7f33570\") " pod="openstack/rabbitmq-server-1" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.203456 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2325ef7c-90a0-48f3-81f0-ede3e7f33570-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"2325ef7c-90a0-48f3-81f0-ede3e7f33570\") " pod="openstack/rabbitmq-server-1" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.204855 4826 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.204883 4826 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c3b6961b-20a9-4b00-9638-0d75e0bb359a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3b6961b-20a9-4b00-9638-0d75e0bb359a\") pod \"rabbitmq-server-2\" (UID: \"ad041e2d-3400-49ce-b25f-0d335f3b6738\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7c227a1c94246a3d17c388fbd1d144f5240f0189ff653b85b39092d420b9acf6/globalmount\"" pod="openstack/rabbitmq-server-2" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.205484 4826 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.205501 4826 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1ab1b068-4700-4ede-9fbf-f1e7d28eb79e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1ab1b068-4700-4ede-9fbf-f1e7d28eb79e\") pod \"rabbitmq-server-1\" (UID: \"2325ef7c-90a0-48f3-81f0-ede3e7f33570\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/444a67217cc3019efbdec4573c58565e98449ba1a91ed8a6d8ae2930727b7eaf/globalmount\"" pod="openstack/rabbitmq-server-1" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.209228 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2325ef7c-90a0-48f3-81f0-ede3e7f33570-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"2325ef7c-90a0-48f3-81f0-ede3e7f33570\") " pod="openstack/rabbitmq-server-1" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.212610 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ad041e2d-3400-49ce-b25f-0d335f3b6738-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"ad041e2d-3400-49ce-b25f-0d335f3b6738\") " pod="openstack/rabbitmq-server-2" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.218086 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4c5s\" (UniqueName: \"kubernetes.io/projected/ad041e2d-3400-49ce-b25f-0d335f3b6738-kube-api-access-r4c5s\") pod \"rabbitmq-server-2\" (UID: \"ad041e2d-3400-49ce-b25f-0d335f3b6738\") " pod="openstack/rabbitmq-server-2" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.222223 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a4b76c04-891e-4a0b-9bcd-c8581b59c5c0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a4b76c04-891e-4a0b-9bcd-c8581b59c5c0\") pod \"rabbitmq-server-0\" (UID: \"e617bcf9-daaa-4a7a-949c-cdf0fc9646a5\") " pod="openstack/rabbitmq-server-0" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.226930 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4vr4\" (UniqueName: \"kubernetes.io/projected/2325ef7c-90a0-48f3-81f0-ede3e7f33570-kube-api-access-m4vr4\") pod \"rabbitmq-server-1\" (UID: \"2325ef7c-90a0-48f3-81f0-ede3e7f33570\") " pod="openstack/rabbitmq-server-1" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.283668 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzvb6\" (UniqueName: \"kubernetes.io/projected/4d5fbfcc-053e-4453-961c-91a0719cdaa6-kube-api-access-rzvb6\") pod \"auto-csr-approver-29565796-wqw9j\" (UID: \"4d5fbfcc-053e-4453-961c-91a0719cdaa6\") " pod="openshift-infra/auto-csr-approver-29565796-wqw9j" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.298211 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c3b6961b-20a9-4b00-9638-0d75e0bb359a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3b6961b-20a9-4b00-9638-0d75e0bb359a\") pod \"rabbitmq-server-2\" (UID: \"ad041e2d-3400-49ce-b25f-0d335f3b6738\") " pod="openstack/rabbitmq-server-2" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.300237 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzvb6\" (UniqueName: \"kubernetes.io/projected/4d5fbfcc-053e-4453-961c-91a0719cdaa6-kube-api-access-rzvb6\") pod \"auto-csr-approver-29565796-wqw9j\" (UID: \"4d5fbfcc-053e-4453-961c-91a0719cdaa6\") " pod="openshift-infra/auto-csr-approver-29565796-wqw9j" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.300456 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1ab1b068-4700-4ede-9fbf-f1e7d28eb79e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1ab1b068-4700-4ede-9fbf-f1e7d28eb79e\") pod \"rabbitmq-server-1\" (UID: \"2325ef7c-90a0-48f3-81f0-ede3e7f33570\") " pod="openstack/rabbitmq-server-1" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.444338 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.454057 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565796-wqw9j" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.467632 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.519340 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"69dc8d23-ac18-40b1-99d9-365705c5753b","Type":"ContainerStarted","Data":"40d6e310243716b9d6f4a2f5859aab977e50d0f43d3b6857cbbeb1778e8122dc"} Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.521006 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-qtzpm" event={"ID":"2bb5b40f-d23c-411f-82d2-7d443b88a00b","Type":"ContainerStarted","Data":"c862cca108d182ddae56c93e9ddfb98b3996c6bff89adb16e3511c1f0ab34b9f"} Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.523499 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-css7p" event={"ID":"a9d44695-2372-4bc3-a893-1d82703fc963","Type":"ContainerStarted","Data":"25adced0ca0e574256d7e7846669aa2040f594750619b17e332ec32be7862ddc"} Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.571237 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.601110 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.958693 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.963016 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.966579 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-bxfsz" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.967486 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.967913 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.968526 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 19 19:16:00 crc kubenswrapper[4826]: I0319 19:16:00.981306 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 19 19:16:01 crc kubenswrapper[4826]: I0319 19:16:00.987216 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 19 19:16:01 crc kubenswrapper[4826]: I0319 19:16:01.112406 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5b78\" (UniqueName: \"kubernetes.io/projected/763c5ded-be94-49ad-9eea-447e444f24f3-kube-api-access-t5b78\") pod \"openstack-galera-0\" (UID: \"763c5ded-be94-49ad-9eea-447e444f24f3\") " pod="openstack/openstack-galera-0" Mar 19 19:16:01 crc kubenswrapper[4826]: I0319 19:16:01.112456 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/763c5ded-be94-49ad-9eea-447e444f24f3-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"763c5ded-be94-49ad-9eea-447e444f24f3\") " pod="openstack/openstack-galera-0" Mar 19 19:16:01 crc kubenswrapper[4826]: I0319 19:16:01.112510 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/763c5ded-be94-49ad-9eea-447e444f24f3-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"763c5ded-be94-49ad-9eea-447e444f24f3\") " pod="openstack/openstack-galera-0" Mar 19 19:16:01 crc kubenswrapper[4826]: I0319 19:16:01.112537 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ac08c2f3-41c8-4445-a2f6-499ac18d16ba\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ac08c2f3-41c8-4445-a2f6-499ac18d16ba\") pod \"openstack-galera-0\" (UID: \"763c5ded-be94-49ad-9eea-447e444f24f3\") " pod="openstack/openstack-galera-0" Mar 19 19:16:01 crc kubenswrapper[4826]: I0319 19:16:01.112555 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/763c5ded-be94-49ad-9eea-447e444f24f3-config-data-default\") pod \"openstack-galera-0\" (UID: \"763c5ded-be94-49ad-9eea-447e444f24f3\") " pod="openstack/openstack-galera-0" Mar 19 19:16:01 crc kubenswrapper[4826]: I0319 19:16:01.112592 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/763c5ded-be94-49ad-9eea-447e444f24f3-config-data-generated\") pod \"openstack-galera-0\" (UID: \"763c5ded-be94-49ad-9eea-447e444f24f3\") " pod="openstack/openstack-galera-0" Mar 19 19:16:01 crc kubenswrapper[4826]: I0319 19:16:01.112624 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/763c5ded-be94-49ad-9eea-447e444f24f3-kolla-config\") pod \"openstack-galera-0\" (UID: \"763c5ded-be94-49ad-9eea-447e444f24f3\") " pod="openstack/openstack-galera-0" Mar 19 19:16:01 crc kubenswrapper[4826]: I0319 19:16:01.112670 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/763c5ded-be94-49ad-9eea-447e444f24f3-operator-scripts\") pod \"openstack-galera-0\" (UID: \"763c5ded-be94-49ad-9eea-447e444f24f3\") " pod="openstack/openstack-galera-0" Mar 19 19:16:01 crc kubenswrapper[4826]: I0319 19:16:01.113425 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565796-wqw9j"] Mar 19 19:16:01 crc kubenswrapper[4826]: I0319 19:16:01.216105 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5b78\" (UniqueName: \"kubernetes.io/projected/763c5ded-be94-49ad-9eea-447e444f24f3-kube-api-access-t5b78\") pod \"openstack-galera-0\" (UID: \"763c5ded-be94-49ad-9eea-447e444f24f3\") " pod="openstack/openstack-galera-0" Mar 19 19:16:01 crc kubenswrapper[4826]: I0319 19:16:01.216170 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/763c5ded-be94-49ad-9eea-447e444f24f3-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"763c5ded-be94-49ad-9eea-447e444f24f3\") " pod="openstack/openstack-galera-0" Mar 19 19:16:01 crc kubenswrapper[4826]: I0319 19:16:01.216248 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/763c5ded-be94-49ad-9eea-447e444f24f3-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"763c5ded-be94-49ad-9eea-447e444f24f3\") " pod="openstack/openstack-galera-0" Mar 19 19:16:01 crc kubenswrapper[4826]: I0319 19:16:01.216292 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ac08c2f3-41c8-4445-a2f6-499ac18d16ba\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ac08c2f3-41c8-4445-a2f6-499ac18d16ba\") pod \"openstack-galera-0\" (UID: \"763c5ded-be94-49ad-9eea-447e444f24f3\") " pod="openstack/openstack-galera-0" Mar 19 19:16:01 crc kubenswrapper[4826]: I0319 19:16:01.216326 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/763c5ded-be94-49ad-9eea-447e444f24f3-config-data-default\") pod \"openstack-galera-0\" (UID: \"763c5ded-be94-49ad-9eea-447e444f24f3\") " pod="openstack/openstack-galera-0" Mar 19 19:16:01 crc kubenswrapper[4826]: I0319 19:16:01.216387 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/763c5ded-be94-49ad-9eea-447e444f24f3-config-data-generated\") pod \"openstack-galera-0\" (UID: \"763c5ded-be94-49ad-9eea-447e444f24f3\") " pod="openstack/openstack-galera-0" Mar 19 19:16:01 crc kubenswrapper[4826]: I0319 19:16:01.216439 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/763c5ded-be94-49ad-9eea-447e444f24f3-kolla-config\") pod \"openstack-galera-0\" (UID: \"763c5ded-be94-49ad-9eea-447e444f24f3\") " pod="openstack/openstack-galera-0" Mar 19 19:16:01 crc kubenswrapper[4826]: I0319 19:16:01.216500 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/763c5ded-be94-49ad-9eea-447e444f24f3-operator-scripts\") pod \"openstack-galera-0\" (UID: \"763c5ded-be94-49ad-9eea-447e444f24f3\") " pod="openstack/openstack-galera-0" Mar 19 19:16:01 crc kubenswrapper[4826]: I0319 19:16:01.218329 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/763c5ded-be94-49ad-9eea-447e444f24f3-config-data-default\") pod \"openstack-galera-0\" (UID: \"763c5ded-be94-49ad-9eea-447e444f24f3\") " pod="openstack/openstack-galera-0" Mar 19 19:16:01 crc kubenswrapper[4826]: I0319 19:16:01.218369 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/763c5ded-be94-49ad-9eea-447e444f24f3-operator-scripts\") pod \"openstack-galera-0\" (UID: \"763c5ded-be94-49ad-9eea-447e444f24f3\") " pod="openstack/openstack-galera-0" Mar 19 19:16:01 crc kubenswrapper[4826]: I0319 19:16:01.218754 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/763c5ded-be94-49ad-9eea-447e444f24f3-kolla-config\") pod \"openstack-galera-0\" (UID: \"763c5ded-be94-49ad-9eea-447e444f24f3\") " pod="openstack/openstack-galera-0" Mar 19 19:16:01 crc kubenswrapper[4826]: I0319 19:16:01.219674 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/763c5ded-be94-49ad-9eea-447e444f24f3-config-data-generated\") pod \"openstack-galera-0\" (UID: \"763c5ded-be94-49ad-9eea-447e444f24f3\") " pod="openstack/openstack-galera-0" Mar 19 19:16:01 crc kubenswrapper[4826]: I0319 19:16:01.221967 4826 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 19:16:01 crc kubenswrapper[4826]: I0319 19:16:01.221991 4826 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ac08c2f3-41c8-4445-a2f6-499ac18d16ba\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ac08c2f3-41c8-4445-a2f6-499ac18d16ba\") pod \"openstack-galera-0\" (UID: \"763c5ded-be94-49ad-9eea-447e444f24f3\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/8f6a9418f6aaf5406eb3359c177951916ad31cf8aae6c5af2630ff5a670dcdb2/globalmount\"" pod="openstack/openstack-galera-0" Mar 19 19:16:01 crc kubenswrapper[4826]: I0319 19:16:01.226809 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/763c5ded-be94-49ad-9eea-447e444f24f3-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"763c5ded-be94-49ad-9eea-447e444f24f3\") " pod="openstack/openstack-galera-0" Mar 19 19:16:01 crc kubenswrapper[4826]: I0319 19:16:01.236077 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/763c5ded-be94-49ad-9eea-447e444f24f3-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"763c5ded-be94-49ad-9eea-447e444f24f3\") " pod="openstack/openstack-galera-0" Mar 19 19:16:01 crc kubenswrapper[4826]: I0319 19:16:01.249001 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5b78\" (UniqueName: \"kubernetes.io/projected/763c5ded-be94-49ad-9eea-447e444f24f3-kube-api-access-t5b78\") pod \"openstack-galera-0\" (UID: \"763c5ded-be94-49ad-9eea-447e444f24f3\") " pod="openstack/openstack-galera-0" Mar 19 19:16:01 crc kubenswrapper[4826]: I0319 19:16:01.331926 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ac08c2f3-41c8-4445-a2f6-499ac18d16ba\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ac08c2f3-41c8-4445-a2f6-499ac18d16ba\") pod \"openstack-galera-0\" (UID: \"763c5ded-be94-49ad-9eea-447e444f24f3\") " pod="openstack/openstack-galera-0" Mar 19 19:16:01 crc kubenswrapper[4826]: I0319 19:16:01.556861 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565796-wqw9j" event={"ID":"4d5fbfcc-053e-4453-961c-91a0719cdaa6","Type":"ContainerStarted","Data":"626bd617295d2f711549f0cea9d5c4be79d4bd6bb66fa111a68886fca94a0d82"} Mar 19 19:16:01 crc kubenswrapper[4826]: I0319 19:16:01.578745 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 19:16:01 crc kubenswrapper[4826]: I0319 19:16:01.587664 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 19 19:16:01 crc kubenswrapper[4826]: I0319 19:16:01.600883 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 19 19:16:01 crc kubenswrapper[4826]: I0319 19:16:01.632834 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 19 19:16:02 crc kubenswrapper[4826]: I0319 19:16:02.247968 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 19 19:16:02 crc kubenswrapper[4826]: I0319 19:16:02.261035 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 19 19:16:02 crc kubenswrapper[4826]: I0319 19:16:02.261137 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 19 19:16:02 crc kubenswrapper[4826]: I0319 19:16:02.290761 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 19 19:16:02 crc kubenswrapper[4826]: I0319 19:16:02.290961 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 19 19:16:02 crc kubenswrapper[4826]: I0319 19:16:02.291090 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 19 19:16:02 crc kubenswrapper[4826]: I0319 19:16:02.294776 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-pgnzh" Mar 19 19:16:02 crc kubenswrapper[4826]: I0319 19:16:02.356332 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38814433-1737-49df-966a-ac3511ed48dd-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"38814433-1737-49df-966a-ac3511ed48dd\") " pod="openstack/openstack-cell1-galera-0" Mar 19 19:16:02 crc kubenswrapper[4826]: I0319 19:16:02.356389 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/38814433-1737-49df-966a-ac3511ed48dd-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"38814433-1737-49df-966a-ac3511ed48dd\") " pod="openstack/openstack-cell1-galera-0" Mar 19 19:16:02 crc kubenswrapper[4826]: I0319 19:16:02.356419 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/38814433-1737-49df-966a-ac3511ed48dd-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"38814433-1737-49df-966a-ac3511ed48dd\") " pod="openstack/openstack-cell1-galera-0" Mar 19 19:16:02 crc kubenswrapper[4826]: I0319 19:16:02.356553 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnwn2\" (UniqueName: \"kubernetes.io/projected/38814433-1737-49df-966a-ac3511ed48dd-kube-api-access-cnwn2\") pod \"openstack-cell1-galera-0\" (UID: \"38814433-1737-49df-966a-ac3511ed48dd\") " pod="openstack/openstack-cell1-galera-0" Mar 19 19:16:02 crc kubenswrapper[4826]: I0319 19:16:02.356689 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/38814433-1737-49df-966a-ac3511ed48dd-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"38814433-1737-49df-966a-ac3511ed48dd\") " pod="openstack/openstack-cell1-galera-0" Mar 19 19:16:02 crc kubenswrapper[4826]: I0319 19:16:02.356755 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3e71037a-de4f-45b9-86d7-78acb8b9f894\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3e71037a-de4f-45b9-86d7-78acb8b9f894\") pod \"openstack-cell1-galera-0\" (UID: \"38814433-1737-49df-966a-ac3511ed48dd\") " pod="openstack/openstack-cell1-galera-0" Mar 19 19:16:02 crc kubenswrapper[4826]: I0319 19:16:02.356872 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/38814433-1737-49df-966a-ac3511ed48dd-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"38814433-1737-49df-966a-ac3511ed48dd\") " pod="openstack/openstack-cell1-galera-0" Mar 19 19:16:02 crc kubenswrapper[4826]: I0319 19:16:02.357105 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38814433-1737-49df-966a-ac3511ed48dd-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"38814433-1737-49df-966a-ac3511ed48dd\") " pod="openstack/openstack-cell1-galera-0" Mar 19 19:16:02 crc kubenswrapper[4826]: I0319 19:16:02.459993 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnwn2\" (UniqueName: \"kubernetes.io/projected/38814433-1737-49df-966a-ac3511ed48dd-kube-api-access-cnwn2\") pod \"openstack-cell1-galera-0\" (UID: \"38814433-1737-49df-966a-ac3511ed48dd\") " pod="openstack/openstack-cell1-galera-0" Mar 19 19:16:02 crc kubenswrapper[4826]: I0319 19:16:02.460269 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/38814433-1737-49df-966a-ac3511ed48dd-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"38814433-1737-49df-966a-ac3511ed48dd\") " pod="openstack/openstack-cell1-galera-0" Mar 19 19:16:02 crc kubenswrapper[4826]: I0319 19:16:02.460303 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3e71037a-de4f-45b9-86d7-78acb8b9f894\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3e71037a-de4f-45b9-86d7-78acb8b9f894\") pod \"openstack-cell1-galera-0\" (UID: \"38814433-1737-49df-966a-ac3511ed48dd\") " pod="openstack/openstack-cell1-galera-0" Mar 19 19:16:02 crc kubenswrapper[4826]: I0319 19:16:02.460361 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/38814433-1737-49df-966a-ac3511ed48dd-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"38814433-1737-49df-966a-ac3511ed48dd\") " pod="openstack/openstack-cell1-galera-0" Mar 19 19:16:02 crc kubenswrapper[4826]: I0319 19:16:02.460392 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38814433-1737-49df-966a-ac3511ed48dd-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"38814433-1737-49df-966a-ac3511ed48dd\") " pod="openstack/openstack-cell1-galera-0" Mar 19 19:16:02 crc kubenswrapper[4826]: I0319 19:16:02.460410 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38814433-1737-49df-966a-ac3511ed48dd-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"38814433-1737-49df-966a-ac3511ed48dd\") " pod="openstack/openstack-cell1-galera-0" Mar 19 19:16:02 crc kubenswrapper[4826]: I0319 19:16:02.460438 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/38814433-1737-49df-966a-ac3511ed48dd-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"38814433-1737-49df-966a-ac3511ed48dd\") " pod="openstack/openstack-cell1-galera-0" Mar 19 19:16:02 crc kubenswrapper[4826]: I0319 19:16:02.460466 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/38814433-1737-49df-966a-ac3511ed48dd-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"38814433-1737-49df-966a-ac3511ed48dd\") " pod="openstack/openstack-cell1-galera-0" Mar 19 19:16:02 crc kubenswrapper[4826]: I0319 19:16:02.461732 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/38814433-1737-49df-966a-ac3511ed48dd-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"38814433-1737-49df-966a-ac3511ed48dd\") " pod="openstack/openstack-cell1-galera-0" Mar 19 19:16:02 crc kubenswrapper[4826]: I0319 19:16:02.462569 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/38814433-1737-49df-966a-ac3511ed48dd-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"38814433-1737-49df-966a-ac3511ed48dd\") " pod="openstack/openstack-cell1-galera-0" Mar 19 19:16:02 crc kubenswrapper[4826]: I0319 19:16:02.462879 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/38814433-1737-49df-966a-ac3511ed48dd-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"38814433-1737-49df-966a-ac3511ed48dd\") " pod="openstack/openstack-cell1-galera-0" Mar 19 19:16:02 crc kubenswrapper[4826]: I0319 19:16:02.463562 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38814433-1737-49df-966a-ac3511ed48dd-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"38814433-1737-49df-966a-ac3511ed48dd\") " pod="openstack/openstack-cell1-galera-0" Mar 19 19:16:02 crc kubenswrapper[4826]: I0319 19:16:02.466706 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38814433-1737-49df-966a-ac3511ed48dd-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"38814433-1737-49df-966a-ac3511ed48dd\") " pod="openstack/openstack-cell1-galera-0" Mar 19 19:16:02 crc kubenswrapper[4826]: I0319 19:16:02.467267 4826 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 19:16:02 crc kubenswrapper[4826]: I0319 19:16:02.467300 4826 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3e71037a-de4f-45b9-86d7-78acb8b9f894\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3e71037a-de4f-45b9-86d7-78acb8b9f894\") pod \"openstack-cell1-galera-0\" (UID: \"38814433-1737-49df-966a-ac3511ed48dd\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/29f96520455a9e86858c53f2694c14f4fe2163fd7088608e91d43703db8de386/globalmount\"" pod="openstack/openstack-cell1-galera-0" Mar 19 19:16:02 crc kubenswrapper[4826]: I0319 19:16:02.468048 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/38814433-1737-49df-966a-ac3511ed48dd-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"38814433-1737-49df-966a-ac3511ed48dd\") " pod="openstack/openstack-cell1-galera-0" Mar 19 19:16:02 crc kubenswrapper[4826]: I0319 19:16:02.474953 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnwn2\" (UniqueName: \"kubernetes.io/projected/38814433-1737-49df-966a-ac3511ed48dd-kube-api-access-cnwn2\") pod \"openstack-cell1-galera-0\" (UID: \"38814433-1737-49df-966a-ac3511ed48dd\") " pod="openstack/openstack-cell1-galera-0" Mar 19 19:16:02 crc kubenswrapper[4826]: I0319 19:16:02.538797 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3e71037a-de4f-45b9-86d7-78acb8b9f894\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3e71037a-de4f-45b9-86d7-78acb8b9f894\") pod \"openstack-cell1-galera-0\" (UID: \"38814433-1737-49df-966a-ac3511ed48dd\") " pod="openstack/openstack-cell1-galera-0" Mar 19 19:16:02 crc kubenswrapper[4826]: I0319 19:16:02.595038 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"2325ef7c-90a0-48f3-81f0-ede3e7f33570","Type":"ContainerStarted","Data":"1069dee418723266d4b0d18b50cf3b7afbf08c0019ab1b520270ebdbbac3ca91"} Mar 19 19:16:02 crc kubenswrapper[4826]: I0319 19:16:02.609706 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"ad041e2d-3400-49ce-b25f-0d335f3b6738","Type":"ContainerStarted","Data":"12d0d4d94335881c332d8a29d3a7295c94f1cd8e18970716075bf0c2905733c6"} Mar 19 19:16:02 crc kubenswrapper[4826]: I0319 19:16:02.617882 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e617bcf9-daaa-4a7a-949c-cdf0fc9646a5","Type":"ContainerStarted","Data":"3bb8d5045124888b616e485e6dfd698e2ad1188dbddec76c6fd9a6b02e84f3ee"} Mar 19 19:16:02 crc kubenswrapper[4826]: I0319 19:16:02.625968 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 19 19:16:02 crc kubenswrapper[4826]: I0319 19:16:02.652548 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 19 19:16:02 crc kubenswrapper[4826]: I0319 19:16:02.653780 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 19 19:16:02 crc kubenswrapper[4826]: I0319 19:16:02.656105 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-9s6rs" Mar 19 19:16:02 crc kubenswrapper[4826]: I0319 19:16:02.656791 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 19 19:16:02 crc kubenswrapper[4826]: I0319 19:16:02.657144 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 19 19:16:02 crc kubenswrapper[4826]: I0319 19:16:02.673396 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 19 19:16:02 crc kubenswrapper[4826]: I0319 19:16:02.770126 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jdnw\" (UniqueName: \"kubernetes.io/projected/128ebe90-32c9-409b-b145-5f7f95c7dbbf-kube-api-access-4jdnw\") pod \"memcached-0\" (UID: \"128ebe90-32c9-409b-b145-5f7f95c7dbbf\") " pod="openstack/memcached-0" Mar 19 19:16:02 crc kubenswrapper[4826]: I0319 19:16:02.770178 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/128ebe90-32c9-409b-b145-5f7f95c7dbbf-memcached-tls-certs\") pod \"memcached-0\" (UID: \"128ebe90-32c9-409b-b145-5f7f95c7dbbf\") " pod="openstack/memcached-0" Mar 19 19:16:02 crc kubenswrapper[4826]: I0319 19:16:02.770433 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/128ebe90-32c9-409b-b145-5f7f95c7dbbf-kolla-config\") pod \"memcached-0\" (UID: \"128ebe90-32c9-409b-b145-5f7f95c7dbbf\") " pod="openstack/memcached-0" Mar 19 19:16:02 crc kubenswrapper[4826]: I0319 19:16:02.770485 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/128ebe90-32c9-409b-b145-5f7f95c7dbbf-combined-ca-bundle\") pod \"memcached-0\" (UID: \"128ebe90-32c9-409b-b145-5f7f95c7dbbf\") " pod="openstack/memcached-0" Mar 19 19:16:02 crc kubenswrapper[4826]: I0319 19:16:02.770519 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/128ebe90-32c9-409b-b145-5f7f95c7dbbf-config-data\") pod \"memcached-0\" (UID: \"128ebe90-32c9-409b-b145-5f7f95c7dbbf\") " pod="openstack/memcached-0" Mar 19 19:16:02 crc kubenswrapper[4826]: I0319 19:16:02.874793 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/128ebe90-32c9-409b-b145-5f7f95c7dbbf-kolla-config\") pod \"memcached-0\" (UID: \"128ebe90-32c9-409b-b145-5f7f95c7dbbf\") " pod="openstack/memcached-0" Mar 19 19:16:02 crc kubenswrapper[4826]: I0319 19:16:02.874880 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/128ebe90-32c9-409b-b145-5f7f95c7dbbf-combined-ca-bundle\") pod \"memcached-0\" (UID: \"128ebe90-32c9-409b-b145-5f7f95c7dbbf\") " pod="openstack/memcached-0" Mar 19 19:16:02 crc kubenswrapper[4826]: I0319 19:16:02.874924 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/128ebe90-32c9-409b-b145-5f7f95c7dbbf-config-data\") pod \"memcached-0\" (UID: \"128ebe90-32c9-409b-b145-5f7f95c7dbbf\") " pod="openstack/memcached-0" Mar 19 19:16:02 crc kubenswrapper[4826]: I0319 19:16:02.881603 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jdnw\" (UniqueName: \"kubernetes.io/projected/128ebe90-32c9-409b-b145-5f7f95c7dbbf-kube-api-access-4jdnw\") pod \"memcached-0\" (UID: \"128ebe90-32c9-409b-b145-5f7f95c7dbbf\") " pod="openstack/memcached-0" Mar 19 19:16:02 crc kubenswrapper[4826]: I0319 19:16:02.881693 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/128ebe90-32c9-409b-b145-5f7f95c7dbbf-config-data\") pod \"memcached-0\" (UID: \"128ebe90-32c9-409b-b145-5f7f95c7dbbf\") " pod="openstack/memcached-0" Mar 19 19:16:02 crc kubenswrapper[4826]: I0319 19:16:02.881734 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/128ebe90-32c9-409b-b145-5f7f95c7dbbf-memcached-tls-certs\") pod \"memcached-0\" (UID: \"128ebe90-32c9-409b-b145-5f7f95c7dbbf\") " pod="openstack/memcached-0" Mar 19 19:16:02 crc kubenswrapper[4826]: I0319 19:16:02.881881 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/128ebe90-32c9-409b-b145-5f7f95c7dbbf-kolla-config\") pod \"memcached-0\" (UID: \"128ebe90-32c9-409b-b145-5f7f95c7dbbf\") " pod="openstack/memcached-0" Mar 19 19:16:02 crc kubenswrapper[4826]: I0319 19:16:02.889273 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/128ebe90-32c9-409b-b145-5f7f95c7dbbf-combined-ca-bundle\") pod \"memcached-0\" (UID: \"128ebe90-32c9-409b-b145-5f7f95c7dbbf\") " pod="openstack/memcached-0" Mar 19 19:16:02 crc kubenswrapper[4826]: I0319 19:16:02.896926 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/128ebe90-32c9-409b-b145-5f7f95c7dbbf-memcached-tls-certs\") pod \"memcached-0\" (UID: \"128ebe90-32c9-409b-b145-5f7f95c7dbbf\") " pod="openstack/memcached-0" Mar 19 19:16:02 crc kubenswrapper[4826]: I0319 19:16:02.901050 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jdnw\" (UniqueName: \"kubernetes.io/projected/128ebe90-32c9-409b-b145-5f7f95c7dbbf-kube-api-access-4jdnw\") pod \"memcached-0\" (UID: \"128ebe90-32c9-409b-b145-5f7f95c7dbbf\") " pod="openstack/memcached-0" Mar 19 19:16:02 crc kubenswrapper[4826]: I0319 19:16:02.987860 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 19 19:16:05 crc kubenswrapper[4826]: I0319 19:16:05.755119 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 19 19:16:05 crc kubenswrapper[4826]: I0319 19:16:05.757109 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 19 19:16:05 crc kubenswrapper[4826]: I0319 19:16:05.762363 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-9b65p" Mar 19 19:16:05 crc kubenswrapper[4826]: I0319 19:16:05.771865 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 19 19:16:05 crc kubenswrapper[4826]: I0319 19:16:05.857410 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8pbr\" (UniqueName: \"kubernetes.io/projected/7be203b6-dbb5-49d5-935f-9844ee4d6c11-kube-api-access-d8pbr\") pod \"kube-state-metrics-0\" (UID: \"7be203b6-dbb5-49d5-935f-9844ee4d6c11\") " pod="openstack/kube-state-metrics-0" Mar 19 19:16:05 crc kubenswrapper[4826]: I0319 19:16:05.959563 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8pbr\" (UniqueName: \"kubernetes.io/projected/7be203b6-dbb5-49d5-935f-9844ee4d6c11-kube-api-access-d8pbr\") pod \"kube-state-metrics-0\" (UID: \"7be203b6-dbb5-49d5-935f-9844ee4d6c11\") " pod="openstack/kube-state-metrics-0" Mar 19 19:16:06 crc kubenswrapper[4826]: I0319 19:16:06.009906 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8pbr\" (UniqueName: \"kubernetes.io/projected/7be203b6-dbb5-49d5-935f-9844ee4d6c11-kube-api-access-d8pbr\") pod \"kube-state-metrics-0\" (UID: \"7be203b6-dbb5-49d5-935f-9844ee4d6c11\") " pod="openstack/kube-state-metrics-0" Mar 19 19:16:06 crc kubenswrapper[4826]: I0319 19:16:06.095934 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 19 19:16:06 crc kubenswrapper[4826]: I0319 19:16:06.501071 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-ui-dashboards-7f87b9b85b-9r5qq"] Mar 19 19:16:06 crc kubenswrapper[4826]: I0319 19:16:06.502242 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-7f87b9b85b-9r5qq" Mar 19 19:16:06 crc kubenswrapper[4826]: I0319 19:16:06.505268 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards" Mar 19 19:16:06 crc kubenswrapper[4826]: I0319 19:16:06.505964 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards-sa-dockercfg-z2knl" Mar 19 19:16:06 crc kubenswrapper[4826]: I0319 19:16:06.516022 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-7f87b9b85b-9r5qq"] Mar 19 19:16:06 crc kubenswrapper[4826]: I0319 19:16:06.570420 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svskl\" (UniqueName: \"kubernetes.io/projected/e3abbb77-c3e9-4c0f-8038-2cdc6ddd10a5-kube-api-access-svskl\") pod \"observability-ui-dashboards-7f87b9b85b-9r5qq\" (UID: \"e3abbb77-c3e9-4c0f-8038-2cdc6ddd10a5\") " pod="openshift-operators/observability-ui-dashboards-7f87b9b85b-9r5qq" Mar 19 19:16:06 crc kubenswrapper[4826]: I0319 19:16:06.570604 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3abbb77-c3e9-4c0f-8038-2cdc6ddd10a5-serving-cert\") pod \"observability-ui-dashboards-7f87b9b85b-9r5qq\" (UID: \"e3abbb77-c3e9-4c0f-8038-2cdc6ddd10a5\") " pod="openshift-operators/observability-ui-dashboards-7f87b9b85b-9r5qq" Mar 19 19:16:06 crc kubenswrapper[4826]: I0319 19:16:06.672570 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3abbb77-c3e9-4c0f-8038-2cdc6ddd10a5-serving-cert\") pod \"observability-ui-dashboards-7f87b9b85b-9r5qq\" (UID: \"e3abbb77-c3e9-4c0f-8038-2cdc6ddd10a5\") " pod="openshift-operators/observability-ui-dashboards-7f87b9b85b-9r5qq" Mar 19 19:16:06 crc kubenswrapper[4826]: I0319 19:16:06.672682 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svskl\" (UniqueName: \"kubernetes.io/projected/e3abbb77-c3e9-4c0f-8038-2cdc6ddd10a5-kube-api-access-svskl\") pod \"observability-ui-dashboards-7f87b9b85b-9r5qq\" (UID: \"e3abbb77-c3e9-4c0f-8038-2cdc6ddd10a5\") " pod="openshift-operators/observability-ui-dashboards-7f87b9b85b-9r5qq" Mar 19 19:16:06 crc kubenswrapper[4826]: E0319 19:16:06.673026 4826 secret.go:188] Couldn't get secret openshift-operators/observability-ui-dashboards: secret "observability-ui-dashboards" not found Mar 19 19:16:06 crc kubenswrapper[4826]: E0319 19:16:06.673075 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3abbb77-c3e9-4c0f-8038-2cdc6ddd10a5-serving-cert podName:e3abbb77-c3e9-4c0f-8038-2cdc6ddd10a5 nodeName:}" failed. No retries permitted until 2026-03-19 19:16:07.173060544 +0000 UTC m=+1191.927128857 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/e3abbb77-c3e9-4c0f-8038-2cdc6ddd10a5-serving-cert") pod "observability-ui-dashboards-7f87b9b85b-9r5qq" (UID: "e3abbb77-c3e9-4c0f-8038-2cdc6ddd10a5") : secret "observability-ui-dashboards" not found Mar 19 19:16:06 crc kubenswrapper[4826]: I0319 19:16:06.695316 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svskl\" (UniqueName: \"kubernetes.io/projected/e3abbb77-c3e9-4c0f-8038-2cdc6ddd10a5-kube-api-access-svskl\") pod \"observability-ui-dashboards-7f87b9b85b-9r5qq\" (UID: \"e3abbb77-c3e9-4c0f-8038-2cdc6ddd10a5\") " pod="openshift-operators/observability-ui-dashboards-7f87b9b85b-9r5qq" Mar 19 19:16:06 crc kubenswrapper[4826]: I0319 19:16:06.855567 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-849c6d8fdf-t6vlp"] Mar 19 19:16:06 crc kubenswrapper[4826]: I0319 19:16:06.856826 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-849c6d8fdf-t6vlp" Mar 19 19:16:06 crc kubenswrapper[4826]: I0319 19:16:06.894669 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-849c6d8fdf-t6vlp"] Mar 19 19:16:06 crc kubenswrapper[4826]: I0319 19:16:06.977717 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d068f929-58c2-481e-99bd-e7808a74f36e-console-serving-cert\") pod \"console-849c6d8fdf-t6vlp\" (UID: \"d068f929-58c2-481e-99bd-e7808a74f36e\") " pod="openshift-console/console-849c6d8fdf-t6vlp" Mar 19 19:16:06 crc kubenswrapper[4826]: I0319 19:16:06.977769 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d068f929-58c2-481e-99bd-e7808a74f36e-console-oauth-config\") pod \"console-849c6d8fdf-t6vlp\" (UID: \"d068f929-58c2-481e-99bd-e7808a74f36e\") " pod="openshift-console/console-849c6d8fdf-t6vlp" Mar 19 19:16:06 crc kubenswrapper[4826]: I0319 19:16:06.977797 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccbvc\" (UniqueName: \"kubernetes.io/projected/d068f929-58c2-481e-99bd-e7808a74f36e-kube-api-access-ccbvc\") pod \"console-849c6d8fdf-t6vlp\" (UID: \"d068f929-58c2-481e-99bd-e7808a74f36e\") " pod="openshift-console/console-849c6d8fdf-t6vlp" Mar 19 19:16:06 crc kubenswrapper[4826]: I0319 19:16:06.977907 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d068f929-58c2-481e-99bd-e7808a74f36e-oauth-serving-cert\") pod \"console-849c6d8fdf-t6vlp\" (UID: \"d068f929-58c2-481e-99bd-e7808a74f36e\") " pod="openshift-console/console-849c6d8fdf-t6vlp" Mar 19 19:16:06 crc kubenswrapper[4826]: I0319 19:16:06.977929 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d068f929-58c2-481e-99bd-e7808a74f36e-service-ca\") pod \"console-849c6d8fdf-t6vlp\" (UID: \"d068f929-58c2-481e-99bd-e7808a74f36e\") " pod="openshift-console/console-849c6d8fdf-t6vlp" Mar 19 19:16:06 crc kubenswrapper[4826]: I0319 19:16:06.977945 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d068f929-58c2-481e-99bd-e7808a74f36e-console-config\") pod \"console-849c6d8fdf-t6vlp\" (UID: \"d068f929-58c2-481e-99bd-e7808a74f36e\") " pod="openshift-console/console-849c6d8fdf-t6vlp" Mar 19 19:16:06 crc kubenswrapper[4826]: I0319 19:16:06.977964 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d068f929-58c2-481e-99bd-e7808a74f36e-trusted-ca-bundle\") pod \"console-849c6d8fdf-t6vlp\" (UID: \"d068f929-58c2-481e-99bd-e7808a74f36e\") " pod="openshift-console/console-849c6d8fdf-t6vlp" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.079369 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d068f929-58c2-481e-99bd-e7808a74f36e-console-oauth-config\") pod \"console-849c6d8fdf-t6vlp\" (UID: \"d068f929-58c2-481e-99bd-e7808a74f36e\") " pod="openshift-console/console-849c6d8fdf-t6vlp" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.079413 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccbvc\" (UniqueName: \"kubernetes.io/projected/d068f929-58c2-481e-99bd-e7808a74f36e-kube-api-access-ccbvc\") pod \"console-849c6d8fdf-t6vlp\" (UID: \"d068f929-58c2-481e-99bd-e7808a74f36e\") " pod="openshift-console/console-849c6d8fdf-t6vlp" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.079606 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d068f929-58c2-481e-99bd-e7808a74f36e-oauth-serving-cert\") pod \"console-849c6d8fdf-t6vlp\" (UID: \"d068f929-58c2-481e-99bd-e7808a74f36e\") " pod="openshift-console/console-849c6d8fdf-t6vlp" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.079631 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d068f929-58c2-481e-99bd-e7808a74f36e-service-ca\") pod \"console-849c6d8fdf-t6vlp\" (UID: \"d068f929-58c2-481e-99bd-e7808a74f36e\") " pod="openshift-console/console-849c6d8fdf-t6vlp" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.079646 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d068f929-58c2-481e-99bd-e7808a74f36e-console-config\") pod \"console-849c6d8fdf-t6vlp\" (UID: \"d068f929-58c2-481e-99bd-e7808a74f36e\") " pod="openshift-console/console-849c6d8fdf-t6vlp" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.079680 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d068f929-58c2-481e-99bd-e7808a74f36e-trusted-ca-bundle\") pod \"console-849c6d8fdf-t6vlp\" (UID: \"d068f929-58c2-481e-99bd-e7808a74f36e\") " pod="openshift-console/console-849c6d8fdf-t6vlp" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.079755 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d068f929-58c2-481e-99bd-e7808a74f36e-console-serving-cert\") pod \"console-849c6d8fdf-t6vlp\" (UID: \"d068f929-58c2-481e-99bd-e7808a74f36e\") " pod="openshift-console/console-849c6d8fdf-t6vlp" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.084240 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d068f929-58c2-481e-99bd-e7808a74f36e-console-config\") pod \"console-849c6d8fdf-t6vlp\" (UID: \"d068f929-58c2-481e-99bd-e7808a74f36e\") " pod="openshift-console/console-849c6d8fdf-t6vlp" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.084419 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d068f929-58c2-481e-99bd-e7808a74f36e-trusted-ca-bundle\") pod \"console-849c6d8fdf-t6vlp\" (UID: \"d068f929-58c2-481e-99bd-e7808a74f36e\") " pod="openshift-console/console-849c6d8fdf-t6vlp" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.084894 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d068f929-58c2-481e-99bd-e7808a74f36e-service-ca\") pod \"console-849c6d8fdf-t6vlp\" (UID: \"d068f929-58c2-481e-99bd-e7808a74f36e\") " pod="openshift-console/console-849c6d8fdf-t6vlp" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.085258 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d068f929-58c2-481e-99bd-e7808a74f36e-oauth-serving-cert\") pod \"console-849c6d8fdf-t6vlp\" (UID: \"d068f929-58c2-481e-99bd-e7808a74f36e\") " pod="openshift-console/console-849c6d8fdf-t6vlp" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.090307 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d068f929-58c2-481e-99bd-e7808a74f36e-console-oauth-config\") pod \"console-849c6d8fdf-t6vlp\" (UID: \"d068f929-58c2-481e-99bd-e7808a74f36e\") " pod="openshift-console/console-849c6d8fdf-t6vlp" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.098866 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.102848 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d068f929-58c2-481e-99bd-e7808a74f36e-console-serving-cert\") pod \"console-849c6d8fdf-t6vlp\" (UID: \"d068f929-58c2-481e-99bd-e7808a74f36e\") " pod="openshift-console/console-849c6d8fdf-t6vlp" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.103171 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.110957 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.111199 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.111381 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.111600 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-zc2nx" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.111606 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.111595 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.118024 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.121226 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccbvc\" (UniqueName: \"kubernetes.io/projected/d068f929-58c2-481e-99bd-e7808a74f36e-kube-api-access-ccbvc\") pod \"console-849c6d8fdf-t6vlp\" (UID: \"d068f929-58c2-481e-99bd-e7808a74f36e\") " pod="openshift-console/console-849c6d8fdf-t6vlp" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.134774 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.171466 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.186034 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-849c6d8fdf-t6vlp" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.186546 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/63839c94-d94a-4fe8-a195-b86a6a9e8b79-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"63839c94-d94a-4fe8-a195-b86a6a9e8b79\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.186597 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/63839c94-d94a-4fe8-a195-b86a6a9e8b79-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"63839c94-d94a-4fe8-a195-b86a6a9e8b79\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.186617 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/63839c94-d94a-4fe8-a195-b86a6a9e8b79-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"63839c94-d94a-4fe8-a195-b86a6a9e8b79\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.186640 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlj5w\" (UniqueName: \"kubernetes.io/projected/63839c94-d94a-4fe8-a195-b86a6a9e8b79-kube-api-access-rlj5w\") pod \"prometheus-metric-storage-0\" (UID: \"63839c94-d94a-4fe8-a195-b86a6a9e8b79\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.186688 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/63839c94-d94a-4fe8-a195-b86a6a9e8b79-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"63839c94-d94a-4fe8-a195-b86a6a9e8b79\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.186719 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/63839c94-d94a-4fe8-a195-b86a6a9e8b79-config\") pod \"prometheus-metric-storage-0\" (UID: \"63839c94-d94a-4fe8-a195-b86a6a9e8b79\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.186760 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d1bf03f1-4718-4f37-9624-fffdd3002646\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d1bf03f1-4718-4f37-9624-fffdd3002646\") pod \"prometheus-metric-storage-0\" (UID: \"63839c94-d94a-4fe8-a195-b86a6a9e8b79\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.186804 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3abbb77-c3e9-4c0f-8038-2cdc6ddd10a5-serving-cert\") pod \"observability-ui-dashboards-7f87b9b85b-9r5qq\" (UID: \"e3abbb77-c3e9-4c0f-8038-2cdc6ddd10a5\") " pod="openshift-operators/observability-ui-dashboards-7f87b9b85b-9r5qq" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.186832 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/63839c94-d94a-4fe8-a195-b86a6a9e8b79-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"63839c94-d94a-4fe8-a195-b86a6a9e8b79\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.186848 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/63839c94-d94a-4fe8-a195-b86a6a9e8b79-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"63839c94-d94a-4fe8-a195-b86a6a9e8b79\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.186866 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/63839c94-d94a-4fe8-a195-b86a6a9e8b79-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"63839c94-d94a-4fe8-a195-b86a6a9e8b79\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.190004 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3abbb77-c3e9-4c0f-8038-2cdc6ddd10a5-serving-cert\") pod \"observability-ui-dashboards-7f87b9b85b-9r5qq\" (UID: \"e3abbb77-c3e9-4c0f-8038-2cdc6ddd10a5\") " pod="openshift-operators/observability-ui-dashboards-7f87b9b85b-9r5qq" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.289753 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d1bf03f1-4718-4f37-9624-fffdd3002646\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d1bf03f1-4718-4f37-9624-fffdd3002646\") pod \"prometheus-metric-storage-0\" (UID: \"63839c94-d94a-4fe8-a195-b86a6a9e8b79\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.289841 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/63839c94-d94a-4fe8-a195-b86a6a9e8b79-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"63839c94-d94a-4fe8-a195-b86a6a9e8b79\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.289862 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/63839c94-d94a-4fe8-a195-b86a6a9e8b79-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"63839c94-d94a-4fe8-a195-b86a6a9e8b79\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.289885 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/63839c94-d94a-4fe8-a195-b86a6a9e8b79-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"63839c94-d94a-4fe8-a195-b86a6a9e8b79\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.289941 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/63839c94-d94a-4fe8-a195-b86a6a9e8b79-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"63839c94-d94a-4fe8-a195-b86a6a9e8b79\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.289986 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/63839c94-d94a-4fe8-a195-b86a6a9e8b79-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"63839c94-d94a-4fe8-a195-b86a6a9e8b79\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.290019 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/63839c94-d94a-4fe8-a195-b86a6a9e8b79-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"63839c94-d94a-4fe8-a195-b86a6a9e8b79\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.290057 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlj5w\" (UniqueName: \"kubernetes.io/projected/63839c94-d94a-4fe8-a195-b86a6a9e8b79-kube-api-access-rlj5w\") pod \"prometheus-metric-storage-0\" (UID: \"63839c94-d94a-4fe8-a195-b86a6a9e8b79\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.290099 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/63839c94-d94a-4fe8-a195-b86a6a9e8b79-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"63839c94-d94a-4fe8-a195-b86a6a9e8b79\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.290141 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/63839c94-d94a-4fe8-a195-b86a6a9e8b79-config\") pod \"prometheus-metric-storage-0\" (UID: \"63839c94-d94a-4fe8-a195-b86a6a9e8b79\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.290624 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/63839c94-d94a-4fe8-a195-b86a6a9e8b79-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"63839c94-d94a-4fe8-a195-b86a6a9e8b79\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.291217 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/63839c94-d94a-4fe8-a195-b86a6a9e8b79-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"63839c94-d94a-4fe8-a195-b86a6a9e8b79\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.292068 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/63839c94-d94a-4fe8-a195-b86a6a9e8b79-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"63839c94-d94a-4fe8-a195-b86a6a9e8b79\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.303209 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/63839c94-d94a-4fe8-a195-b86a6a9e8b79-config\") pod \"prometheus-metric-storage-0\" (UID: \"63839c94-d94a-4fe8-a195-b86a6a9e8b79\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.324351 4826 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.324394 4826 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d1bf03f1-4718-4f37-9624-fffdd3002646\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d1bf03f1-4718-4f37-9624-fffdd3002646\") pod \"prometheus-metric-storage-0\" (UID: \"63839c94-d94a-4fe8-a195-b86a6a9e8b79\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/bec5dc65a51c5d8459204c761fcdfad10688abc652e4da5f0ede2b4f4a0c41c7/globalmount\"" pod="openstack/prometheus-metric-storage-0" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.326701 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/63839c94-d94a-4fe8-a195-b86a6a9e8b79-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"63839c94-d94a-4fe8-a195-b86a6a9e8b79\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.327397 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlj5w\" (UniqueName: \"kubernetes.io/projected/63839c94-d94a-4fe8-a195-b86a6a9e8b79-kube-api-access-rlj5w\") pod \"prometheus-metric-storage-0\" (UID: \"63839c94-d94a-4fe8-a195-b86a6a9e8b79\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.329242 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/63839c94-d94a-4fe8-a195-b86a6a9e8b79-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"63839c94-d94a-4fe8-a195-b86a6a9e8b79\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.331216 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/63839c94-d94a-4fe8-a195-b86a6a9e8b79-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"63839c94-d94a-4fe8-a195-b86a6a9e8b79\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.332217 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/63839c94-d94a-4fe8-a195-b86a6a9e8b79-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"63839c94-d94a-4fe8-a195-b86a6a9e8b79\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.419036 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d1bf03f1-4718-4f37-9624-fffdd3002646\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d1bf03f1-4718-4f37-9624-fffdd3002646\") pod \"prometheus-metric-storage-0\" (UID: \"63839c94-d94a-4fe8-a195-b86a6a9e8b79\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.430490 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-7f87b9b85b-9r5qq" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.532976 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.722785 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-wdll6"] Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.724437 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wdll6" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.729976 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-wkxmm" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.730240 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.730483 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.759280 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-wdll6"] Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.768123 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-2vpmv"] Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.771420 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-2vpmv" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.782509 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-2vpmv"] Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.809832 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2ed5ed9d-f761-4b5d-8cc8-07693c1d1289-var-run\") pod \"ovn-controller-wdll6\" (UID: \"2ed5ed9d-f761-4b5d-8cc8-07693c1d1289\") " pod="openstack/ovn-controller-wdll6" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.809897 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ed5ed9d-f761-4b5d-8cc8-07693c1d1289-combined-ca-bundle\") pod \"ovn-controller-wdll6\" (UID: \"2ed5ed9d-f761-4b5d-8cc8-07693c1d1289\") " pod="openstack/ovn-controller-wdll6" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.809943 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ed5ed9d-f761-4b5d-8cc8-07693c1d1289-ovn-controller-tls-certs\") pod \"ovn-controller-wdll6\" (UID: \"2ed5ed9d-f761-4b5d-8cc8-07693c1d1289\") " pod="openstack/ovn-controller-wdll6" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.809980 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d26gj\" (UniqueName: \"kubernetes.io/projected/2ed5ed9d-f761-4b5d-8cc8-07693c1d1289-kube-api-access-d26gj\") pod \"ovn-controller-wdll6\" (UID: \"2ed5ed9d-f761-4b5d-8cc8-07693c1d1289\") " pod="openstack/ovn-controller-wdll6" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.810036 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2ed5ed9d-f761-4b5d-8cc8-07693c1d1289-var-run-ovn\") pod \"ovn-controller-wdll6\" (UID: \"2ed5ed9d-f761-4b5d-8cc8-07693c1d1289\") " pod="openstack/ovn-controller-wdll6" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.810061 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2ed5ed9d-f761-4b5d-8cc8-07693c1d1289-scripts\") pod \"ovn-controller-wdll6\" (UID: \"2ed5ed9d-f761-4b5d-8cc8-07693c1d1289\") " pod="openstack/ovn-controller-wdll6" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.810167 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2ed5ed9d-f761-4b5d-8cc8-07693c1d1289-var-log-ovn\") pod \"ovn-controller-wdll6\" (UID: \"2ed5ed9d-f761-4b5d-8cc8-07693c1d1289\") " pod="openstack/ovn-controller-wdll6" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.912380 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ed5ed9d-f761-4b5d-8cc8-07693c1d1289-ovn-controller-tls-certs\") pod \"ovn-controller-wdll6\" (UID: \"2ed5ed9d-f761-4b5d-8cc8-07693c1d1289\") " pod="openstack/ovn-controller-wdll6" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.912433 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/91261e82-2106-4923-9f54-8a7e04b033b8-etc-ovs\") pod \"ovn-controller-ovs-2vpmv\" (UID: \"91261e82-2106-4923-9f54-8a7e04b033b8\") " pod="openstack/ovn-controller-ovs-2vpmv" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.912450 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcw44\" (UniqueName: \"kubernetes.io/projected/91261e82-2106-4923-9f54-8a7e04b033b8-kube-api-access-qcw44\") pod \"ovn-controller-ovs-2vpmv\" (UID: \"91261e82-2106-4923-9f54-8a7e04b033b8\") " pod="openstack/ovn-controller-ovs-2vpmv" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.912474 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d26gj\" (UniqueName: \"kubernetes.io/projected/2ed5ed9d-f761-4b5d-8cc8-07693c1d1289-kube-api-access-d26gj\") pod \"ovn-controller-wdll6\" (UID: \"2ed5ed9d-f761-4b5d-8cc8-07693c1d1289\") " pod="openstack/ovn-controller-wdll6" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.912524 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2ed5ed9d-f761-4b5d-8cc8-07693c1d1289-var-run-ovn\") pod \"ovn-controller-wdll6\" (UID: \"2ed5ed9d-f761-4b5d-8cc8-07693c1d1289\") " pod="openstack/ovn-controller-wdll6" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.912540 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2ed5ed9d-f761-4b5d-8cc8-07693c1d1289-scripts\") pod \"ovn-controller-wdll6\" (UID: \"2ed5ed9d-f761-4b5d-8cc8-07693c1d1289\") " pod="openstack/ovn-controller-wdll6" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.912565 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91261e82-2106-4923-9f54-8a7e04b033b8-scripts\") pod \"ovn-controller-ovs-2vpmv\" (UID: \"91261e82-2106-4923-9f54-8a7e04b033b8\") " pod="openstack/ovn-controller-ovs-2vpmv" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.912605 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/91261e82-2106-4923-9f54-8a7e04b033b8-var-lib\") pod \"ovn-controller-ovs-2vpmv\" (UID: \"91261e82-2106-4923-9f54-8a7e04b033b8\") " pod="openstack/ovn-controller-ovs-2vpmv" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.912648 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2ed5ed9d-f761-4b5d-8cc8-07693c1d1289-var-log-ovn\") pod \"ovn-controller-wdll6\" (UID: \"2ed5ed9d-f761-4b5d-8cc8-07693c1d1289\") " pod="openstack/ovn-controller-wdll6" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.912728 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/91261e82-2106-4923-9f54-8a7e04b033b8-var-run\") pod \"ovn-controller-ovs-2vpmv\" (UID: \"91261e82-2106-4923-9f54-8a7e04b033b8\") " pod="openstack/ovn-controller-ovs-2vpmv" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.912747 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2ed5ed9d-f761-4b5d-8cc8-07693c1d1289-var-run\") pod \"ovn-controller-wdll6\" (UID: \"2ed5ed9d-f761-4b5d-8cc8-07693c1d1289\") " pod="openstack/ovn-controller-wdll6" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.912774 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ed5ed9d-f761-4b5d-8cc8-07693c1d1289-combined-ca-bundle\") pod \"ovn-controller-wdll6\" (UID: \"2ed5ed9d-f761-4b5d-8cc8-07693c1d1289\") " pod="openstack/ovn-controller-wdll6" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.912791 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/91261e82-2106-4923-9f54-8a7e04b033b8-var-log\") pod \"ovn-controller-ovs-2vpmv\" (UID: \"91261e82-2106-4923-9f54-8a7e04b033b8\") " pod="openstack/ovn-controller-ovs-2vpmv" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.916285 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2ed5ed9d-f761-4b5d-8cc8-07693c1d1289-var-run-ovn\") pod \"ovn-controller-wdll6\" (UID: \"2ed5ed9d-f761-4b5d-8cc8-07693c1d1289\") " pod="openstack/ovn-controller-wdll6" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.919903 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2ed5ed9d-f761-4b5d-8cc8-07693c1d1289-var-run\") pod \"ovn-controller-wdll6\" (UID: \"2ed5ed9d-f761-4b5d-8cc8-07693c1d1289\") " pod="openstack/ovn-controller-wdll6" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.919954 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2ed5ed9d-f761-4b5d-8cc8-07693c1d1289-scripts\") pod \"ovn-controller-wdll6\" (UID: \"2ed5ed9d-f761-4b5d-8cc8-07693c1d1289\") " pod="openstack/ovn-controller-wdll6" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.919990 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2ed5ed9d-f761-4b5d-8cc8-07693c1d1289-var-log-ovn\") pod \"ovn-controller-wdll6\" (UID: \"2ed5ed9d-f761-4b5d-8cc8-07693c1d1289\") " pod="openstack/ovn-controller-wdll6" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.920522 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ed5ed9d-f761-4b5d-8cc8-07693c1d1289-ovn-controller-tls-certs\") pod \"ovn-controller-wdll6\" (UID: \"2ed5ed9d-f761-4b5d-8cc8-07693c1d1289\") " pod="openstack/ovn-controller-wdll6" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.922862 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ed5ed9d-f761-4b5d-8cc8-07693c1d1289-combined-ca-bundle\") pod \"ovn-controller-wdll6\" (UID: \"2ed5ed9d-f761-4b5d-8cc8-07693c1d1289\") " pod="openstack/ovn-controller-wdll6" Mar 19 19:16:07 crc kubenswrapper[4826]: I0319 19:16:07.928037 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d26gj\" (UniqueName: \"kubernetes.io/projected/2ed5ed9d-f761-4b5d-8cc8-07693c1d1289-kube-api-access-d26gj\") pod \"ovn-controller-wdll6\" (UID: \"2ed5ed9d-f761-4b5d-8cc8-07693c1d1289\") " pod="openstack/ovn-controller-wdll6" Mar 19 19:16:08 crc kubenswrapper[4826]: I0319 19:16:08.014205 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91261e82-2106-4923-9f54-8a7e04b033b8-scripts\") pod \"ovn-controller-ovs-2vpmv\" (UID: \"91261e82-2106-4923-9f54-8a7e04b033b8\") " pod="openstack/ovn-controller-ovs-2vpmv" Mar 19 19:16:08 crc kubenswrapper[4826]: I0319 19:16:08.014274 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/91261e82-2106-4923-9f54-8a7e04b033b8-var-lib\") pod \"ovn-controller-ovs-2vpmv\" (UID: \"91261e82-2106-4923-9f54-8a7e04b033b8\") " pod="openstack/ovn-controller-ovs-2vpmv" Mar 19 19:16:08 crc kubenswrapper[4826]: I0319 19:16:08.014357 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/91261e82-2106-4923-9f54-8a7e04b033b8-var-run\") pod \"ovn-controller-ovs-2vpmv\" (UID: \"91261e82-2106-4923-9f54-8a7e04b033b8\") " pod="openstack/ovn-controller-ovs-2vpmv" Mar 19 19:16:08 crc kubenswrapper[4826]: I0319 19:16:08.014385 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/91261e82-2106-4923-9f54-8a7e04b033b8-var-log\") pod \"ovn-controller-ovs-2vpmv\" (UID: \"91261e82-2106-4923-9f54-8a7e04b033b8\") " pod="openstack/ovn-controller-ovs-2vpmv" Mar 19 19:16:08 crc kubenswrapper[4826]: I0319 19:16:08.014419 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/91261e82-2106-4923-9f54-8a7e04b033b8-etc-ovs\") pod \"ovn-controller-ovs-2vpmv\" (UID: \"91261e82-2106-4923-9f54-8a7e04b033b8\") " pod="openstack/ovn-controller-ovs-2vpmv" Mar 19 19:16:08 crc kubenswrapper[4826]: I0319 19:16:08.014433 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcw44\" (UniqueName: \"kubernetes.io/projected/91261e82-2106-4923-9f54-8a7e04b033b8-kube-api-access-qcw44\") pod \"ovn-controller-ovs-2vpmv\" (UID: \"91261e82-2106-4923-9f54-8a7e04b033b8\") " pod="openstack/ovn-controller-ovs-2vpmv" Mar 19 19:16:08 crc kubenswrapper[4826]: I0319 19:16:08.015516 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/91261e82-2106-4923-9f54-8a7e04b033b8-var-run\") pod \"ovn-controller-ovs-2vpmv\" (UID: \"91261e82-2106-4923-9f54-8a7e04b033b8\") " pod="openstack/ovn-controller-ovs-2vpmv" Mar 19 19:16:08 crc kubenswrapper[4826]: I0319 19:16:08.015553 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/91261e82-2106-4923-9f54-8a7e04b033b8-var-log\") pod \"ovn-controller-ovs-2vpmv\" (UID: \"91261e82-2106-4923-9f54-8a7e04b033b8\") " pod="openstack/ovn-controller-ovs-2vpmv" Mar 19 19:16:08 crc kubenswrapper[4826]: I0319 19:16:08.015631 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/91261e82-2106-4923-9f54-8a7e04b033b8-var-lib\") pod \"ovn-controller-ovs-2vpmv\" (UID: \"91261e82-2106-4923-9f54-8a7e04b033b8\") " pod="openstack/ovn-controller-ovs-2vpmv" Mar 19 19:16:08 crc kubenswrapper[4826]: I0319 19:16:08.015679 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/91261e82-2106-4923-9f54-8a7e04b033b8-etc-ovs\") pod \"ovn-controller-ovs-2vpmv\" (UID: \"91261e82-2106-4923-9f54-8a7e04b033b8\") " pod="openstack/ovn-controller-ovs-2vpmv" Mar 19 19:16:08 crc kubenswrapper[4826]: I0319 19:16:08.018229 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91261e82-2106-4923-9f54-8a7e04b033b8-scripts\") pod \"ovn-controller-ovs-2vpmv\" (UID: \"91261e82-2106-4923-9f54-8a7e04b033b8\") " pod="openstack/ovn-controller-ovs-2vpmv" Mar 19 19:16:08 crc kubenswrapper[4826]: I0319 19:16:08.030157 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcw44\" (UniqueName: \"kubernetes.io/projected/91261e82-2106-4923-9f54-8a7e04b033b8-kube-api-access-qcw44\") pod \"ovn-controller-ovs-2vpmv\" (UID: \"91261e82-2106-4923-9f54-8a7e04b033b8\") " pod="openstack/ovn-controller-ovs-2vpmv" Mar 19 19:16:08 crc kubenswrapper[4826]: I0319 19:16:08.054942 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wdll6" Mar 19 19:16:08 crc kubenswrapper[4826]: I0319 19:16:08.090194 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-2vpmv" Mar 19 19:16:08 crc kubenswrapper[4826]: I0319 19:16:08.609420 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 19 19:16:08 crc kubenswrapper[4826]: I0319 19:16:08.611032 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 19 19:16:08 crc kubenswrapper[4826]: I0319 19:16:08.618400 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-h7dsw" Mar 19 19:16:08 crc kubenswrapper[4826]: I0319 19:16:08.618482 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 19 19:16:08 crc kubenswrapper[4826]: I0319 19:16:08.619005 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 19 19:16:08 crc kubenswrapper[4826]: I0319 19:16:08.619044 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 19 19:16:08 crc kubenswrapper[4826]: I0319 19:16:08.619275 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 19 19:16:08 crc kubenswrapper[4826]: I0319 19:16:08.626762 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 19 19:16:08 crc kubenswrapper[4826]: I0319 19:16:08.727237 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7411e60b-ca3b-4409-a289-0513649c49b3-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"7411e60b-ca3b-4409-a289-0513649c49b3\") " pod="openstack/ovsdbserver-nb-0" Mar 19 19:16:08 crc kubenswrapper[4826]: I0319 19:16:08.727325 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7411e60b-ca3b-4409-a289-0513649c49b3-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"7411e60b-ca3b-4409-a289-0513649c49b3\") " pod="openstack/ovsdbserver-nb-0" Mar 19 19:16:08 crc kubenswrapper[4826]: I0319 19:16:08.727352 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7411e60b-ca3b-4409-a289-0513649c49b3-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"7411e60b-ca3b-4409-a289-0513649c49b3\") " pod="openstack/ovsdbserver-nb-0" Mar 19 19:16:08 crc kubenswrapper[4826]: I0319 19:16:08.727383 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7411e60b-ca3b-4409-a289-0513649c49b3-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"7411e60b-ca3b-4409-a289-0513649c49b3\") " pod="openstack/ovsdbserver-nb-0" Mar 19 19:16:08 crc kubenswrapper[4826]: I0319 19:16:08.727433 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-de83235f-a8f9-4a27-aa8b-d16d83677c73\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-de83235f-a8f9-4a27-aa8b-d16d83677c73\") pod \"ovsdbserver-nb-0\" (UID: \"7411e60b-ca3b-4409-a289-0513649c49b3\") " pod="openstack/ovsdbserver-nb-0" Mar 19 19:16:08 crc kubenswrapper[4826]: I0319 19:16:08.727478 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7411e60b-ca3b-4409-a289-0513649c49b3-config\") pod \"ovsdbserver-nb-0\" (UID: \"7411e60b-ca3b-4409-a289-0513649c49b3\") " pod="openstack/ovsdbserver-nb-0" Mar 19 19:16:08 crc kubenswrapper[4826]: I0319 19:16:08.727516 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qt7b\" (UniqueName: \"kubernetes.io/projected/7411e60b-ca3b-4409-a289-0513649c49b3-kube-api-access-5qt7b\") pod \"ovsdbserver-nb-0\" (UID: \"7411e60b-ca3b-4409-a289-0513649c49b3\") " pod="openstack/ovsdbserver-nb-0" Mar 19 19:16:08 crc kubenswrapper[4826]: I0319 19:16:08.727545 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7411e60b-ca3b-4409-a289-0513649c49b3-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"7411e60b-ca3b-4409-a289-0513649c49b3\") " pod="openstack/ovsdbserver-nb-0" Mar 19 19:16:08 crc kubenswrapper[4826]: I0319 19:16:08.829210 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7411e60b-ca3b-4409-a289-0513649c49b3-config\") pod \"ovsdbserver-nb-0\" (UID: \"7411e60b-ca3b-4409-a289-0513649c49b3\") " pod="openstack/ovsdbserver-nb-0" Mar 19 19:16:08 crc kubenswrapper[4826]: I0319 19:16:08.829282 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qt7b\" (UniqueName: \"kubernetes.io/projected/7411e60b-ca3b-4409-a289-0513649c49b3-kube-api-access-5qt7b\") pod \"ovsdbserver-nb-0\" (UID: \"7411e60b-ca3b-4409-a289-0513649c49b3\") " pod="openstack/ovsdbserver-nb-0" Mar 19 19:16:08 crc kubenswrapper[4826]: I0319 19:16:08.829323 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7411e60b-ca3b-4409-a289-0513649c49b3-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"7411e60b-ca3b-4409-a289-0513649c49b3\") " pod="openstack/ovsdbserver-nb-0" Mar 19 19:16:08 crc kubenswrapper[4826]: I0319 19:16:08.829369 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7411e60b-ca3b-4409-a289-0513649c49b3-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"7411e60b-ca3b-4409-a289-0513649c49b3\") " pod="openstack/ovsdbserver-nb-0" Mar 19 19:16:08 crc kubenswrapper[4826]: I0319 19:16:08.829425 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7411e60b-ca3b-4409-a289-0513649c49b3-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"7411e60b-ca3b-4409-a289-0513649c49b3\") " pod="openstack/ovsdbserver-nb-0" Mar 19 19:16:08 crc kubenswrapper[4826]: I0319 19:16:08.829447 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7411e60b-ca3b-4409-a289-0513649c49b3-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"7411e60b-ca3b-4409-a289-0513649c49b3\") " pod="openstack/ovsdbserver-nb-0" Mar 19 19:16:08 crc kubenswrapper[4826]: I0319 19:16:08.829479 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7411e60b-ca3b-4409-a289-0513649c49b3-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"7411e60b-ca3b-4409-a289-0513649c49b3\") " pod="openstack/ovsdbserver-nb-0" Mar 19 19:16:08 crc kubenswrapper[4826]: I0319 19:16:08.829538 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-de83235f-a8f9-4a27-aa8b-d16d83677c73\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-de83235f-a8f9-4a27-aa8b-d16d83677c73\") pod \"ovsdbserver-nb-0\" (UID: \"7411e60b-ca3b-4409-a289-0513649c49b3\") " pod="openstack/ovsdbserver-nb-0" Mar 19 19:16:08 crc kubenswrapper[4826]: I0319 19:16:08.830149 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7411e60b-ca3b-4409-a289-0513649c49b3-config\") pod \"ovsdbserver-nb-0\" (UID: \"7411e60b-ca3b-4409-a289-0513649c49b3\") " pod="openstack/ovsdbserver-nb-0" Mar 19 19:16:08 crc kubenswrapper[4826]: I0319 19:16:08.830319 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7411e60b-ca3b-4409-a289-0513649c49b3-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"7411e60b-ca3b-4409-a289-0513649c49b3\") " pod="openstack/ovsdbserver-nb-0" Mar 19 19:16:08 crc kubenswrapper[4826]: I0319 19:16:08.830701 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7411e60b-ca3b-4409-a289-0513649c49b3-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"7411e60b-ca3b-4409-a289-0513649c49b3\") " pod="openstack/ovsdbserver-nb-0" Mar 19 19:16:08 crc kubenswrapper[4826]: I0319 19:16:08.831616 4826 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 19:16:08 crc kubenswrapper[4826]: I0319 19:16:08.831679 4826 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-de83235f-a8f9-4a27-aa8b-d16d83677c73\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-de83235f-a8f9-4a27-aa8b-d16d83677c73\") pod \"ovsdbserver-nb-0\" (UID: \"7411e60b-ca3b-4409-a289-0513649c49b3\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/057b3e170466b8586b77e47368800ac855e84c2640e54c4fe311951efb43caf7/globalmount\"" pod="openstack/ovsdbserver-nb-0" Mar 19 19:16:08 crc kubenswrapper[4826]: I0319 19:16:08.833445 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7411e60b-ca3b-4409-a289-0513649c49b3-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"7411e60b-ca3b-4409-a289-0513649c49b3\") " pod="openstack/ovsdbserver-nb-0" Mar 19 19:16:08 crc kubenswrapper[4826]: I0319 19:16:08.834560 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7411e60b-ca3b-4409-a289-0513649c49b3-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"7411e60b-ca3b-4409-a289-0513649c49b3\") " pod="openstack/ovsdbserver-nb-0" Mar 19 19:16:08 crc kubenswrapper[4826]: I0319 19:16:08.839003 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7411e60b-ca3b-4409-a289-0513649c49b3-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"7411e60b-ca3b-4409-a289-0513649c49b3\") " pod="openstack/ovsdbserver-nb-0" Mar 19 19:16:08 crc kubenswrapper[4826]: I0319 19:16:08.851429 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qt7b\" (UniqueName: \"kubernetes.io/projected/7411e60b-ca3b-4409-a289-0513649c49b3-kube-api-access-5qt7b\") pod \"ovsdbserver-nb-0\" (UID: \"7411e60b-ca3b-4409-a289-0513649c49b3\") " pod="openstack/ovsdbserver-nb-0" Mar 19 19:16:08 crc kubenswrapper[4826]: I0319 19:16:08.869343 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-de83235f-a8f9-4a27-aa8b-d16d83677c73\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-de83235f-a8f9-4a27-aa8b-d16d83677c73\") pod \"ovsdbserver-nb-0\" (UID: \"7411e60b-ca3b-4409-a289-0513649c49b3\") " pod="openstack/ovsdbserver-nb-0" Mar 19 19:16:08 crc kubenswrapper[4826]: I0319 19:16:08.938506 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 19 19:16:11 crc kubenswrapper[4826]: I0319 19:16:11.909290 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 19 19:16:11 crc kubenswrapper[4826]: I0319 19:16:11.911400 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 19 19:16:11 crc kubenswrapper[4826]: I0319 19:16:11.913709 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 19 19:16:11 crc kubenswrapper[4826]: I0319 19:16:11.913774 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 19 19:16:11 crc kubenswrapper[4826]: I0319 19:16:11.913946 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-c7hlr" Mar 19 19:16:11 crc kubenswrapper[4826]: I0319 19:16:11.915729 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 19 19:16:11 crc kubenswrapper[4826]: I0319 19:16:11.925834 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 19 19:16:11 crc kubenswrapper[4826]: I0319 19:16:11.990261 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1bd0e4d-d264-47a7-a3d0-71d4824ca253-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"a1bd0e4d-d264-47a7-a3d0-71d4824ca253\") " pod="openstack/ovsdbserver-sb-0" Mar 19 19:16:11 crc kubenswrapper[4826]: I0319 19:16:11.990330 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1bd0e4d-d264-47a7-a3d0-71d4824ca253-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"a1bd0e4d-d264-47a7-a3d0-71d4824ca253\") " pod="openstack/ovsdbserver-sb-0" Mar 19 19:16:11 crc kubenswrapper[4826]: I0319 19:16:11.990407 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fh9k\" (UniqueName: \"kubernetes.io/projected/a1bd0e4d-d264-47a7-a3d0-71d4824ca253-kube-api-access-4fh9k\") pod \"ovsdbserver-sb-0\" (UID: \"a1bd0e4d-d264-47a7-a3d0-71d4824ca253\") " pod="openstack/ovsdbserver-sb-0" Mar 19 19:16:11 crc kubenswrapper[4826]: I0319 19:16:11.990442 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a1bd0e4d-d264-47a7-a3d0-71d4824ca253-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"a1bd0e4d-d264-47a7-a3d0-71d4824ca253\") " pod="openstack/ovsdbserver-sb-0" Mar 19 19:16:11 crc kubenswrapper[4826]: I0319 19:16:11.990461 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a1bd0e4d-d264-47a7-a3d0-71d4824ca253-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"a1bd0e4d-d264-47a7-a3d0-71d4824ca253\") " pod="openstack/ovsdbserver-sb-0" Mar 19 19:16:11 crc kubenswrapper[4826]: I0319 19:16:11.990505 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1bd0e4d-d264-47a7-a3d0-71d4824ca253-config\") pod \"ovsdbserver-sb-0\" (UID: \"a1bd0e4d-d264-47a7-a3d0-71d4824ca253\") " pod="openstack/ovsdbserver-sb-0" Mar 19 19:16:11 crc kubenswrapper[4826]: I0319 19:16:11.990552 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1bd0e4d-d264-47a7-a3d0-71d4824ca253-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"a1bd0e4d-d264-47a7-a3d0-71d4824ca253\") " pod="openstack/ovsdbserver-sb-0" Mar 19 19:16:11 crc kubenswrapper[4826]: I0319 19:16:11.990580 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-cf39844c-952e-4196-9588-ab43d44b3500\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cf39844c-952e-4196-9588-ab43d44b3500\") pod \"ovsdbserver-sb-0\" (UID: \"a1bd0e4d-d264-47a7-a3d0-71d4824ca253\") " pod="openstack/ovsdbserver-sb-0" Mar 19 19:16:12 crc kubenswrapper[4826]: I0319 19:16:12.091776 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1bd0e4d-d264-47a7-a3d0-71d4824ca253-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"a1bd0e4d-d264-47a7-a3d0-71d4824ca253\") " pod="openstack/ovsdbserver-sb-0" Mar 19 19:16:12 crc kubenswrapper[4826]: I0319 19:16:12.091836 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1bd0e4d-d264-47a7-a3d0-71d4824ca253-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"a1bd0e4d-d264-47a7-a3d0-71d4824ca253\") " pod="openstack/ovsdbserver-sb-0" Mar 19 19:16:12 crc kubenswrapper[4826]: I0319 19:16:12.091880 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fh9k\" (UniqueName: \"kubernetes.io/projected/a1bd0e4d-d264-47a7-a3d0-71d4824ca253-kube-api-access-4fh9k\") pod \"ovsdbserver-sb-0\" (UID: \"a1bd0e4d-d264-47a7-a3d0-71d4824ca253\") " pod="openstack/ovsdbserver-sb-0" Mar 19 19:16:12 crc kubenswrapper[4826]: I0319 19:16:12.091918 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a1bd0e4d-d264-47a7-a3d0-71d4824ca253-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"a1bd0e4d-d264-47a7-a3d0-71d4824ca253\") " pod="openstack/ovsdbserver-sb-0" Mar 19 19:16:12 crc kubenswrapper[4826]: I0319 19:16:12.091935 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a1bd0e4d-d264-47a7-a3d0-71d4824ca253-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"a1bd0e4d-d264-47a7-a3d0-71d4824ca253\") " pod="openstack/ovsdbserver-sb-0" Mar 19 19:16:12 crc kubenswrapper[4826]: I0319 19:16:12.091989 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1bd0e4d-d264-47a7-a3d0-71d4824ca253-config\") pod \"ovsdbserver-sb-0\" (UID: \"a1bd0e4d-d264-47a7-a3d0-71d4824ca253\") " pod="openstack/ovsdbserver-sb-0" Mar 19 19:16:12 crc kubenswrapper[4826]: I0319 19:16:12.092127 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1bd0e4d-d264-47a7-a3d0-71d4824ca253-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"a1bd0e4d-d264-47a7-a3d0-71d4824ca253\") " pod="openstack/ovsdbserver-sb-0" Mar 19 19:16:12 crc kubenswrapper[4826]: I0319 19:16:12.092160 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-cf39844c-952e-4196-9588-ab43d44b3500\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cf39844c-952e-4196-9588-ab43d44b3500\") pod \"ovsdbserver-sb-0\" (UID: \"a1bd0e4d-d264-47a7-a3d0-71d4824ca253\") " pod="openstack/ovsdbserver-sb-0" Mar 19 19:16:12 crc kubenswrapper[4826]: I0319 19:16:12.092489 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a1bd0e4d-d264-47a7-a3d0-71d4824ca253-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"a1bd0e4d-d264-47a7-a3d0-71d4824ca253\") " pod="openstack/ovsdbserver-sb-0" Mar 19 19:16:12 crc kubenswrapper[4826]: I0319 19:16:12.092953 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1bd0e4d-d264-47a7-a3d0-71d4824ca253-config\") pod \"ovsdbserver-sb-0\" (UID: \"a1bd0e4d-d264-47a7-a3d0-71d4824ca253\") " pod="openstack/ovsdbserver-sb-0" Mar 19 19:16:12 crc kubenswrapper[4826]: I0319 19:16:12.093085 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a1bd0e4d-d264-47a7-a3d0-71d4824ca253-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"a1bd0e4d-d264-47a7-a3d0-71d4824ca253\") " pod="openstack/ovsdbserver-sb-0" Mar 19 19:16:12 crc kubenswrapper[4826]: I0319 19:16:12.097197 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1bd0e4d-d264-47a7-a3d0-71d4824ca253-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"a1bd0e4d-d264-47a7-a3d0-71d4824ca253\") " pod="openstack/ovsdbserver-sb-0" Mar 19 19:16:12 crc kubenswrapper[4826]: I0319 19:16:12.101938 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1bd0e4d-d264-47a7-a3d0-71d4824ca253-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"a1bd0e4d-d264-47a7-a3d0-71d4824ca253\") " pod="openstack/ovsdbserver-sb-0" Mar 19 19:16:12 crc kubenswrapper[4826]: I0319 19:16:12.109186 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1bd0e4d-d264-47a7-a3d0-71d4824ca253-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"a1bd0e4d-d264-47a7-a3d0-71d4824ca253\") " pod="openstack/ovsdbserver-sb-0" Mar 19 19:16:12 crc kubenswrapper[4826]: I0319 19:16:12.109395 4826 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 19:16:12 crc kubenswrapper[4826]: I0319 19:16:12.109430 4826 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-cf39844c-952e-4196-9588-ab43d44b3500\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cf39844c-952e-4196-9588-ab43d44b3500\") pod \"ovsdbserver-sb-0\" (UID: \"a1bd0e4d-d264-47a7-a3d0-71d4824ca253\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/e3995ddc0c95148bf9621d9159511d6b3846a5504d6002df069ec03edf90ddc2/globalmount\"" pod="openstack/ovsdbserver-sb-0" Mar 19 19:16:12 crc kubenswrapper[4826]: I0319 19:16:12.125443 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fh9k\" (UniqueName: \"kubernetes.io/projected/a1bd0e4d-d264-47a7-a3d0-71d4824ca253-kube-api-access-4fh9k\") pod \"ovsdbserver-sb-0\" (UID: \"a1bd0e4d-d264-47a7-a3d0-71d4824ca253\") " pod="openstack/ovsdbserver-sb-0" Mar 19 19:16:12 crc kubenswrapper[4826]: I0319 19:16:12.166542 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-cf39844c-952e-4196-9588-ab43d44b3500\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cf39844c-952e-4196-9588-ab43d44b3500\") pod \"ovsdbserver-sb-0\" (UID: \"a1bd0e4d-d264-47a7-a3d0-71d4824ca253\") " pod="openstack/ovsdbserver-sb-0" Mar 19 19:16:12 crc kubenswrapper[4826]: I0319 19:16:12.251813 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 19 19:16:15 crc kubenswrapper[4826]: I0319 19:16:15.770387 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 19 19:16:19 crc kubenswrapper[4826]: E0319 19:16:19.687399 4826 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Mar 19 19:16:19 crc kubenswrapper[4826]: E0319 19:16:19.688175 4826 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6xxg5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(69dc8d23-ac18-40b1-99d9-365705c5753b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 19:16:19 crc kubenswrapper[4826]: E0319 19:16:19.689709 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="69dc8d23-ac18-40b1-99d9-365705c5753b" Mar 19 19:16:19 crc kubenswrapper[4826]: E0319 19:16:19.832145 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="69dc8d23-ac18-40b1-99d9-365705c5753b" Mar 19 19:16:24 crc kubenswrapper[4826]: I0319 19:16:24.912338 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"763c5ded-be94-49ad-9eea-447e444f24f3","Type":"ContainerStarted","Data":"620b7496fe08941eca29335314d653254347f3be240466f5342cf5d368b33c28"} Mar 19 19:16:25 crc kubenswrapper[4826]: E0319 19:16:25.596735 4826 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 19 19:16:25 crc kubenswrapper[4826]: E0319 19:16:25.597269 4826 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xnqgk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5ccc8479f9-qtzpm_openstack(2bb5b40f-d23c-411f-82d2-7d443b88a00b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 19:16:25 crc kubenswrapper[4826]: E0319 19:16:25.598626 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5ccc8479f9-qtzpm" podUID="2bb5b40f-d23c-411f-82d2-7d443b88a00b" Mar 19 19:16:25 crc kubenswrapper[4826]: E0319 19:16:25.668274 4826 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 19 19:16:25 crc kubenswrapper[4826]: E0319 19:16:25.668431 4826 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9gfh5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-m87mz_openstack(306202dd-aac9-4865-a5aa-69c04b06cf09): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 19:16:25 crc kubenswrapper[4826]: E0319 19:16:25.670773 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-m87mz" podUID="306202dd-aac9-4865-a5aa-69c04b06cf09" Mar 19 19:16:25 crc kubenswrapper[4826]: E0319 19:16:25.677810 4826 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 19 19:16:25 crc kubenswrapper[4826]: E0319 19:16:25.677964 4826 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rbg5m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-2z2f5_openstack(68e26478-de34-4ca1-8c1f-ea760101ec64): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 19:16:25 crc kubenswrapper[4826]: E0319 19:16:25.681310 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-2z2f5" podUID="68e26478-de34-4ca1-8c1f-ea760101ec64" Mar 19 19:16:25 crc kubenswrapper[4826]: E0319 19:16:25.872483 4826 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 19 19:16:25 crc kubenswrapper[4826]: E0319 19:16:25.872999 4826 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-szr2d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-css7p_openstack(a9d44695-2372-4bc3-a893-1d82703fc963): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 19:16:25 crc kubenswrapper[4826]: E0319 19:16:25.875393 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-css7p" podUID="a9d44695-2372-4bc3-a893-1d82703fc963" Mar 19 19:16:25 crc kubenswrapper[4826]: E0319 19:16:25.926146 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-5ccc8479f9-qtzpm" podUID="2bb5b40f-d23c-411f-82d2-7d443b88a00b" Mar 19 19:16:25 crc kubenswrapper[4826]: E0319 19:16:25.926413 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-css7p" podUID="a9d44695-2372-4bc3-a893-1d82703fc963" Mar 19 19:16:26 crc kubenswrapper[4826]: I0319 19:16:26.786623 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 19 19:16:26 crc kubenswrapper[4826]: I0319 19:16:26.793030 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 19 19:16:26 crc kubenswrapper[4826]: W0319 19:16:26.876480 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63839c94_d94a_4fe8_a195_b86a6a9e8b79.slice/crio-d4323ce2073b9fe115b288f1a909240a1be6b110f1a4e83906c64c507040c4f2 WatchSource:0}: Error finding container d4323ce2073b9fe115b288f1a909240a1be6b110f1a4e83906c64c507040c4f2: Status 404 returned error can't find the container with id d4323ce2073b9fe115b288f1a909240a1be6b110f1a4e83906c64c507040c4f2 Mar 19 19:16:26 crc kubenswrapper[4826]: W0319 19:16:26.877457 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod128ebe90_32c9_409b_b145_5f7f95c7dbbf.slice/crio-63a2905a093916437da8f94d6511949ee79c1fc16406de84b9a207e8cccef612 WatchSource:0}: Error finding container 63a2905a093916437da8f94d6511949ee79c1fc16406de84b9a207e8cccef612: Status 404 returned error can't find the container with id 63a2905a093916437da8f94d6511949ee79c1fc16406de84b9a207e8cccef612 Mar 19 19:16:26 crc kubenswrapper[4826]: I0319 19:16:26.935349 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-m87mz" event={"ID":"306202dd-aac9-4865-a5aa-69c04b06cf09","Type":"ContainerDied","Data":"eef8018a26014fc78ac5bc6a25c7450cfd24a02a8c11eb682709531589451ba9"} Mar 19 19:16:26 crc kubenswrapper[4826]: I0319 19:16:26.935391 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eef8018a26014fc78ac5bc6a25c7450cfd24a02a8c11eb682709531589451ba9" Mar 19 19:16:26 crc kubenswrapper[4826]: I0319 19:16:26.937610 4826 generic.go:334] "Generic (PLEG): container finished" podID="4d5fbfcc-053e-4453-961c-91a0719cdaa6" containerID="d401a99a442648f9b3fadc5fa90a70eef741fe9341c23b08bf40a32aca5d3fe1" exitCode=0 Mar 19 19:16:26 crc kubenswrapper[4826]: I0319 19:16:26.938265 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565796-wqw9j" event={"ID":"4d5fbfcc-053e-4453-961c-91a0719cdaa6","Type":"ContainerDied","Data":"d401a99a442648f9b3fadc5fa90a70eef741fe9341c23b08bf40a32aca5d3fe1"} Mar 19 19:16:26 crc kubenswrapper[4826]: I0319 19:16:26.942445 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"128ebe90-32c9-409b-b145-5f7f95c7dbbf","Type":"ContainerStarted","Data":"63a2905a093916437da8f94d6511949ee79c1fc16406de84b9a207e8cccef612"} Mar 19 19:16:26 crc kubenswrapper[4826]: I0319 19:16:26.944612 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"63839c94-d94a-4fe8-a195-b86a6a9e8b79","Type":"ContainerStarted","Data":"d4323ce2073b9fe115b288f1a909240a1be6b110f1a4e83906c64c507040c4f2"} Mar 19 19:16:27 crc kubenswrapper[4826]: I0319 19:16:27.047074 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-m87mz" Mar 19 19:16:27 crc kubenswrapper[4826]: I0319 19:16:27.081217 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-2z2f5" Mar 19 19:16:27 crc kubenswrapper[4826]: I0319 19:16:27.122143 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gfh5\" (UniqueName: \"kubernetes.io/projected/306202dd-aac9-4865-a5aa-69c04b06cf09-kube-api-access-9gfh5\") pod \"306202dd-aac9-4865-a5aa-69c04b06cf09\" (UID: \"306202dd-aac9-4865-a5aa-69c04b06cf09\") " Mar 19 19:16:27 crc kubenswrapper[4826]: I0319 19:16:27.122204 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68e26478-de34-4ca1-8c1f-ea760101ec64-dns-svc\") pod \"68e26478-de34-4ca1-8c1f-ea760101ec64\" (UID: \"68e26478-de34-4ca1-8c1f-ea760101ec64\") " Mar 19 19:16:27 crc kubenswrapper[4826]: I0319 19:16:27.122344 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68e26478-de34-4ca1-8c1f-ea760101ec64-config\") pod \"68e26478-de34-4ca1-8c1f-ea760101ec64\" (UID: \"68e26478-de34-4ca1-8c1f-ea760101ec64\") " Mar 19 19:16:27 crc kubenswrapper[4826]: I0319 19:16:27.122460 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/306202dd-aac9-4865-a5aa-69c04b06cf09-config\") pod \"306202dd-aac9-4865-a5aa-69c04b06cf09\" (UID: \"306202dd-aac9-4865-a5aa-69c04b06cf09\") " Mar 19 19:16:27 crc kubenswrapper[4826]: I0319 19:16:27.122544 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbg5m\" (UniqueName: \"kubernetes.io/projected/68e26478-de34-4ca1-8c1f-ea760101ec64-kube-api-access-rbg5m\") pod \"68e26478-de34-4ca1-8c1f-ea760101ec64\" (UID: \"68e26478-de34-4ca1-8c1f-ea760101ec64\") " Mar 19 19:16:27 crc kubenswrapper[4826]: I0319 19:16:27.124140 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/306202dd-aac9-4865-a5aa-69c04b06cf09-config" (OuterVolumeSpecName: "config") pod "306202dd-aac9-4865-a5aa-69c04b06cf09" (UID: "306202dd-aac9-4865-a5aa-69c04b06cf09"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:16:27 crc kubenswrapper[4826]: I0319 19:16:27.124533 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68e26478-de34-4ca1-8c1f-ea760101ec64-config" (OuterVolumeSpecName: "config") pod "68e26478-de34-4ca1-8c1f-ea760101ec64" (UID: "68e26478-de34-4ca1-8c1f-ea760101ec64"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:16:27 crc kubenswrapper[4826]: I0319 19:16:27.124880 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68e26478-de34-4ca1-8c1f-ea760101ec64-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "68e26478-de34-4ca1-8c1f-ea760101ec64" (UID: "68e26478-de34-4ca1-8c1f-ea760101ec64"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:16:27 crc kubenswrapper[4826]: I0319 19:16:27.129365 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68e26478-de34-4ca1-8c1f-ea760101ec64-kube-api-access-rbg5m" (OuterVolumeSpecName: "kube-api-access-rbg5m") pod "68e26478-de34-4ca1-8c1f-ea760101ec64" (UID: "68e26478-de34-4ca1-8c1f-ea760101ec64"). InnerVolumeSpecName "kube-api-access-rbg5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:16:27 crc kubenswrapper[4826]: I0319 19:16:27.167271 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/306202dd-aac9-4865-a5aa-69c04b06cf09-kube-api-access-9gfh5" (OuterVolumeSpecName: "kube-api-access-9gfh5") pod "306202dd-aac9-4865-a5aa-69c04b06cf09" (UID: "306202dd-aac9-4865-a5aa-69c04b06cf09"). InnerVolumeSpecName "kube-api-access-9gfh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:16:27 crc kubenswrapper[4826]: I0319 19:16:27.203908 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 19 19:16:27 crc kubenswrapper[4826]: I0319 19:16:27.225265 4826 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/306202dd-aac9-4865-a5aa-69c04b06cf09-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:27 crc kubenswrapper[4826]: I0319 19:16:27.225292 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbg5m\" (UniqueName: \"kubernetes.io/projected/68e26478-de34-4ca1-8c1f-ea760101ec64-kube-api-access-rbg5m\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:27 crc kubenswrapper[4826]: I0319 19:16:27.225302 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gfh5\" (UniqueName: \"kubernetes.io/projected/306202dd-aac9-4865-a5aa-69c04b06cf09-kube-api-access-9gfh5\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:27 crc kubenswrapper[4826]: I0319 19:16:27.225312 4826 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68e26478-de34-4ca1-8c1f-ea760101ec64-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:27 crc kubenswrapper[4826]: I0319 19:16:27.225321 4826 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68e26478-de34-4ca1-8c1f-ea760101ec64-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:27 crc kubenswrapper[4826]: I0319 19:16:27.383010 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-849c6d8fdf-t6vlp"] Mar 19 19:16:27 crc kubenswrapper[4826]: I0319 19:16:27.417076 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-7f87b9b85b-9r5qq"] Mar 19 19:16:27 crc kubenswrapper[4826]: I0319 19:16:27.426321 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 19 19:16:27 crc kubenswrapper[4826]: I0319 19:16:27.432232 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-wdll6"] Mar 19 19:16:27 crc kubenswrapper[4826]: W0319 19:16:27.482094 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd068f929_58c2_481e_99bd_e7808a74f36e.slice/crio-e57aba7237813bc874e5d8e5037af86fc70cc32555e93c313296ac9712ed6b12 WatchSource:0}: Error finding container e57aba7237813bc874e5d8e5037af86fc70cc32555e93c313296ac9712ed6b12: Status 404 returned error can't find the container with id e57aba7237813bc874e5d8e5037af86fc70cc32555e93c313296ac9712ed6b12 Mar 19 19:16:27 crc kubenswrapper[4826]: I0319 19:16:27.753549 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 19 19:16:27 crc kubenswrapper[4826]: I0319 19:16:27.957802 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wdll6" event={"ID":"2ed5ed9d-f761-4b5d-8cc8-07693c1d1289","Type":"ContainerStarted","Data":"d24d701f2074bafdac9df2c5d5eafbbea19dc471eaf0d7a97bb0707833afe0d5"} Mar 19 19:16:27 crc kubenswrapper[4826]: I0319 19:16:27.965215 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"7be203b6-dbb5-49d5-935f-9844ee4d6c11","Type":"ContainerStarted","Data":"5bb48a28dbd5ccc899c69c66e3bc9b2c57c5fe24c6d4bb5c9c72272b6309228a"} Mar 19 19:16:27 crc kubenswrapper[4826]: I0319 19:16:27.966913 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-7f87b9b85b-9r5qq" event={"ID":"e3abbb77-c3e9-4c0f-8038-2cdc6ddd10a5","Type":"ContainerStarted","Data":"ba7000fd7dd4a37d0ce7320648da1fb665cf6f1cc0710516e41cc7ee1cab5d76"} Mar 19 19:16:27 crc kubenswrapper[4826]: I0319 19:16:27.968764 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-849c6d8fdf-t6vlp" event={"ID":"d068f929-58c2-481e-99bd-e7808a74f36e","Type":"ContainerStarted","Data":"01cf0b5e16a2a0f2b6b150c7f92fb442c9a5a540da1a3a471cf772321432716b"} Mar 19 19:16:27 crc kubenswrapper[4826]: I0319 19:16:27.968790 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-849c6d8fdf-t6vlp" event={"ID":"d068f929-58c2-481e-99bd-e7808a74f36e","Type":"ContainerStarted","Data":"e57aba7237813bc874e5d8e5037af86fc70cc32555e93c313296ac9712ed6b12"} Mar 19 19:16:27 crc kubenswrapper[4826]: I0319 19:16:27.971708 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"2325ef7c-90a0-48f3-81f0-ede3e7f33570","Type":"ContainerStarted","Data":"045565dbdc1fcc69a6554c263960690e114832329c56dc8084a5c107a59ae84b"} Mar 19 19:16:27 crc kubenswrapper[4826]: I0319 19:16:27.973409 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"ad041e2d-3400-49ce-b25f-0d335f3b6738","Type":"ContainerStarted","Data":"4949977f15e132445d8d9e1657957d62bc88426ae01376ce6c0dd6719b94f8e3"} Mar 19 19:16:27 crc kubenswrapper[4826]: I0319 19:16:27.974213 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"a1bd0e4d-d264-47a7-a3d0-71d4824ca253","Type":"ContainerStarted","Data":"483cce7981317ecc345c73820ab33fa547caa95c71ab81731a4672e11df8e4a3"} Mar 19 19:16:27 crc kubenswrapper[4826]: I0319 19:16:27.977832 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-2z2f5" Mar 19 19:16:27 crc kubenswrapper[4826]: I0319 19:16:27.980824 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-m87mz" Mar 19 19:16:28 crc kubenswrapper[4826]: I0319 19:16:28.003713 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"38814433-1737-49df-966a-ac3511ed48dd","Type":"ContainerStarted","Data":"e5e4e0430189f36a8c7884848b5ce7fde6a5e084d033daec1b289ae4626bb168"} Mar 19 19:16:28 crc kubenswrapper[4826]: I0319 19:16:28.003751 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-2z2f5" event={"ID":"68e26478-de34-4ca1-8c1f-ea760101ec64","Type":"ContainerDied","Data":"a2c1ef2af69a688f61631fd0c5055ed3e1176b9f0d3c234bb7f64b20b8e5771b"} Mar 19 19:16:28 crc kubenswrapper[4826]: I0319 19:16:28.003769 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e617bcf9-daaa-4a7a-949c-cdf0fc9646a5","Type":"ContainerStarted","Data":"9f47c3da92f7e4ec1425e458bec96773c98df6c88e2e80157791087f2af7f4bd"} Mar 19 19:16:28 crc kubenswrapper[4826]: I0319 19:16:28.028191 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-849c6d8fdf-t6vlp" podStartSLOduration=22.028171622 podStartE2EDuration="22.028171622s" podCreationTimestamp="2026-03-19 19:16:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:16:27.989991948 +0000 UTC m=+1212.744060261" watchObservedRunningTime="2026-03-19 19:16:28.028171622 +0000 UTC m=+1212.782239935" Mar 19 19:16:28 crc kubenswrapper[4826]: I0319 19:16:28.142517 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-m87mz"] Mar 19 19:16:28 crc kubenswrapper[4826]: I0319 19:16:28.154824 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-m87mz"] Mar 19 19:16:28 crc kubenswrapper[4826]: I0319 19:16:28.177408 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-2z2f5"] Mar 19 19:16:28 crc kubenswrapper[4826]: I0319 19:16:28.183575 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-2z2f5"] Mar 19 19:16:28 crc kubenswrapper[4826]: I0319 19:16:28.460934 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-2vpmv"] Mar 19 19:16:28 crc kubenswrapper[4826]: I0319 19:16:28.592010 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 19 19:16:29 crc kubenswrapper[4826]: I0319 19:16:29.517824 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565796-wqw9j" Mar 19 19:16:29 crc kubenswrapper[4826]: I0319 19:16:29.692208 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzvb6\" (UniqueName: \"kubernetes.io/projected/4d5fbfcc-053e-4453-961c-91a0719cdaa6-kube-api-access-rzvb6\") pod \"4d5fbfcc-053e-4453-961c-91a0719cdaa6\" (UID: \"4d5fbfcc-053e-4453-961c-91a0719cdaa6\") " Mar 19 19:16:29 crc kubenswrapper[4826]: I0319 19:16:29.714562 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d5fbfcc-053e-4453-961c-91a0719cdaa6-kube-api-access-rzvb6" (OuterVolumeSpecName: "kube-api-access-rzvb6") pod "4d5fbfcc-053e-4453-961c-91a0719cdaa6" (UID: "4d5fbfcc-053e-4453-961c-91a0719cdaa6"). InnerVolumeSpecName "kube-api-access-rzvb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:16:29 crc kubenswrapper[4826]: I0319 19:16:29.794103 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzvb6\" (UniqueName: \"kubernetes.io/projected/4d5fbfcc-053e-4453-961c-91a0719cdaa6-kube-api-access-rzvb6\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:29 crc kubenswrapper[4826]: I0319 19:16:29.989888 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="306202dd-aac9-4865-a5aa-69c04b06cf09" path="/var/lib/kubelet/pods/306202dd-aac9-4865-a5aa-69c04b06cf09/volumes" Mar 19 19:16:29 crc kubenswrapper[4826]: I0319 19:16:29.990441 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68e26478-de34-4ca1-8c1f-ea760101ec64" path="/var/lib/kubelet/pods/68e26478-de34-4ca1-8c1f-ea760101ec64/volumes" Mar 19 19:16:30 crc kubenswrapper[4826]: I0319 19:16:30.000166 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"7411e60b-ca3b-4409-a289-0513649c49b3","Type":"ContainerStarted","Data":"f8e28de9c81beef7a400317acc7be53ca510202cd09c1c51a89a1f34f384dd15"} Mar 19 19:16:30 crc kubenswrapper[4826]: I0319 19:16:30.001750 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565796-wqw9j" event={"ID":"4d5fbfcc-053e-4453-961c-91a0719cdaa6","Type":"ContainerDied","Data":"626bd617295d2f711549f0cea9d5c4be79d4bd6bb66fa111a68886fca94a0d82"} Mar 19 19:16:30 crc kubenswrapper[4826]: I0319 19:16:30.001793 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="626bd617295d2f711549f0cea9d5c4be79d4bd6bb66fa111a68886fca94a0d82" Mar 19 19:16:30 crc kubenswrapper[4826]: I0319 19:16:30.001837 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565796-wqw9j" Mar 19 19:16:30 crc kubenswrapper[4826]: I0319 19:16:30.615916 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565790-twk5t"] Mar 19 19:16:30 crc kubenswrapper[4826]: I0319 19:16:30.622393 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565790-twk5t"] Mar 19 19:16:31 crc kubenswrapper[4826]: W0319 19:16:31.762874 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91261e82_2106_4923_9f54_8a7e04b033b8.slice/crio-801264316cafd364ae7fbf02bedffaae2b1c03b91c62d9aa1b2993a444d029f4 WatchSource:0}: Error finding container 801264316cafd364ae7fbf02bedffaae2b1c03b91c62d9aa1b2993a444d029f4: Status 404 returned error can't find the container with id 801264316cafd364ae7fbf02bedffaae2b1c03b91c62d9aa1b2993a444d029f4 Mar 19 19:16:32 crc kubenswrapper[4826]: I0319 19:16:32.014543 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ac6f48b-794a-4278-8d3f-9a1cfd37c033" path="/var/lib/kubelet/pods/6ac6f48b-794a-4278-8d3f-9a1cfd37c033/volumes" Mar 19 19:16:32 crc kubenswrapper[4826]: I0319 19:16:32.027069 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2vpmv" event={"ID":"91261e82-2106-4923-9f54-8a7e04b033b8","Type":"ContainerStarted","Data":"801264316cafd364ae7fbf02bedffaae2b1c03b91c62d9aa1b2993a444d029f4"} Mar 19 19:16:37 crc kubenswrapper[4826]: I0319 19:16:37.097820 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"128ebe90-32c9-409b-b145-5f7f95c7dbbf","Type":"ContainerStarted","Data":"4db0b4ce213df9de558e74a122dca877cbee835a2e2dc10ce8f3d04c7b703efa"} Mar 19 19:16:37 crc kubenswrapper[4826]: I0319 19:16:37.100572 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 19 19:16:37 crc kubenswrapper[4826]: I0319 19:16:37.106885 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-7f87b9b85b-9r5qq" event={"ID":"e3abbb77-c3e9-4c0f-8038-2cdc6ddd10a5","Type":"ContainerStarted","Data":"380b12db30ea57c687596e9a71f31ab4ddbfa526d56377ab1ab3bb364cd7c077"} Mar 19 19:16:37 crc kubenswrapper[4826]: I0319 19:16:37.124551 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=27.45681482 podStartE2EDuration="35.124528293s" podCreationTimestamp="2026-03-19 19:16:02 +0000 UTC" firstStartedPulling="2026-03-19 19:16:26.88079088 +0000 UTC m=+1211.634859183" lastFinishedPulling="2026-03-19 19:16:34.548504343 +0000 UTC m=+1219.302572656" observedRunningTime="2026-03-19 19:16:37.121236983 +0000 UTC m=+1221.875305306" watchObservedRunningTime="2026-03-19 19:16:37.124528293 +0000 UTC m=+1221.878596606" Mar 19 19:16:37 crc kubenswrapper[4826]: I0319 19:16:37.154077 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-ui-dashboards-7f87b9b85b-9r5qq" podStartSLOduration=22.811007752 podStartE2EDuration="31.154056387s" podCreationTimestamp="2026-03-19 19:16:06 +0000 UTC" firstStartedPulling="2026-03-19 19:16:27.52500675 +0000 UTC m=+1212.279075063" lastFinishedPulling="2026-03-19 19:16:35.868055375 +0000 UTC m=+1220.622123698" observedRunningTime="2026-03-19 19:16:37.138717176 +0000 UTC m=+1221.892785489" watchObservedRunningTime="2026-03-19 19:16:37.154056387 +0000 UTC m=+1221.908124710" Mar 19 19:16:37 crc kubenswrapper[4826]: I0319 19:16:37.188010 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-849c6d8fdf-t6vlp" Mar 19 19:16:37 crc kubenswrapper[4826]: I0319 19:16:37.188048 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-849c6d8fdf-t6vlp" Mar 19 19:16:37 crc kubenswrapper[4826]: I0319 19:16:37.194213 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-849c6d8fdf-t6vlp" Mar 19 19:16:38 crc kubenswrapper[4826]: I0319 19:16:38.117357 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"69dc8d23-ac18-40b1-99d9-365705c5753b","Type":"ContainerStarted","Data":"d503c343673c800d54ff6e6cc56a18acaec57bf393c9bb2a22d379eea6512b2d"} Mar 19 19:16:38 crc kubenswrapper[4826]: I0319 19:16:38.119504 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wdll6" event={"ID":"2ed5ed9d-f761-4b5d-8cc8-07693c1d1289","Type":"ContainerStarted","Data":"24bb2ced8ade1a1732fef2597f049bb9fa4246f4ab9a819004c6eeb42e8ee1dd"} Mar 19 19:16:38 crc kubenswrapper[4826]: I0319 19:16:38.119642 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-wdll6" Mar 19 19:16:38 crc kubenswrapper[4826]: I0319 19:16:38.121492 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"7be203b6-dbb5-49d5-935f-9844ee4d6c11","Type":"ContainerStarted","Data":"5bb004602f7948e5a1d29c36fe0641da926560d38c110106c8ccbaf1a5f48fa0"} Mar 19 19:16:38 crc kubenswrapper[4826]: I0319 19:16:38.121573 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 19 19:16:38 crc kubenswrapper[4826]: I0319 19:16:38.123867 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"a1bd0e4d-d264-47a7-a3d0-71d4824ca253","Type":"ContainerStarted","Data":"41737093439ac4d583eadf04fbe14855334a1fff26f9b7cec93f8bb5e0153182"} Mar 19 19:16:38 crc kubenswrapper[4826]: I0319 19:16:38.125586 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"38814433-1737-49df-966a-ac3511ed48dd","Type":"ContainerStarted","Data":"c675efc30ff2824888327fc0d074397c016dd9f7f215ff3213708f95e459f518"} Mar 19 19:16:38 crc kubenswrapper[4826]: I0319 19:16:38.127533 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2vpmv" event={"ID":"91261e82-2106-4923-9f54-8a7e04b033b8","Type":"ContainerStarted","Data":"5f7a0dfac128b48239411318d175f9dd30bf86d3dfcf407fc002e0a65bf4875e"} Mar 19 19:16:38 crc kubenswrapper[4826]: I0319 19:16:38.128911 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"763c5ded-be94-49ad-9eea-447e444f24f3","Type":"ContainerStarted","Data":"7ddcf38015080e1279c095f6f6c3b6974053d4a4a54d3cdeeaddc0b4bd67211e"} Mar 19 19:16:38 crc kubenswrapper[4826]: I0319 19:16:38.130240 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"7411e60b-ca3b-4409-a289-0513649c49b3","Type":"ContainerStarted","Data":"d83067b8f2e20adff542aee21600dbd43b8ee88d9bfdd3d5a5b8289379893d75"} Mar 19 19:16:38 crc kubenswrapper[4826]: I0319 19:16:38.134271 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-849c6d8fdf-t6vlp" Mar 19 19:16:38 crc kubenswrapper[4826]: I0319 19:16:38.205882 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-wdll6" podStartSLOduration=22.849831299999998 podStartE2EDuration="31.205865648s" podCreationTimestamp="2026-03-19 19:16:07 +0000 UTC" firstStartedPulling="2026-03-19 19:16:27.512019087 +0000 UTC m=+1212.266087410" lastFinishedPulling="2026-03-19 19:16:35.868053435 +0000 UTC m=+1220.622121758" observedRunningTime="2026-03-19 19:16:38.199907444 +0000 UTC m=+1222.953975777" watchObservedRunningTime="2026-03-19 19:16:38.205865648 +0000 UTC m=+1222.959933951" Mar 19 19:16:38 crc kubenswrapper[4826]: I0319 19:16:38.302178 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=24.317631047 podStartE2EDuration="33.302159896s" podCreationTimestamp="2026-03-19 19:16:05 +0000 UTC" firstStartedPulling="2026-03-19 19:16:27.499049073 +0000 UTC m=+1212.253117386" lastFinishedPulling="2026-03-19 19:16:36.483577912 +0000 UTC m=+1221.237646235" observedRunningTime="2026-03-19 19:16:38.28086002 +0000 UTC m=+1223.034928333" watchObservedRunningTime="2026-03-19 19:16:38.302159896 +0000 UTC m=+1223.056228209" Mar 19 19:16:38 crc kubenswrapper[4826]: I0319 19:16:38.308632 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-54c8fbc9df-9kq4s"] Mar 19 19:16:39 crc kubenswrapper[4826]: I0319 19:16:39.143506 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-qtzpm" event={"ID":"2bb5b40f-d23c-411f-82d2-7d443b88a00b","Type":"ContainerDied","Data":"356c2ada1626789f06f3e9dfe2d3d9cadf24fc37bd6cae4a220c9542763a18bf"} Mar 19 19:16:39 crc kubenswrapper[4826]: I0319 19:16:39.143415 4826 generic.go:334] "Generic (PLEG): container finished" podID="2bb5b40f-d23c-411f-82d2-7d443b88a00b" containerID="356c2ada1626789f06f3e9dfe2d3d9cadf24fc37bd6cae4a220c9542763a18bf" exitCode=0 Mar 19 19:16:39 crc kubenswrapper[4826]: I0319 19:16:39.148484 4826 generic.go:334] "Generic (PLEG): container finished" podID="91261e82-2106-4923-9f54-8a7e04b033b8" containerID="5f7a0dfac128b48239411318d175f9dd30bf86d3dfcf407fc002e0a65bf4875e" exitCode=0 Mar 19 19:16:39 crc kubenswrapper[4826]: I0319 19:16:39.148555 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2vpmv" event={"ID":"91261e82-2106-4923-9f54-8a7e04b033b8","Type":"ContainerDied","Data":"5f7a0dfac128b48239411318d175f9dd30bf86d3dfcf407fc002e0a65bf4875e"} Mar 19 19:16:40 crc kubenswrapper[4826]: I0319 19:16:40.166122 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2vpmv" event={"ID":"91261e82-2106-4923-9f54-8a7e04b033b8","Type":"ContainerStarted","Data":"049b1bdb4752140a99ec4740acd83269e1b59122fad638c6c53810b1e5e4829f"} Mar 19 19:16:40 crc kubenswrapper[4826]: I0319 19:16:40.169781 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-2vpmv" Mar 19 19:16:40 crc kubenswrapper[4826]: I0319 19:16:40.169826 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-2vpmv" Mar 19 19:16:40 crc kubenswrapper[4826]: I0319 19:16:40.169841 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2vpmv" event={"ID":"91261e82-2106-4923-9f54-8a7e04b033b8","Type":"ContainerStarted","Data":"f85e4187f01a81dc9895956824e05e6e34a400b1c7496bb5e50cd04e6fbe6811"} Mar 19 19:16:40 crc kubenswrapper[4826]: I0319 19:16:40.170288 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-qtzpm" event={"ID":"2bb5b40f-d23c-411f-82d2-7d443b88a00b","Type":"ContainerStarted","Data":"a5b9f30c58e83a0c0764abbc60312b001c058fdf98943cc86e22b46a562c7676"} Mar 19 19:16:40 crc kubenswrapper[4826]: I0319 19:16:40.170476 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ccc8479f9-qtzpm" Mar 19 19:16:40 crc kubenswrapper[4826]: I0319 19:16:40.173156 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"63839c94-d94a-4fe8-a195-b86a6a9e8b79","Type":"ContainerStarted","Data":"7e3557f404ae5596f07e68f863e66e56de142e27881c119d8b748b24b7a4453b"} Mar 19 19:16:40 crc kubenswrapper[4826]: I0319 19:16:40.196938 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-2vpmv" podStartSLOduration=29.414025482 podStartE2EDuration="33.19691749s" podCreationTimestamp="2026-03-19 19:16:07 +0000 UTC" firstStartedPulling="2026-03-19 19:16:31.766375352 +0000 UTC m=+1216.520443675" lastFinishedPulling="2026-03-19 19:16:35.54926737 +0000 UTC m=+1220.303335683" observedRunningTime="2026-03-19 19:16:40.196575262 +0000 UTC m=+1224.950643605" watchObservedRunningTime="2026-03-19 19:16:40.19691749 +0000 UTC m=+1224.950985803" Mar 19 19:16:40 crc kubenswrapper[4826]: I0319 19:16:40.224196 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ccc8479f9-qtzpm" podStartSLOduration=3.124638601 podStartE2EDuration="42.224178299s" podCreationTimestamp="2026-03-19 19:15:58 +0000 UTC" firstStartedPulling="2026-03-19 19:15:59.513816809 +0000 UTC m=+1184.267885112" lastFinishedPulling="2026-03-19 19:16:38.613356497 +0000 UTC m=+1223.367424810" observedRunningTime="2026-03-19 19:16:40.216194227 +0000 UTC m=+1224.970262570" watchObservedRunningTime="2026-03-19 19:16:40.224178299 +0000 UTC m=+1224.978246622" Mar 19 19:16:41 crc kubenswrapper[4826]: E0319 19:16:41.690836 4826 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod38814433_1737_49df_966a_ac3511ed48dd.slice/crio-c675efc30ff2824888327fc0d074397c016dd9f7f215ff3213708f95e459f518.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod763c5ded_be94_49ad_9eea_447e444f24f3.slice/crio-7ddcf38015080e1279c095f6f6c3b6974053d4a4a54d3cdeeaddc0b4bd67211e.scope\": RecentStats: unable to find data in memory cache]" Mar 19 19:16:42 crc kubenswrapper[4826]: I0319 19:16:42.220867 4826 generic.go:334] "Generic (PLEG): container finished" podID="38814433-1737-49df-966a-ac3511ed48dd" containerID="c675efc30ff2824888327fc0d074397c016dd9f7f215ff3213708f95e459f518" exitCode=0 Mar 19 19:16:42 crc kubenswrapper[4826]: I0319 19:16:42.220966 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"38814433-1737-49df-966a-ac3511ed48dd","Type":"ContainerDied","Data":"c675efc30ff2824888327fc0d074397c016dd9f7f215ff3213708f95e459f518"} Mar 19 19:16:42 crc kubenswrapper[4826]: I0319 19:16:42.225519 4826 generic.go:334] "Generic (PLEG): container finished" podID="763c5ded-be94-49ad-9eea-447e444f24f3" containerID="7ddcf38015080e1279c095f6f6c3b6974053d4a4a54d3cdeeaddc0b4bd67211e" exitCode=0 Mar 19 19:16:42 crc kubenswrapper[4826]: I0319 19:16:42.225574 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"763c5ded-be94-49ad-9eea-447e444f24f3","Type":"ContainerDied","Data":"7ddcf38015080e1279c095f6f6c3b6974053d4a4a54d3cdeeaddc0b4bd67211e"} Mar 19 19:16:42 crc kubenswrapper[4826]: I0319 19:16:42.991010 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 19 19:16:43 crc kubenswrapper[4826]: I0319 19:16:43.236187 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"763c5ded-be94-49ad-9eea-447e444f24f3","Type":"ContainerStarted","Data":"48ef0c18927acad3cab5327c9df4d256a3f01325b10cf9e6772558514a35dec9"} Mar 19 19:16:43 crc kubenswrapper[4826]: I0319 19:16:43.237433 4826 generic.go:334] "Generic (PLEG): container finished" podID="a9d44695-2372-4bc3-a893-1d82703fc963" containerID="379adec6e5c7baf26d53552d8e77b0964e8f8693cfa0e9b5245dd4e7c4a91013" exitCode=0 Mar 19 19:16:43 crc kubenswrapper[4826]: I0319 19:16:43.237521 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-css7p" event={"ID":"a9d44695-2372-4bc3-a893-1d82703fc963","Type":"ContainerDied","Data":"379adec6e5c7baf26d53552d8e77b0964e8f8693cfa0e9b5245dd4e7c4a91013"} Mar 19 19:16:43 crc kubenswrapper[4826]: I0319 19:16:43.240670 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"7411e60b-ca3b-4409-a289-0513649c49b3","Type":"ContainerStarted","Data":"0b050a90cb60c7f543735c4acf7910a3fbd52cd77c02a89a99f58acf31813d77"} Mar 19 19:16:43 crc kubenswrapper[4826]: I0319 19:16:43.242590 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"a1bd0e4d-d264-47a7-a3d0-71d4824ca253","Type":"ContainerStarted","Data":"0f950c6d7d12367f8c5f751758cfa83f4f57fe94ac41da559080cd6c39488f7b"} Mar 19 19:16:43 crc kubenswrapper[4826]: I0319 19:16:43.244166 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"38814433-1737-49df-966a-ac3511ed48dd","Type":"ContainerStarted","Data":"3c6b4dafb4bb937c4481ee36080942d492dddee83e2f324b34dcb098d03b3ea9"} Mar 19 19:16:43 crc kubenswrapper[4826]: I0319 19:16:43.258358 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=33.925908526 podStartE2EDuration="44.258344693s" podCreationTimestamp="2026-03-19 19:15:59 +0000 UTC" firstStartedPulling="2026-03-19 19:16:24.675412098 +0000 UTC m=+1209.429480421" lastFinishedPulling="2026-03-19 19:16:35.007848275 +0000 UTC m=+1219.761916588" observedRunningTime="2026-03-19 19:16:43.254140801 +0000 UTC m=+1228.008209114" watchObservedRunningTime="2026-03-19 19:16:43.258344693 +0000 UTC m=+1228.012413006" Mar 19 19:16:43 crc kubenswrapper[4826]: I0319 19:16:43.285181 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=18.905015213 podStartE2EDuration="33.28514085s" podCreationTimestamp="2026-03-19 19:16:10 +0000 UTC" firstStartedPulling="2026-03-19 19:16:27.766797804 +0000 UTC m=+1212.520866117" lastFinishedPulling="2026-03-19 19:16:42.146923431 +0000 UTC m=+1226.900991754" observedRunningTime="2026-03-19 19:16:43.281286058 +0000 UTC m=+1228.035354371" watchObservedRunningTime="2026-03-19 19:16:43.28514085 +0000 UTC m=+1228.039209173" Mar 19 19:16:43 crc kubenswrapper[4826]: I0319 19:16:43.307994 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=34.123340238 podStartE2EDuration="42.307978883s" podCreationTimestamp="2026-03-19 19:16:01 +0000 UTC" firstStartedPulling="2026-03-19 19:16:27.209028474 +0000 UTC m=+1211.963096787" lastFinishedPulling="2026-03-19 19:16:35.393667109 +0000 UTC m=+1220.147735432" observedRunningTime="2026-03-19 19:16:43.303687159 +0000 UTC m=+1228.057755482" watchObservedRunningTime="2026-03-19 19:16:43.307978883 +0000 UTC m=+1228.062047196" Mar 19 19:16:43 crc kubenswrapper[4826]: I0319 19:16:43.348604 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=23.600992715 podStartE2EDuration="36.348587934s" podCreationTimestamp="2026-03-19 19:16:07 +0000 UTC" firstStartedPulling="2026-03-19 19:16:29.425138035 +0000 UTC m=+1214.179206368" lastFinishedPulling="2026-03-19 19:16:42.172733264 +0000 UTC m=+1226.926801587" observedRunningTime="2026-03-19 19:16:43.345285194 +0000 UTC m=+1228.099353507" watchObservedRunningTime="2026-03-19 19:16:43.348587934 +0000 UTC m=+1228.102656247" Mar 19 19:16:43 crc kubenswrapper[4826]: I0319 19:16:43.938833 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 19 19:16:44 crc kubenswrapper[4826]: I0319 19:16:44.254636 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-css7p" event={"ID":"a9d44695-2372-4bc3-a893-1d82703fc963","Type":"ContainerStarted","Data":"4d4147fc0aeca4b0eb1a50b7f94a4ebb41bd11522847a3c43bc4b48f4de1f16c"} Mar 19 19:16:44 crc kubenswrapper[4826]: I0319 19:16:44.280052 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-css7p" podStartSLOduration=-9223371990.574741 podStartE2EDuration="46.280034866s" podCreationTimestamp="2026-03-19 19:15:58 +0000 UTC" firstStartedPulling="2026-03-19 19:15:59.648489705 +0000 UTC m=+1184.402558018" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:16:44.272996676 +0000 UTC m=+1229.027064989" watchObservedRunningTime="2026-03-19 19:16:44.280034866 +0000 UTC m=+1229.034103179" Mar 19 19:16:44 crc kubenswrapper[4826]: I0319 19:16:44.939055 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 19 19:16:44 crc kubenswrapper[4826]: I0319 19:16:44.997848 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 19 19:16:45 crc kubenswrapper[4826]: I0319 19:16:45.253677 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 19 19:16:45 crc kubenswrapper[4826]: I0319 19:16:45.263747 4826 generic.go:334] "Generic (PLEG): container finished" podID="63839c94-d94a-4fe8-a195-b86a6a9e8b79" containerID="7e3557f404ae5596f07e68f863e66e56de142e27881c119d8b748b24b7a4453b" exitCode=0 Mar 19 19:16:45 crc kubenswrapper[4826]: I0319 19:16:45.263848 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"63839c94-d94a-4fe8-a195-b86a6a9e8b79","Type":"ContainerDied","Data":"7e3557f404ae5596f07e68f863e66e56de142e27881c119d8b748b24b7a4453b"} Mar 19 19:16:45 crc kubenswrapper[4826]: I0319 19:16:45.323424 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 19 19:16:45 crc kubenswrapper[4826]: I0319 19:16:45.324171 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 19 19:16:45 crc kubenswrapper[4826]: I0319 19:16:45.506731 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-css7p"] Mar 19 19:16:45 crc kubenswrapper[4826]: I0319 19:16:45.506939 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-css7p" podUID="a9d44695-2372-4bc3-a893-1d82703fc963" containerName="dnsmasq-dns" containerID="cri-o://4d4147fc0aeca4b0eb1a50b7f94a4ebb41bd11522847a3c43bc4b48f4de1f16c" gracePeriod=10 Mar 19 19:16:45 crc kubenswrapper[4826]: I0319 19:16:45.506990 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-css7p" Mar 19 19:16:45 crc kubenswrapper[4826]: I0319 19:16:45.540463 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-gsfz5"] Mar 19 19:16:45 crc kubenswrapper[4826]: E0319 19:16:45.540853 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d5fbfcc-053e-4453-961c-91a0719cdaa6" containerName="oc" Mar 19 19:16:45 crc kubenswrapper[4826]: I0319 19:16:45.540872 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d5fbfcc-053e-4453-961c-91a0719cdaa6" containerName="oc" Mar 19 19:16:45 crc kubenswrapper[4826]: I0319 19:16:45.541078 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d5fbfcc-053e-4453-961c-91a0719cdaa6" containerName="oc" Mar 19 19:16:45 crc kubenswrapper[4826]: I0319 19:16:45.542042 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-gsfz5" Mar 19 19:16:45 crc kubenswrapper[4826]: I0319 19:16:45.543542 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 19 19:16:45 crc kubenswrapper[4826]: I0319 19:16:45.558067 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-gsfz5"] Mar 19 19:16:45 crc kubenswrapper[4826]: I0319 19:16:45.647636 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34bfe186-0db4-4b55-a738-bcfa3cf10972-config\") pod \"dnsmasq-dns-5bf47b49b7-gsfz5\" (UID: \"34bfe186-0db4-4b55-a738-bcfa3cf10972\") " pod="openstack/dnsmasq-dns-5bf47b49b7-gsfz5" Mar 19 19:16:45 crc kubenswrapper[4826]: I0319 19:16:45.647936 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34bfe186-0db4-4b55-a738-bcfa3cf10972-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-gsfz5\" (UID: \"34bfe186-0db4-4b55-a738-bcfa3cf10972\") " pod="openstack/dnsmasq-dns-5bf47b49b7-gsfz5" Mar 19 19:16:45 crc kubenswrapper[4826]: I0319 19:16:45.648148 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34bfe186-0db4-4b55-a738-bcfa3cf10972-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-gsfz5\" (UID: \"34bfe186-0db4-4b55-a738-bcfa3cf10972\") " pod="openstack/dnsmasq-dns-5bf47b49b7-gsfz5" Mar 19 19:16:45 crc kubenswrapper[4826]: I0319 19:16:45.648202 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpw8w\" (UniqueName: \"kubernetes.io/projected/34bfe186-0db4-4b55-a738-bcfa3cf10972-kube-api-access-xpw8w\") pod \"dnsmasq-dns-5bf47b49b7-gsfz5\" (UID: \"34bfe186-0db4-4b55-a738-bcfa3cf10972\") " pod="openstack/dnsmasq-dns-5bf47b49b7-gsfz5" Mar 19 19:16:45 crc kubenswrapper[4826]: I0319 19:16:45.665017 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-7srxv"] Mar 19 19:16:45 crc kubenswrapper[4826]: I0319 19:16:45.666358 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-7srxv" Mar 19 19:16:45 crc kubenswrapper[4826]: I0319 19:16:45.670143 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 19 19:16:45 crc kubenswrapper[4826]: I0319 19:16:45.684443 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-7srxv"] Mar 19 19:16:45 crc kubenswrapper[4826]: I0319 19:16:45.750619 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34bfe186-0db4-4b55-a738-bcfa3cf10972-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-gsfz5\" (UID: \"34bfe186-0db4-4b55-a738-bcfa3cf10972\") " pod="openstack/dnsmasq-dns-5bf47b49b7-gsfz5" Mar 19 19:16:45 crc kubenswrapper[4826]: I0319 19:16:45.751006 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34bfe186-0db4-4b55-a738-bcfa3cf10972-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-gsfz5\" (UID: \"34bfe186-0db4-4b55-a738-bcfa3cf10972\") " pod="openstack/dnsmasq-dns-5bf47b49b7-gsfz5" Mar 19 19:16:45 crc kubenswrapper[4826]: I0319 19:16:45.751039 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpw8w\" (UniqueName: \"kubernetes.io/projected/34bfe186-0db4-4b55-a738-bcfa3cf10972-kube-api-access-xpw8w\") pod \"dnsmasq-dns-5bf47b49b7-gsfz5\" (UID: \"34bfe186-0db4-4b55-a738-bcfa3cf10972\") " pod="openstack/dnsmasq-dns-5bf47b49b7-gsfz5" Mar 19 19:16:45 crc kubenswrapper[4826]: I0319 19:16:45.751063 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c42846f-f2bf-4928-bc9b-d200b10cbf9b-combined-ca-bundle\") pod \"ovn-controller-metrics-7srxv\" (UID: \"0c42846f-f2bf-4928-bc9b-d200b10cbf9b\") " pod="openstack/ovn-controller-metrics-7srxv" Mar 19 19:16:45 crc kubenswrapper[4826]: I0319 19:16:45.751083 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c42846f-f2bf-4928-bc9b-d200b10cbf9b-config\") pod \"ovn-controller-metrics-7srxv\" (UID: \"0c42846f-f2bf-4928-bc9b-d200b10cbf9b\") " pod="openstack/ovn-controller-metrics-7srxv" Mar 19 19:16:45 crc kubenswrapper[4826]: I0319 19:16:45.751107 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4f96\" (UniqueName: \"kubernetes.io/projected/0c42846f-f2bf-4928-bc9b-d200b10cbf9b-kube-api-access-r4f96\") pod \"ovn-controller-metrics-7srxv\" (UID: \"0c42846f-f2bf-4928-bc9b-d200b10cbf9b\") " pod="openstack/ovn-controller-metrics-7srxv" Mar 19 19:16:45 crc kubenswrapper[4826]: I0319 19:16:45.751148 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/0c42846f-f2bf-4928-bc9b-d200b10cbf9b-ovn-rundir\") pod \"ovn-controller-metrics-7srxv\" (UID: \"0c42846f-f2bf-4928-bc9b-d200b10cbf9b\") " pod="openstack/ovn-controller-metrics-7srxv" Mar 19 19:16:45 crc kubenswrapper[4826]: I0319 19:16:45.751169 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/0c42846f-f2bf-4928-bc9b-d200b10cbf9b-ovs-rundir\") pod \"ovn-controller-metrics-7srxv\" (UID: \"0c42846f-f2bf-4928-bc9b-d200b10cbf9b\") " pod="openstack/ovn-controller-metrics-7srxv" Mar 19 19:16:45 crc kubenswrapper[4826]: I0319 19:16:45.751189 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34bfe186-0db4-4b55-a738-bcfa3cf10972-config\") pod \"dnsmasq-dns-5bf47b49b7-gsfz5\" (UID: \"34bfe186-0db4-4b55-a738-bcfa3cf10972\") " pod="openstack/dnsmasq-dns-5bf47b49b7-gsfz5" Mar 19 19:16:45 crc kubenswrapper[4826]: I0319 19:16:45.751225 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c42846f-f2bf-4928-bc9b-d200b10cbf9b-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-7srxv\" (UID: \"0c42846f-f2bf-4928-bc9b-d200b10cbf9b\") " pod="openstack/ovn-controller-metrics-7srxv" Mar 19 19:16:45 crc kubenswrapper[4826]: I0319 19:16:45.752111 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34bfe186-0db4-4b55-a738-bcfa3cf10972-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-gsfz5\" (UID: \"34bfe186-0db4-4b55-a738-bcfa3cf10972\") " pod="openstack/dnsmasq-dns-5bf47b49b7-gsfz5" Mar 19 19:16:45 crc kubenswrapper[4826]: I0319 19:16:45.752621 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34bfe186-0db4-4b55-a738-bcfa3cf10972-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-gsfz5\" (UID: \"34bfe186-0db4-4b55-a738-bcfa3cf10972\") " pod="openstack/dnsmasq-dns-5bf47b49b7-gsfz5" Mar 19 19:16:45 crc kubenswrapper[4826]: I0319 19:16:45.753469 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34bfe186-0db4-4b55-a738-bcfa3cf10972-config\") pod \"dnsmasq-dns-5bf47b49b7-gsfz5\" (UID: \"34bfe186-0db4-4b55-a738-bcfa3cf10972\") " pod="openstack/dnsmasq-dns-5bf47b49b7-gsfz5" Mar 19 19:16:45 crc kubenswrapper[4826]: I0319 19:16:45.773575 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpw8w\" (UniqueName: \"kubernetes.io/projected/34bfe186-0db4-4b55-a738-bcfa3cf10972-kube-api-access-xpw8w\") pod \"dnsmasq-dns-5bf47b49b7-gsfz5\" (UID: \"34bfe186-0db4-4b55-a738-bcfa3cf10972\") " pod="openstack/dnsmasq-dns-5bf47b49b7-gsfz5" Mar 19 19:16:45 crc kubenswrapper[4826]: I0319 19:16:45.853294 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/0c42846f-f2bf-4928-bc9b-d200b10cbf9b-ovs-rundir\") pod \"ovn-controller-metrics-7srxv\" (UID: \"0c42846f-f2bf-4928-bc9b-d200b10cbf9b\") " pod="openstack/ovn-controller-metrics-7srxv" Mar 19 19:16:45 crc kubenswrapper[4826]: I0319 19:16:45.853366 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c42846f-f2bf-4928-bc9b-d200b10cbf9b-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-7srxv\" (UID: \"0c42846f-f2bf-4928-bc9b-d200b10cbf9b\") " pod="openstack/ovn-controller-metrics-7srxv" Mar 19 19:16:45 crc kubenswrapper[4826]: I0319 19:16:45.853484 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c42846f-f2bf-4928-bc9b-d200b10cbf9b-combined-ca-bundle\") pod \"ovn-controller-metrics-7srxv\" (UID: \"0c42846f-f2bf-4928-bc9b-d200b10cbf9b\") " pod="openstack/ovn-controller-metrics-7srxv" Mar 19 19:16:45 crc kubenswrapper[4826]: I0319 19:16:45.853502 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c42846f-f2bf-4928-bc9b-d200b10cbf9b-config\") pod \"ovn-controller-metrics-7srxv\" (UID: \"0c42846f-f2bf-4928-bc9b-d200b10cbf9b\") " pod="openstack/ovn-controller-metrics-7srxv" Mar 19 19:16:45 crc kubenswrapper[4826]: I0319 19:16:45.853528 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4f96\" (UniqueName: \"kubernetes.io/projected/0c42846f-f2bf-4928-bc9b-d200b10cbf9b-kube-api-access-r4f96\") pod \"ovn-controller-metrics-7srxv\" (UID: \"0c42846f-f2bf-4928-bc9b-d200b10cbf9b\") " pod="openstack/ovn-controller-metrics-7srxv" Mar 19 19:16:45 crc kubenswrapper[4826]: I0319 19:16:45.853562 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/0c42846f-f2bf-4928-bc9b-d200b10cbf9b-ovn-rundir\") pod \"ovn-controller-metrics-7srxv\" (UID: \"0c42846f-f2bf-4928-bc9b-d200b10cbf9b\") " pod="openstack/ovn-controller-metrics-7srxv" Mar 19 19:16:45 crc kubenswrapper[4826]: I0319 19:16:45.853851 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/0c42846f-f2bf-4928-bc9b-d200b10cbf9b-ovn-rundir\") pod \"ovn-controller-metrics-7srxv\" (UID: \"0c42846f-f2bf-4928-bc9b-d200b10cbf9b\") " pod="openstack/ovn-controller-metrics-7srxv" Mar 19 19:16:45 crc kubenswrapper[4826]: I0319 19:16:45.854545 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c42846f-f2bf-4928-bc9b-d200b10cbf9b-config\") pod \"ovn-controller-metrics-7srxv\" (UID: \"0c42846f-f2bf-4928-bc9b-d200b10cbf9b\") " pod="openstack/ovn-controller-metrics-7srxv" Mar 19 19:16:45 crc kubenswrapper[4826]: I0319 19:16:45.854817 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/0c42846f-f2bf-4928-bc9b-d200b10cbf9b-ovs-rundir\") pod \"ovn-controller-metrics-7srxv\" (UID: \"0c42846f-f2bf-4928-bc9b-d200b10cbf9b\") " pod="openstack/ovn-controller-metrics-7srxv" Mar 19 19:16:45 crc kubenswrapper[4826]: I0319 19:16:45.859355 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c42846f-f2bf-4928-bc9b-d200b10cbf9b-combined-ca-bundle\") pod \"ovn-controller-metrics-7srxv\" (UID: \"0c42846f-f2bf-4928-bc9b-d200b10cbf9b\") " pod="openstack/ovn-controller-metrics-7srxv" Mar 19 19:16:45 crc kubenswrapper[4826]: I0319 19:16:45.882184 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c42846f-f2bf-4928-bc9b-d200b10cbf9b-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-7srxv\" (UID: \"0c42846f-f2bf-4928-bc9b-d200b10cbf9b\") " pod="openstack/ovn-controller-metrics-7srxv" Mar 19 19:16:45 crc kubenswrapper[4826]: I0319 19:16:45.892508 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4f96\" (UniqueName: \"kubernetes.io/projected/0c42846f-f2bf-4928-bc9b-d200b10cbf9b-kube-api-access-r4f96\") pod \"ovn-controller-metrics-7srxv\" (UID: \"0c42846f-f2bf-4928-bc9b-d200b10cbf9b\") " pod="openstack/ovn-controller-metrics-7srxv" Mar 19 19:16:45 crc kubenswrapper[4826]: I0319 19:16:45.935187 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-qtzpm"] Mar 19 19:16:45 crc kubenswrapper[4826]: I0319 19:16:45.935581 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ccc8479f9-qtzpm" podUID="2bb5b40f-d23c-411f-82d2-7d443b88a00b" containerName="dnsmasq-dns" containerID="cri-o://a5b9f30c58e83a0c0764abbc60312b001c058fdf98943cc86e22b46a562c7676" gracePeriod=10 Mar 19 19:16:45 crc kubenswrapper[4826]: I0319 19:16:45.940268 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5ccc8479f9-qtzpm" Mar 19 19:16:45 crc kubenswrapper[4826]: I0319 19:16:45.974503 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-gsfz5" Mar 19 19:16:45 crc kubenswrapper[4826]: I0319 19:16:45.991257 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-7srxv" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.008979 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d65f699f-pvshc"] Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.010584 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d65f699f-pvshc" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.061452 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/235a902e-cd55-4515-aeda-c19cc09af6c3-config\") pod \"dnsmasq-dns-57d65f699f-pvshc\" (UID: \"235a902e-cd55-4515-aeda-c19cc09af6c3\") " pod="openstack/dnsmasq-dns-57d65f699f-pvshc" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.061503 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/235a902e-cd55-4515-aeda-c19cc09af6c3-ovsdbserver-nb\") pod \"dnsmasq-dns-57d65f699f-pvshc\" (UID: \"235a902e-cd55-4515-aeda-c19cc09af6c3\") " pod="openstack/dnsmasq-dns-57d65f699f-pvshc" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.061564 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vb9m\" (UniqueName: \"kubernetes.io/projected/235a902e-cd55-4515-aeda-c19cc09af6c3-kube-api-access-2vb9m\") pod \"dnsmasq-dns-57d65f699f-pvshc\" (UID: \"235a902e-cd55-4515-aeda-c19cc09af6c3\") " pod="openstack/dnsmasq-dns-57d65f699f-pvshc" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.061694 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/235a902e-cd55-4515-aeda-c19cc09af6c3-dns-svc\") pod \"dnsmasq-dns-57d65f699f-pvshc\" (UID: \"235a902e-cd55-4515-aeda-c19cc09af6c3\") " pod="openstack/dnsmasq-dns-57d65f699f-pvshc" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.095801 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d65f699f-pvshc"] Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.145992 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.167679 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/235a902e-cd55-4515-aeda-c19cc09af6c3-dns-svc\") pod \"dnsmasq-dns-57d65f699f-pvshc\" (UID: \"235a902e-cd55-4515-aeda-c19cc09af6c3\") " pod="openstack/dnsmasq-dns-57d65f699f-pvshc" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.173898 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/235a902e-cd55-4515-aeda-c19cc09af6c3-dns-svc\") pod \"dnsmasq-dns-57d65f699f-pvshc\" (UID: \"235a902e-cd55-4515-aeda-c19cc09af6c3\") " pod="openstack/dnsmasq-dns-57d65f699f-pvshc" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.187737 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/235a902e-cd55-4515-aeda-c19cc09af6c3-config\") pod \"dnsmasq-dns-57d65f699f-pvshc\" (UID: \"235a902e-cd55-4515-aeda-c19cc09af6c3\") " pod="openstack/dnsmasq-dns-57d65f699f-pvshc" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.187892 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/235a902e-cd55-4515-aeda-c19cc09af6c3-ovsdbserver-nb\") pod \"dnsmasq-dns-57d65f699f-pvshc\" (UID: \"235a902e-cd55-4515-aeda-c19cc09af6c3\") " pod="openstack/dnsmasq-dns-57d65f699f-pvshc" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.188015 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vb9m\" (UniqueName: \"kubernetes.io/projected/235a902e-cd55-4515-aeda-c19cc09af6c3-kube-api-access-2vb9m\") pod \"dnsmasq-dns-57d65f699f-pvshc\" (UID: \"235a902e-cd55-4515-aeda-c19cc09af6c3\") " pod="openstack/dnsmasq-dns-57d65f699f-pvshc" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.189317 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/235a902e-cd55-4515-aeda-c19cc09af6c3-ovsdbserver-nb\") pod \"dnsmasq-dns-57d65f699f-pvshc\" (UID: \"235a902e-cd55-4515-aeda-c19cc09af6c3\") " pod="openstack/dnsmasq-dns-57d65f699f-pvshc" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.194994 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/235a902e-cd55-4515-aeda-c19cc09af6c3-config\") pod \"dnsmasq-dns-57d65f699f-pvshc\" (UID: \"235a902e-cd55-4515-aeda-c19cc09af6c3\") " pod="openstack/dnsmasq-dns-57d65f699f-pvshc" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.212038 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d65f699f-pvshc"] Mar 19 19:16:46 crc kubenswrapper[4826]: E0319 19:16:46.213039 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-2vb9m], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-57d65f699f-pvshc" podUID="235a902e-cd55-4515-aeda-c19cc09af6c3" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.226398 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vb9m\" (UniqueName: \"kubernetes.io/projected/235a902e-cd55-4515-aeda-c19cc09af6c3-kube-api-access-2vb9m\") pod \"dnsmasq-dns-57d65f699f-pvshc\" (UID: \"235a902e-cd55-4515-aeda-c19cc09af6c3\") " pod="openstack/dnsmasq-dns-57d65f699f-pvshc" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.263727 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-nqjw7"] Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.265392 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-nqjw7" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.271747 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.278555 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-nqjw7"] Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.293614 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f22cc2a8-4d4f-42c8-b22e-478636b5259e-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-nqjw7\" (UID: \"f22cc2a8-4d4f-42c8-b22e-478636b5259e\") " pod="openstack/dnsmasq-dns-b8fbc5445-nqjw7" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.293683 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f22cc2a8-4d4f-42c8-b22e-478636b5259e-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-nqjw7\" (UID: \"f22cc2a8-4d4f-42c8-b22e-478636b5259e\") " pod="openstack/dnsmasq-dns-b8fbc5445-nqjw7" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.293742 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2wq2\" (UniqueName: \"kubernetes.io/projected/f22cc2a8-4d4f-42c8-b22e-478636b5259e-kube-api-access-l2wq2\") pod \"dnsmasq-dns-b8fbc5445-nqjw7\" (UID: \"f22cc2a8-4d4f-42c8-b22e-478636b5259e\") " pod="openstack/dnsmasq-dns-b8fbc5445-nqjw7" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.293778 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f22cc2a8-4d4f-42c8-b22e-478636b5259e-config\") pod \"dnsmasq-dns-b8fbc5445-nqjw7\" (UID: \"f22cc2a8-4d4f-42c8-b22e-478636b5259e\") " pod="openstack/dnsmasq-dns-b8fbc5445-nqjw7" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.293832 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f22cc2a8-4d4f-42c8-b22e-478636b5259e-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-nqjw7\" (UID: \"f22cc2a8-4d4f-42c8-b22e-478636b5259e\") " pod="openstack/dnsmasq-dns-b8fbc5445-nqjw7" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.359796 4826 generic.go:334] "Generic (PLEG): container finished" podID="a9d44695-2372-4bc3-a893-1d82703fc963" containerID="4d4147fc0aeca4b0eb1a50b7f94a4ebb41bd11522847a3c43bc4b48f4de1f16c" exitCode=0 Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.359874 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-css7p" event={"ID":"a9d44695-2372-4bc3-a893-1d82703fc963","Type":"ContainerDied","Data":"4d4147fc0aeca4b0eb1a50b7f94a4ebb41bd11522847a3c43bc4b48f4de1f16c"} Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.399747 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f22cc2a8-4d4f-42c8-b22e-478636b5259e-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-nqjw7\" (UID: \"f22cc2a8-4d4f-42c8-b22e-478636b5259e\") " pod="openstack/dnsmasq-dns-b8fbc5445-nqjw7" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.399890 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f22cc2a8-4d4f-42c8-b22e-478636b5259e-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-nqjw7\" (UID: \"f22cc2a8-4d4f-42c8-b22e-478636b5259e\") " pod="openstack/dnsmasq-dns-b8fbc5445-nqjw7" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.399926 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f22cc2a8-4d4f-42c8-b22e-478636b5259e-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-nqjw7\" (UID: \"f22cc2a8-4d4f-42c8-b22e-478636b5259e\") " pod="openstack/dnsmasq-dns-b8fbc5445-nqjw7" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.399979 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2wq2\" (UniqueName: \"kubernetes.io/projected/f22cc2a8-4d4f-42c8-b22e-478636b5259e-kube-api-access-l2wq2\") pod \"dnsmasq-dns-b8fbc5445-nqjw7\" (UID: \"f22cc2a8-4d4f-42c8-b22e-478636b5259e\") " pod="openstack/dnsmasq-dns-b8fbc5445-nqjw7" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.400002 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f22cc2a8-4d4f-42c8-b22e-478636b5259e-config\") pod \"dnsmasq-dns-b8fbc5445-nqjw7\" (UID: \"f22cc2a8-4d4f-42c8-b22e-478636b5259e\") " pod="openstack/dnsmasq-dns-b8fbc5445-nqjw7" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.401052 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f22cc2a8-4d4f-42c8-b22e-478636b5259e-config\") pod \"dnsmasq-dns-b8fbc5445-nqjw7\" (UID: \"f22cc2a8-4d4f-42c8-b22e-478636b5259e\") " pod="openstack/dnsmasq-dns-b8fbc5445-nqjw7" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.401536 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f22cc2a8-4d4f-42c8-b22e-478636b5259e-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-nqjw7\" (UID: \"f22cc2a8-4d4f-42c8-b22e-478636b5259e\") " pod="openstack/dnsmasq-dns-b8fbc5445-nqjw7" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.402040 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f22cc2a8-4d4f-42c8-b22e-478636b5259e-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-nqjw7\" (UID: \"f22cc2a8-4d4f-42c8-b22e-478636b5259e\") " pod="openstack/dnsmasq-dns-b8fbc5445-nqjw7" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.402488 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f22cc2a8-4d4f-42c8-b22e-478636b5259e-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-nqjw7\" (UID: \"f22cc2a8-4d4f-42c8-b22e-478636b5259e\") " pod="openstack/dnsmasq-dns-b8fbc5445-nqjw7" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.415710 4826 generic.go:334] "Generic (PLEG): container finished" podID="2bb5b40f-d23c-411f-82d2-7d443b88a00b" containerID="a5b9f30c58e83a0c0764abbc60312b001c058fdf98943cc86e22b46a562c7676" exitCode=0 Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.415809 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d65f699f-pvshc" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.416520 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-qtzpm" event={"ID":"2bb5b40f-d23c-411f-82d2-7d443b88a00b","Type":"ContainerDied","Data":"a5b9f30c58e83a0c0764abbc60312b001c058fdf98943cc86e22b46a562c7676"} Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.417961 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.433693 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2wq2\" (UniqueName: \"kubernetes.io/projected/f22cc2a8-4d4f-42c8-b22e-478636b5259e-kube-api-access-l2wq2\") pod \"dnsmasq-dns-b8fbc5445-nqjw7\" (UID: \"f22cc2a8-4d4f-42c8-b22e-478636b5259e\") " pod="openstack/dnsmasq-dns-b8fbc5445-nqjw7" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.449741 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d65f699f-pvshc" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.486834 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.547308 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-css7p" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.604665 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/235a902e-cd55-4515-aeda-c19cc09af6c3-config\") pod \"235a902e-cd55-4515-aeda-c19cc09af6c3\" (UID: \"235a902e-cd55-4515-aeda-c19cc09af6c3\") " Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.604824 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vb9m\" (UniqueName: \"kubernetes.io/projected/235a902e-cd55-4515-aeda-c19cc09af6c3-kube-api-access-2vb9m\") pod \"235a902e-cd55-4515-aeda-c19cc09af6c3\" (UID: \"235a902e-cd55-4515-aeda-c19cc09af6c3\") " Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.604962 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/235a902e-cd55-4515-aeda-c19cc09af6c3-dns-svc\") pod \"235a902e-cd55-4515-aeda-c19cc09af6c3\" (UID: \"235a902e-cd55-4515-aeda-c19cc09af6c3\") " Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.605088 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/235a902e-cd55-4515-aeda-c19cc09af6c3-ovsdbserver-nb\") pod \"235a902e-cd55-4515-aeda-c19cc09af6c3\" (UID: \"235a902e-cd55-4515-aeda-c19cc09af6c3\") " Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.605467 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/235a902e-cd55-4515-aeda-c19cc09af6c3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "235a902e-cd55-4515-aeda-c19cc09af6c3" (UID: "235a902e-cd55-4515-aeda-c19cc09af6c3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.605623 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/235a902e-cd55-4515-aeda-c19cc09af6c3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "235a902e-cd55-4515-aeda-c19cc09af6c3" (UID: "235a902e-cd55-4515-aeda-c19cc09af6c3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.606329 4826 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/235a902e-cd55-4515-aeda-c19cc09af6c3-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.606357 4826 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/235a902e-cd55-4515-aeda-c19cc09af6c3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.606505 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/235a902e-cd55-4515-aeda-c19cc09af6c3-config" (OuterVolumeSpecName: "config") pod "235a902e-cd55-4515-aeda-c19cc09af6c3" (UID: "235a902e-cd55-4515-aeda-c19cc09af6c3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.615580 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/235a902e-cd55-4515-aeda-c19cc09af6c3-kube-api-access-2vb9m" (OuterVolumeSpecName: "kube-api-access-2vb9m") pod "235a902e-cd55-4515-aeda-c19cc09af6c3" (UID: "235a902e-cd55-4515-aeda-c19cc09af6c3"). InnerVolumeSpecName "kube-api-access-2vb9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.646311 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-nqjw7" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.707419 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9d44695-2372-4bc3-a893-1d82703fc963-config\") pod \"a9d44695-2372-4bc3-a893-1d82703fc963\" (UID: \"a9d44695-2372-4bc3-a893-1d82703fc963\") " Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.707470 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szr2d\" (UniqueName: \"kubernetes.io/projected/a9d44695-2372-4bc3-a893-1d82703fc963-kube-api-access-szr2d\") pod \"a9d44695-2372-4bc3-a893-1d82703fc963\" (UID: \"a9d44695-2372-4bc3-a893-1d82703fc963\") " Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.707553 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9d44695-2372-4bc3-a893-1d82703fc963-dns-svc\") pod \"a9d44695-2372-4bc3-a893-1d82703fc963\" (UID: \"a9d44695-2372-4bc3-a893-1d82703fc963\") " Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.708698 4826 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/235a902e-cd55-4515-aeda-c19cc09af6c3-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.708719 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vb9m\" (UniqueName: \"kubernetes.io/projected/235a902e-cd55-4515-aeda-c19cc09af6c3-kube-api-access-2vb9m\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.710752 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9d44695-2372-4bc3-a893-1d82703fc963-kube-api-access-szr2d" (OuterVolumeSpecName: "kube-api-access-szr2d") pod "a9d44695-2372-4bc3-a893-1d82703fc963" (UID: "a9d44695-2372-4bc3-a893-1d82703fc963"). InnerVolumeSpecName "kube-api-access-szr2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.767461 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9d44695-2372-4bc3-a893-1d82703fc963-config" (OuterVolumeSpecName: "config") pod "a9d44695-2372-4bc3-a893-1d82703fc963" (UID: "a9d44695-2372-4bc3-a893-1d82703fc963"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.799846 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9d44695-2372-4bc3-a893-1d82703fc963-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a9d44695-2372-4bc3-a893-1d82703fc963" (UID: "a9d44695-2372-4bc3-a893-1d82703fc963"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.810801 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szr2d\" (UniqueName: \"kubernetes.io/projected/a9d44695-2372-4bc3-a893-1d82703fc963-kube-api-access-szr2d\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.810831 4826 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9d44695-2372-4bc3-a893-1d82703fc963-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.810842 4826 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9d44695-2372-4bc3-a893-1d82703fc963-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.928318 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 19 19:16:46 crc kubenswrapper[4826]: E0319 19:16:46.929611 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9d44695-2372-4bc3-a893-1d82703fc963" containerName="dnsmasq-dns" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.929623 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9d44695-2372-4bc3-a893-1d82703fc963" containerName="dnsmasq-dns" Mar 19 19:16:46 crc kubenswrapper[4826]: E0319 19:16:46.929692 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9d44695-2372-4bc3-a893-1d82703fc963" containerName="init" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.929808 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9d44695-2372-4bc3-a893-1d82703fc963" containerName="init" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.939317 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9d44695-2372-4bc3-a893-1d82703fc963" containerName="dnsmasq-dns" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.947745 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.947949 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-qtzpm" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.949391 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.950342 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.950681 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.950836 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-jv9nr" Mar 19 19:16:46 crc kubenswrapper[4826]: I0319 19:16:46.995173 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.023488 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-7srxv"] Mar 19 19:16:47 crc kubenswrapper[4826]: W0319 19:16:47.054869 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34bfe186_0db4_4b55_a738_bcfa3cf10972.slice/crio-a38d4029a9e1d19da603d18d84b0c041d126bb3e9050b0a32b3f3c0c42e082ef WatchSource:0}: Error finding container a38d4029a9e1d19da603d18d84b0c041d126bb3e9050b0a32b3f3c0c42e082ef: Status 404 returned error can't find the container with id a38d4029a9e1d19da603d18d84b0c041d126bb3e9050b0a32b3f3c0c42e082ef Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.055467 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnqgk\" (UniqueName: \"kubernetes.io/projected/2bb5b40f-d23c-411f-82d2-7d443b88a00b-kube-api-access-xnqgk\") pod \"2bb5b40f-d23c-411f-82d2-7d443b88a00b\" (UID: \"2bb5b40f-d23c-411f-82d2-7d443b88a00b\") " Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.055555 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2bb5b40f-d23c-411f-82d2-7d443b88a00b-dns-svc\") pod \"2bb5b40f-d23c-411f-82d2-7d443b88a00b\" (UID: \"2bb5b40f-d23c-411f-82d2-7d443b88a00b\") " Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.055721 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bb5b40f-d23c-411f-82d2-7d443b88a00b-config\") pod \"2bb5b40f-d23c-411f-82d2-7d443b88a00b\" (UID: \"2bb5b40f-d23c-411f-82d2-7d443b88a00b\") " Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.055988 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d6c4b87f-ecc2-49f3-98db-f9e81cacdb7e-scripts\") pod \"ovn-northd-0\" (UID: \"d6c4b87f-ecc2-49f3-98db-f9e81cacdb7e\") " pod="openstack/ovn-northd-0" Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.056029 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6c4b87f-ecc2-49f3-98db-f9e81cacdb7e-config\") pod \"ovn-northd-0\" (UID: \"d6c4b87f-ecc2-49f3-98db-f9e81cacdb7e\") " pod="openstack/ovn-northd-0" Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.056094 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hjwt\" (UniqueName: \"kubernetes.io/projected/d6c4b87f-ecc2-49f3-98db-f9e81cacdb7e-kube-api-access-8hjwt\") pod \"ovn-northd-0\" (UID: \"d6c4b87f-ecc2-49f3-98db-f9e81cacdb7e\") " pod="openstack/ovn-northd-0" Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.056172 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6c4b87f-ecc2-49f3-98db-f9e81cacdb7e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"d6c4b87f-ecc2-49f3-98db-f9e81cacdb7e\") " pod="openstack/ovn-northd-0" Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.056201 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6c4b87f-ecc2-49f3-98db-f9e81cacdb7e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"d6c4b87f-ecc2-49f3-98db-f9e81cacdb7e\") " pod="openstack/ovn-northd-0" Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.056263 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d6c4b87f-ecc2-49f3-98db-f9e81cacdb7e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"d6c4b87f-ecc2-49f3-98db-f9e81cacdb7e\") " pod="openstack/ovn-northd-0" Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.056307 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6c4b87f-ecc2-49f3-98db-f9e81cacdb7e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"d6c4b87f-ecc2-49f3-98db-f9e81cacdb7e\") " pod="openstack/ovn-northd-0" Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.062050 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bb5b40f-d23c-411f-82d2-7d443b88a00b-kube-api-access-xnqgk" (OuterVolumeSpecName: "kube-api-access-xnqgk") pod "2bb5b40f-d23c-411f-82d2-7d443b88a00b" (UID: "2bb5b40f-d23c-411f-82d2-7d443b88a00b"). InnerVolumeSpecName "kube-api-access-xnqgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.078619 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-gsfz5"] Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.105724 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bb5b40f-d23c-411f-82d2-7d443b88a00b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2bb5b40f-d23c-411f-82d2-7d443b88a00b" (UID: "2bb5b40f-d23c-411f-82d2-7d443b88a00b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.107447 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bb5b40f-d23c-411f-82d2-7d443b88a00b-config" (OuterVolumeSpecName: "config") pod "2bb5b40f-d23c-411f-82d2-7d443b88a00b" (UID: "2bb5b40f-d23c-411f-82d2-7d443b88a00b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.157742 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d6c4b87f-ecc2-49f3-98db-f9e81cacdb7e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"d6c4b87f-ecc2-49f3-98db-f9e81cacdb7e\") " pod="openstack/ovn-northd-0" Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.157814 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6c4b87f-ecc2-49f3-98db-f9e81cacdb7e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"d6c4b87f-ecc2-49f3-98db-f9e81cacdb7e\") " pod="openstack/ovn-northd-0" Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.157841 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d6c4b87f-ecc2-49f3-98db-f9e81cacdb7e-scripts\") pod \"ovn-northd-0\" (UID: \"d6c4b87f-ecc2-49f3-98db-f9e81cacdb7e\") " pod="openstack/ovn-northd-0" Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.157871 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6c4b87f-ecc2-49f3-98db-f9e81cacdb7e-config\") pod \"ovn-northd-0\" (UID: \"d6c4b87f-ecc2-49f3-98db-f9e81cacdb7e\") " pod="openstack/ovn-northd-0" Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.157924 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hjwt\" (UniqueName: \"kubernetes.io/projected/d6c4b87f-ecc2-49f3-98db-f9e81cacdb7e-kube-api-access-8hjwt\") pod \"ovn-northd-0\" (UID: \"d6c4b87f-ecc2-49f3-98db-f9e81cacdb7e\") " pod="openstack/ovn-northd-0" Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.158274 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d6c4b87f-ecc2-49f3-98db-f9e81cacdb7e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"d6c4b87f-ecc2-49f3-98db-f9e81cacdb7e\") " pod="openstack/ovn-northd-0" Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.158315 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6c4b87f-ecc2-49f3-98db-f9e81cacdb7e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"d6c4b87f-ecc2-49f3-98db-f9e81cacdb7e\") " pod="openstack/ovn-northd-0" Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.158353 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6c4b87f-ecc2-49f3-98db-f9e81cacdb7e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"d6c4b87f-ecc2-49f3-98db-f9e81cacdb7e\") " pod="openstack/ovn-northd-0" Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.158448 4826 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bb5b40f-d23c-411f-82d2-7d443b88a00b-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.158472 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xnqgk\" (UniqueName: \"kubernetes.io/projected/2bb5b40f-d23c-411f-82d2-7d443b88a00b-kube-api-access-xnqgk\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.158488 4826 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2bb5b40f-d23c-411f-82d2-7d443b88a00b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.158940 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d6c4b87f-ecc2-49f3-98db-f9e81cacdb7e-scripts\") pod \"ovn-northd-0\" (UID: \"d6c4b87f-ecc2-49f3-98db-f9e81cacdb7e\") " pod="openstack/ovn-northd-0" Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.159146 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6c4b87f-ecc2-49f3-98db-f9e81cacdb7e-config\") pod \"ovn-northd-0\" (UID: \"d6c4b87f-ecc2-49f3-98db-f9e81cacdb7e\") " pod="openstack/ovn-northd-0" Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.160277 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 19 19:16:47 crc kubenswrapper[4826]: E0319 19:16:47.160696 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bb5b40f-d23c-411f-82d2-7d443b88a00b" containerName="init" Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.160714 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bb5b40f-d23c-411f-82d2-7d443b88a00b" containerName="init" Mar 19 19:16:47 crc kubenswrapper[4826]: E0319 19:16:47.160731 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bb5b40f-d23c-411f-82d2-7d443b88a00b" containerName="dnsmasq-dns" Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.160738 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bb5b40f-d23c-411f-82d2-7d443b88a00b" containerName="dnsmasq-dns" Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.160938 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bb5b40f-d23c-411f-82d2-7d443b88a00b" containerName="dnsmasq-dns" Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.164606 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6c4b87f-ecc2-49f3-98db-f9e81cacdb7e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"d6c4b87f-ecc2-49f3-98db-f9e81cacdb7e\") " pod="openstack/ovn-northd-0" Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.168293 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.171717 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6c4b87f-ecc2-49f3-98db-f9e81cacdb7e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"d6c4b87f-ecc2-49f3-98db-f9e81cacdb7e\") " pod="openstack/ovn-northd-0" Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.173995 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.174153 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.174261 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-c6qpp" Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.174472 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.175628 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6c4b87f-ecc2-49f3-98db-f9e81cacdb7e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"d6c4b87f-ecc2-49f3-98db-f9e81cacdb7e\") " pod="openstack/ovn-northd-0" Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.199693 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hjwt\" (UniqueName: \"kubernetes.io/projected/d6c4b87f-ecc2-49f3-98db-f9e81cacdb7e-kube-api-access-8hjwt\") pod \"ovn-northd-0\" (UID: \"d6c4b87f-ecc2-49f3-98db-f9e81cacdb7e\") " pod="openstack/ovn-northd-0" Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.203306 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.261840 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/775f9d8a-377a-4913-b2d2-3bb1b7aec077-lock\") pod \"swift-storage-0\" (UID: \"775f9d8a-377a-4913-b2d2-3bb1b7aec077\") " pod="openstack/swift-storage-0" Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.261883 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/775f9d8a-377a-4913-b2d2-3bb1b7aec077-etc-swift\") pod \"swift-storage-0\" (UID: \"775f9d8a-377a-4913-b2d2-3bb1b7aec077\") " pod="openstack/swift-storage-0" Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.261917 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/775f9d8a-377a-4913-b2d2-3bb1b7aec077-cache\") pod \"swift-storage-0\" (UID: \"775f9d8a-377a-4913-b2d2-3bb1b7aec077\") " pod="openstack/swift-storage-0" Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.261953 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-9cb3aa5c-c139-41c7-8780-d759205c3c5d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9cb3aa5c-c139-41c7-8780-d759205c3c5d\") pod \"swift-storage-0\" (UID: \"775f9d8a-377a-4913-b2d2-3bb1b7aec077\") " pod="openstack/swift-storage-0" Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.262076 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s48ml\" (UniqueName: \"kubernetes.io/projected/775f9d8a-377a-4913-b2d2-3bb1b7aec077-kube-api-access-s48ml\") pod \"swift-storage-0\" (UID: \"775f9d8a-377a-4913-b2d2-3bb1b7aec077\") " pod="openstack/swift-storage-0" Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.262098 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/775f9d8a-377a-4913-b2d2-3bb1b7aec077-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"775f9d8a-377a-4913-b2d2-3bb1b7aec077\") " pod="openstack/swift-storage-0" Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.292845 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.364081 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/775f9d8a-377a-4913-b2d2-3bb1b7aec077-lock\") pod \"swift-storage-0\" (UID: \"775f9d8a-377a-4913-b2d2-3bb1b7aec077\") " pod="openstack/swift-storage-0" Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.364126 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/775f9d8a-377a-4913-b2d2-3bb1b7aec077-etc-swift\") pod \"swift-storage-0\" (UID: \"775f9d8a-377a-4913-b2d2-3bb1b7aec077\") " pod="openstack/swift-storage-0" Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.364155 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/775f9d8a-377a-4913-b2d2-3bb1b7aec077-cache\") pod \"swift-storage-0\" (UID: \"775f9d8a-377a-4913-b2d2-3bb1b7aec077\") " pod="openstack/swift-storage-0" Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.364191 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-9cb3aa5c-c139-41c7-8780-d759205c3c5d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9cb3aa5c-c139-41c7-8780-d759205c3c5d\") pod \"swift-storage-0\" (UID: \"775f9d8a-377a-4913-b2d2-3bb1b7aec077\") " pod="openstack/swift-storage-0" Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.364309 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s48ml\" (UniqueName: \"kubernetes.io/projected/775f9d8a-377a-4913-b2d2-3bb1b7aec077-kube-api-access-s48ml\") pod \"swift-storage-0\" (UID: \"775f9d8a-377a-4913-b2d2-3bb1b7aec077\") " pod="openstack/swift-storage-0" Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.364334 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/775f9d8a-377a-4913-b2d2-3bb1b7aec077-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"775f9d8a-377a-4913-b2d2-3bb1b7aec077\") " pod="openstack/swift-storage-0" Mar 19 19:16:47 crc kubenswrapper[4826]: E0319 19:16:47.365430 4826 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 19 19:16:47 crc kubenswrapper[4826]: E0319 19:16:47.365458 4826 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 19 19:16:47 crc kubenswrapper[4826]: E0319 19:16:47.365500 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/775f9d8a-377a-4913-b2d2-3bb1b7aec077-etc-swift podName:775f9d8a-377a-4913-b2d2-3bb1b7aec077 nodeName:}" failed. No retries permitted until 2026-03-19 19:16:47.86548204 +0000 UTC m=+1232.619550343 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/775f9d8a-377a-4913-b2d2-3bb1b7aec077-etc-swift") pod "swift-storage-0" (UID: "775f9d8a-377a-4913-b2d2-3bb1b7aec077") : configmap "swift-ring-files" not found Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.365536 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/775f9d8a-377a-4913-b2d2-3bb1b7aec077-lock\") pod \"swift-storage-0\" (UID: \"775f9d8a-377a-4913-b2d2-3bb1b7aec077\") " pod="openstack/swift-storage-0" Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.366223 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/775f9d8a-377a-4913-b2d2-3bb1b7aec077-cache\") pod \"swift-storage-0\" (UID: \"775f9d8a-377a-4913-b2d2-3bb1b7aec077\") " pod="openstack/swift-storage-0" Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.367276 4826 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.367301 4826 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-9cb3aa5c-c139-41c7-8780-d759205c3c5d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9cb3aa5c-c139-41c7-8780-d759205c3c5d\") pod \"swift-storage-0\" (UID: \"775f9d8a-377a-4913-b2d2-3bb1b7aec077\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/920270afe4cd40fe1185d3584dcaa93881765c04116f890c98a06296bd0464ad/globalmount\"" pod="openstack/swift-storage-0" Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.369601 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/775f9d8a-377a-4913-b2d2-3bb1b7aec077-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"775f9d8a-377a-4913-b2d2-3bb1b7aec077\") " pod="openstack/swift-storage-0" Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.392773 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s48ml\" (UniqueName: \"kubernetes.io/projected/775f9d8a-377a-4913-b2d2-3bb1b7aec077-kube-api-access-s48ml\") pod \"swift-storage-0\" (UID: \"775f9d8a-377a-4913-b2d2-3bb1b7aec077\") " pod="openstack/swift-storage-0" Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.428446 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-7srxv" event={"ID":"0c42846f-f2bf-4928-bc9b-d200b10cbf9b","Type":"ContainerStarted","Data":"5fd6afbcb2bc6715a7aa5bd6e164e330ce915da2b162f1751c2be450c9329133"} Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.437079 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-qtzpm" event={"ID":"2bb5b40f-d23c-411f-82d2-7d443b88a00b","Type":"ContainerDied","Data":"c862cca108d182ddae56c93e9ddfb98b3996c6bff89adb16e3511c1f0ab34b9f"} Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.437132 4826 scope.go:117] "RemoveContainer" containerID="a5b9f30c58e83a0c0764abbc60312b001c058fdf98943cc86e22b46a562c7676" Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.437483 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-qtzpm" Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.442741 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-9cb3aa5c-c139-41c7-8780-d759205c3c5d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9cb3aa5c-c139-41c7-8780-d759205c3c5d\") pod \"swift-storage-0\" (UID: \"775f9d8a-377a-4913-b2d2-3bb1b7aec077\") " pod="openstack/swift-storage-0" Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.454781 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-css7p" event={"ID":"a9d44695-2372-4bc3-a893-1d82703fc963","Type":"ContainerDied","Data":"25adced0ca0e574256d7e7846669aa2040f594750619b17e332ec32be7862ddc"} Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.454881 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-css7p" Mar 19 19:16:47 crc kubenswrapper[4826]: W0319 19:16:47.478539 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf22cc2a8_4d4f_42c8_b22e_478636b5259e.slice/crio-dd37605856f33ca28b622c507288eac0f0b11f5073b144c9890a2d1128a75804 WatchSource:0}: Error finding container dd37605856f33ca28b622c507288eac0f0b11f5073b144c9890a2d1128a75804: Status 404 returned error can't find the container with id dd37605856f33ca28b622c507288eac0f0b11f5073b144c9890a2d1128a75804 Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.478819 4826 scope.go:117] "RemoveContainer" containerID="356c2ada1626789f06f3e9dfe2d3d9cadf24fc37bd6cae4a220c9542763a18bf" Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.481111 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-nqjw7"] Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.510918 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-gsfz5" event={"ID":"34bfe186-0db4-4b55-a738-bcfa3cf10972","Type":"ContainerStarted","Data":"a38d4029a9e1d19da603d18d84b0c041d126bb3e9050b0a32b3f3c0c42e082ef"} Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.511797 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d65f699f-pvshc" Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.515615 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-css7p"] Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.531003 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-css7p"] Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.540915 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-qtzpm"] Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.559063 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-qtzpm"] Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.567062 4826 scope.go:117] "RemoveContainer" containerID="4d4147fc0aeca4b0eb1a50b7f94a4ebb41bd11522847a3c43bc4b48f4de1f16c" Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.577325 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d65f699f-pvshc"] Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.609303 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d65f699f-pvshc"] Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.622354 4826 scope.go:117] "RemoveContainer" containerID="379adec6e5c7baf26d53552d8e77b0964e8f8693cfa0e9b5245dd4e7c4a91013" Mar 19 19:16:47 crc kubenswrapper[4826]: W0319 19:16:47.864319 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6c4b87f_ecc2_49f3_98db_f9e81cacdb7e.slice/crio-f572ce602f42615b59f64ce1b375e6167ea5293b5369e92d0d7522dc5d9a1be5 WatchSource:0}: Error finding container f572ce602f42615b59f64ce1b375e6167ea5293b5369e92d0d7522dc5d9a1be5: Status 404 returned error can't find the container with id f572ce602f42615b59f64ce1b375e6167ea5293b5369e92d0d7522dc5d9a1be5 Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.875877 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.901613 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/775f9d8a-377a-4913-b2d2-3bb1b7aec077-etc-swift\") pod \"swift-storage-0\" (UID: \"775f9d8a-377a-4913-b2d2-3bb1b7aec077\") " pod="openstack/swift-storage-0" Mar 19 19:16:47 crc kubenswrapper[4826]: E0319 19:16:47.901859 4826 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 19 19:16:47 crc kubenswrapper[4826]: E0319 19:16:47.901875 4826 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 19 19:16:47 crc kubenswrapper[4826]: E0319 19:16:47.901923 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/775f9d8a-377a-4913-b2d2-3bb1b7aec077-etc-swift podName:775f9d8a-377a-4913-b2d2-3bb1b7aec077 nodeName:}" failed. No retries permitted until 2026-03-19 19:16:48.901907165 +0000 UTC m=+1233.655975488 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/775f9d8a-377a-4913-b2d2-3bb1b7aec077-etc-swift") pod "swift-storage-0" (UID: "775f9d8a-377a-4913-b2d2-3bb1b7aec077") : configmap "swift-ring-files" not found Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.988708 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="235a902e-cd55-4515-aeda-c19cc09af6c3" path="/var/lib/kubelet/pods/235a902e-cd55-4515-aeda-c19cc09af6c3/volumes" Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.989390 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bb5b40f-d23c-411f-82d2-7d443b88a00b" path="/var/lib/kubelet/pods/2bb5b40f-d23c-411f-82d2-7d443b88a00b/volumes" Mar 19 19:16:47 crc kubenswrapper[4826]: I0319 19:16:47.990246 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9d44695-2372-4bc3-a893-1d82703fc963" path="/var/lib/kubelet/pods/a9d44695-2372-4bc3-a893-1d82703fc963/volumes" Mar 19 19:16:48 crc kubenswrapper[4826]: I0319 19:16:48.344312 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-s8pzl"] Mar 19 19:16:48 crc kubenswrapper[4826]: I0319 19:16:48.346151 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-s8pzl" Mar 19 19:16:48 crc kubenswrapper[4826]: I0319 19:16:48.347573 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 19 19:16:48 crc kubenswrapper[4826]: I0319 19:16:48.348012 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 19 19:16:48 crc kubenswrapper[4826]: I0319 19:16:48.355998 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-s8pzl"] Mar 19 19:16:48 crc kubenswrapper[4826]: I0319 19:16:48.356427 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 19 19:16:48 crc kubenswrapper[4826]: I0319 19:16:48.514950 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9327e4ee-66b1-4f08-9cb8-9facc43491b4-scripts\") pod \"swift-ring-rebalance-s8pzl\" (UID: \"9327e4ee-66b1-4f08-9cb8-9facc43491b4\") " pod="openstack/swift-ring-rebalance-s8pzl" Mar 19 19:16:48 crc kubenswrapper[4826]: I0319 19:16:48.515037 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9327e4ee-66b1-4f08-9cb8-9facc43491b4-dispersionconf\") pod \"swift-ring-rebalance-s8pzl\" (UID: \"9327e4ee-66b1-4f08-9cb8-9facc43491b4\") " pod="openstack/swift-ring-rebalance-s8pzl" Mar 19 19:16:48 crc kubenswrapper[4826]: I0319 19:16:48.515093 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9327e4ee-66b1-4f08-9cb8-9facc43491b4-ring-data-devices\") pod \"swift-ring-rebalance-s8pzl\" (UID: \"9327e4ee-66b1-4f08-9cb8-9facc43491b4\") " pod="openstack/swift-ring-rebalance-s8pzl" Mar 19 19:16:48 crc kubenswrapper[4826]: I0319 19:16:48.515133 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9327e4ee-66b1-4f08-9cb8-9facc43491b4-swiftconf\") pod \"swift-ring-rebalance-s8pzl\" (UID: \"9327e4ee-66b1-4f08-9cb8-9facc43491b4\") " pod="openstack/swift-ring-rebalance-s8pzl" Mar 19 19:16:48 crc kubenswrapper[4826]: I0319 19:16:48.515155 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9327e4ee-66b1-4f08-9cb8-9facc43491b4-etc-swift\") pod \"swift-ring-rebalance-s8pzl\" (UID: \"9327e4ee-66b1-4f08-9cb8-9facc43491b4\") " pod="openstack/swift-ring-rebalance-s8pzl" Mar 19 19:16:48 crc kubenswrapper[4826]: I0319 19:16:48.515168 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9327e4ee-66b1-4f08-9cb8-9facc43491b4-combined-ca-bundle\") pod \"swift-ring-rebalance-s8pzl\" (UID: \"9327e4ee-66b1-4f08-9cb8-9facc43491b4\") " pod="openstack/swift-ring-rebalance-s8pzl" Mar 19 19:16:48 crc kubenswrapper[4826]: I0319 19:16:48.515198 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hldr6\" (UniqueName: \"kubernetes.io/projected/9327e4ee-66b1-4f08-9cb8-9facc43491b4-kube-api-access-hldr6\") pod \"swift-ring-rebalance-s8pzl\" (UID: \"9327e4ee-66b1-4f08-9cb8-9facc43491b4\") " pod="openstack/swift-ring-rebalance-s8pzl" Mar 19 19:16:48 crc kubenswrapper[4826]: I0319 19:16:48.525754 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"d6c4b87f-ecc2-49f3-98db-f9e81cacdb7e","Type":"ContainerStarted","Data":"f572ce602f42615b59f64ce1b375e6167ea5293b5369e92d0d7522dc5d9a1be5"} Mar 19 19:16:48 crc kubenswrapper[4826]: I0319 19:16:48.526835 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-nqjw7" event={"ID":"f22cc2a8-4d4f-42c8-b22e-478636b5259e","Type":"ContainerStarted","Data":"dd37605856f33ca28b622c507288eac0f0b11f5073b144c9890a2d1128a75804"} Mar 19 19:16:48 crc kubenswrapper[4826]: I0319 19:16:48.617228 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9327e4ee-66b1-4f08-9cb8-9facc43491b4-scripts\") pod \"swift-ring-rebalance-s8pzl\" (UID: \"9327e4ee-66b1-4f08-9cb8-9facc43491b4\") " pod="openstack/swift-ring-rebalance-s8pzl" Mar 19 19:16:48 crc kubenswrapper[4826]: I0319 19:16:48.617318 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9327e4ee-66b1-4f08-9cb8-9facc43491b4-dispersionconf\") pod \"swift-ring-rebalance-s8pzl\" (UID: \"9327e4ee-66b1-4f08-9cb8-9facc43491b4\") " pod="openstack/swift-ring-rebalance-s8pzl" Mar 19 19:16:48 crc kubenswrapper[4826]: I0319 19:16:48.617387 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9327e4ee-66b1-4f08-9cb8-9facc43491b4-ring-data-devices\") pod \"swift-ring-rebalance-s8pzl\" (UID: \"9327e4ee-66b1-4f08-9cb8-9facc43491b4\") " pod="openstack/swift-ring-rebalance-s8pzl" Mar 19 19:16:48 crc kubenswrapper[4826]: I0319 19:16:48.617437 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9327e4ee-66b1-4f08-9cb8-9facc43491b4-swiftconf\") pod \"swift-ring-rebalance-s8pzl\" (UID: \"9327e4ee-66b1-4f08-9cb8-9facc43491b4\") " pod="openstack/swift-ring-rebalance-s8pzl" Mar 19 19:16:48 crc kubenswrapper[4826]: I0319 19:16:48.617465 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9327e4ee-66b1-4f08-9cb8-9facc43491b4-etc-swift\") pod \"swift-ring-rebalance-s8pzl\" (UID: \"9327e4ee-66b1-4f08-9cb8-9facc43491b4\") " pod="openstack/swift-ring-rebalance-s8pzl" Mar 19 19:16:48 crc kubenswrapper[4826]: I0319 19:16:48.617484 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9327e4ee-66b1-4f08-9cb8-9facc43491b4-combined-ca-bundle\") pod \"swift-ring-rebalance-s8pzl\" (UID: \"9327e4ee-66b1-4f08-9cb8-9facc43491b4\") " pod="openstack/swift-ring-rebalance-s8pzl" Mar 19 19:16:48 crc kubenswrapper[4826]: I0319 19:16:48.617525 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hldr6\" (UniqueName: \"kubernetes.io/projected/9327e4ee-66b1-4f08-9cb8-9facc43491b4-kube-api-access-hldr6\") pod \"swift-ring-rebalance-s8pzl\" (UID: \"9327e4ee-66b1-4f08-9cb8-9facc43491b4\") " pod="openstack/swift-ring-rebalance-s8pzl" Mar 19 19:16:48 crc kubenswrapper[4826]: I0319 19:16:48.618471 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9327e4ee-66b1-4f08-9cb8-9facc43491b4-etc-swift\") pod \"swift-ring-rebalance-s8pzl\" (UID: \"9327e4ee-66b1-4f08-9cb8-9facc43491b4\") " pod="openstack/swift-ring-rebalance-s8pzl" Mar 19 19:16:48 crc kubenswrapper[4826]: I0319 19:16:48.618627 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9327e4ee-66b1-4f08-9cb8-9facc43491b4-scripts\") pod \"swift-ring-rebalance-s8pzl\" (UID: \"9327e4ee-66b1-4f08-9cb8-9facc43491b4\") " pod="openstack/swift-ring-rebalance-s8pzl" Mar 19 19:16:48 crc kubenswrapper[4826]: I0319 19:16:48.619291 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9327e4ee-66b1-4f08-9cb8-9facc43491b4-ring-data-devices\") pod \"swift-ring-rebalance-s8pzl\" (UID: \"9327e4ee-66b1-4f08-9cb8-9facc43491b4\") " pod="openstack/swift-ring-rebalance-s8pzl" Mar 19 19:16:48 crc kubenswrapper[4826]: I0319 19:16:48.624879 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9327e4ee-66b1-4f08-9cb8-9facc43491b4-dispersionconf\") pod \"swift-ring-rebalance-s8pzl\" (UID: \"9327e4ee-66b1-4f08-9cb8-9facc43491b4\") " pod="openstack/swift-ring-rebalance-s8pzl" Mar 19 19:16:48 crc kubenswrapper[4826]: I0319 19:16:48.625319 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9327e4ee-66b1-4f08-9cb8-9facc43491b4-combined-ca-bundle\") pod \"swift-ring-rebalance-s8pzl\" (UID: \"9327e4ee-66b1-4f08-9cb8-9facc43491b4\") " pod="openstack/swift-ring-rebalance-s8pzl" Mar 19 19:16:48 crc kubenswrapper[4826]: I0319 19:16:48.626242 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9327e4ee-66b1-4f08-9cb8-9facc43491b4-swiftconf\") pod \"swift-ring-rebalance-s8pzl\" (UID: \"9327e4ee-66b1-4f08-9cb8-9facc43491b4\") " pod="openstack/swift-ring-rebalance-s8pzl" Mar 19 19:16:48 crc kubenswrapper[4826]: I0319 19:16:48.648573 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hldr6\" (UniqueName: \"kubernetes.io/projected/9327e4ee-66b1-4f08-9cb8-9facc43491b4-kube-api-access-hldr6\") pod \"swift-ring-rebalance-s8pzl\" (UID: \"9327e4ee-66b1-4f08-9cb8-9facc43491b4\") " pod="openstack/swift-ring-rebalance-s8pzl" Mar 19 19:16:48 crc kubenswrapper[4826]: I0319 19:16:48.666381 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-s8pzl" Mar 19 19:16:48 crc kubenswrapper[4826]: I0319 19:16:48.927408 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/775f9d8a-377a-4913-b2d2-3bb1b7aec077-etc-swift\") pod \"swift-storage-0\" (UID: \"775f9d8a-377a-4913-b2d2-3bb1b7aec077\") " pod="openstack/swift-storage-0" Mar 19 19:16:48 crc kubenswrapper[4826]: E0319 19:16:48.927667 4826 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 19 19:16:48 crc kubenswrapper[4826]: E0319 19:16:48.927928 4826 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 19 19:16:48 crc kubenswrapper[4826]: E0319 19:16:48.928029 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/775f9d8a-377a-4913-b2d2-3bb1b7aec077-etc-swift podName:775f9d8a-377a-4913-b2d2-3bb1b7aec077 nodeName:}" failed. No retries permitted until 2026-03-19 19:16:50.928005415 +0000 UTC m=+1235.682073738 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/775f9d8a-377a-4913-b2d2-3bb1b7aec077-etc-swift") pod "swift-storage-0" (UID: "775f9d8a-377a-4913-b2d2-3bb1b7aec077") : configmap "swift-ring-files" not found Mar 19 19:16:49 crc kubenswrapper[4826]: I0319 19:16:49.209937 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-s8pzl"] Mar 19 19:16:49 crc kubenswrapper[4826]: W0319 19:16:49.221835 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9327e4ee_66b1_4f08_9cb8_9facc43491b4.slice/crio-d008225b3b1a49a120d470b6ffbeadd67219f520c9cee8e470ef8843054a0bb7 WatchSource:0}: Error finding container d008225b3b1a49a120d470b6ffbeadd67219f520c9cee8e470ef8843054a0bb7: Status 404 returned error can't find the container with id d008225b3b1a49a120d470b6ffbeadd67219f520c9cee8e470ef8843054a0bb7 Mar 19 19:16:49 crc kubenswrapper[4826]: I0319 19:16:49.540357 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-s8pzl" event={"ID":"9327e4ee-66b1-4f08-9cb8-9facc43491b4","Type":"ContainerStarted","Data":"d008225b3b1a49a120d470b6ffbeadd67219f520c9cee8e470ef8843054a0bb7"} Mar 19 19:16:50 crc kubenswrapper[4826]: I0319 19:16:50.554022 4826 generic.go:334] "Generic (PLEG): container finished" podID="34bfe186-0db4-4b55-a738-bcfa3cf10972" containerID="7dd8632a1dbc9b1613971d71d747506f77ad4235ae2414e8a2b662fa7bb01f32" exitCode=0 Mar 19 19:16:50 crc kubenswrapper[4826]: I0319 19:16:50.554181 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-gsfz5" event={"ID":"34bfe186-0db4-4b55-a738-bcfa3cf10972","Type":"ContainerDied","Data":"7dd8632a1dbc9b1613971d71d747506f77ad4235ae2414e8a2b662fa7bb01f32"} Mar 19 19:16:50 crc kubenswrapper[4826]: I0319 19:16:50.557850 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-7srxv" event={"ID":"0c42846f-f2bf-4928-bc9b-d200b10cbf9b","Type":"ContainerStarted","Data":"3967d4a9ff8305ee2b0fc81290c391201e536570aabe77bc3466d4fe950d9e54"} Mar 19 19:16:50 crc kubenswrapper[4826]: I0319 19:16:50.561009 4826 generic.go:334] "Generic (PLEG): container finished" podID="f22cc2a8-4d4f-42c8-b22e-478636b5259e" containerID="15014fa0408a8c08289b2115c220d8fc06ec8c78be7b7f7d342af741e3a520f0" exitCode=0 Mar 19 19:16:50 crc kubenswrapper[4826]: I0319 19:16:50.561063 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-nqjw7" event={"ID":"f22cc2a8-4d4f-42c8-b22e-478636b5259e","Type":"ContainerDied","Data":"15014fa0408a8c08289b2115c220d8fc06ec8c78be7b7f7d342af741e3a520f0"} Mar 19 19:16:50 crc kubenswrapper[4826]: I0319 19:16:50.610960 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-7srxv" podStartSLOduration=5.61093887 podStartE2EDuration="5.61093887s" podCreationTimestamp="2026-03-19 19:16:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:16:50.592490244 +0000 UTC m=+1235.346558557" watchObservedRunningTime="2026-03-19 19:16:50.61093887 +0000 UTC m=+1235.365007183" Mar 19 19:16:50 crc kubenswrapper[4826]: I0319 19:16:50.995528 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/775f9d8a-377a-4913-b2d2-3bb1b7aec077-etc-swift\") pod \"swift-storage-0\" (UID: \"775f9d8a-377a-4913-b2d2-3bb1b7aec077\") " pod="openstack/swift-storage-0" Mar 19 19:16:50 crc kubenswrapper[4826]: E0319 19:16:50.995743 4826 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 19 19:16:50 crc kubenswrapper[4826]: E0319 19:16:50.997254 4826 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 19 19:16:50 crc kubenswrapper[4826]: E0319 19:16:50.997342 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/775f9d8a-377a-4913-b2d2-3bb1b7aec077-etc-swift podName:775f9d8a-377a-4913-b2d2-3bb1b7aec077 nodeName:}" failed. No retries permitted until 2026-03-19 19:16:54.997308988 +0000 UTC m=+1239.751377311 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/775f9d8a-377a-4913-b2d2-3bb1b7aec077-etc-swift") pod "swift-storage-0" (UID: "775f9d8a-377a-4913-b2d2-3bb1b7aec077") : configmap "swift-ring-files" not found Mar 19 19:16:51 crc kubenswrapper[4826]: I0319 19:16:51.585219 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-gsfz5" event={"ID":"34bfe186-0db4-4b55-a738-bcfa3cf10972","Type":"ContainerStarted","Data":"55c9d0abe7da1d3babdd4a2029746eaf635cbed25ef96c900a5e115058e65647"} Mar 19 19:16:51 crc kubenswrapper[4826]: I0319 19:16:51.585590 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5bf47b49b7-gsfz5" Mar 19 19:16:51 crc kubenswrapper[4826]: I0319 19:16:51.587642 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"d6c4b87f-ecc2-49f3-98db-f9e81cacdb7e","Type":"ContainerStarted","Data":"4ad5ed24fe6486996447b834ca03548f17b5918afa8d291e0009552d1ae7057e"} Mar 19 19:16:51 crc kubenswrapper[4826]: I0319 19:16:51.593811 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-nqjw7" event={"ID":"f22cc2a8-4d4f-42c8-b22e-478636b5259e","Type":"ContainerStarted","Data":"326c7e917beb6524f0a0ba588c56756018b6f25727555edcfa14a915da74514e"} Mar 19 19:16:51 crc kubenswrapper[4826]: I0319 19:16:51.593934 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-nqjw7" Mar 19 19:16:51 crc kubenswrapper[4826]: I0319 19:16:51.601609 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 19 19:16:51 crc kubenswrapper[4826]: I0319 19:16:51.601672 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 19 19:16:51 crc kubenswrapper[4826]: I0319 19:16:51.610956 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5bf47b49b7-gsfz5" podStartSLOduration=6.610930299 podStartE2EDuration="6.610930299s" podCreationTimestamp="2026-03-19 19:16:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:16:51.601953471 +0000 UTC m=+1236.356021804" watchObservedRunningTime="2026-03-19 19:16:51.610930299 +0000 UTC m=+1236.364998612" Mar 19 19:16:51 crc kubenswrapper[4826]: I0319 19:16:51.627107 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-nqjw7" podStartSLOduration=5.627061708 podStartE2EDuration="5.627061708s" podCreationTimestamp="2026-03-19 19:16:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:16:51.6208988 +0000 UTC m=+1236.374967113" watchObservedRunningTime="2026-03-19 19:16:51.627061708 +0000 UTC m=+1236.381130031" Mar 19 19:16:52 crc kubenswrapper[4826]: I0319 19:16:52.628893 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 19 19:16:52 crc kubenswrapper[4826]: I0319 19:16:52.629789 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 19 19:16:52 crc kubenswrapper[4826]: I0319 19:16:52.716064 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 19 19:16:53 crc kubenswrapper[4826]: I0319 19:16:53.740299 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 19 19:16:53 crc kubenswrapper[4826]: I0319 19:16:53.947287 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 19 19:16:54 crc kubenswrapper[4826]: I0319 19:16:54.063409 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 19 19:16:54 crc kubenswrapper[4826]: I0319 19:16:54.330353 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-9d24-account-create-update-qr7td"] Mar 19 19:16:54 crc kubenswrapper[4826]: I0319 19:16:54.331970 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9d24-account-create-update-qr7td" Mar 19 19:16:54 crc kubenswrapper[4826]: I0319 19:16:54.333933 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 19 19:16:54 crc kubenswrapper[4826]: I0319 19:16:54.340263 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-9d24-account-create-update-qr7td"] Mar 19 19:16:54 crc kubenswrapper[4826]: I0319 19:16:54.507988 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/330a9484-e555-4aa4-ae9a-7d09bf97571a-operator-scripts\") pod \"placement-9d24-account-create-update-qr7td\" (UID: \"330a9484-e555-4aa4-ae9a-7d09bf97571a\") " pod="openstack/placement-9d24-account-create-update-qr7td" Mar 19 19:16:54 crc kubenswrapper[4826]: I0319 19:16:54.508262 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d48hr\" (UniqueName: \"kubernetes.io/projected/330a9484-e555-4aa4-ae9a-7d09bf97571a-kube-api-access-d48hr\") pod \"placement-9d24-account-create-update-qr7td\" (UID: \"330a9484-e555-4aa4-ae9a-7d09bf97571a\") " pod="openstack/placement-9d24-account-create-update-qr7td" Mar 19 19:16:54 crc kubenswrapper[4826]: I0319 19:16:54.610475 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/330a9484-e555-4aa4-ae9a-7d09bf97571a-operator-scripts\") pod \"placement-9d24-account-create-update-qr7td\" (UID: \"330a9484-e555-4aa4-ae9a-7d09bf97571a\") " pod="openstack/placement-9d24-account-create-update-qr7td" Mar 19 19:16:54 crc kubenswrapper[4826]: I0319 19:16:54.610649 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d48hr\" (UniqueName: \"kubernetes.io/projected/330a9484-e555-4aa4-ae9a-7d09bf97571a-kube-api-access-d48hr\") pod \"placement-9d24-account-create-update-qr7td\" (UID: \"330a9484-e555-4aa4-ae9a-7d09bf97571a\") " pod="openstack/placement-9d24-account-create-update-qr7td" Mar 19 19:16:54 crc kubenswrapper[4826]: I0319 19:16:54.611279 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/330a9484-e555-4aa4-ae9a-7d09bf97571a-operator-scripts\") pod \"placement-9d24-account-create-update-qr7td\" (UID: \"330a9484-e555-4aa4-ae9a-7d09bf97571a\") " pod="openstack/placement-9d24-account-create-update-qr7td" Mar 19 19:16:54 crc kubenswrapper[4826]: I0319 19:16:54.639490 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d48hr\" (UniqueName: \"kubernetes.io/projected/330a9484-e555-4aa4-ae9a-7d09bf97571a-kube-api-access-d48hr\") pod \"placement-9d24-account-create-update-qr7td\" (UID: \"330a9484-e555-4aa4-ae9a-7d09bf97571a\") " pod="openstack/placement-9d24-account-create-update-qr7td" Mar 19 19:16:54 crc kubenswrapper[4826]: I0319 19:16:54.665095 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9d24-account-create-update-qr7td" Mar 19 19:16:55 crc kubenswrapper[4826]: I0319 19:16:55.020926 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/775f9d8a-377a-4913-b2d2-3bb1b7aec077-etc-swift\") pod \"swift-storage-0\" (UID: \"775f9d8a-377a-4913-b2d2-3bb1b7aec077\") " pod="openstack/swift-storage-0" Mar 19 19:16:55 crc kubenswrapper[4826]: E0319 19:16:55.021177 4826 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 19 19:16:55 crc kubenswrapper[4826]: E0319 19:16:55.021431 4826 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 19 19:16:55 crc kubenswrapper[4826]: E0319 19:16:55.021500 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/775f9d8a-377a-4913-b2d2-3bb1b7aec077-etc-swift podName:775f9d8a-377a-4913-b2d2-3bb1b7aec077 nodeName:}" failed. No retries permitted until 2026-03-19 19:17:03.02147993 +0000 UTC m=+1247.775548253 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/775f9d8a-377a-4913-b2d2-3bb1b7aec077-etc-swift") pod "swift-storage-0" (UID: "775f9d8a-377a-4913-b2d2-3bb1b7aec077") : configmap "swift-ring-files" not found Mar 19 19:16:55 crc kubenswrapper[4826]: I0319 19:16:55.930259 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-v8dg5"] Mar 19 19:16:55 crc kubenswrapper[4826]: I0319 19:16:55.931639 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-v8dg5" Mar 19 19:16:55 crc kubenswrapper[4826]: I0319 19:16:55.947392 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-v8dg5"] Mar 19 19:16:55 crc kubenswrapper[4826]: I0319 19:16:55.994705 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5bf47b49b7-gsfz5" Mar 19 19:16:56 crc kubenswrapper[4826]: I0319 19:16:56.042783 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7tbt\" (UniqueName: \"kubernetes.io/projected/8948d5dc-068b-4571-bd13-f0cbf3750004-kube-api-access-g7tbt\") pod \"mysqld-exporter-openstack-db-create-v8dg5\" (UID: \"8948d5dc-068b-4571-bd13-f0cbf3750004\") " pod="openstack/mysqld-exporter-openstack-db-create-v8dg5" Mar 19 19:16:56 crc kubenswrapper[4826]: I0319 19:16:56.042905 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8948d5dc-068b-4571-bd13-f0cbf3750004-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-v8dg5\" (UID: \"8948d5dc-068b-4571-bd13-f0cbf3750004\") " pod="openstack/mysqld-exporter-openstack-db-create-v8dg5" Mar 19 19:16:56 crc kubenswrapper[4826]: I0319 19:16:56.140808 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-3908-account-create-update-qx49d"] Mar 19 19:16:56 crc kubenswrapper[4826]: I0319 19:16:56.142429 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-3908-account-create-update-qx49d" Mar 19 19:16:56 crc kubenswrapper[4826]: I0319 19:16:56.144318 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7tbt\" (UniqueName: \"kubernetes.io/projected/8948d5dc-068b-4571-bd13-f0cbf3750004-kube-api-access-g7tbt\") pod \"mysqld-exporter-openstack-db-create-v8dg5\" (UID: \"8948d5dc-068b-4571-bd13-f0cbf3750004\") " pod="openstack/mysqld-exporter-openstack-db-create-v8dg5" Mar 19 19:16:56 crc kubenswrapper[4826]: I0319 19:16:56.144468 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8948d5dc-068b-4571-bd13-f0cbf3750004-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-v8dg5\" (UID: \"8948d5dc-068b-4571-bd13-f0cbf3750004\") " pod="openstack/mysqld-exporter-openstack-db-create-v8dg5" Mar 19 19:16:56 crc kubenswrapper[4826]: I0319 19:16:56.144549 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-openstack-db-secret" Mar 19 19:16:56 crc kubenswrapper[4826]: I0319 19:16:56.145824 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8948d5dc-068b-4571-bd13-f0cbf3750004-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-v8dg5\" (UID: \"8948d5dc-068b-4571-bd13-f0cbf3750004\") " pod="openstack/mysqld-exporter-openstack-db-create-v8dg5" Mar 19 19:16:56 crc kubenswrapper[4826]: I0319 19:16:56.165377 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7tbt\" (UniqueName: \"kubernetes.io/projected/8948d5dc-068b-4571-bd13-f0cbf3750004-kube-api-access-g7tbt\") pod \"mysqld-exporter-openstack-db-create-v8dg5\" (UID: \"8948d5dc-068b-4571-bd13-f0cbf3750004\") " pod="openstack/mysqld-exporter-openstack-db-create-v8dg5" Mar 19 19:16:56 crc kubenswrapper[4826]: I0319 19:16:56.166569 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-3908-account-create-update-qx49d"] Mar 19 19:16:56 crc kubenswrapper[4826]: I0319 19:16:56.246720 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6c70b2d-b15f-4fbe-a820-d24479ea0d40-operator-scripts\") pod \"mysqld-exporter-3908-account-create-update-qx49d\" (UID: \"a6c70b2d-b15f-4fbe-a820-d24479ea0d40\") " pod="openstack/mysqld-exporter-3908-account-create-update-qx49d" Mar 19 19:16:56 crc kubenswrapper[4826]: I0319 19:16:56.246780 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbnh4\" (UniqueName: \"kubernetes.io/projected/a6c70b2d-b15f-4fbe-a820-d24479ea0d40-kube-api-access-xbnh4\") pod \"mysqld-exporter-3908-account-create-update-qx49d\" (UID: \"a6c70b2d-b15f-4fbe-a820-d24479ea0d40\") " pod="openstack/mysqld-exporter-3908-account-create-update-qx49d" Mar 19 19:16:56 crc kubenswrapper[4826]: I0319 19:16:56.262852 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-v8dg5" Mar 19 19:16:56 crc kubenswrapper[4826]: I0319 19:16:56.349518 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6c70b2d-b15f-4fbe-a820-d24479ea0d40-operator-scripts\") pod \"mysqld-exporter-3908-account-create-update-qx49d\" (UID: \"a6c70b2d-b15f-4fbe-a820-d24479ea0d40\") " pod="openstack/mysqld-exporter-3908-account-create-update-qx49d" Mar 19 19:16:56 crc kubenswrapper[4826]: I0319 19:16:56.349576 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbnh4\" (UniqueName: \"kubernetes.io/projected/a6c70b2d-b15f-4fbe-a820-d24479ea0d40-kube-api-access-xbnh4\") pod \"mysqld-exporter-3908-account-create-update-qx49d\" (UID: \"a6c70b2d-b15f-4fbe-a820-d24479ea0d40\") " pod="openstack/mysqld-exporter-3908-account-create-update-qx49d" Mar 19 19:16:56 crc kubenswrapper[4826]: I0319 19:16:56.350266 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6c70b2d-b15f-4fbe-a820-d24479ea0d40-operator-scripts\") pod \"mysqld-exporter-3908-account-create-update-qx49d\" (UID: \"a6c70b2d-b15f-4fbe-a820-d24479ea0d40\") " pod="openstack/mysqld-exporter-3908-account-create-update-qx49d" Mar 19 19:16:56 crc kubenswrapper[4826]: I0319 19:16:56.365795 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbnh4\" (UniqueName: \"kubernetes.io/projected/a6c70b2d-b15f-4fbe-a820-d24479ea0d40-kube-api-access-xbnh4\") pod \"mysqld-exporter-3908-account-create-update-qx49d\" (UID: \"a6c70b2d-b15f-4fbe-a820-d24479ea0d40\") " pod="openstack/mysqld-exporter-3908-account-create-update-qx49d" Mar 19 19:16:56 crc kubenswrapper[4826]: I0319 19:16:56.464356 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-3908-account-create-update-qx49d" Mar 19 19:16:56 crc kubenswrapper[4826]: I0319 19:16:56.647854 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-nqjw7" Mar 19 19:16:56 crc kubenswrapper[4826]: I0319 19:16:56.716944 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-gsfz5"] Mar 19 19:16:56 crc kubenswrapper[4826]: I0319 19:16:56.717170 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5bf47b49b7-gsfz5" podUID="34bfe186-0db4-4b55-a738-bcfa3cf10972" containerName="dnsmasq-dns" containerID="cri-o://55c9d0abe7da1d3babdd4a2029746eaf635cbed25ef96c900a5e115058e65647" gracePeriod=10 Mar 19 19:16:57 crc kubenswrapper[4826]: I0319 19:16:57.659209 4826 generic.go:334] "Generic (PLEG): container finished" podID="34bfe186-0db4-4b55-a738-bcfa3cf10972" containerID="55c9d0abe7da1d3babdd4a2029746eaf635cbed25ef96c900a5e115058e65647" exitCode=0 Mar 19 19:16:57 crc kubenswrapper[4826]: I0319 19:16:57.659294 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-gsfz5" event={"ID":"34bfe186-0db4-4b55-a738-bcfa3cf10972","Type":"ContainerDied","Data":"55c9d0abe7da1d3babdd4a2029746eaf635cbed25ef96c900a5e115058e65647"} Mar 19 19:16:59 crc kubenswrapper[4826]: I0319 19:16:59.254549 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-gsfz5" Mar 19 19:16:59 crc kubenswrapper[4826]: I0319 19:16:59.429110 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34bfe186-0db4-4b55-a738-bcfa3cf10972-dns-svc\") pod \"34bfe186-0db4-4b55-a738-bcfa3cf10972\" (UID: \"34bfe186-0db4-4b55-a738-bcfa3cf10972\") " Mar 19 19:16:59 crc kubenswrapper[4826]: I0319 19:16:59.429465 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34bfe186-0db4-4b55-a738-bcfa3cf10972-ovsdbserver-nb\") pod \"34bfe186-0db4-4b55-a738-bcfa3cf10972\" (UID: \"34bfe186-0db4-4b55-a738-bcfa3cf10972\") " Mar 19 19:16:59 crc kubenswrapper[4826]: I0319 19:16:59.429682 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34bfe186-0db4-4b55-a738-bcfa3cf10972-config\") pod \"34bfe186-0db4-4b55-a738-bcfa3cf10972\" (UID: \"34bfe186-0db4-4b55-a738-bcfa3cf10972\") " Mar 19 19:16:59 crc kubenswrapper[4826]: I0319 19:16:59.429711 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpw8w\" (UniqueName: \"kubernetes.io/projected/34bfe186-0db4-4b55-a738-bcfa3cf10972-kube-api-access-xpw8w\") pod \"34bfe186-0db4-4b55-a738-bcfa3cf10972\" (UID: \"34bfe186-0db4-4b55-a738-bcfa3cf10972\") " Mar 19 19:16:59 crc kubenswrapper[4826]: I0319 19:16:59.436825 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34bfe186-0db4-4b55-a738-bcfa3cf10972-kube-api-access-xpw8w" (OuterVolumeSpecName: "kube-api-access-xpw8w") pod "34bfe186-0db4-4b55-a738-bcfa3cf10972" (UID: "34bfe186-0db4-4b55-a738-bcfa3cf10972"). InnerVolumeSpecName "kube-api-access-xpw8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:16:59 crc kubenswrapper[4826]: I0319 19:16:59.492076 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34bfe186-0db4-4b55-a738-bcfa3cf10972-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "34bfe186-0db4-4b55-a738-bcfa3cf10972" (UID: "34bfe186-0db4-4b55-a738-bcfa3cf10972"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:16:59 crc kubenswrapper[4826]: I0319 19:16:59.493202 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34bfe186-0db4-4b55-a738-bcfa3cf10972-config" (OuterVolumeSpecName: "config") pod "34bfe186-0db4-4b55-a738-bcfa3cf10972" (UID: "34bfe186-0db4-4b55-a738-bcfa3cf10972"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:16:59 crc kubenswrapper[4826]: I0319 19:16:59.494362 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34bfe186-0db4-4b55-a738-bcfa3cf10972-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "34bfe186-0db4-4b55-a738-bcfa3cf10972" (UID: "34bfe186-0db4-4b55-a738-bcfa3cf10972"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:16:59 crc kubenswrapper[4826]: I0319 19:16:59.533617 4826 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34bfe186-0db4-4b55-a738-bcfa3cf10972-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:59 crc kubenswrapper[4826]: I0319 19:16:59.533685 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpw8w\" (UniqueName: \"kubernetes.io/projected/34bfe186-0db4-4b55-a738-bcfa3cf10972-kube-api-access-xpw8w\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:59 crc kubenswrapper[4826]: I0319 19:16:59.533706 4826 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34bfe186-0db4-4b55-a738-bcfa3cf10972-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:59 crc kubenswrapper[4826]: I0319 19:16:59.533722 4826 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34bfe186-0db4-4b55-a738-bcfa3cf10972-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 19:16:59 crc kubenswrapper[4826]: I0319 19:16:59.600125 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-9d24-account-create-update-qr7td"] Mar 19 19:16:59 crc kubenswrapper[4826]: W0319 19:16:59.637097 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod330a9484_e555_4aa4_ae9a_7d09bf97571a.slice/crio-9f3e70ebb4890670504443281fd092bf651274732e39aeef2d838a0bd2124ad4 WatchSource:0}: Error finding container 9f3e70ebb4890670504443281fd092bf651274732e39aeef2d838a0bd2124ad4: Status 404 returned error can't find the container with id 9f3e70ebb4890670504443281fd092bf651274732e39aeef2d838a0bd2124ad4 Mar 19 19:16:59 crc kubenswrapper[4826]: I0319 19:16:59.687757 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-gsfz5" event={"ID":"34bfe186-0db4-4b55-a738-bcfa3cf10972","Type":"ContainerDied","Data":"a38d4029a9e1d19da603d18d84b0c041d126bb3e9050b0a32b3f3c0c42e082ef"} Mar 19 19:16:59 crc kubenswrapper[4826]: I0319 19:16:59.687810 4826 scope.go:117] "RemoveContainer" containerID="55c9d0abe7da1d3babdd4a2029746eaf635cbed25ef96c900a5e115058e65647" Mar 19 19:16:59 crc kubenswrapper[4826]: I0319 19:16:59.687951 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-gsfz5" Mar 19 19:16:59 crc kubenswrapper[4826]: I0319 19:16:59.699920 4826 generic.go:334] "Generic (PLEG): container finished" podID="2325ef7c-90a0-48f3-81f0-ede3e7f33570" containerID="045565dbdc1fcc69a6554c263960690e114832329c56dc8084a5c107a59ae84b" exitCode=0 Mar 19 19:16:59 crc kubenswrapper[4826]: I0319 19:16:59.699984 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"2325ef7c-90a0-48f3-81f0-ede3e7f33570","Type":"ContainerDied","Data":"045565dbdc1fcc69a6554c263960690e114832329c56dc8084a5c107a59ae84b"} Mar 19 19:16:59 crc kubenswrapper[4826]: I0319 19:16:59.707154 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"63839c94-d94a-4fe8-a195-b86a6a9e8b79","Type":"ContainerStarted","Data":"c0f1f7d466460401af90a3d9bff2ecdb385e1668e8526c5c229b079613d3dbec"} Mar 19 19:16:59 crc kubenswrapper[4826]: I0319 19:16:59.708894 4826 generic.go:334] "Generic (PLEG): container finished" podID="ad041e2d-3400-49ce-b25f-0d335f3b6738" containerID="4949977f15e132445d8d9e1657957d62bc88426ae01376ce6c0dd6719b94f8e3" exitCode=0 Mar 19 19:16:59 crc kubenswrapper[4826]: I0319 19:16:59.708952 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"ad041e2d-3400-49ce-b25f-0d335f3b6738","Type":"ContainerDied","Data":"4949977f15e132445d8d9e1657957d62bc88426ae01376ce6c0dd6719b94f8e3"} Mar 19 19:16:59 crc kubenswrapper[4826]: I0319 19:16:59.711944 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-v8dg5"] Mar 19 19:16:59 crc kubenswrapper[4826]: I0319 19:16:59.712946 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"d6c4b87f-ecc2-49f3-98db-f9e81cacdb7e","Type":"ContainerStarted","Data":"711a02086e23bb94086eb4585b9380441b8cc92e3e070685f014593c34139976"} Mar 19 19:16:59 crc kubenswrapper[4826]: I0319 19:16:59.713220 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 19 19:16:59 crc kubenswrapper[4826]: W0319 19:16:59.713921 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8948d5dc_068b_4571_bd13_f0cbf3750004.slice/crio-afd1e92cfaa6fa7389e9bd4561368a5538c47c08cccd89373d05bd1c53a94435 WatchSource:0}: Error finding container afd1e92cfaa6fa7389e9bd4561368a5538c47c08cccd89373d05bd1c53a94435: Status 404 returned error can't find the container with id afd1e92cfaa6fa7389e9bd4561368a5538c47c08cccd89373d05bd1c53a94435 Mar 19 19:16:59 crc kubenswrapper[4826]: I0319 19:16:59.714648 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9d24-account-create-update-qr7td" event={"ID":"330a9484-e555-4aa4-ae9a-7d09bf97571a","Type":"ContainerStarted","Data":"9f3e70ebb4890670504443281fd092bf651274732e39aeef2d838a0bd2124ad4"} Mar 19 19:16:59 crc kubenswrapper[4826]: I0319 19:16:59.716323 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-s8pzl" event={"ID":"9327e4ee-66b1-4f08-9cb8-9facc43491b4","Type":"ContainerStarted","Data":"10a151b600161f6457bd2b2786cc4e512f55741f9173282433b805759f2cb53a"} Mar 19 19:16:59 crc kubenswrapper[4826]: W0319 19:16:59.716367 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6c70b2d_b15f_4fbe_a820_d24479ea0d40.slice/crio-a162662e1666151177f448b1feb2247473ea42c585154e443551a3ff6f8f1295 WatchSource:0}: Error finding container a162662e1666151177f448b1feb2247473ea42c585154e443551a3ff6f8f1295: Status 404 returned error can't find the container with id a162662e1666151177f448b1feb2247473ea42c585154e443551a3ff6f8f1295 Mar 19 19:16:59 crc kubenswrapper[4826]: I0319 19:16:59.729798 4826 scope.go:117] "RemoveContainer" containerID="7dd8632a1dbc9b1613971d71d747506f77ad4235ae2414e8a2b662fa7bb01f32" Mar 19 19:16:59 crc kubenswrapper[4826]: I0319 19:16:59.743839 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-3908-account-create-update-qx49d"] Mar 19 19:16:59 crc kubenswrapper[4826]: I0319 19:16:59.779719 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-gsfz5"] Mar 19 19:16:59 crc kubenswrapper[4826]: I0319 19:16:59.796563 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-gsfz5"] Mar 19 19:16:59 crc kubenswrapper[4826]: I0319 19:16:59.798024 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=10.395194292 podStartE2EDuration="13.798001725s" podCreationTimestamp="2026-03-19 19:16:46 +0000 UTC" firstStartedPulling="2026-03-19 19:16:47.871280964 +0000 UTC m=+1232.625349297" lastFinishedPulling="2026-03-19 19:16:51.274088417 +0000 UTC m=+1236.028156730" observedRunningTime="2026-03-19 19:16:59.77257628 +0000 UTC m=+1244.526644603" watchObservedRunningTime="2026-03-19 19:16:59.798001725 +0000 UTC m=+1244.552070048" Mar 19 19:16:59 crc kubenswrapper[4826]: I0319 19:16:59.839377 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-s8pzl" podStartSLOduration=1.962687883 podStartE2EDuration="11.839358935s" podCreationTimestamp="2026-03-19 19:16:48 +0000 UTC" firstStartedPulling="2026-03-19 19:16:49.223380064 +0000 UTC m=+1233.977448377" lastFinishedPulling="2026-03-19 19:16:59.100051116 +0000 UTC m=+1243.854119429" observedRunningTime="2026-03-19 19:16:59.824888414 +0000 UTC m=+1244.578956737" watchObservedRunningTime="2026-03-19 19:16:59.839358935 +0000 UTC m=+1244.593427248" Mar 19 19:16:59 crc kubenswrapper[4826]: E0319 19:16:59.869385 4826 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34bfe186_0db4_4b55_a738_bcfa3cf10972.slice\": RecentStats: unable to find data in memory cache]" Mar 19 19:16:59 crc kubenswrapper[4826]: I0319 19:16:59.899811 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-vgjs9"] Mar 19 19:16:59 crc kubenswrapper[4826]: E0319 19:16:59.900374 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34bfe186-0db4-4b55-a738-bcfa3cf10972" containerName="dnsmasq-dns" Mar 19 19:16:59 crc kubenswrapper[4826]: I0319 19:16:59.900397 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="34bfe186-0db4-4b55-a738-bcfa3cf10972" containerName="dnsmasq-dns" Mar 19 19:16:59 crc kubenswrapper[4826]: E0319 19:16:59.900416 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34bfe186-0db4-4b55-a738-bcfa3cf10972" containerName="init" Mar 19 19:16:59 crc kubenswrapper[4826]: I0319 19:16:59.900425 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="34bfe186-0db4-4b55-a738-bcfa3cf10972" containerName="init" Mar 19 19:16:59 crc kubenswrapper[4826]: I0319 19:16:59.900691 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="34bfe186-0db4-4b55-a738-bcfa3cf10972" containerName="dnsmasq-dns" Mar 19 19:16:59 crc kubenswrapper[4826]: I0319 19:16:59.902079 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vgjs9" Mar 19 19:16:59 crc kubenswrapper[4826]: I0319 19:16:59.904469 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 19 19:16:59 crc kubenswrapper[4826]: I0319 19:16:59.910677 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-vgjs9"] Mar 19 19:16:59 crc kubenswrapper[4826]: I0319 19:16:59.988202 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34bfe186-0db4-4b55-a738-bcfa3cf10972" path="/var/lib/kubelet/pods/34bfe186-0db4-4b55-a738-bcfa3cf10972/volumes" Mar 19 19:17:00 crc kubenswrapper[4826]: I0319 19:17:00.053578 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db7f0121-bd0f-4144-9b3b-98d0b4d2d682-operator-scripts\") pod \"root-account-create-update-vgjs9\" (UID: \"db7f0121-bd0f-4144-9b3b-98d0b4d2d682\") " pod="openstack/root-account-create-update-vgjs9" Mar 19 19:17:00 crc kubenswrapper[4826]: I0319 19:17:00.053784 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x57qm\" (UniqueName: \"kubernetes.io/projected/db7f0121-bd0f-4144-9b3b-98d0b4d2d682-kube-api-access-x57qm\") pod \"root-account-create-update-vgjs9\" (UID: \"db7f0121-bd0f-4144-9b3b-98d0b4d2d682\") " pod="openstack/root-account-create-update-vgjs9" Mar 19 19:17:00 crc kubenswrapper[4826]: I0319 19:17:00.155694 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db7f0121-bd0f-4144-9b3b-98d0b4d2d682-operator-scripts\") pod \"root-account-create-update-vgjs9\" (UID: \"db7f0121-bd0f-4144-9b3b-98d0b4d2d682\") " pod="openstack/root-account-create-update-vgjs9" Mar 19 19:17:00 crc kubenswrapper[4826]: I0319 19:17:00.155816 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x57qm\" (UniqueName: \"kubernetes.io/projected/db7f0121-bd0f-4144-9b3b-98d0b4d2d682-kube-api-access-x57qm\") pod \"root-account-create-update-vgjs9\" (UID: \"db7f0121-bd0f-4144-9b3b-98d0b4d2d682\") " pod="openstack/root-account-create-update-vgjs9" Mar 19 19:17:00 crc kubenswrapper[4826]: I0319 19:17:00.157172 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db7f0121-bd0f-4144-9b3b-98d0b4d2d682-operator-scripts\") pod \"root-account-create-update-vgjs9\" (UID: \"db7f0121-bd0f-4144-9b3b-98d0b4d2d682\") " pod="openstack/root-account-create-update-vgjs9" Mar 19 19:17:00 crc kubenswrapper[4826]: I0319 19:17:00.182677 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x57qm\" (UniqueName: \"kubernetes.io/projected/db7f0121-bd0f-4144-9b3b-98d0b4d2d682-kube-api-access-x57qm\") pod \"root-account-create-update-vgjs9\" (UID: \"db7f0121-bd0f-4144-9b3b-98d0b4d2d682\") " pod="openstack/root-account-create-update-vgjs9" Mar 19 19:17:00 crc kubenswrapper[4826]: I0319 19:17:00.327395 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vgjs9" Mar 19 19:17:00 crc kubenswrapper[4826]: I0319 19:17:00.727520 4826 generic.go:334] "Generic (PLEG): container finished" podID="e617bcf9-daaa-4a7a-949c-cdf0fc9646a5" containerID="9f47c3da92f7e4ec1425e458bec96773c98df6c88e2e80157791087f2af7f4bd" exitCode=0 Mar 19 19:17:00 crc kubenswrapper[4826]: I0319 19:17:00.727614 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e617bcf9-daaa-4a7a-949c-cdf0fc9646a5","Type":"ContainerDied","Data":"9f47c3da92f7e4ec1425e458bec96773c98df6c88e2e80157791087f2af7f4bd"} Mar 19 19:17:00 crc kubenswrapper[4826]: I0319 19:17:00.730623 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-v8dg5" event={"ID":"8948d5dc-068b-4571-bd13-f0cbf3750004","Type":"ContainerStarted","Data":"9aa31870c72ebdb95890aa6803e6340e13d0ef42dc8a9d3b05b88ee93d4924f8"} Mar 19 19:17:00 crc kubenswrapper[4826]: I0319 19:17:00.730676 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-v8dg5" event={"ID":"8948d5dc-068b-4571-bd13-f0cbf3750004","Type":"ContainerStarted","Data":"afd1e92cfaa6fa7389e9bd4561368a5538c47c08cccd89373d05bd1c53a94435"} Mar 19 19:17:00 crc kubenswrapper[4826]: I0319 19:17:00.736603 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"2325ef7c-90a0-48f3-81f0-ede3e7f33570","Type":"ContainerStarted","Data":"44ca87a6f1315aca3fa78783b92f5e8f7f8b8fcb3cb71b35e857cd4f7578c88c"} Mar 19 19:17:00 crc kubenswrapper[4826]: I0319 19:17:00.736970 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-1" Mar 19 19:17:00 crc kubenswrapper[4826]: I0319 19:17:00.738567 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-3908-account-create-update-qx49d" event={"ID":"a6c70b2d-b15f-4fbe-a820-d24479ea0d40","Type":"ContainerStarted","Data":"25353c4ed41f810e11e5308d624954397db7245fdba59471262236628c14c0a9"} Mar 19 19:17:00 crc kubenswrapper[4826]: I0319 19:17:00.738595 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-3908-account-create-update-qx49d" event={"ID":"a6c70b2d-b15f-4fbe-a820-d24479ea0d40","Type":"ContainerStarted","Data":"a162662e1666151177f448b1feb2247473ea42c585154e443551a3ff6f8f1295"} Mar 19 19:17:00 crc kubenswrapper[4826]: I0319 19:17:00.741038 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"ad041e2d-3400-49ce-b25f-0d335f3b6738","Type":"ContainerStarted","Data":"e599ec2e8b0d98e5639b385d8980140372266b12b05d4bf6c5c497a47fd71073"} Mar 19 19:17:00 crc kubenswrapper[4826]: I0319 19:17:00.741696 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-2" Mar 19 19:17:00 crc kubenswrapper[4826]: I0319 19:17:00.743167 4826 generic.go:334] "Generic (PLEG): container finished" podID="330a9484-e555-4aa4-ae9a-7d09bf97571a" containerID="27c7991ce5d2cc96d8ed0da4727b8a95d278d96e5d67b995acf8aa3e271bdcf2" exitCode=0 Mar 19 19:17:00 crc kubenswrapper[4826]: I0319 19:17:00.744087 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9d24-account-create-update-qr7td" event={"ID":"330a9484-e555-4aa4-ae9a-7d09bf97571a","Type":"ContainerDied","Data":"27c7991ce5d2cc96d8ed0da4727b8a95d278d96e5d67b995acf8aa3e271bdcf2"} Mar 19 19:17:00 crc kubenswrapper[4826]: I0319 19:17:00.784963 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-openstack-db-create-v8dg5" podStartSLOduration=5.784938888 podStartE2EDuration="5.784938888s" podCreationTimestamp="2026-03-19 19:16:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:17:00.770044268 +0000 UTC m=+1245.524112581" watchObservedRunningTime="2026-03-19 19:17:00.784938888 +0000 UTC m=+1245.539007211" Mar 19 19:17:00 crc kubenswrapper[4826]: I0319 19:17:00.813165 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-1" podStartSLOduration=38.659988065 podStartE2EDuration="1m2.81314938s" podCreationTimestamp="2026-03-19 19:15:58 +0000 UTC" firstStartedPulling="2026-03-19 19:16:01.758620215 +0000 UTC m=+1186.512688518" lastFinishedPulling="2026-03-19 19:16:25.91178152 +0000 UTC m=+1210.665849833" observedRunningTime="2026-03-19 19:17:00.804491891 +0000 UTC m=+1245.558560204" watchObservedRunningTime="2026-03-19 19:17:00.81314938 +0000 UTC m=+1245.567217693" Mar 19 19:17:00 crc kubenswrapper[4826]: I0319 19:17:00.846744 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-2" podStartSLOduration=38.779676678 podStartE2EDuration="1m2.846730812s" podCreationTimestamp="2026-03-19 19:15:58 +0000 UTC" firstStartedPulling="2026-03-19 19:16:01.602900351 +0000 UTC m=+1186.356968664" lastFinishedPulling="2026-03-19 19:16:25.669954485 +0000 UTC m=+1210.424022798" observedRunningTime="2026-03-19 19:17:00.845328308 +0000 UTC m=+1245.599396621" watchObservedRunningTime="2026-03-19 19:17:00.846730812 +0000 UTC m=+1245.600799125" Mar 19 19:17:00 crc kubenswrapper[4826]: I0319 19:17:00.866107 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-vgjs9"] Mar 19 19:17:01 crc kubenswrapper[4826]: I0319 19:17:01.753414 4826 generic.go:334] "Generic (PLEG): container finished" podID="a6c70b2d-b15f-4fbe-a820-d24479ea0d40" containerID="25353c4ed41f810e11e5308d624954397db7245fdba59471262236628c14c0a9" exitCode=0 Mar 19 19:17:01 crc kubenswrapper[4826]: I0319 19:17:01.753517 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-3908-account-create-update-qx49d" event={"ID":"a6c70b2d-b15f-4fbe-a820-d24479ea0d40","Type":"ContainerDied","Data":"25353c4ed41f810e11e5308d624954397db7245fdba59471262236628c14c0a9"} Mar 19 19:17:01 crc kubenswrapper[4826]: I0319 19:17:01.755751 4826 generic.go:334] "Generic (PLEG): container finished" podID="db7f0121-bd0f-4144-9b3b-98d0b4d2d682" containerID="04d31648074977e485b156fca3ec61b392463a5f1231866828c07c4f46cf8e09" exitCode=0 Mar 19 19:17:01 crc kubenswrapper[4826]: I0319 19:17:01.755787 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-vgjs9" event={"ID":"db7f0121-bd0f-4144-9b3b-98d0b4d2d682","Type":"ContainerDied","Data":"04d31648074977e485b156fca3ec61b392463a5f1231866828c07c4f46cf8e09"} Mar 19 19:17:01 crc kubenswrapper[4826]: I0319 19:17:01.755832 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-vgjs9" event={"ID":"db7f0121-bd0f-4144-9b3b-98d0b4d2d682","Type":"ContainerStarted","Data":"8aa1a0d86fa61584b7de591ed955cdcf1c139ac822b2c37e438ee27a766d670b"} Mar 19 19:17:01 crc kubenswrapper[4826]: I0319 19:17:01.770257 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e617bcf9-daaa-4a7a-949c-cdf0fc9646a5","Type":"ContainerStarted","Data":"d5305552064e09b571620e69bf94a0e89302212f965b53b1173ceb0e701975c3"} Mar 19 19:17:01 crc kubenswrapper[4826]: I0319 19:17:01.770491 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 19 19:17:01 crc kubenswrapper[4826]: I0319 19:17:01.774327 4826 generic.go:334] "Generic (PLEG): container finished" podID="8948d5dc-068b-4571-bd13-f0cbf3750004" containerID="9aa31870c72ebdb95890aa6803e6340e13d0ef42dc8a9d3b05b88ee93d4924f8" exitCode=0 Mar 19 19:17:01 crc kubenswrapper[4826]: I0319 19:17:01.774545 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-v8dg5" event={"ID":"8948d5dc-068b-4571-bd13-f0cbf3750004","Type":"ContainerDied","Data":"9aa31870c72ebdb95890aa6803e6340e13d0ef42dc8a9d3b05b88ee93d4924f8"} Mar 19 19:17:01 crc kubenswrapper[4826]: I0319 19:17:01.856725 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=39.688870873 podStartE2EDuration="1m3.856703073s" podCreationTimestamp="2026-03-19 19:15:58 +0000 UTC" firstStartedPulling="2026-03-19 19:16:01.705043699 +0000 UTC m=+1186.459112012" lastFinishedPulling="2026-03-19 19:16:25.872875899 +0000 UTC m=+1210.626944212" observedRunningTime="2026-03-19 19:17:01.842708304 +0000 UTC m=+1246.596776617" watchObservedRunningTime="2026-03-19 19:17:01.856703073 +0000 UTC m=+1246.610771386" Mar 19 19:17:02 crc kubenswrapper[4826]: I0319 19:17:02.484271 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-3908-account-create-update-qx49d" Mar 19 19:17:02 crc kubenswrapper[4826]: I0319 19:17:02.492110 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9d24-account-create-update-qr7td" Mar 19 19:17:02 crc kubenswrapper[4826]: I0319 19:17:02.605275 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d48hr\" (UniqueName: \"kubernetes.io/projected/330a9484-e555-4aa4-ae9a-7d09bf97571a-kube-api-access-d48hr\") pod \"330a9484-e555-4aa4-ae9a-7d09bf97571a\" (UID: \"330a9484-e555-4aa4-ae9a-7d09bf97571a\") " Mar 19 19:17:02 crc kubenswrapper[4826]: I0319 19:17:02.605461 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/330a9484-e555-4aa4-ae9a-7d09bf97571a-operator-scripts\") pod \"330a9484-e555-4aa4-ae9a-7d09bf97571a\" (UID: \"330a9484-e555-4aa4-ae9a-7d09bf97571a\") " Mar 19 19:17:02 crc kubenswrapper[4826]: I0319 19:17:02.605491 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6c70b2d-b15f-4fbe-a820-d24479ea0d40-operator-scripts\") pod \"a6c70b2d-b15f-4fbe-a820-d24479ea0d40\" (UID: \"a6c70b2d-b15f-4fbe-a820-d24479ea0d40\") " Mar 19 19:17:02 crc kubenswrapper[4826]: I0319 19:17:02.605513 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbnh4\" (UniqueName: \"kubernetes.io/projected/a6c70b2d-b15f-4fbe-a820-d24479ea0d40-kube-api-access-xbnh4\") pod \"a6c70b2d-b15f-4fbe-a820-d24479ea0d40\" (UID: \"a6c70b2d-b15f-4fbe-a820-d24479ea0d40\") " Mar 19 19:17:02 crc kubenswrapper[4826]: I0319 19:17:02.605954 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/330a9484-e555-4aa4-ae9a-7d09bf97571a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "330a9484-e555-4aa4-ae9a-7d09bf97571a" (UID: "330a9484-e555-4aa4-ae9a-7d09bf97571a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:17:02 crc kubenswrapper[4826]: I0319 19:17:02.606098 4826 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/330a9484-e555-4aa4-ae9a-7d09bf97571a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:02 crc kubenswrapper[4826]: I0319 19:17:02.612455 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6c70b2d-b15f-4fbe-a820-d24479ea0d40-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a6c70b2d-b15f-4fbe-a820-d24479ea0d40" (UID: "a6c70b2d-b15f-4fbe-a820-d24479ea0d40"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:17:02 crc kubenswrapper[4826]: I0319 19:17:02.614004 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/330a9484-e555-4aa4-ae9a-7d09bf97571a-kube-api-access-d48hr" (OuterVolumeSpecName: "kube-api-access-d48hr") pod "330a9484-e555-4aa4-ae9a-7d09bf97571a" (UID: "330a9484-e555-4aa4-ae9a-7d09bf97571a"). InnerVolumeSpecName "kube-api-access-d48hr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:17:02 crc kubenswrapper[4826]: I0319 19:17:02.628898 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6c70b2d-b15f-4fbe-a820-d24479ea0d40-kube-api-access-xbnh4" (OuterVolumeSpecName: "kube-api-access-xbnh4") pod "a6c70b2d-b15f-4fbe-a820-d24479ea0d40" (UID: "a6c70b2d-b15f-4fbe-a820-d24479ea0d40"). InnerVolumeSpecName "kube-api-access-xbnh4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:17:02 crc kubenswrapper[4826]: I0319 19:17:02.708223 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d48hr\" (UniqueName: \"kubernetes.io/projected/330a9484-e555-4aa4-ae9a-7d09bf97571a-kube-api-access-d48hr\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:02 crc kubenswrapper[4826]: I0319 19:17:02.708259 4826 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6c70b2d-b15f-4fbe-a820-d24479ea0d40-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:02 crc kubenswrapper[4826]: I0319 19:17:02.708273 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbnh4\" (UniqueName: \"kubernetes.io/projected/a6c70b2d-b15f-4fbe-a820-d24479ea0d40-kube-api-access-xbnh4\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:02 crc kubenswrapper[4826]: I0319 19:17:02.784078 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-3908-account-create-update-qx49d" event={"ID":"a6c70b2d-b15f-4fbe-a820-d24479ea0d40","Type":"ContainerDied","Data":"a162662e1666151177f448b1feb2247473ea42c585154e443551a3ff6f8f1295"} Mar 19 19:17:02 crc kubenswrapper[4826]: I0319 19:17:02.784120 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a162662e1666151177f448b1feb2247473ea42c585154e443551a3ff6f8f1295" Mar 19 19:17:02 crc kubenswrapper[4826]: I0319 19:17:02.784184 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-3908-account-create-update-qx49d" Mar 19 19:17:02 crc kubenswrapper[4826]: I0319 19:17:02.792017 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"63839c94-d94a-4fe8-a195-b86a6a9e8b79","Type":"ContainerStarted","Data":"8c9c5267be455ec2905d4a2bf8db5ab89c8e3ed5aac699b450dfeebccdf5f050"} Mar 19 19:17:02 crc kubenswrapper[4826]: I0319 19:17:02.795234 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9d24-account-create-update-qr7td" Mar 19 19:17:02 crc kubenswrapper[4826]: I0319 19:17:02.796878 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9d24-account-create-update-qr7td" event={"ID":"330a9484-e555-4aa4-ae9a-7d09bf97571a","Type":"ContainerDied","Data":"9f3e70ebb4890670504443281fd092bf651274732e39aeef2d838a0bd2124ad4"} Mar 19 19:17:02 crc kubenswrapper[4826]: I0319 19:17:02.796942 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f3e70ebb4890670504443281fd092bf651274732e39aeef2d838a0bd2124ad4" Mar 19 19:17:03 crc kubenswrapper[4826]: I0319 19:17:03.028522 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/775f9d8a-377a-4913-b2d2-3bb1b7aec077-etc-swift\") pod \"swift-storage-0\" (UID: \"775f9d8a-377a-4913-b2d2-3bb1b7aec077\") " pod="openstack/swift-storage-0" Mar 19 19:17:03 crc kubenswrapper[4826]: E0319 19:17:03.033610 4826 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 19 19:17:03 crc kubenswrapper[4826]: E0319 19:17:03.034034 4826 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 19 19:17:03 crc kubenswrapper[4826]: E0319 19:17:03.034086 4826 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/775f9d8a-377a-4913-b2d2-3bb1b7aec077-etc-swift podName:775f9d8a-377a-4913-b2d2-3bb1b7aec077 nodeName:}" failed. No retries permitted until 2026-03-19 19:17:19.034068518 +0000 UTC m=+1263.788136831 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/775f9d8a-377a-4913-b2d2-3bb1b7aec077-etc-swift") pod "swift-storage-0" (UID: "775f9d8a-377a-4913-b2d2-3bb1b7aec077") : configmap "swift-ring-files" not found Mar 19 19:17:03 crc kubenswrapper[4826]: I0319 19:17:03.219775 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-tg74r"] Mar 19 19:17:03 crc kubenswrapper[4826]: E0319 19:17:03.220155 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6c70b2d-b15f-4fbe-a820-d24479ea0d40" containerName="mariadb-account-create-update" Mar 19 19:17:03 crc kubenswrapper[4826]: I0319 19:17:03.220166 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6c70b2d-b15f-4fbe-a820-d24479ea0d40" containerName="mariadb-account-create-update" Mar 19 19:17:03 crc kubenswrapper[4826]: E0319 19:17:03.220179 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="330a9484-e555-4aa4-ae9a-7d09bf97571a" containerName="mariadb-account-create-update" Mar 19 19:17:03 crc kubenswrapper[4826]: I0319 19:17:03.220185 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="330a9484-e555-4aa4-ae9a-7d09bf97571a" containerName="mariadb-account-create-update" Mar 19 19:17:03 crc kubenswrapper[4826]: I0319 19:17:03.220410 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="330a9484-e555-4aa4-ae9a-7d09bf97571a" containerName="mariadb-account-create-update" Mar 19 19:17:03 crc kubenswrapper[4826]: I0319 19:17:03.220431 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6c70b2d-b15f-4fbe-a820-d24479ea0d40" containerName="mariadb-account-create-update" Mar 19 19:17:03 crc kubenswrapper[4826]: I0319 19:17:03.221104 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-tg74r" Mar 19 19:17:03 crc kubenswrapper[4826]: I0319 19:17:03.251937 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-tg74r"] Mar 19 19:17:03 crc kubenswrapper[4826]: I0319 19:17:03.256544 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vgjs9" Mar 19 19:17:03 crc kubenswrapper[4826]: I0319 19:17:03.315536 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-8770-account-create-update-p98hn"] Mar 19 19:17:03 crc kubenswrapper[4826]: E0319 19:17:03.316047 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db7f0121-bd0f-4144-9b3b-98d0b4d2d682" containerName="mariadb-account-create-update" Mar 19 19:17:03 crc kubenswrapper[4826]: I0319 19:17:03.316062 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="db7f0121-bd0f-4144-9b3b-98d0b4d2d682" containerName="mariadb-account-create-update" Mar 19 19:17:03 crc kubenswrapper[4826]: I0319 19:17:03.316786 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="db7f0121-bd0f-4144-9b3b-98d0b4d2d682" containerName="mariadb-account-create-update" Mar 19 19:17:03 crc kubenswrapper[4826]: I0319 19:17:03.317478 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8770-account-create-update-p98hn" Mar 19 19:17:03 crc kubenswrapper[4826]: I0319 19:17:03.322921 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 19 19:17:03 crc kubenswrapper[4826]: I0319 19:17:03.323281 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-v8dg5" Mar 19 19:17:03 crc kubenswrapper[4826]: I0319 19:17:03.341334 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db7f0121-bd0f-4144-9b3b-98d0b4d2d682-operator-scripts\") pod \"db7f0121-bd0f-4144-9b3b-98d0b4d2d682\" (UID: \"db7f0121-bd0f-4144-9b3b-98d0b4d2d682\") " Mar 19 19:17:03 crc kubenswrapper[4826]: I0319 19:17:03.341411 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x57qm\" (UniqueName: \"kubernetes.io/projected/db7f0121-bd0f-4144-9b3b-98d0b4d2d682-kube-api-access-x57qm\") pod \"db7f0121-bd0f-4144-9b3b-98d0b4d2d682\" (UID: \"db7f0121-bd0f-4144-9b3b-98d0b4d2d682\") " Mar 19 19:17:03 crc kubenswrapper[4826]: I0319 19:17:03.341822 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vh5cl\" (UniqueName: \"kubernetes.io/projected/e43282f8-6f7c-4cb5-a9cd-79cc13a5be89-kube-api-access-vh5cl\") pod \"glance-db-create-tg74r\" (UID: \"e43282f8-6f7c-4cb5-a9cd-79cc13a5be89\") " pod="openstack/glance-db-create-tg74r" Mar 19 19:17:03 crc kubenswrapper[4826]: I0319 19:17:03.341854 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e43282f8-6f7c-4cb5-a9cd-79cc13a5be89-operator-scripts\") pod \"glance-db-create-tg74r\" (UID: \"e43282f8-6f7c-4cb5-a9cd-79cc13a5be89\") " pod="openstack/glance-db-create-tg74r" Mar 19 19:17:03 crc kubenswrapper[4826]: I0319 19:17:03.343747 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db7f0121-bd0f-4144-9b3b-98d0b4d2d682-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "db7f0121-bd0f-4144-9b3b-98d0b4d2d682" (UID: "db7f0121-bd0f-4144-9b3b-98d0b4d2d682"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:17:03 crc kubenswrapper[4826]: I0319 19:17:03.343799 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-8770-account-create-update-p98hn"] Mar 19 19:17:03 crc kubenswrapper[4826]: I0319 19:17:03.347931 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db7f0121-bd0f-4144-9b3b-98d0b4d2d682-kube-api-access-x57qm" (OuterVolumeSpecName: "kube-api-access-x57qm") pod "db7f0121-bd0f-4144-9b3b-98d0b4d2d682" (UID: "db7f0121-bd0f-4144-9b3b-98d0b4d2d682"). InnerVolumeSpecName "kube-api-access-x57qm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:17:03 crc kubenswrapper[4826]: I0319 19:17:03.366987 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-54c8fbc9df-9kq4s" podUID="8c238d2b-d512-419b-92dc-6d68bcba8655" containerName="console" containerID="cri-o://ef28f7712a323c8c461d246dd313b2eba2451f7028e39ebb46d68a6c3c18879e" gracePeriod=15 Mar 19 19:17:03 crc kubenswrapper[4826]: I0319 19:17:03.443257 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8948d5dc-068b-4571-bd13-f0cbf3750004-operator-scripts\") pod \"8948d5dc-068b-4571-bd13-f0cbf3750004\" (UID: \"8948d5dc-068b-4571-bd13-f0cbf3750004\") " Mar 19 19:17:03 crc kubenswrapper[4826]: I0319 19:17:03.443385 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7tbt\" (UniqueName: \"kubernetes.io/projected/8948d5dc-068b-4571-bd13-f0cbf3750004-kube-api-access-g7tbt\") pod \"8948d5dc-068b-4571-bd13-f0cbf3750004\" (UID: \"8948d5dc-068b-4571-bd13-f0cbf3750004\") " Mar 19 19:17:03 crc kubenswrapper[4826]: I0319 19:17:03.444091 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76frv\" (UniqueName: \"kubernetes.io/projected/a413e441-0fbd-4400-84be-959ce0870e4e-kube-api-access-76frv\") pod \"glance-8770-account-create-update-p98hn\" (UID: \"a413e441-0fbd-4400-84be-959ce0870e4e\") " pod="openstack/glance-8770-account-create-update-p98hn" Mar 19 19:17:03 crc kubenswrapper[4826]: I0319 19:17:03.444118 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a413e441-0fbd-4400-84be-959ce0870e4e-operator-scripts\") pod \"glance-8770-account-create-update-p98hn\" (UID: \"a413e441-0fbd-4400-84be-959ce0870e4e\") " pod="openstack/glance-8770-account-create-update-p98hn" Mar 19 19:17:03 crc kubenswrapper[4826]: I0319 19:17:03.444115 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8948d5dc-068b-4571-bd13-f0cbf3750004-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8948d5dc-068b-4571-bd13-f0cbf3750004" (UID: "8948d5dc-068b-4571-bd13-f0cbf3750004"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:17:03 crc kubenswrapper[4826]: I0319 19:17:03.444140 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vh5cl\" (UniqueName: \"kubernetes.io/projected/e43282f8-6f7c-4cb5-a9cd-79cc13a5be89-kube-api-access-vh5cl\") pod \"glance-db-create-tg74r\" (UID: \"e43282f8-6f7c-4cb5-a9cd-79cc13a5be89\") " pod="openstack/glance-db-create-tg74r" Mar 19 19:17:03 crc kubenswrapper[4826]: I0319 19:17:03.444238 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e43282f8-6f7c-4cb5-a9cd-79cc13a5be89-operator-scripts\") pod \"glance-db-create-tg74r\" (UID: \"e43282f8-6f7c-4cb5-a9cd-79cc13a5be89\") " pod="openstack/glance-db-create-tg74r" Mar 19 19:17:03 crc kubenswrapper[4826]: I0319 19:17:03.444581 4826 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8948d5dc-068b-4571-bd13-f0cbf3750004-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:03 crc kubenswrapper[4826]: I0319 19:17:03.444600 4826 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db7f0121-bd0f-4144-9b3b-98d0b4d2d682-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:03 crc kubenswrapper[4826]: I0319 19:17:03.444617 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x57qm\" (UniqueName: \"kubernetes.io/projected/db7f0121-bd0f-4144-9b3b-98d0b4d2d682-kube-api-access-x57qm\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:03 crc kubenswrapper[4826]: I0319 19:17:03.445128 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e43282f8-6f7c-4cb5-a9cd-79cc13a5be89-operator-scripts\") pod \"glance-db-create-tg74r\" (UID: \"e43282f8-6f7c-4cb5-a9cd-79cc13a5be89\") " pod="openstack/glance-db-create-tg74r" Mar 19 19:17:03 crc kubenswrapper[4826]: I0319 19:17:03.448307 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8948d5dc-068b-4571-bd13-f0cbf3750004-kube-api-access-g7tbt" (OuterVolumeSpecName: "kube-api-access-g7tbt") pod "8948d5dc-068b-4571-bd13-f0cbf3750004" (UID: "8948d5dc-068b-4571-bd13-f0cbf3750004"). InnerVolumeSpecName "kube-api-access-g7tbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:17:03 crc kubenswrapper[4826]: I0319 19:17:03.462293 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vh5cl\" (UniqueName: \"kubernetes.io/projected/e43282f8-6f7c-4cb5-a9cd-79cc13a5be89-kube-api-access-vh5cl\") pod \"glance-db-create-tg74r\" (UID: \"e43282f8-6f7c-4cb5-a9cd-79cc13a5be89\") " pod="openstack/glance-db-create-tg74r" Mar 19 19:17:03 crc kubenswrapper[4826]: I0319 19:17:03.546787 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76frv\" (UniqueName: \"kubernetes.io/projected/a413e441-0fbd-4400-84be-959ce0870e4e-kube-api-access-76frv\") pod \"glance-8770-account-create-update-p98hn\" (UID: \"a413e441-0fbd-4400-84be-959ce0870e4e\") " pod="openstack/glance-8770-account-create-update-p98hn" Mar 19 19:17:03 crc kubenswrapper[4826]: I0319 19:17:03.546828 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a413e441-0fbd-4400-84be-959ce0870e4e-operator-scripts\") pod \"glance-8770-account-create-update-p98hn\" (UID: \"a413e441-0fbd-4400-84be-959ce0870e4e\") " pod="openstack/glance-8770-account-create-update-p98hn" Mar 19 19:17:03 crc kubenswrapper[4826]: I0319 19:17:03.550464 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a413e441-0fbd-4400-84be-959ce0870e4e-operator-scripts\") pod \"glance-8770-account-create-update-p98hn\" (UID: \"a413e441-0fbd-4400-84be-959ce0870e4e\") " pod="openstack/glance-8770-account-create-update-p98hn" Mar 19 19:17:03 crc kubenswrapper[4826]: I0319 19:17:03.563735 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7tbt\" (UniqueName: \"kubernetes.io/projected/8948d5dc-068b-4571-bd13-f0cbf3750004-kube-api-access-g7tbt\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:03 crc kubenswrapper[4826]: I0319 19:17:03.569101 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-tg74r" Mar 19 19:17:03 crc kubenswrapper[4826]: I0319 19:17:03.601696 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76frv\" (UniqueName: \"kubernetes.io/projected/a413e441-0fbd-4400-84be-959ce0870e4e-kube-api-access-76frv\") pod \"glance-8770-account-create-update-p98hn\" (UID: \"a413e441-0fbd-4400-84be-959ce0870e4e\") " pod="openstack/glance-8770-account-create-update-p98hn" Mar 19 19:17:03 crc kubenswrapper[4826]: I0319 19:17:03.638682 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8770-account-create-update-p98hn" Mar 19 19:17:03 crc kubenswrapper[4826]: I0319 19:17:03.854986 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-2pq4k"] Mar 19 19:17:03 crc kubenswrapper[4826]: E0319 19:17:03.855564 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8948d5dc-068b-4571-bd13-f0cbf3750004" containerName="mariadb-database-create" Mar 19 19:17:03 crc kubenswrapper[4826]: I0319 19:17:03.855577 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="8948d5dc-068b-4571-bd13-f0cbf3750004" containerName="mariadb-database-create" Mar 19 19:17:03 crc kubenswrapper[4826]: I0319 19:17:03.855801 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="8948d5dc-068b-4571-bd13-f0cbf3750004" containerName="mariadb-database-create" Mar 19 19:17:03 crc kubenswrapper[4826]: I0319 19:17:03.859033 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vgjs9" Mar 19 19:17:03 crc kubenswrapper[4826]: I0319 19:17:03.860336 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-vgjs9" event={"ID":"db7f0121-bd0f-4144-9b3b-98d0b4d2d682","Type":"ContainerDied","Data":"8aa1a0d86fa61584b7de591ed955cdcf1c139ac822b2c37e438ee27a766d670b"} Mar 19 19:17:03 crc kubenswrapper[4826]: I0319 19:17:03.860365 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8aa1a0d86fa61584b7de591ed955cdcf1c139ac822b2c37e438ee27a766d670b" Mar 19 19:17:03 crc kubenswrapper[4826]: I0319 19:17:03.860417 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-2pq4k" Mar 19 19:17:03 crc kubenswrapper[4826]: I0319 19:17:03.864269 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-v8dg5" event={"ID":"8948d5dc-068b-4571-bd13-f0cbf3750004","Type":"ContainerDied","Data":"afd1e92cfaa6fa7389e9bd4561368a5538c47c08cccd89373d05bd1c53a94435"} Mar 19 19:17:03 crc kubenswrapper[4826]: I0319 19:17:03.864301 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afd1e92cfaa6fa7389e9bd4561368a5538c47c08cccd89373d05bd1c53a94435" Mar 19 19:17:03 crc kubenswrapper[4826]: I0319 19:17:03.864368 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-v8dg5" Mar 19 19:17:03 crc kubenswrapper[4826]: I0319 19:17:03.871210 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-2pq4k"] Mar 19 19:17:03 crc kubenswrapper[4826]: I0319 19:17:03.884744 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-54c8fbc9df-9kq4s_8c238d2b-d512-419b-92dc-6d68bcba8655/console/0.log" Mar 19 19:17:03 crc kubenswrapper[4826]: I0319 19:17:03.884782 4826 generic.go:334] "Generic (PLEG): container finished" podID="8c238d2b-d512-419b-92dc-6d68bcba8655" containerID="ef28f7712a323c8c461d246dd313b2eba2451f7028e39ebb46d68a6c3c18879e" exitCode=2 Mar 19 19:17:03 crc kubenswrapper[4826]: I0319 19:17:03.884812 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54c8fbc9df-9kq4s" event={"ID":"8c238d2b-d512-419b-92dc-6d68bcba8655","Type":"ContainerDied","Data":"ef28f7712a323c8c461d246dd313b2eba2451f7028e39ebb46d68a6c3c18879e"} Mar 19 19:17:03 crc kubenswrapper[4826]: I0319 19:17:03.892130 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt487\" (UniqueName: \"kubernetes.io/projected/4e3df38e-6b8b-4d2c-b45a-1ec0c7c03ac9-kube-api-access-lt487\") pod \"keystone-db-create-2pq4k\" (UID: \"4e3df38e-6b8b-4d2c-b45a-1ec0c7c03ac9\") " pod="openstack/keystone-db-create-2pq4k" Mar 19 19:17:03 crc kubenswrapper[4826]: I0319 19:17:03.896626 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e3df38e-6b8b-4d2c-b45a-1ec0c7c03ac9-operator-scripts\") pod \"keystone-db-create-2pq4k\" (UID: \"4e3df38e-6b8b-4d2c-b45a-1ec0c7c03ac9\") " pod="openstack/keystone-db-create-2pq4k" Mar 19 19:17:03 crc kubenswrapper[4826]: I0319 19:17:03.947602 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-54c8fbc9df-9kq4s_8c238d2b-d512-419b-92dc-6d68bcba8655/console/0.log" Mar 19 19:17:03 crc kubenswrapper[4826]: I0319 19:17:03.947674 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54c8fbc9df-9kq4s" Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.002759 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8c238d2b-d512-419b-92dc-6d68bcba8655-service-ca\") pod \"8c238d2b-d512-419b-92dc-6d68bcba8655\" (UID: \"8c238d2b-d512-419b-92dc-6d68bcba8655\") " Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.002837 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6v2x\" (UniqueName: \"kubernetes.io/projected/8c238d2b-d512-419b-92dc-6d68bcba8655-kube-api-access-l6v2x\") pod \"8c238d2b-d512-419b-92dc-6d68bcba8655\" (UID: \"8c238d2b-d512-419b-92dc-6d68bcba8655\") " Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.002870 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8c238d2b-d512-419b-92dc-6d68bcba8655-console-config\") pod \"8c238d2b-d512-419b-92dc-6d68bcba8655\" (UID: \"8c238d2b-d512-419b-92dc-6d68bcba8655\") " Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.003074 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8c238d2b-d512-419b-92dc-6d68bcba8655-console-oauth-config\") pod \"8c238d2b-d512-419b-92dc-6d68bcba8655\" (UID: \"8c238d2b-d512-419b-92dc-6d68bcba8655\") " Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.003137 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c238d2b-d512-419b-92dc-6d68bcba8655-trusted-ca-bundle\") pod \"8c238d2b-d512-419b-92dc-6d68bcba8655\" (UID: \"8c238d2b-d512-419b-92dc-6d68bcba8655\") " Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.003165 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8c238d2b-d512-419b-92dc-6d68bcba8655-console-serving-cert\") pod \"8c238d2b-d512-419b-92dc-6d68bcba8655\" (UID: \"8c238d2b-d512-419b-92dc-6d68bcba8655\") " Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.003215 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8c238d2b-d512-419b-92dc-6d68bcba8655-oauth-serving-cert\") pod \"8c238d2b-d512-419b-92dc-6d68bcba8655\" (UID: \"8c238d2b-d512-419b-92dc-6d68bcba8655\") " Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.003613 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lt487\" (UniqueName: \"kubernetes.io/projected/4e3df38e-6b8b-4d2c-b45a-1ec0c7c03ac9-kube-api-access-lt487\") pod \"keystone-db-create-2pq4k\" (UID: \"4e3df38e-6b8b-4d2c-b45a-1ec0c7c03ac9\") " pod="openstack/keystone-db-create-2pq4k" Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.003866 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e3df38e-6b8b-4d2c-b45a-1ec0c7c03ac9-operator-scripts\") pod \"keystone-db-create-2pq4k\" (UID: \"4e3df38e-6b8b-4d2c-b45a-1ec0c7c03ac9\") " pod="openstack/keystone-db-create-2pq4k" Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.004442 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c238d2b-d512-419b-92dc-6d68bcba8655-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "8c238d2b-d512-419b-92dc-6d68bcba8655" (UID: "8c238d2b-d512-419b-92dc-6d68bcba8655"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.004553 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-4236-account-create-update-cxc67"] Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.005041 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c238d2b-d512-419b-92dc-6d68bcba8655-console-config" (OuterVolumeSpecName: "console-config") pod "8c238d2b-d512-419b-92dc-6d68bcba8655" (UID: "8c238d2b-d512-419b-92dc-6d68bcba8655"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.005112 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c238d2b-d512-419b-92dc-6d68bcba8655-service-ca" (OuterVolumeSpecName: "service-ca") pod "8c238d2b-d512-419b-92dc-6d68bcba8655" (UID: "8c238d2b-d512-419b-92dc-6d68bcba8655"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.005906 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c238d2b-d512-419b-92dc-6d68bcba8655-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "8c238d2b-d512-419b-92dc-6d68bcba8655" (UID: "8c238d2b-d512-419b-92dc-6d68bcba8655"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.006135 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e3df38e-6b8b-4d2c-b45a-1ec0c7c03ac9-operator-scripts\") pod \"keystone-db-create-2pq4k\" (UID: \"4e3df38e-6b8b-4d2c-b45a-1ec0c7c03ac9\") " pod="openstack/keystone-db-create-2pq4k" Mar 19 19:17:04 crc kubenswrapper[4826]: E0319 19:17:04.009349 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c238d2b-d512-419b-92dc-6d68bcba8655" containerName="console" Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.009424 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c238d2b-d512-419b-92dc-6d68bcba8655" containerName="console" Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.009791 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c238d2b-d512-419b-92dc-6d68bcba8655" containerName="console" Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.012835 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-4236-account-create-update-cxc67"] Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.012928 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4236-account-create-update-cxc67" Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.014776 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c238d2b-d512-419b-92dc-6d68bcba8655-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "8c238d2b-d512-419b-92dc-6d68bcba8655" (UID: "8c238d2b-d512-419b-92dc-6d68bcba8655"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.015591 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.017483 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c238d2b-d512-419b-92dc-6d68bcba8655-kube-api-access-l6v2x" (OuterVolumeSpecName: "kube-api-access-l6v2x") pod "8c238d2b-d512-419b-92dc-6d68bcba8655" (UID: "8c238d2b-d512-419b-92dc-6d68bcba8655"). InnerVolumeSpecName "kube-api-access-l6v2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.021774 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c238d2b-d512-419b-92dc-6d68bcba8655-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "8c238d2b-d512-419b-92dc-6d68bcba8655" (UID: "8c238d2b-d512-419b-92dc-6d68bcba8655"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.028910 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt487\" (UniqueName: \"kubernetes.io/projected/4e3df38e-6b8b-4d2c-b45a-1ec0c7c03ac9-kube-api-access-lt487\") pod \"keystone-db-create-2pq4k\" (UID: \"4e3df38e-6b8b-4d2c-b45a-1ec0c7c03ac9\") " pod="openstack/keystone-db-create-2pq4k" Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.105840 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mffbl\" (UniqueName: \"kubernetes.io/projected/0a008787-669e-41e3-9178-b37bc657c710-kube-api-access-mffbl\") pod \"keystone-4236-account-create-update-cxc67\" (UID: \"0a008787-669e-41e3-9178-b37bc657c710\") " pod="openstack/keystone-4236-account-create-update-cxc67" Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.106188 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a008787-669e-41e3-9178-b37bc657c710-operator-scripts\") pod \"keystone-4236-account-create-update-cxc67\" (UID: \"0a008787-669e-41e3-9178-b37bc657c710\") " pod="openstack/keystone-4236-account-create-update-cxc67" Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.106358 4826 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8c238d2b-d512-419b-92dc-6d68bcba8655-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.106369 4826 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c238d2b-d512-419b-92dc-6d68bcba8655-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.106381 4826 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8c238d2b-d512-419b-92dc-6d68bcba8655-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.106390 4826 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8c238d2b-d512-419b-92dc-6d68bcba8655-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.106399 4826 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8c238d2b-d512-419b-92dc-6d68bcba8655-service-ca\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.106408 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6v2x\" (UniqueName: \"kubernetes.io/projected/8c238d2b-d512-419b-92dc-6d68bcba8655-kube-api-access-l6v2x\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.106418 4826 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8c238d2b-d512-419b-92dc-6d68bcba8655-console-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.149145 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-pvnh2"] Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.150426 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-pvnh2" Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.160342 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-pvnh2"] Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.210271 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mffbl\" (UniqueName: \"kubernetes.io/projected/0a008787-669e-41e3-9178-b37bc657c710-kube-api-access-mffbl\") pod \"keystone-4236-account-create-update-cxc67\" (UID: \"0a008787-669e-41e3-9178-b37bc657c710\") " pod="openstack/keystone-4236-account-create-update-cxc67" Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.210467 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/992fbead-aec8-4dbc-875a-d01481bdec46-operator-scripts\") pod \"placement-db-create-pvnh2\" (UID: \"992fbead-aec8-4dbc-875a-d01481bdec46\") " pod="openstack/placement-db-create-pvnh2" Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.210633 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a008787-669e-41e3-9178-b37bc657c710-operator-scripts\") pod \"keystone-4236-account-create-update-cxc67\" (UID: \"0a008787-669e-41e3-9178-b37bc657c710\") " pod="openstack/keystone-4236-account-create-update-cxc67" Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.210810 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8hk2\" (UniqueName: \"kubernetes.io/projected/992fbead-aec8-4dbc-875a-d01481bdec46-kube-api-access-t8hk2\") pod \"placement-db-create-pvnh2\" (UID: \"992fbead-aec8-4dbc-875a-d01481bdec46\") " pod="openstack/placement-db-create-pvnh2" Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.211405 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a008787-669e-41e3-9178-b37bc657c710-operator-scripts\") pod \"keystone-4236-account-create-update-cxc67\" (UID: \"0a008787-669e-41e3-9178-b37bc657c710\") " pod="openstack/keystone-4236-account-create-update-cxc67" Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.225275 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mffbl\" (UniqueName: \"kubernetes.io/projected/0a008787-669e-41e3-9178-b37bc657c710-kube-api-access-mffbl\") pod \"keystone-4236-account-create-update-cxc67\" (UID: \"0a008787-669e-41e3-9178-b37bc657c710\") " pod="openstack/keystone-4236-account-create-update-cxc67" Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.239700 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-8770-account-create-update-p98hn"] Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.242181 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-2pq4k" Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.313061 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8hk2\" (UniqueName: \"kubernetes.io/projected/992fbead-aec8-4dbc-875a-d01481bdec46-kube-api-access-t8hk2\") pod \"placement-db-create-pvnh2\" (UID: \"992fbead-aec8-4dbc-875a-d01481bdec46\") " pod="openstack/placement-db-create-pvnh2" Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.313199 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/992fbead-aec8-4dbc-875a-d01481bdec46-operator-scripts\") pod \"placement-db-create-pvnh2\" (UID: \"992fbead-aec8-4dbc-875a-d01481bdec46\") " pod="openstack/placement-db-create-pvnh2" Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.314050 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/992fbead-aec8-4dbc-875a-d01481bdec46-operator-scripts\") pod \"placement-db-create-pvnh2\" (UID: \"992fbead-aec8-4dbc-875a-d01481bdec46\") " pod="openstack/placement-db-create-pvnh2" Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.330707 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8hk2\" (UniqueName: \"kubernetes.io/projected/992fbead-aec8-4dbc-875a-d01481bdec46-kube-api-access-t8hk2\") pod \"placement-db-create-pvnh2\" (UID: \"992fbead-aec8-4dbc-875a-d01481bdec46\") " pod="openstack/placement-db-create-pvnh2" Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.339284 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4236-account-create-update-cxc67" Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.344330 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-tg74r"] Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.475696 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-pvnh2" Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.747044 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-2pq4k"] Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.902852 4826 generic.go:334] "Generic (PLEG): container finished" podID="a413e441-0fbd-4400-84be-959ce0870e4e" containerID="8b469432a0ecc0bb635b1e4504109b55673734f874c0b2fd95ffdc5772beb9f4" exitCode=0 Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.902927 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8770-account-create-update-p98hn" event={"ID":"a413e441-0fbd-4400-84be-959ce0870e4e","Type":"ContainerDied","Data":"8b469432a0ecc0bb635b1e4504109b55673734f874c0b2fd95ffdc5772beb9f4"} Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.902959 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8770-account-create-update-p98hn" event={"ID":"a413e441-0fbd-4400-84be-959ce0870e4e","Type":"ContainerStarted","Data":"50d012e101d2035a58066eaad55dd96b8c9271aa00973f34f2b5e90d1f51f193"} Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.910542 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-tg74r" event={"ID":"e43282f8-6f7c-4cb5-a9cd-79cc13a5be89","Type":"ContainerStarted","Data":"63d65f9f523a0d9cd6afff7dfc7e7d72734abd437ede480556980f4a6b515765"} Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.910580 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-tg74r" event={"ID":"e43282f8-6f7c-4cb5-a9cd-79cc13a5be89","Type":"ContainerStarted","Data":"ebd45a641e0e6375108f1eccfce5d2ffc763c7934ee5b11f53cb8e8fc1449c9a"} Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.916470 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-54c8fbc9df-9kq4s_8c238d2b-d512-419b-92dc-6d68bcba8655/console/0.log" Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.916572 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54c8fbc9df-9kq4s" event={"ID":"8c238d2b-d512-419b-92dc-6d68bcba8655","Type":"ContainerDied","Data":"12a01f38c7a125e4d2b8d03e1eb40cc01aec53b8fd0570c19324e12fdbc5e797"} Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.916617 4826 scope.go:117] "RemoveContainer" containerID="ef28f7712a323c8c461d246dd313b2eba2451f7028e39ebb46d68a6c3c18879e" Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.916770 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54c8fbc9df-9kq4s" Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.918647 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-4236-account-create-update-cxc67"] Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.922114 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-2pq4k" event={"ID":"4e3df38e-6b8b-4d2c-b45a-1ec0c7c03ac9","Type":"ContainerStarted","Data":"629f709a281938b4b34d6935968c40e2bb37459000bcea0980ac12ae2955bb7d"} Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.951978 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-tg74r" podStartSLOduration=1.951945612 podStartE2EDuration="1.951945612s" podCreationTimestamp="2026-03-19 19:17:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:17:04.944520183 +0000 UTC m=+1249.698588516" watchObservedRunningTime="2026-03-19 19:17:04.951945612 +0000 UTC m=+1249.706013925" Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.974031 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-54c8fbc9df-9kq4s"] Mar 19 19:17:04 crc kubenswrapper[4826]: I0319 19:17:04.981676 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-54c8fbc9df-9kq4s"] Mar 19 19:17:05 crc kubenswrapper[4826]: I0319 19:17:05.036816 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-pvnh2"] Mar 19 19:17:05 crc kubenswrapper[4826]: W0319 19:17:05.050558 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod992fbead_aec8_4dbc_875a_d01481bdec46.slice/crio-8aad4945a14cdc5d5055bab14f69e9469e0dfc945332c4b8484892eca1fcaf6a WatchSource:0}: Error finding container 8aad4945a14cdc5d5055bab14f69e9469e0dfc945332c4b8484892eca1fcaf6a: Status 404 returned error can't find the container with id 8aad4945a14cdc5d5055bab14f69e9469e0dfc945332c4b8484892eca1fcaf6a Mar 19 19:17:05 crc kubenswrapper[4826]: I0319 19:17:05.936111 4826 generic.go:334] "Generic (PLEG): container finished" podID="992fbead-aec8-4dbc-875a-d01481bdec46" containerID="ceb48a59bd8031ce31f1a7b448d21311bbf6e5aecdc476c5598cb1b62e26981b" exitCode=0 Mar 19 19:17:05 crc kubenswrapper[4826]: I0319 19:17:05.936206 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-pvnh2" event={"ID":"992fbead-aec8-4dbc-875a-d01481bdec46","Type":"ContainerDied","Data":"ceb48a59bd8031ce31f1a7b448d21311bbf6e5aecdc476c5598cb1b62e26981b"} Mar 19 19:17:05 crc kubenswrapper[4826]: I0319 19:17:05.936433 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-pvnh2" event={"ID":"992fbead-aec8-4dbc-875a-d01481bdec46","Type":"ContainerStarted","Data":"8aad4945a14cdc5d5055bab14f69e9469e0dfc945332c4b8484892eca1fcaf6a"} Mar 19 19:17:05 crc kubenswrapper[4826]: I0319 19:17:05.941545 4826 generic.go:334] "Generic (PLEG): container finished" podID="4e3df38e-6b8b-4d2c-b45a-1ec0c7c03ac9" containerID="08316ceebdbdbe0ba6ffeda0832f85d21999a6b25a89134bbcbddbcd7aee20c4" exitCode=0 Mar 19 19:17:05 crc kubenswrapper[4826]: I0319 19:17:05.941597 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-2pq4k" event={"ID":"4e3df38e-6b8b-4d2c-b45a-1ec0c7c03ac9","Type":"ContainerDied","Data":"08316ceebdbdbe0ba6ffeda0832f85d21999a6b25a89134bbcbddbcd7aee20c4"} Mar 19 19:17:05 crc kubenswrapper[4826]: I0319 19:17:05.943501 4826 generic.go:334] "Generic (PLEG): container finished" podID="0a008787-669e-41e3-9178-b37bc657c710" containerID="4ed752e3c7d3c4d8d33dbb01bf0870a3d2b1f4de97cd4b0d27f49a7a72e04507" exitCode=0 Mar 19 19:17:05 crc kubenswrapper[4826]: I0319 19:17:05.943550 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4236-account-create-update-cxc67" event={"ID":"0a008787-669e-41e3-9178-b37bc657c710","Type":"ContainerDied","Data":"4ed752e3c7d3c4d8d33dbb01bf0870a3d2b1f4de97cd4b0d27f49a7a72e04507"} Mar 19 19:17:05 crc kubenswrapper[4826]: I0319 19:17:05.943565 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4236-account-create-update-cxc67" event={"ID":"0a008787-669e-41e3-9178-b37bc657c710","Type":"ContainerStarted","Data":"0f9bab374eec1c1af7a9e1aaef42f2407b100ead4d8218a75d09a8fb42e9e103"} Mar 19 19:17:05 crc kubenswrapper[4826]: I0319 19:17:05.945790 4826 generic.go:334] "Generic (PLEG): container finished" podID="e43282f8-6f7c-4cb5-a9cd-79cc13a5be89" containerID="63d65f9f523a0d9cd6afff7dfc7e7d72734abd437ede480556980f4a6b515765" exitCode=0 Mar 19 19:17:05 crc kubenswrapper[4826]: I0319 19:17:05.945961 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-tg74r" event={"ID":"e43282f8-6f7c-4cb5-a9cd-79cc13a5be89","Type":"ContainerDied","Data":"63d65f9f523a0d9cd6afff7dfc7e7d72734abd437ede480556980f4a6b515765"} Mar 19 19:17:06 crc kubenswrapper[4826]: I0319 19:17:06.001009 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c238d2b-d512-419b-92dc-6d68bcba8655" path="/var/lib/kubelet/pods/8c238d2b-d512-419b-92dc-6d68bcba8655/volumes" Mar 19 19:17:06 crc kubenswrapper[4826]: I0319 19:17:06.302521 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-kzx9b"] Mar 19 19:17:06 crc kubenswrapper[4826]: I0319 19:17:06.304785 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-kzx9b" Mar 19 19:17:06 crc kubenswrapper[4826]: I0319 19:17:06.334201 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-kzx9b"] Mar 19 19:17:06 crc kubenswrapper[4826]: I0319 19:17:06.375337 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d279b05-60b2-4405-8ba1-11e707e145fe-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-kzx9b\" (UID: \"4d279b05-60b2-4405-8ba1-11e707e145fe\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-kzx9b" Mar 19 19:17:06 crc kubenswrapper[4826]: I0319 19:17:06.375613 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dv4xz\" (UniqueName: \"kubernetes.io/projected/4d279b05-60b2-4405-8ba1-11e707e145fe-kube-api-access-dv4xz\") pod \"mysqld-exporter-openstack-cell1-db-create-kzx9b\" (UID: \"4d279b05-60b2-4405-8ba1-11e707e145fe\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-kzx9b" Mar 19 19:17:06 crc kubenswrapper[4826]: I0319 19:17:06.376827 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-vgjs9"] Mar 19 19:17:06 crc kubenswrapper[4826]: I0319 19:17:06.398795 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-vgjs9"] Mar 19 19:17:06 crc kubenswrapper[4826]: I0319 19:17:06.478361 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d279b05-60b2-4405-8ba1-11e707e145fe-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-kzx9b\" (UID: \"4d279b05-60b2-4405-8ba1-11e707e145fe\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-kzx9b" Mar 19 19:17:06 crc kubenswrapper[4826]: I0319 19:17:06.479135 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d279b05-60b2-4405-8ba1-11e707e145fe-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-kzx9b\" (UID: \"4d279b05-60b2-4405-8ba1-11e707e145fe\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-kzx9b" Mar 19 19:17:06 crc kubenswrapper[4826]: I0319 19:17:06.479285 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dv4xz\" (UniqueName: \"kubernetes.io/projected/4d279b05-60b2-4405-8ba1-11e707e145fe-kube-api-access-dv4xz\") pod \"mysqld-exporter-openstack-cell1-db-create-kzx9b\" (UID: \"4d279b05-60b2-4405-8ba1-11e707e145fe\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-kzx9b" Mar 19 19:17:06 crc kubenswrapper[4826]: I0319 19:17:06.486521 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-9640-account-create-update-29jnn"] Mar 19 19:17:06 crc kubenswrapper[4826]: I0319 19:17:06.488910 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-9640-account-create-update-29jnn" Mar 19 19:17:06 crc kubenswrapper[4826]: I0319 19:17:06.497978 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-9640-account-create-update-29jnn"] Mar 19 19:17:06 crc kubenswrapper[4826]: I0319 19:17:06.510323 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-openstack-cell1-db-secret" Mar 19 19:17:06 crc kubenswrapper[4826]: I0319 19:17:06.516505 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dv4xz\" (UniqueName: \"kubernetes.io/projected/4d279b05-60b2-4405-8ba1-11e707e145fe-kube-api-access-dv4xz\") pod \"mysqld-exporter-openstack-cell1-db-create-kzx9b\" (UID: \"4d279b05-60b2-4405-8ba1-11e707e145fe\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-kzx9b" Mar 19 19:17:06 crc kubenswrapper[4826]: I0319 19:17:06.581694 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqt8m\" (UniqueName: \"kubernetes.io/projected/c459bda1-b58d-4425-b401-1493252e282d-kube-api-access-dqt8m\") pod \"mysqld-exporter-9640-account-create-update-29jnn\" (UID: \"c459bda1-b58d-4425-b401-1493252e282d\") " pod="openstack/mysqld-exporter-9640-account-create-update-29jnn" Mar 19 19:17:06 crc kubenswrapper[4826]: I0319 19:17:06.582098 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c459bda1-b58d-4425-b401-1493252e282d-operator-scripts\") pod \"mysqld-exporter-9640-account-create-update-29jnn\" (UID: \"c459bda1-b58d-4425-b401-1493252e282d\") " pod="openstack/mysqld-exporter-9640-account-create-update-29jnn" Mar 19 19:17:06 crc kubenswrapper[4826]: I0319 19:17:06.629084 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-kzx9b" Mar 19 19:17:06 crc kubenswrapper[4826]: I0319 19:17:06.684501 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c459bda1-b58d-4425-b401-1493252e282d-operator-scripts\") pod \"mysqld-exporter-9640-account-create-update-29jnn\" (UID: \"c459bda1-b58d-4425-b401-1493252e282d\") " pod="openstack/mysqld-exporter-9640-account-create-update-29jnn" Mar 19 19:17:06 crc kubenswrapper[4826]: I0319 19:17:06.684714 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqt8m\" (UniqueName: \"kubernetes.io/projected/c459bda1-b58d-4425-b401-1493252e282d-kube-api-access-dqt8m\") pod \"mysqld-exporter-9640-account-create-update-29jnn\" (UID: \"c459bda1-b58d-4425-b401-1493252e282d\") " pod="openstack/mysqld-exporter-9640-account-create-update-29jnn" Mar 19 19:17:06 crc kubenswrapper[4826]: I0319 19:17:06.685737 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c459bda1-b58d-4425-b401-1493252e282d-operator-scripts\") pod \"mysqld-exporter-9640-account-create-update-29jnn\" (UID: \"c459bda1-b58d-4425-b401-1493252e282d\") " pod="openstack/mysqld-exporter-9640-account-create-update-29jnn" Mar 19 19:17:06 crc kubenswrapper[4826]: I0319 19:17:06.704193 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqt8m\" (UniqueName: \"kubernetes.io/projected/c459bda1-b58d-4425-b401-1493252e282d-kube-api-access-dqt8m\") pod \"mysqld-exporter-9640-account-create-update-29jnn\" (UID: \"c459bda1-b58d-4425-b401-1493252e282d\") " pod="openstack/mysqld-exporter-9640-account-create-update-29jnn" Mar 19 19:17:06 crc kubenswrapper[4826]: I0319 19:17:06.814961 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-9640-account-create-update-29jnn" Mar 19 19:17:07 crc kubenswrapper[4826]: I0319 19:17:07.399766 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 19 19:17:07 crc kubenswrapper[4826]: I0319 19:17:07.857881 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-pvnh2" Mar 19 19:17:07 crc kubenswrapper[4826]: I0319 19:17:07.862851 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-2pq4k" Mar 19 19:17:07 crc kubenswrapper[4826]: I0319 19:17:07.876637 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8770-account-create-update-p98hn" Mar 19 19:17:07 crc kubenswrapper[4826]: I0319 19:17:07.900116 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4236-account-create-update-cxc67" Mar 19 19:17:07 crc kubenswrapper[4826]: I0319 19:17:07.901397 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-tg74r" Mar 19 19:17:07 crc kubenswrapper[4826]: I0319 19:17:07.913794 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e3df38e-6b8b-4d2c-b45a-1ec0c7c03ac9-operator-scripts\") pod \"4e3df38e-6b8b-4d2c-b45a-1ec0c7c03ac9\" (UID: \"4e3df38e-6b8b-4d2c-b45a-1ec0c7c03ac9\") " Mar 19 19:17:07 crc kubenswrapper[4826]: I0319 19:17:07.913930 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lt487\" (UniqueName: \"kubernetes.io/projected/4e3df38e-6b8b-4d2c-b45a-1ec0c7c03ac9-kube-api-access-lt487\") pod \"4e3df38e-6b8b-4d2c-b45a-1ec0c7c03ac9\" (UID: \"4e3df38e-6b8b-4d2c-b45a-1ec0c7c03ac9\") " Mar 19 19:17:07 crc kubenswrapper[4826]: I0319 19:17:07.914036 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a413e441-0fbd-4400-84be-959ce0870e4e-operator-scripts\") pod \"a413e441-0fbd-4400-84be-959ce0870e4e\" (UID: \"a413e441-0fbd-4400-84be-959ce0870e4e\") " Mar 19 19:17:07 crc kubenswrapper[4826]: I0319 19:17:07.914064 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76frv\" (UniqueName: \"kubernetes.io/projected/a413e441-0fbd-4400-84be-959ce0870e4e-kube-api-access-76frv\") pod \"a413e441-0fbd-4400-84be-959ce0870e4e\" (UID: \"a413e441-0fbd-4400-84be-959ce0870e4e\") " Mar 19 19:17:07 crc kubenswrapper[4826]: I0319 19:17:07.914105 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/992fbead-aec8-4dbc-875a-d01481bdec46-operator-scripts\") pod \"992fbead-aec8-4dbc-875a-d01481bdec46\" (UID: \"992fbead-aec8-4dbc-875a-d01481bdec46\") " Mar 19 19:17:07 crc kubenswrapper[4826]: I0319 19:17:07.914224 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8hk2\" (UniqueName: \"kubernetes.io/projected/992fbead-aec8-4dbc-875a-d01481bdec46-kube-api-access-t8hk2\") pod \"992fbead-aec8-4dbc-875a-d01481bdec46\" (UID: \"992fbead-aec8-4dbc-875a-d01481bdec46\") " Mar 19 19:17:07 crc kubenswrapper[4826]: I0319 19:17:07.914281 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e3df38e-6b8b-4d2c-b45a-1ec0c7c03ac9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4e3df38e-6b8b-4d2c-b45a-1ec0c7c03ac9" (UID: "4e3df38e-6b8b-4d2c-b45a-1ec0c7c03ac9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:17:07 crc kubenswrapper[4826]: I0319 19:17:07.914632 4826 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e3df38e-6b8b-4d2c-b45a-1ec0c7c03ac9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:07 crc kubenswrapper[4826]: I0319 19:17:07.914776 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a413e441-0fbd-4400-84be-959ce0870e4e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a413e441-0fbd-4400-84be-959ce0870e4e" (UID: "a413e441-0fbd-4400-84be-959ce0870e4e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:17:07 crc kubenswrapper[4826]: I0319 19:17:07.915319 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/992fbead-aec8-4dbc-875a-d01481bdec46-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "992fbead-aec8-4dbc-875a-d01481bdec46" (UID: "992fbead-aec8-4dbc-875a-d01481bdec46"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:17:07 crc kubenswrapper[4826]: I0319 19:17:07.922993 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/992fbead-aec8-4dbc-875a-d01481bdec46-kube-api-access-t8hk2" (OuterVolumeSpecName: "kube-api-access-t8hk2") pod "992fbead-aec8-4dbc-875a-d01481bdec46" (UID: "992fbead-aec8-4dbc-875a-d01481bdec46"). InnerVolumeSpecName "kube-api-access-t8hk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:17:07 crc kubenswrapper[4826]: I0319 19:17:07.923791 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e3df38e-6b8b-4d2c-b45a-1ec0c7c03ac9-kube-api-access-lt487" (OuterVolumeSpecName: "kube-api-access-lt487") pod "4e3df38e-6b8b-4d2c-b45a-1ec0c7c03ac9" (UID: "4e3df38e-6b8b-4d2c-b45a-1ec0c7c03ac9"). InnerVolumeSpecName "kube-api-access-lt487". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:17:07 crc kubenswrapper[4826]: I0319 19:17:07.933437 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a413e441-0fbd-4400-84be-959ce0870e4e-kube-api-access-76frv" (OuterVolumeSpecName: "kube-api-access-76frv") pod "a413e441-0fbd-4400-84be-959ce0870e4e" (UID: "a413e441-0fbd-4400-84be-959ce0870e4e"). InnerVolumeSpecName "kube-api-access-76frv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:17:07 crc kubenswrapper[4826]: I0319 19:17:07.968761 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-tg74r" event={"ID":"e43282f8-6f7c-4cb5-a9cd-79cc13a5be89","Type":"ContainerDied","Data":"ebd45a641e0e6375108f1eccfce5d2ffc763c7934ee5b11f53cb8e8fc1449c9a"} Mar 19 19:17:07 crc kubenswrapper[4826]: I0319 19:17:07.968801 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ebd45a641e0e6375108f1eccfce5d2ffc763c7934ee5b11f53cb8e8fc1449c9a" Mar 19 19:17:07 crc kubenswrapper[4826]: I0319 19:17:07.968858 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-tg74r" Mar 19 19:17:07 crc kubenswrapper[4826]: I0319 19:17:07.977744 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-pvnh2" Mar 19 19:17:07 crc kubenswrapper[4826]: I0319 19:17:07.983960 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-2pq4k" Mar 19 19:17:07 crc kubenswrapper[4826]: I0319 19:17:07.986791 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8770-account-create-update-p98hn" Mar 19 19:17:07 crc kubenswrapper[4826]: I0319 19:17:07.990348 4826 generic.go:334] "Generic (PLEG): container finished" podID="9327e4ee-66b1-4f08-9cb8-9facc43491b4" containerID="10a151b600161f6457bd2b2786cc4e512f55741f9173282433b805759f2cb53a" exitCode=0 Mar 19 19:17:07 crc kubenswrapper[4826]: I0319 19:17:07.994345 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4236-account-create-update-cxc67" Mar 19 19:17:07 crc kubenswrapper[4826]: I0319 19:17:07.996880 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db7f0121-bd0f-4144-9b3b-98d0b4d2d682" path="/var/lib/kubelet/pods/db7f0121-bd0f-4144-9b3b-98d0b4d2d682/volumes" Mar 19 19:17:08 crc kubenswrapper[4826]: I0319 19:17:08.003075 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-pvnh2" event={"ID":"992fbead-aec8-4dbc-875a-d01481bdec46","Type":"ContainerDied","Data":"8aad4945a14cdc5d5055bab14f69e9469e0dfc945332c4b8484892eca1fcaf6a"} Mar 19 19:17:08 crc kubenswrapper[4826]: I0319 19:17:08.003101 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8aad4945a14cdc5d5055bab14f69e9469e0dfc945332c4b8484892eca1fcaf6a" Mar 19 19:17:08 crc kubenswrapper[4826]: I0319 19:17:08.003113 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-2pq4k" event={"ID":"4e3df38e-6b8b-4d2c-b45a-1ec0c7c03ac9","Type":"ContainerDied","Data":"629f709a281938b4b34d6935968c40e2bb37459000bcea0980ac12ae2955bb7d"} Mar 19 19:17:08 crc kubenswrapper[4826]: I0319 19:17:08.003123 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="629f709a281938b4b34d6935968c40e2bb37459000bcea0980ac12ae2955bb7d" Mar 19 19:17:08 crc kubenswrapper[4826]: I0319 19:17:08.003132 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8770-account-create-update-p98hn" event={"ID":"a413e441-0fbd-4400-84be-959ce0870e4e","Type":"ContainerDied","Data":"50d012e101d2035a58066eaad55dd96b8c9271aa00973f34f2b5e90d1f51f193"} Mar 19 19:17:08 crc kubenswrapper[4826]: I0319 19:17:08.003142 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50d012e101d2035a58066eaad55dd96b8c9271aa00973f34f2b5e90d1f51f193" Mar 19 19:17:08 crc kubenswrapper[4826]: I0319 19:17:08.003150 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-s8pzl" event={"ID":"9327e4ee-66b1-4f08-9cb8-9facc43491b4","Type":"ContainerDied","Data":"10a151b600161f6457bd2b2786cc4e512f55741f9173282433b805759f2cb53a"} Mar 19 19:17:08 crc kubenswrapper[4826]: I0319 19:17:08.003161 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4236-account-create-update-cxc67" event={"ID":"0a008787-669e-41e3-9178-b37bc657c710","Type":"ContainerDied","Data":"0f9bab374eec1c1af7a9e1aaef42f2407b100ead4d8218a75d09a8fb42e9e103"} Mar 19 19:17:08 crc kubenswrapper[4826]: I0319 19:17:08.003170 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f9bab374eec1c1af7a9e1aaef42f2407b100ead4d8218a75d09a8fb42e9e103" Mar 19 19:17:08 crc kubenswrapper[4826]: I0319 19:17:08.015522 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a008787-669e-41e3-9178-b37bc657c710-operator-scripts\") pod \"0a008787-669e-41e3-9178-b37bc657c710\" (UID: \"0a008787-669e-41e3-9178-b37bc657c710\") " Mar 19 19:17:08 crc kubenswrapper[4826]: I0319 19:17:08.015597 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e43282f8-6f7c-4cb5-a9cd-79cc13a5be89-operator-scripts\") pod \"e43282f8-6f7c-4cb5-a9cd-79cc13a5be89\" (UID: \"e43282f8-6f7c-4cb5-a9cd-79cc13a5be89\") " Mar 19 19:17:08 crc kubenswrapper[4826]: I0319 19:17:08.015666 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mffbl\" (UniqueName: \"kubernetes.io/projected/0a008787-669e-41e3-9178-b37bc657c710-kube-api-access-mffbl\") pod \"0a008787-669e-41e3-9178-b37bc657c710\" (UID: \"0a008787-669e-41e3-9178-b37bc657c710\") " Mar 19 19:17:08 crc kubenswrapper[4826]: I0319 19:17:08.015814 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vh5cl\" (UniqueName: \"kubernetes.io/projected/e43282f8-6f7c-4cb5-a9cd-79cc13a5be89-kube-api-access-vh5cl\") pod \"e43282f8-6f7c-4cb5-a9cd-79cc13a5be89\" (UID: \"e43282f8-6f7c-4cb5-a9cd-79cc13a5be89\") " Mar 19 19:17:08 crc kubenswrapper[4826]: I0319 19:17:08.016265 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e43282f8-6f7c-4cb5-a9cd-79cc13a5be89-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e43282f8-6f7c-4cb5-a9cd-79cc13a5be89" (UID: "e43282f8-6f7c-4cb5-a9cd-79cc13a5be89"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:17:08 crc kubenswrapper[4826]: I0319 19:17:08.016281 4826 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a413e441-0fbd-4400-84be-959ce0870e4e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:08 crc kubenswrapper[4826]: I0319 19:17:08.016295 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76frv\" (UniqueName: \"kubernetes.io/projected/a413e441-0fbd-4400-84be-959ce0870e4e-kube-api-access-76frv\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:08 crc kubenswrapper[4826]: I0319 19:17:08.016305 4826 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/992fbead-aec8-4dbc-875a-d01481bdec46-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:08 crc kubenswrapper[4826]: I0319 19:17:08.016314 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8hk2\" (UniqueName: \"kubernetes.io/projected/992fbead-aec8-4dbc-875a-d01481bdec46-kube-api-access-t8hk2\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:08 crc kubenswrapper[4826]: I0319 19:17:08.016322 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lt487\" (UniqueName: \"kubernetes.io/projected/4e3df38e-6b8b-4d2c-b45a-1ec0c7c03ac9-kube-api-access-lt487\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:08 crc kubenswrapper[4826]: I0319 19:17:08.016947 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a008787-669e-41e3-9178-b37bc657c710-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0a008787-669e-41e3-9178-b37bc657c710" (UID: "0a008787-669e-41e3-9178-b37bc657c710"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:17:08 crc kubenswrapper[4826]: I0319 19:17:08.025816 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e43282f8-6f7c-4cb5-a9cd-79cc13a5be89-kube-api-access-vh5cl" (OuterVolumeSpecName: "kube-api-access-vh5cl") pod "e43282f8-6f7c-4cb5-a9cd-79cc13a5be89" (UID: "e43282f8-6f7c-4cb5-a9cd-79cc13a5be89"). InnerVolumeSpecName "kube-api-access-vh5cl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:17:08 crc kubenswrapper[4826]: I0319 19:17:08.027892 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a008787-669e-41e3-9178-b37bc657c710-kube-api-access-mffbl" (OuterVolumeSpecName: "kube-api-access-mffbl") pod "0a008787-669e-41e3-9178-b37bc657c710" (UID: "0a008787-669e-41e3-9178-b37bc657c710"). InnerVolumeSpecName "kube-api-access-mffbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:17:08 crc kubenswrapper[4826]: I0319 19:17:08.117782 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-wdll6" podUID="2ed5ed9d-f761-4b5d-8cc8-07693c1d1289" containerName="ovn-controller" probeResult="failure" output=< Mar 19 19:17:08 crc kubenswrapper[4826]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 19 19:17:08 crc kubenswrapper[4826]: > Mar 19 19:17:08 crc kubenswrapper[4826]: I0319 19:17:08.122635 4826 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e43282f8-6f7c-4cb5-a9cd-79cc13a5be89-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:08 crc kubenswrapper[4826]: I0319 19:17:08.122675 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mffbl\" (UniqueName: \"kubernetes.io/projected/0a008787-669e-41e3-9178-b37bc657c710-kube-api-access-mffbl\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:08 crc kubenswrapper[4826]: I0319 19:17:08.122686 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vh5cl\" (UniqueName: \"kubernetes.io/projected/e43282f8-6f7c-4cb5-a9cd-79cc13a5be89-kube-api-access-vh5cl\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:08 crc kubenswrapper[4826]: I0319 19:17:08.122694 4826 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a008787-669e-41e3-9178-b37bc657c710-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:08 crc kubenswrapper[4826]: I0319 19:17:08.215193 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-kzx9b"] Mar 19 19:17:08 crc kubenswrapper[4826]: W0319 19:17:08.218545 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d279b05_60b2_4405_8ba1_11e707e145fe.slice/crio-7a15cb67bb3643eeeae8cba9f178b8e4060db1f0c4d0954d923dc68f01315b91 WatchSource:0}: Error finding container 7a15cb67bb3643eeeae8cba9f178b8e4060db1f0c4d0954d923dc68f01315b91: Status 404 returned error can't find the container with id 7a15cb67bb3643eeeae8cba9f178b8e4060db1f0c4d0954d923dc68f01315b91 Mar 19 19:17:08 crc kubenswrapper[4826]: I0319 19:17:08.376406 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-9640-account-create-update-29jnn"] Mar 19 19:17:09 crc kubenswrapper[4826]: I0319 19:17:09.008987 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"63839c94-d94a-4fe8-a195-b86a6a9e8b79","Type":"ContainerStarted","Data":"5d9bfa8cd13f3a3eda081700b8b5fea64497a2c1cabd846849199cb1bd8f95c5"} Mar 19 19:17:09 crc kubenswrapper[4826]: I0319 19:17:09.011186 4826 generic.go:334] "Generic (PLEG): container finished" podID="4d279b05-60b2-4405-8ba1-11e707e145fe" containerID="3087dbeb19b6d1d1ff271676afbdb2242509e1341844f8b9797c3b56b1f7236d" exitCode=0 Mar 19 19:17:09 crc kubenswrapper[4826]: I0319 19:17:09.011263 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-kzx9b" event={"ID":"4d279b05-60b2-4405-8ba1-11e707e145fe","Type":"ContainerDied","Data":"3087dbeb19b6d1d1ff271676afbdb2242509e1341844f8b9797c3b56b1f7236d"} Mar 19 19:17:09 crc kubenswrapper[4826]: I0319 19:17:09.011299 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-kzx9b" event={"ID":"4d279b05-60b2-4405-8ba1-11e707e145fe","Type":"ContainerStarted","Data":"7a15cb67bb3643eeeae8cba9f178b8e4060db1f0c4d0954d923dc68f01315b91"} Mar 19 19:17:09 crc kubenswrapper[4826]: I0319 19:17:09.013279 4826 generic.go:334] "Generic (PLEG): container finished" podID="c459bda1-b58d-4425-b401-1493252e282d" containerID="3fe5f5bc1379a3e4a6de1823a8ed65844c5aee9768eb7d386cc56f6a4891adc1" exitCode=0 Mar 19 19:17:09 crc kubenswrapper[4826]: I0319 19:17:09.013332 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-9640-account-create-update-29jnn" event={"ID":"c459bda1-b58d-4425-b401-1493252e282d","Type":"ContainerDied","Data":"3fe5f5bc1379a3e4a6de1823a8ed65844c5aee9768eb7d386cc56f6a4891adc1"} Mar 19 19:17:09 crc kubenswrapper[4826]: I0319 19:17:09.013427 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-9640-account-create-update-29jnn" event={"ID":"c459bda1-b58d-4425-b401-1493252e282d","Type":"ContainerStarted","Data":"1d7090cc888fa694c054a9288c44b98f1a3b24d6a2cb9aa25c528c41fcec38ff"} Mar 19 19:17:09 crc kubenswrapper[4826]: I0319 19:17:09.048052 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=22.185627649 podStartE2EDuration="1m3.048032362s" podCreationTimestamp="2026-03-19 19:16:06 +0000 UTC" firstStartedPulling="2026-03-19 19:16:26.881537859 +0000 UTC m=+1211.635606172" lastFinishedPulling="2026-03-19 19:17:07.743942572 +0000 UTC m=+1252.498010885" observedRunningTime="2026-03-19 19:17:09.034799361 +0000 UTC m=+1253.788867684" watchObservedRunningTime="2026-03-19 19:17:09.048032362 +0000 UTC m=+1253.802100675" Mar 19 19:17:09 crc kubenswrapper[4826]: I0319 19:17:09.391533 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-s8pzl" Mar 19 19:17:09 crc kubenswrapper[4826]: I0319 19:17:09.449154 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9327e4ee-66b1-4f08-9cb8-9facc43491b4-scripts\") pod \"9327e4ee-66b1-4f08-9cb8-9facc43491b4\" (UID: \"9327e4ee-66b1-4f08-9cb8-9facc43491b4\") " Mar 19 19:17:09 crc kubenswrapper[4826]: I0319 19:17:09.449216 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9327e4ee-66b1-4f08-9cb8-9facc43491b4-swiftconf\") pod \"9327e4ee-66b1-4f08-9cb8-9facc43491b4\" (UID: \"9327e4ee-66b1-4f08-9cb8-9facc43491b4\") " Mar 19 19:17:09 crc kubenswrapper[4826]: I0319 19:17:09.449269 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9327e4ee-66b1-4f08-9cb8-9facc43491b4-ring-data-devices\") pod \"9327e4ee-66b1-4f08-9cb8-9facc43491b4\" (UID: \"9327e4ee-66b1-4f08-9cb8-9facc43491b4\") " Mar 19 19:17:09 crc kubenswrapper[4826]: I0319 19:17:09.449356 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9327e4ee-66b1-4f08-9cb8-9facc43491b4-dispersionconf\") pod \"9327e4ee-66b1-4f08-9cb8-9facc43491b4\" (UID: \"9327e4ee-66b1-4f08-9cb8-9facc43491b4\") " Mar 19 19:17:09 crc kubenswrapper[4826]: I0319 19:17:09.449448 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9327e4ee-66b1-4f08-9cb8-9facc43491b4-combined-ca-bundle\") pod \"9327e4ee-66b1-4f08-9cb8-9facc43491b4\" (UID: \"9327e4ee-66b1-4f08-9cb8-9facc43491b4\") " Mar 19 19:17:09 crc kubenswrapper[4826]: I0319 19:17:09.449496 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hldr6\" (UniqueName: \"kubernetes.io/projected/9327e4ee-66b1-4f08-9cb8-9facc43491b4-kube-api-access-hldr6\") pod \"9327e4ee-66b1-4f08-9cb8-9facc43491b4\" (UID: \"9327e4ee-66b1-4f08-9cb8-9facc43491b4\") " Mar 19 19:17:09 crc kubenswrapper[4826]: I0319 19:17:09.449752 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9327e4ee-66b1-4f08-9cb8-9facc43491b4-etc-swift\") pod \"9327e4ee-66b1-4f08-9cb8-9facc43491b4\" (UID: \"9327e4ee-66b1-4f08-9cb8-9facc43491b4\") " Mar 19 19:17:09 crc kubenswrapper[4826]: I0319 19:17:09.450672 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9327e4ee-66b1-4f08-9cb8-9facc43491b4-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "9327e4ee-66b1-4f08-9cb8-9facc43491b4" (UID: "9327e4ee-66b1-4f08-9cb8-9facc43491b4"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:17:09 crc kubenswrapper[4826]: I0319 19:17:09.451679 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9327e4ee-66b1-4f08-9cb8-9facc43491b4-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "9327e4ee-66b1-4f08-9cb8-9facc43491b4" (UID: "9327e4ee-66b1-4f08-9cb8-9facc43491b4"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:17:09 crc kubenswrapper[4826]: I0319 19:17:09.451920 4826 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9327e4ee-66b1-4f08-9cb8-9facc43491b4-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:09 crc kubenswrapper[4826]: I0319 19:17:09.451940 4826 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9327e4ee-66b1-4f08-9cb8-9facc43491b4-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:09 crc kubenswrapper[4826]: I0319 19:17:09.474418 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9327e4ee-66b1-4f08-9cb8-9facc43491b4-kube-api-access-hldr6" (OuterVolumeSpecName: "kube-api-access-hldr6") pod "9327e4ee-66b1-4f08-9cb8-9facc43491b4" (UID: "9327e4ee-66b1-4f08-9cb8-9facc43491b4"). InnerVolumeSpecName "kube-api-access-hldr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:17:09 crc kubenswrapper[4826]: I0319 19:17:09.477121 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9327e4ee-66b1-4f08-9cb8-9facc43491b4-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "9327e4ee-66b1-4f08-9cb8-9facc43491b4" (UID: "9327e4ee-66b1-4f08-9cb8-9facc43491b4"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:17:09 crc kubenswrapper[4826]: I0319 19:17:09.480913 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9327e4ee-66b1-4f08-9cb8-9facc43491b4-scripts" (OuterVolumeSpecName: "scripts") pod "9327e4ee-66b1-4f08-9cb8-9facc43491b4" (UID: "9327e4ee-66b1-4f08-9cb8-9facc43491b4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:17:09 crc kubenswrapper[4826]: I0319 19:17:09.481555 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9327e4ee-66b1-4f08-9cb8-9facc43491b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9327e4ee-66b1-4f08-9cb8-9facc43491b4" (UID: "9327e4ee-66b1-4f08-9cb8-9facc43491b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:17:09 crc kubenswrapper[4826]: I0319 19:17:09.493231 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9327e4ee-66b1-4f08-9cb8-9facc43491b4-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "9327e4ee-66b1-4f08-9cb8-9facc43491b4" (UID: "9327e4ee-66b1-4f08-9cb8-9facc43491b4"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:17:09 crc kubenswrapper[4826]: I0319 19:17:09.554090 4826 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9327e4ee-66b1-4f08-9cb8-9facc43491b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:09 crc kubenswrapper[4826]: I0319 19:17:09.554132 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hldr6\" (UniqueName: \"kubernetes.io/projected/9327e4ee-66b1-4f08-9cb8-9facc43491b4-kube-api-access-hldr6\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:09 crc kubenswrapper[4826]: I0319 19:17:09.554149 4826 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9327e4ee-66b1-4f08-9cb8-9facc43491b4-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:09 crc kubenswrapper[4826]: I0319 19:17:09.554161 4826 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9327e4ee-66b1-4f08-9cb8-9facc43491b4-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:09 crc kubenswrapper[4826]: I0319 19:17:09.554172 4826 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9327e4ee-66b1-4f08-9cb8-9facc43491b4-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:10 crc kubenswrapper[4826]: I0319 19:17:10.025606 4826 generic.go:334] "Generic (PLEG): container finished" podID="69dc8d23-ac18-40b1-99d9-365705c5753b" containerID="d503c343673c800d54ff6e6cc56a18acaec57bf393c9bb2a22d379eea6512b2d" exitCode=0 Mar 19 19:17:10 crc kubenswrapper[4826]: I0319 19:17:10.025695 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"69dc8d23-ac18-40b1-99d9-365705c5753b","Type":"ContainerDied","Data":"d503c343673c800d54ff6e6cc56a18acaec57bf393c9bb2a22d379eea6512b2d"} Mar 19 19:17:10 crc kubenswrapper[4826]: I0319 19:17:10.028378 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-s8pzl" Mar 19 19:17:10 crc kubenswrapper[4826]: I0319 19:17:10.029037 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-s8pzl" event={"ID":"9327e4ee-66b1-4f08-9cb8-9facc43491b4","Type":"ContainerDied","Data":"d008225b3b1a49a120d470b6ffbeadd67219f520c9cee8e470ef8843054a0bb7"} Mar 19 19:17:10 crc kubenswrapper[4826]: I0319 19:17:10.029060 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d008225b3b1a49a120d470b6ffbeadd67219f520c9cee8e470ef8843054a0bb7" Mar 19 19:17:10 crc kubenswrapper[4826]: I0319 19:17:10.514904 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-kzx9b" Mar 19 19:17:10 crc kubenswrapper[4826]: I0319 19:17:10.520523 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-9640-account-create-update-29jnn" Mar 19 19:17:10 crc kubenswrapper[4826]: I0319 19:17:10.574644 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-2" podUID="ad041e2d-3400-49ce-b25f-0d335f3b6738" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.134:5671: connect: connection refused" Mar 19 19:17:10 crc kubenswrapper[4826]: I0319 19:17:10.603328 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-1" podUID="2325ef7c-90a0-48f3-81f0-ede3e7f33570" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.135:5671: connect: connection refused" Mar 19 19:17:10 crc kubenswrapper[4826]: I0319 19:17:10.674430 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dv4xz\" (UniqueName: \"kubernetes.io/projected/4d279b05-60b2-4405-8ba1-11e707e145fe-kube-api-access-dv4xz\") pod \"4d279b05-60b2-4405-8ba1-11e707e145fe\" (UID: \"4d279b05-60b2-4405-8ba1-11e707e145fe\") " Mar 19 19:17:10 crc kubenswrapper[4826]: I0319 19:17:10.674727 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqt8m\" (UniqueName: \"kubernetes.io/projected/c459bda1-b58d-4425-b401-1493252e282d-kube-api-access-dqt8m\") pod \"c459bda1-b58d-4425-b401-1493252e282d\" (UID: \"c459bda1-b58d-4425-b401-1493252e282d\") " Mar 19 19:17:10 crc kubenswrapper[4826]: I0319 19:17:10.674776 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d279b05-60b2-4405-8ba1-11e707e145fe-operator-scripts\") pod \"4d279b05-60b2-4405-8ba1-11e707e145fe\" (UID: \"4d279b05-60b2-4405-8ba1-11e707e145fe\") " Mar 19 19:17:10 crc kubenswrapper[4826]: I0319 19:17:10.674902 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c459bda1-b58d-4425-b401-1493252e282d-operator-scripts\") pod \"c459bda1-b58d-4425-b401-1493252e282d\" (UID: \"c459bda1-b58d-4425-b401-1493252e282d\") " Mar 19 19:17:10 crc kubenswrapper[4826]: I0319 19:17:10.675362 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d279b05-60b2-4405-8ba1-11e707e145fe-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4d279b05-60b2-4405-8ba1-11e707e145fe" (UID: "4d279b05-60b2-4405-8ba1-11e707e145fe"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:17:10 crc kubenswrapper[4826]: I0319 19:17:10.675387 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c459bda1-b58d-4425-b401-1493252e282d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c459bda1-b58d-4425-b401-1493252e282d" (UID: "c459bda1-b58d-4425-b401-1493252e282d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:17:10 crc kubenswrapper[4826]: I0319 19:17:10.675972 4826 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d279b05-60b2-4405-8ba1-11e707e145fe-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:10 crc kubenswrapper[4826]: I0319 19:17:10.675992 4826 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c459bda1-b58d-4425-b401-1493252e282d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:10 crc kubenswrapper[4826]: I0319 19:17:10.679137 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d279b05-60b2-4405-8ba1-11e707e145fe-kube-api-access-dv4xz" (OuterVolumeSpecName: "kube-api-access-dv4xz") pod "4d279b05-60b2-4405-8ba1-11e707e145fe" (UID: "4d279b05-60b2-4405-8ba1-11e707e145fe"). InnerVolumeSpecName "kube-api-access-dv4xz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:17:10 crc kubenswrapper[4826]: I0319 19:17:10.707595 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c459bda1-b58d-4425-b401-1493252e282d-kube-api-access-dqt8m" (OuterVolumeSpecName: "kube-api-access-dqt8m") pod "c459bda1-b58d-4425-b401-1493252e282d" (UID: "c459bda1-b58d-4425-b401-1493252e282d"). InnerVolumeSpecName "kube-api-access-dqt8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:17:10 crc kubenswrapper[4826]: I0319 19:17:10.777648 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqt8m\" (UniqueName: \"kubernetes.io/projected/c459bda1-b58d-4425-b401-1493252e282d-kube-api-access-dqt8m\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:10 crc kubenswrapper[4826]: I0319 19:17:10.777748 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dv4xz\" (UniqueName: \"kubernetes.io/projected/4d279b05-60b2-4405-8ba1-11e707e145fe-kube-api-access-dv4xz\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:11 crc kubenswrapper[4826]: I0319 19:17:11.057570 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"69dc8d23-ac18-40b1-99d9-365705c5753b","Type":"ContainerStarted","Data":"0905438bb03dc381ef571e8ce2b64bc797077a674ddcc377521605ee89289434"} Mar 19 19:17:11 crc kubenswrapper[4826]: I0319 19:17:11.057972 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:17:11 crc kubenswrapper[4826]: I0319 19:17:11.060238 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-kzx9b" event={"ID":"4d279b05-60b2-4405-8ba1-11e707e145fe","Type":"ContainerDied","Data":"7a15cb67bb3643eeeae8cba9f178b8e4060db1f0c4d0954d923dc68f01315b91"} Mar 19 19:17:11 crc kubenswrapper[4826]: I0319 19:17:11.060385 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a15cb67bb3643eeeae8cba9f178b8e4060db1f0c4d0954d923dc68f01315b91" Mar 19 19:17:11 crc kubenswrapper[4826]: I0319 19:17:11.060331 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-kzx9b" Mar 19 19:17:11 crc kubenswrapper[4826]: I0319 19:17:11.062239 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-9640-account-create-update-29jnn" event={"ID":"c459bda1-b58d-4425-b401-1493252e282d","Type":"ContainerDied","Data":"1d7090cc888fa694c054a9288c44b98f1a3b24d6a2cb9aa25c528c41fcec38ff"} Mar 19 19:17:11 crc kubenswrapper[4826]: I0319 19:17:11.062289 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d7090cc888fa694c054a9288c44b98f1a3b24d6a2cb9aa25c528c41fcec38ff" Mar 19 19:17:11 crc kubenswrapper[4826]: I0319 19:17:11.062329 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-9640-account-create-update-29jnn" Mar 19 19:17:11 crc kubenswrapper[4826]: I0319 19:17:11.095113 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=-9223371963.759687 podStartE2EDuration="1m13.095087698s" podCreationTimestamp="2026-03-19 19:15:58 +0000 UTC" firstStartedPulling="2026-03-19 19:16:00.511468672 +0000 UTC m=+1185.265536985" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:17:11.087787071 +0000 UTC m=+1255.841855394" watchObservedRunningTime="2026-03-19 19:17:11.095087698 +0000 UTC m=+1255.849156011" Mar 19 19:17:11 crc kubenswrapper[4826]: I0319 19:17:11.303407 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-mk77d"] Mar 19 19:17:11 crc kubenswrapper[4826]: E0319 19:17:11.304255 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9327e4ee-66b1-4f08-9cb8-9facc43491b4" containerName="swift-ring-rebalance" Mar 19 19:17:11 crc kubenswrapper[4826]: I0319 19:17:11.304377 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="9327e4ee-66b1-4f08-9cb8-9facc43491b4" containerName="swift-ring-rebalance" Mar 19 19:17:11 crc kubenswrapper[4826]: E0319 19:17:11.304469 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e43282f8-6f7c-4cb5-a9cd-79cc13a5be89" containerName="mariadb-database-create" Mar 19 19:17:11 crc kubenswrapper[4826]: I0319 19:17:11.304539 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="e43282f8-6f7c-4cb5-a9cd-79cc13a5be89" containerName="mariadb-database-create" Mar 19 19:17:11 crc kubenswrapper[4826]: E0319 19:17:11.304619 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c459bda1-b58d-4425-b401-1493252e282d" containerName="mariadb-account-create-update" Mar 19 19:17:11 crc kubenswrapper[4826]: I0319 19:17:11.304740 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="c459bda1-b58d-4425-b401-1493252e282d" containerName="mariadb-account-create-update" Mar 19 19:17:11 crc kubenswrapper[4826]: E0319 19:17:11.304830 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a008787-669e-41e3-9178-b37bc657c710" containerName="mariadb-account-create-update" Mar 19 19:17:11 crc kubenswrapper[4826]: I0319 19:17:11.304898 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a008787-669e-41e3-9178-b37bc657c710" containerName="mariadb-account-create-update" Mar 19 19:17:11 crc kubenswrapper[4826]: E0319 19:17:11.304957 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d279b05-60b2-4405-8ba1-11e707e145fe" containerName="mariadb-database-create" Mar 19 19:17:11 crc kubenswrapper[4826]: I0319 19:17:11.305008 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d279b05-60b2-4405-8ba1-11e707e145fe" containerName="mariadb-database-create" Mar 19 19:17:11 crc kubenswrapper[4826]: E0319 19:17:11.305076 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="992fbead-aec8-4dbc-875a-d01481bdec46" containerName="mariadb-database-create" Mar 19 19:17:11 crc kubenswrapper[4826]: I0319 19:17:11.305141 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="992fbead-aec8-4dbc-875a-d01481bdec46" containerName="mariadb-database-create" Mar 19 19:17:11 crc kubenswrapper[4826]: E0319 19:17:11.305212 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a413e441-0fbd-4400-84be-959ce0870e4e" containerName="mariadb-account-create-update" Mar 19 19:17:11 crc kubenswrapper[4826]: I0319 19:17:11.305262 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="a413e441-0fbd-4400-84be-959ce0870e4e" containerName="mariadb-account-create-update" Mar 19 19:17:11 crc kubenswrapper[4826]: E0319 19:17:11.305316 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e3df38e-6b8b-4d2c-b45a-1ec0c7c03ac9" containerName="mariadb-database-create" Mar 19 19:17:11 crc kubenswrapper[4826]: I0319 19:17:11.305365 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e3df38e-6b8b-4d2c-b45a-1ec0c7c03ac9" containerName="mariadb-database-create" Mar 19 19:17:11 crc kubenswrapper[4826]: I0319 19:17:11.305640 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="c459bda1-b58d-4425-b401-1493252e282d" containerName="mariadb-account-create-update" Mar 19 19:17:11 crc kubenswrapper[4826]: I0319 19:17:11.305742 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d279b05-60b2-4405-8ba1-11e707e145fe" containerName="mariadb-database-create" Mar 19 19:17:11 crc kubenswrapper[4826]: I0319 19:17:11.305801 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="9327e4ee-66b1-4f08-9cb8-9facc43491b4" containerName="swift-ring-rebalance" Mar 19 19:17:11 crc kubenswrapper[4826]: I0319 19:17:11.305856 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="e43282f8-6f7c-4cb5-a9cd-79cc13a5be89" containerName="mariadb-database-create" Mar 19 19:17:11 crc kubenswrapper[4826]: I0319 19:17:11.305907 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a008787-669e-41e3-9178-b37bc657c710" containerName="mariadb-account-create-update" Mar 19 19:17:11 crc kubenswrapper[4826]: I0319 19:17:11.305982 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="992fbead-aec8-4dbc-875a-d01481bdec46" containerName="mariadb-database-create" Mar 19 19:17:11 crc kubenswrapper[4826]: I0319 19:17:11.306046 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e3df38e-6b8b-4d2c-b45a-1ec0c7c03ac9" containerName="mariadb-database-create" Mar 19 19:17:11 crc kubenswrapper[4826]: I0319 19:17:11.306108 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="a413e441-0fbd-4400-84be-959ce0870e4e" containerName="mariadb-account-create-update" Mar 19 19:17:11 crc kubenswrapper[4826]: I0319 19:17:11.306971 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-mk77d" Mar 19 19:17:11 crc kubenswrapper[4826]: I0319 19:17:11.310638 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 19 19:17:11 crc kubenswrapper[4826]: I0319 19:17:11.321215 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-mk77d"] Mar 19 19:17:11 crc kubenswrapper[4826]: I0319 19:17:11.504007 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/990ccce0-d217-45bb-a1de-f1da6d07e6f4-operator-scripts\") pod \"root-account-create-update-mk77d\" (UID: \"990ccce0-d217-45bb-a1de-f1da6d07e6f4\") " pod="openstack/root-account-create-update-mk77d" Mar 19 19:17:11 crc kubenswrapper[4826]: I0319 19:17:11.504492 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd6pt\" (UniqueName: \"kubernetes.io/projected/990ccce0-d217-45bb-a1de-f1da6d07e6f4-kube-api-access-sd6pt\") pod \"root-account-create-update-mk77d\" (UID: \"990ccce0-d217-45bb-a1de-f1da6d07e6f4\") " pod="openstack/root-account-create-update-mk77d" Mar 19 19:17:11 crc kubenswrapper[4826]: I0319 19:17:11.609813 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sd6pt\" (UniqueName: \"kubernetes.io/projected/990ccce0-d217-45bb-a1de-f1da6d07e6f4-kube-api-access-sd6pt\") pod \"root-account-create-update-mk77d\" (UID: \"990ccce0-d217-45bb-a1de-f1da6d07e6f4\") " pod="openstack/root-account-create-update-mk77d" Mar 19 19:17:11 crc kubenswrapper[4826]: I0319 19:17:11.609943 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/990ccce0-d217-45bb-a1de-f1da6d07e6f4-operator-scripts\") pod \"root-account-create-update-mk77d\" (UID: \"990ccce0-d217-45bb-a1de-f1da6d07e6f4\") " pod="openstack/root-account-create-update-mk77d" Mar 19 19:17:11 crc kubenswrapper[4826]: I0319 19:17:11.610584 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/990ccce0-d217-45bb-a1de-f1da6d07e6f4-operator-scripts\") pod \"root-account-create-update-mk77d\" (UID: \"990ccce0-d217-45bb-a1de-f1da6d07e6f4\") " pod="openstack/root-account-create-update-mk77d" Mar 19 19:17:11 crc kubenswrapper[4826]: I0319 19:17:11.636906 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd6pt\" (UniqueName: \"kubernetes.io/projected/990ccce0-d217-45bb-a1de-f1da6d07e6f4-kube-api-access-sd6pt\") pod \"root-account-create-update-mk77d\" (UID: \"990ccce0-d217-45bb-a1de-f1da6d07e6f4\") " pod="openstack/root-account-create-update-mk77d" Mar 19 19:17:11 crc kubenswrapper[4826]: I0319 19:17:11.686049 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-mk77d" Mar 19 19:17:12 crc kubenswrapper[4826]: I0319 19:17:12.161230 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-mk77d"] Mar 19 19:17:12 crc kubenswrapper[4826]: W0319 19:17:12.167804 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod990ccce0_d217_45bb_a1de_f1da6d07e6f4.slice/crio-332b0357b95df0cf9b8d56aaad44cda5668d7e07f40be42d934926d1f809a634 WatchSource:0}: Error finding container 332b0357b95df0cf9b8d56aaad44cda5668d7e07f40be42d934926d1f809a634: Status 404 returned error can't find the container with id 332b0357b95df0cf9b8d56aaad44cda5668d7e07f40be42d934926d1f809a634 Mar 19 19:17:12 crc kubenswrapper[4826]: I0319 19:17:12.533917 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 19 19:17:13 crc kubenswrapper[4826]: I0319 19:17:13.085556 4826 generic.go:334] "Generic (PLEG): container finished" podID="990ccce0-d217-45bb-a1de-f1da6d07e6f4" containerID="926bfc6635ef53ba250973f547f5e9e47d5486dc664c3fc6813fb4c99d91666b" exitCode=0 Mar 19 19:17:13 crc kubenswrapper[4826]: I0319 19:17:13.085691 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-mk77d" event={"ID":"990ccce0-d217-45bb-a1de-f1da6d07e6f4","Type":"ContainerDied","Data":"926bfc6635ef53ba250973f547f5e9e47d5486dc664c3fc6813fb4c99d91666b"} Mar 19 19:17:13 crc kubenswrapper[4826]: I0319 19:17:13.086025 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-mk77d" event={"ID":"990ccce0-d217-45bb-a1de-f1da6d07e6f4","Type":"ContainerStarted","Data":"332b0357b95df0cf9b8d56aaad44cda5668d7e07f40be42d934926d1f809a634"} Mar 19 19:17:13 crc kubenswrapper[4826]: I0319 19:17:13.137586 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-wdll6" podUID="2ed5ed9d-f761-4b5d-8cc8-07693c1d1289" containerName="ovn-controller" probeResult="failure" output=< Mar 19 19:17:13 crc kubenswrapper[4826]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 19 19:17:13 crc kubenswrapper[4826]: > Mar 19 19:17:13 crc kubenswrapper[4826]: I0319 19:17:13.140999 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-2vpmv" Mar 19 19:17:13 crc kubenswrapper[4826]: I0319 19:17:13.159557 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-2vpmv" Mar 19 19:17:13 crc kubenswrapper[4826]: I0319 19:17:13.419532 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-wdll6-config-nrjr7"] Mar 19 19:17:13 crc kubenswrapper[4826]: I0319 19:17:13.420819 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wdll6-config-nrjr7" Mar 19 19:17:13 crc kubenswrapper[4826]: I0319 19:17:13.422813 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 19 19:17:13 crc kubenswrapper[4826]: I0319 19:17:13.452760 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-wdll6-config-nrjr7"] Mar 19 19:17:13 crc kubenswrapper[4826]: I0319 19:17:13.538985 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-rvwbt"] Mar 19 19:17:13 crc kubenswrapper[4826]: I0319 19:17:13.540176 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-rvwbt" Mar 19 19:17:13 crc kubenswrapper[4826]: I0319 19:17:13.543416 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 19 19:17:13 crc kubenswrapper[4826]: I0319 19:17:13.543421 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-jrns4" Mar 19 19:17:13 crc kubenswrapper[4826]: I0319 19:17:13.548322 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b1084d5d-91b3-4660-88a3-85f5843f5cb4-var-run\") pod \"ovn-controller-wdll6-config-nrjr7\" (UID: \"b1084d5d-91b3-4660-88a3-85f5843f5cb4\") " pod="openstack/ovn-controller-wdll6-config-nrjr7" Mar 19 19:17:13 crc kubenswrapper[4826]: I0319 19:17:13.548372 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhkz4\" (UniqueName: \"kubernetes.io/projected/b1084d5d-91b3-4660-88a3-85f5843f5cb4-kube-api-access-bhkz4\") pod \"ovn-controller-wdll6-config-nrjr7\" (UID: \"b1084d5d-91b3-4660-88a3-85f5843f5cb4\") " pod="openstack/ovn-controller-wdll6-config-nrjr7" Mar 19 19:17:13 crc kubenswrapper[4826]: I0319 19:17:13.548420 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b1084d5d-91b3-4660-88a3-85f5843f5cb4-var-run-ovn\") pod \"ovn-controller-wdll6-config-nrjr7\" (UID: \"b1084d5d-91b3-4660-88a3-85f5843f5cb4\") " pod="openstack/ovn-controller-wdll6-config-nrjr7" Mar 19 19:17:13 crc kubenswrapper[4826]: I0319 19:17:13.548461 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b1084d5d-91b3-4660-88a3-85f5843f5cb4-var-log-ovn\") pod \"ovn-controller-wdll6-config-nrjr7\" (UID: \"b1084d5d-91b3-4660-88a3-85f5843f5cb4\") " pod="openstack/ovn-controller-wdll6-config-nrjr7" Mar 19 19:17:13 crc kubenswrapper[4826]: I0319 19:17:13.548487 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b1084d5d-91b3-4660-88a3-85f5843f5cb4-additional-scripts\") pod \"ovn-controller-wdll6-config-nrjr7\" (UID: \"b1084d5d-91b3-4660-88a3-85f5843f5cb4\") " pod="openstack/ovn-controller-wdll6-config-nrjr7" Mar 19 19:17:13 crc kubenswrapper[4826]: I0319 19:17:13.548541 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b1084d5d-91b3-4660-88a3-85f5843f5cb4-scripts\") pod \"ovn-controller-wdll6-config-nrjr7\" (UID: \"b1084d5d-91b3-4660-88a3-85f5843f5cb4\") " pod="openstack/ovn-controller-wdll6-config-nrjr7" Mar 19 19:17:13 crc kubenswrapper[4826]: I0319 19:17:13.552641 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-rvwbt"] Mar 19 19:17:13 crc kubenswrapper[4826]: I0319 19:17:13.649975 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b1084d5d-91b3-4660-88a3-85f5843f5cb4-var-run\") pod \"ovn-controller-wdll6-config-nrjr7\" (UID: \"b1084d5d-91b3-4660-88a3-85f5843f5cb4\") " pod="openstack/ovn-controller-wdll6-config-nrjr7" Mar 19 19:17:13 crc kubenswrapper[4826]: I0319 19:17:13.650036 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhkz4\" (UniqueName: \"kubernetes.io/projected/b1084d5d-91b3-4660-88a3-85f5843f5cb4-kube-api-access-bhkz4\") pod \"ovn-controller-wdll6-config-nrjr7\" (UID: \"b1084d5d-91b3-4660-88a3-85f5843f5cb4\") " pod="openstack/ovn-controller-wdll6-config-nrjr7" Mar 19 19:17:13 crc kubenswrapper[4826]: I0319 19:17:13.650083 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d888eb3-4bcd-470d-95c6-aa3d281c6332-combined-ca-bundle\") pod \"glance-db-sync-rvwbt\" (UID: \"9d888eb3-4bcd-470d-95c6-aa3d281c6332\") " pod="openstack/glance-db-sync-rvwbt" Mar 19 19:17:13 crc kubenswrapper[4826]: I0319 19:17:13.650115 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b1084d5d-91b3-4660-88a3-85f5843f5cb4-var-run-ovn\") pod \"ovn-controller-wdll6-config-nrjr7\" (UID: \"b1084d5d-91b3-4660-88a3-85f5843f5cb4\") " pod="openstack/ovn-controller-wdll6-config-nrjr7" Mar 19 19:17:13 crc kubenswrapper[4826]: I0319 19:17:13.650142 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d888eb3-4bcd-470d-95c6-aa3d281c6332-config-data\") pod \"glance-db-sync-rvwbt\" (UID: \"9d888eb3-4bcd-470d-95c6-aa3d281c6332\") " pod="openstack/glance-db-sync-rvwbt" Mar 19 19:17:13 crc kubenswrapper[4826]: I0319 19:17:13.650166 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l29wd\" (UniqueName: \"kubernetes.io/projected/9d888eb3-4bcd-470d-95c6-aa3d281c6332-kube-api-access-l29wd\") pod \"glance-db-sync-rvwbt\" (UID: \"9d888eb3-4bcd-470d-95c6-aa3d281c6332\") " pod="openstack/glance-db-sync-rvwbt" Mar 19 19:17:13 crc kubenswrapper[4826]: I0319 19:17:13.650195 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b1084d5d-91b3-4660-88a3-85f5843f5cb4-var-log-ovn\") pod \"ovn-controller-wdll6-config-nrjr7\" (UID: \"b1084d5d-91b3-4660-88a3-85f5843f5cb4\") " pod="openstack/ovn-controller-wdll6-config-nrjr7" Mar 19 19:17:13 crc kubenswrapper[4826]: I0319 19:17:13.650216 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b1084d5d-91b3-4660-88a3-85f5843f5cb4-additional-scripts\") pod \"ovn-controller-wdll6-config-nrjr7\" (UID: \"b1084d5d-91b3-4660-88a3-85f5843f5cb4\") " pod="openstack/ovn-controller-wdll6-config-nrjr7" Mar 19 19:17:13 crc kubenswrapper[4826]: I0319 19:17:13.650277 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b1084d5d-91b3-4660-88a3-85f5843f5cb4-var-run\") pod \"ovn-controller-wdll6-config-nrjr7\" (UID: \"b1084d5d-91b3-4660-88a3-85f5843f5cb4\") " pod="openstack/ovn-controller-wdll6-config-nrjr7" Mar 19 19:17:13 crc kubenswrapper[4826]: I0319 19:17:13.650281 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b1084d5d-91b3-4660-88a3-85f5843f5cb4-var-run-ovn\") pod \"ovn-controller-wdll6-config-nrjr7\" (UID: \"b1084d5d-91b3-4660-88a3-85f5843f5cb4\") " pod="openstack/ovn-controller-wdll6-config-nrjr7" Mar 19 19:17:13 crc kubenswrapper[4826]: I0319 19:17:13.650313 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b1084d5d-91b3-4660-88a3-85f5843f5cb4-var-log-ovn\") pod \"ovn-controller-wdll6-config-nrjr7\" (UID: \"b1084d5d-91b3-4660-88a3-85f5843f5cb4\") " pod="openstack/ovn-controller-wdll6-config-nrjr7" Mar 19 19:17:13 crc kubenswrapper[4826]: I0319 19:17:13.650288 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9d888eb3-4bcd-470d-95c6-aa3d281c6332-db-sync-config-data\") pod \"glance-db-sync-rvwbt\" (UID: \"9d888eb3-4bcd-470d-95c6-aa3d281c6332\") " pod="openstack/glance-db-sync-rvwbt" Mar 19 19:17:13 crc kubenswrapper[4826]: I0319 19:17:13.650485 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b1084d5d-91b3-4660-88a3-85f5843f5cb4-scripts\") pod \"ovn-controller-wdll6-config-nrjr7\" (UID: \"b1084d5d-91b3-4660-88a3-85f5843f5cb4\") " pod="openstack/ovn-controller-wdll6-config-nrjr7" Mar 19 19:17:13 crc kubenswrapper[4826]: I0319 19:17:13.651077 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b1084d5d-91b3-4660-88a3-85f5843f5cb4-additional-scripts\") pod \"ovn-controller-wdll6-config-nrjr7\" (UID: \"b1084d5d-91b3-4660-88a3-85f5843f5cb4\") " pod="openstack/ovn-controller-wdll6-config-nrjr7" Mar 19 19:17:13 crc kubenswrapper[4826]: I0319 19:17:13.652545 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b1084d5d-91b3-4660-88a3-85f5843f5cb4-scripts\") pod \"ovn-controller-wdll6-config-nrjr7\" (UID: \"b1084d5d-91b3-4660-88a3-85f5843f5cb4\") " pod="openstack/ovn-controller-wdll6-config-nrjr7" Mar 19 19:17:13 crc kubenswrapper[4826]: I0319 19:17:13.670015 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhkz4\" (UniqueName: \"kubernetes.io/projected/b1084d5d-91b3-4660-88a3-85f5843f5cb4-kube-api-access-bhkz4\") pod \"ovn-controller-wdll6-config-nrjr7\" (UID: \"b1084d5d-91b3-4660-88a3-85f5843f5cb4\") " pod="openstack/ovn-controller-wdll6-config-nrjr7" Mar 19 19:17:13 crc kubenswrapper[4826]: I0319 19:17:13.751960 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wdll6-config-nrjr7" Mar 19 19:17:13 crc kubenswrapper[4826]: I0319 19:17:13.752640 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9d888eb3-4bcd-470d-95c6-aa3d281c6332-db-sync-config-data\") pod \"glance-db-sync-rvwbt\" (UID: \"9d888eb3-4bcd-470d-95c6-aa3d281c6332\") " pod="openstack/glance-db-sync-rvwbt" Mar 19 19:17:13 crc kubenswrapper[4826]: I0319 19:17:13.752820 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d888eb3-4bcd-470d-95c6-aa3d281c6332-combined-ca-bundle\") pod \"glance-db-sync-rvwbt\" (UID: \"9d888eb3-4bcd-470d-95c6-aa3d281c6332\") " pod="openstack/glance-db-sync-rvwbt" Mar 19 19:17:13 crc kubenswrapper[4826]: I0319 19:17:13.752866 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d888eb3-4bcd-470d-95c6-aa3d281c6332-config-data\") pod \"glance-db-sync-rvwbt\" (UID: \"9d888eb3-4bcd-470d-95c6-aa3d281c6332\") " pod="openstack/glance-db-sync-rvwbt" Mar 19 19:17:13 crc kubenswrapper[4826]: I0319 19:17:13.752902 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l29wd\" (UniqueName: \"kubernetes.io/projected/9d888eb3-4bcd-470d-95c6-aa3d281c6332-kube-api-access-l29wd\") pod \"glance-db-sync-rvwbt\" (UID: \"9d888eb3-4bcd-470d-95c6-aa3d281c6332\") " pod="openstack/glance-db-sync-rvwbt" Mar 19 19:17:13 crc kubenswrapper[4826]: I0319 19:17:13.756525 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d888eb3-4bcd-470d-95c6-aa3d281c6332-config-data\") pod \"glance-db-sync-rvwbt\" (UID: \"9d888eb3-4bcd-470d-95c6-aa3d281c6332\") " pod="openstack/glance-db-sync-rvwbt" Mar 19 19:17:13 crc kubenswrapper[4826]: I0319 19:17:13.756548 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9d888eb3-4bcd-470d-95c6-aa3d281c6332-db-sync-config-data\") pod \"glance-db-sync-rvwbt\" (UID: \"9d888eb3-4bcd-470d-95c6-aa3d281c6332\") " pod="openstack/glance-db-sync-rvwbt" Mar 19 19:17:13 crc kubenswrapper[4826]: I0319 19:17:13.756875 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d888eb3-4bcd-470d-95c6-aa3d281c6332-combined-ca-bundle\") pod \"glance-db-sync-rvwbt\" (UID: \"9d888eb3-4bcd-470d-95c6-aa3d281c6332\") " pod="openstack/glance-db-sync-rvwbt" Mar 19 19:17:13 crc kubenswrapper[4826]: I0319 19:17:13.768576 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l29wd\" (UniqueName: \"kubernetes.io/projected/9d888eb3-4bcd-470d-95c6-aa3d281c6332-kube-api-access-l29wd\") pod \"glance-db-sync-rvwbt\" (UID: \"9d888eb3-4bcd-470d-95c6-aa3d281c6332\") " pod="openstack/glance-db-sync-rvwbt" Mar 19 19:17:13 crc kubenswrapper[4826]: I0319 19:17:13.889331 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-rvwbt" Mar 19 19:17:14 crc kubenswrapper[4826]: I0319 19:17:14.246079 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-wdll6-config-nrjr7"] Mar 19 19:17:14 crc kubenswrapper[4826]: W0319 19:17:14.247867 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1084d5d_91b3_4660_88a3_85f5843f5cb4.slice/crio-2bf646017eaed73c7c60cc9d62a6143d2f1d7e5a2590bef0342f20196d61d59a WatchSource:0}: Error finding container 2bf646017eaed73c7c60cc9d62a6143d2f1d7e5a2590bef0342f20196d61d59a: Status 404 returned error can't find the container with id 2bf646017eaed73c7c60cc9d62a6143d2f1d7e5a2590bef0342f20196d61d59a Mar 19 19:17:14 crc kubenswrapper[4826]: I0319 19:17:14.467582 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-mk77d" Mar 19 19:17:14 crc kubenswrapper[4826]: W0319 19:17:14.474701 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d888eb3_4bcd_470d_95c6_aa3d281c6332.slice/crio-eed7e652e07b53bd671193eb8051d93d7527d2dc40ad9ad5ca561c0851fe9fba WatchSource:0}: Error finding container eed7e652e07b53bd671193eb8051d93d7527d2dc40ad9ad5ca561c0851fe9fba: Status 404 returned error can't find the container with id eed7e652e07b53bd671193eb8051d93d7527d2dc40ad9ad5ca561c0851fe9fba Mar 19 19:17:14 crc kubenswrapper[4826]: I0319 19:17:14.478132 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-rvwbt"] Mar 19 19:17:14 crc kubenswrapper[4826]: I0319 19:17:14.573823 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sd6pt\" (UniqueName: \"kubernetes.io/projected/990ccce0-d217-45bb-a1de-f1da6d07e6f4-kube-api-access-sd6pt\") pod \"990ccce0-d217-45bb-a1de-f1da6d07e6f4\" (UID: \"990ccce0-d217-45bb-a1de-f1da6d07e6f4\") " Mar 19 19:17:14 crc kubenswrapper[4826]: I0319 19:17:14.574002 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/990ccce0-d217-45bb-a1de-f1da6d07e6f4-operator-scripts\") pod \"990ccce0-d217-45bb-a1de-f1da6d07e6f4\" (UID: \"990ccce0-d217-45bb-a1de-f1da6d07e6f4\") " Mar 19 19:17:14 crc kubenswrapper[4826]: I0319 19:17:14.575015 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/990ccce0-d217-45bb-a1de-f1da6d07e6f4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "990ccce0-d217-45bb-a1de-f1da6d07e6f4" (UID: "990ccce0-d217-45bb-a1de-f1da6d07e6f4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:17:14 crc kubenswrapper[4826]: I0319 19:17:14.580571 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/990ccce0-d217-45bb-a1de-f1da6d07e6f4-kube-api-access-sd6pt" (OuterVolumeSpecName: "kube-api-access-sd6pt") pod "990ccce0-d217-45bb-a1de-f1da6d07e6f4" (UID: "990ccce0-d217-45bb-a1de-f1da6d07e6f4"). InnerVolumeSpecName "kube-api-access-sd6pt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:17:14 crc kubenswrapper[4826]: I0319 19:17:14.676563 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sd6pt\" (UniqueName: \"kubernetes.io/projected/990ccce0-d217-45bb-a1de-f1da6d07e6f4-kube-api-access-sd6pt\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:14 crc kubenswrapper[4826]: I0319 19:17:14.676624 4826 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/990ccce0-d217-45bb-a1de-f1da6d07e6f4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:15 crc kubenswrapper[4826]: I0319 19:17:15.124096 4826 generic.go:334] "Generic (PLEG): container finished" podID="b1084d5d-91b3-4660-88a3-85f5843f5cb4" containerID="3ab563d3809408b45476ed084a9feb0b7ea57fe73e485377a7e6e1960114f76f" exitCode=0 Mar 19 19:17:15 crc kubenswrapper[4826]: I0319 19:17:15.124138 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wdll6-config-nrjr7" event={"ID":"b1084d5d-91b3-4660-88a3-85f5843f5cb4","Type":"ContainerDied","Data":"3ab563d3809408b45476ed084a9feb0b7ea57fe73e485377a7e6e1960114f76f"} Mar 19 19:17:15 crc kubenswrapper[4826]: I0319 19:17:15.124394 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wdll6-config-nrjr7" event={"ID":"b1084d5d-91b3-4660-88a3-85f5843f5cb4","Type":"ContainerStarted","Data":"2bf646017eaed73c7c60cc9d62a6143d2f1d7e5a2590bef0342f20196d61d59a"} Mar 19 19:17:15 crc kubenswrapper[4826]: I0319 19:17:15.125895 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-mk77d" event={"ID":"990ccce0-d217-45bb-a1de-f1da6d07e6f4","Type":"ContainerDied","Data":"332b0357b95df0cf9b8d56aaad44cda5668d7e07f40be42d934926d1f809a634"} Mar 19 19:17:15 crc kubenswrapper[4826]: I0319 19:17:15.125935 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="332b0357b95df0cf9b8d56aaad44cda5668d7e07f40be42d934926d1f809a634" Mar 19 19:17:15 crc kubenswrapper[4826]: I0319 19:17:15.125908 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-mk77d" Mar 19 19:17:15 crc kubenswrapper[4826]: I0319 19:17:15.126829 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-rvwbt" event={"ID":"9d888eb3-4bcd-470d-95c6-aa3d281c6332","Type":"ContainerStarted","Data":"eed7e652e07b53bd671193eb8051d93d7527d2dc40ad9ad5ca561c0851fe9fba"} Mar 19 19:17:16 crc kubenswrapper[4826]: I0319 19:17:16.522023 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-0"] Mar 19 19:17:16 crc kubenswrapper[4826]: E0319 19:17:16.522906 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="990ccce0-d217-45bb-a1de-f1da6d07e6f4" containerName="mariadb-account-create-update" Mar 19 19:17:16 crc kubenswrapper[4826]: I0319 19:17:16.522920 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="990ccce0-d217-45bb-a1de-f1da6d07e6f4" containerName="mariadb-account-create-update" Mar 19 19:17:16 crc kubenswrapper[4826]: I0319 19:17:16.523112 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="990ccce0-d217-45bb-a1de-f1da6d07e6f4" containerName="mariadb-account-create-update" Mar 19 19:17:16 crc kubenswrapper[4826]: I0319 19:17:16.523768 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 19 19:17:16 crc kubenswrapper[4826]: I0319 19:17:16.526807 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-config-data" Mar 19 19:17:16 crc kubenswrapper[4826]: I0319 19:17:16.546544 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 19 19:17:16 crc kubenswrapper[4826]: I0319 19:17:16.566690 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wdll6-config-nrjr7" Mar 19 19:17:16 crc kubenswrapper[4826]: I0319 19:17:16.623465 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/144ed0ba-58bd-4991-92af-8ed3f401e52c-config-data\") pod \"mysqld-exporter-0\" (UID: \"144ed0ba-58bd-4991-92af-8ed3f401e52c\") " pod="openstack/mysqld-exporter-0" Mar 19 19:17:16 crc kubenswrapper[4826]: I0319 19:17:16.623603 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tbt6\" (UniqueName: \"kubernetes.io/projected/144ed0ba-58bd-4991-92af-8ed3f401e52c-kube-api-access-4tbt6\") pod \"mysqld-exporter-0\" (UID: \"144ed0ba-58bd-4991-92af-8ed3f401e52c\") " pod="openstack/mysqld-exporter-0" Mar 19 19:17:16 crc kubenswrapper[4826]: I0319 19:17:16.623662 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/144ed0ba-58bd-4991-92af-8ed3f401e52c-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"144ed0ba-58bd-4991-92af-8ed3f401e52c\") " pod="openstack/mysqld-exporter-0" Mar 19 19:17:16 crc kubenswrapper[4826]: I0319 19:17:16.725353 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b1084d5d-91b3-4660-88a3-85f5843f5cb4-additional-scripts\") pod \"b1084d5d-91b3-4660-88a3-85f5843f5cb4\" (UID: \"b1084d5d-91b3-4660-88a3-85f5843f5cb4\") " Mar 19 19:17:16 crc kubenswrapper[4826]: I0319 19:17:16.725447 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b1084d5d-91b3-4660-88a3-85f5843f5cb4-var-log-ovn\") pod \"b1084d5d-91b3-4660-88a3-85f5843f5cb4\" (UID: \"b1084d5d-91b3-4660-88a3-85f5843f5cb4\") " Mar 19 19:17:16 crc kubenswrapper[4826]: I0319 19:17:16.725519 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b1084d5d-91b3-4660-88a3-85f5843f5cb4-var-run\") pod \"b1084d5d-91b3-4660-88a3-85f5843f5cb4\" (UID: \"b1084d5d-91b3-4660-88a3-85f5843f5cb4\") " Mar 19 19:17:16 crc kubenswrapper[4826]: I0319 19:17:16.725583 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhkz4\" (UniqueName: \"kubernetes.io/projected/b1084d5d-91b3-4660-88a3-85f5843f5cb4-kube-api-access-bhkz4\") pod \"b1084d5d-91b3-4660-88a3-85f5843f5cb4\" (UID: \"b1084d5d-91b3-4660-88a3-85f5843f5cb4\") " Mar 19 19:17:16 crc kubenswrapper[4826]: I0319 19:17:16.725641 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b1084d5d-91b3-4660-88a3-85f5843f5cb4-var-run-ovn\") pod \"b1084d5d-91b3-4660-88a3-85f5843f5cb4\" (UID: \"b1084d5d-91b3-4660-88a3-85f5843f5cb4\") " Mar 19 19:17:16 crc kubenswrapper[4826]: I0319 19:17:16.725675 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b1084d5d-91b3-4660-88a3-85f5843f5cb4-scripts\") pod \"b1084d5d-91b3-4660-88a3-85f5843f5cb4\" (UID: \"b1084d5d-91b3-4660-88a3-85f5843f5cb4\") " Mar 19 19:17:16 crc kubenswrapper[4826]: I0319 19:17:16.725950 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tbt6\" (UniqueName: \"kubernetes.io/projected/144ed0ba-58bd-4991-92af-8ed3f401e52c-kube-api-access-4tbt6\") pod \"mysqld-exporter-0\" (UID: \"144ed0ba-58bd-4991-92af-8ed3f401e52c\") " pod="openstack/mysqld-exporter-0" Mar 19 19:17:16 crc kubenswrapper[4826]: I0319 19:17:16.726000 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/144ed0ba-58bd-4991-92af-8ed3f401e52c-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"144ed0ba-58bd-4991-92af-8ed3f401e52c\") " pod="openstack/mysqld-exporter-0" Mar 19 19:17:16 crc kubenswrapper[4826]: I0319 19:17:16.726096 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/144ed0ba-58bd-4991-92af-8ed3f401e52c-config-data\") pod \"mysqld-exporter-0\" (UID: \"144ed0ba-58bd-4991-92af-8ed3f401e52c\") " pod="openstack/mysqld-exporter-0" Mar 19 19:17:16 crc kubenswrapper[4826]: I0319 19:17:16.727022 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b1084d5d-91b3-4660-88a3-85f5843f5cb4-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "b1084d5d-91b3-4660-88a3-85f5843f5cb4" (UID: "b1084d5d-91b3-4660-88a3-85f5843f5cb4"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 19:17:16 crc kubenswrapper[4826]: I0319 19:17:16.727077 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b1084d5d-91b3-4660-88a3-85f5843f5cb4-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "b1084d5d-91b3-4660-88a3-85f5843f5cb4" (UID: "b1084d5d-91b3-4660-88a3-85f5843f5cb4"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 19:17:16 crc kubenswrapper[4826]: I0319 19:17:16.727734 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b1084d5d-91b3-4660-88a3-85f5843f5cb4-var-run" (OuterVolumeSpecName: "var-run") pod "b1084d5d-91b3-4660-88a3-85f5843f5cb4" (UID: "b1084d5d-91b3-4660-88a3-85f5843f5cb4"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 19:17:16 crc kubenswrapper[4826]: I0319 19:17:16.727823 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1084d5d-91b3-4660-88a3-85f5843f5cb4-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "b1084d5d-91b3-4660-88a3-85f5843f5cb4" (UID: "b1084d5d-91b3-4660-88a3-85f5843f5cb4"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:17:16 crc kubenswrapper[4826]: I0319 19:17:16.729239 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1084d5d-91b3-4660-88a3-85f5843f5cb4-scripts" (OuterVolumeSpecName: "scripts") pod "b1084d5d-91b3-4660-88a3-85f5843f5cb4" (UID: "b1084d5d-91b3-4660-88a3-85f5843f5cb4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:17:16 crc kubenswrapper[4826]: I0319 19:17:16.733162 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/144ed0ba-58bd-4991-92af-8ed3f401e52c-config-data\") pod \"mysqld-exporter-0\" (UID: \"144ed0ba-58bd-4991-92af-8ed3f401e52c\") " pod="openstack/mysqld-exporter-0" Mar 19 19:17:16 crc kubenswrapper[4826]: I0319 19:17:16.735810 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1084d5d-91b3-4660-88a3-85f5843f5cb4-kube-api-access-bhkz4" (OuterVolumeSpecName: "kube-api-access-bhkz4") pod "b1084d5d-91b3-4660-88a3-85f5843f5cb4" (UID: "b1084d5d-91b3-4660-88a3-85f5843f5cb4"). InnerVolumeSpecName "kube-api-access-bhkz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:17:16 crc kubenswrapper[4826]: I0319 19:17:16.749529 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/144ed0ba-58bd-4991-92af-8ed3f401e52c-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"144ed0ba-58bd-4991-92af-8ed3f401e52c\") " pod="openstack/mysqld-exporter-0" Mar 19 19:17:16 crc kubenswrapper[4826]: I0319 19:17:16.769309 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tbt6\" (UniqueName: \"kubernetes.io/projected/144ed0ba-58bd-4991-92af-8ed3f401e52c-kube-api-access-4tbt6\") pod \"mysqld-exporter-0\" (UID: \"144ed0ba-58bd-4991-92af-8ed3f401e52c\") " pod="openstack/mysqld-exporter-0" Mar 19 19:17:16 crc kubenswrapper[4826]: I0319 19:17:16.829247 4826 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b1084d5d-91b3-4660-88a3-85f5843f5cb4-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:16 crc kubenswrapper[4826]: I0319 19:17:16.829296 4826 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b1084d5d-91b3-4660-88a3-85f5843f5cb4-var-run\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:16 crc kubenswrapper[4826]: I0319 19:17:16.829308 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhkz4\" (UniqueName: \"kubernetes.io/projected/b1084d5d-91b3-4660-88a3-85f5843f5cb4-kube-api-access-bhkz4\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:16 crc kubenswrapper[4826]: I0319 19:17:16.829319 4826 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b1084d5d-91b3-4660-88a3-85f5843f5cb4-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:16 crc kubenswrapper[4826]: I0319 19:17:16.829326 4826 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b1084d5d-91b3-4660-88a3-85f5843f5cb4-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:16 crc kubenswrapper[4826]: I0319 19:17:16.829334 4826 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b1084d5d-91b3-4660-88a3-85f5843f5cb4-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:16 crc kubenswrapper[4826]: I0319 19:17:16.886409 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 19 19:17:17 crc kubenswrapper[4826]: I0319 19:17:17.152436 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wdll6-config-nrjr7" event={"ID":"b1084d5d-91b3-4660-88a3-85f5843f5cb4","Type":"ContainerDied","Data":"2bf646017eaed73c7c60cc9d62a6143d2f1d7e5a2590bef0342f20196d61d59a"} Mar 19 19:17:17 crc kubenswrapper[4826]: I0319 19:17:17.152698 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2bf646017eaed73c7c60cc9d62a6143d2f1d7e5a2590bef0342f20196d61d59a" Mar 19 19:17:17 crc kubenswrapper[4826]: I0319 19:17:17.152751 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wdll6-config-nrjr7" Mar 19 19:17:17 crc kubenswrapper[4826]: I0319 19:17:17.523510 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 19 19:17:17 crc kubenswrapper[4826]: W0319 19:17:17.546135 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod144ed0ba_58bd_4991_92af_8ed3f401e52c.slice/crio-744ebc93e8d7de4b825bb37739953fa667633fcf7cc9927c4152231f46c21762 WatchSource:0}: Error finding container 744ebc93e8d7de4b825bb37739953fa667633fcf7cc9927c4152231f46c21762: Status 404 returned error can't find the container with id 744ebc93e8d7de4b825bb37739953fa667633fcf7cc9927c4152231f46c21762 Mar 19 19:17:17 crc kubenswrapper[4826]: I0319 19:17:17.674276 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-wdll6-config-nrjr7"] Mar 19 19:17:17 crc kubenswrapper[4826]: I0319 19:17:17.687281 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-wdll6-config-nrjr7"] Mar 19 19:17:17 crc kubenswrapper[4826]: I0319 19:17:17.868081 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-wdll6-config-9xnbx"] Mar 19 19:17:17 crc kubenswrapper[4826]: E0319 19:17:17.869207 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1084d5d-91b3-4660-88a3-85f5843f5cb4" containerName="ovn-config" Mar 19 19:17:17 crc kubenswrapper[4826]: I0319 19:17:17.869771 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1084d5d-91b3-4660-88a3-85f5843f5cb4" containerName="ovn-config" Mar 19 19:17:17 crc kubenswrapper[4826]: I0319 19:17:17.870449 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1084d5d-91b3-4660-88a3-85f5843f5cb4" containerName="ovn-config" Mar 19 19:17:17 crc kubenswrapper[4826]: I0319 19:17:17.872150 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wdll6-config-9xnbx" Mar 19 19:17:17 crc kubenswrapper[4826]: I0319 19:17:17.875953 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 19 19:17:17 crc kubenswrapper[4826]: I0319 19:17:17.896292 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-wdll6-config-9xnbx"] Mar 19 19:17:17 crc kubenswrapper[4826]: I0319 19:17:17.990753 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1084d5d-91b3-4660-88a3-85f5843f5cb4" path="/var/lib/kubelet/pods/b1084d5d-91b3-4660-88a3-85f5843f5cb4/volumes" Mar 19 19:17:18 crc kubenswrapper[4826]: I0319 19:17:18.066459 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4cddaee6-7871-4749-a9c1-43dce8c04cc0-var-run-ovn\") pod \"ovn-controller-wdll6-config-9xnbx\" (UID: \"4cddaee6-7871-4749-a9c1-43dce8c04cc0\") " pod="openstack/ovn-controller-wdll6-config-9xnbx" Mar 19 19:17:18 crc kubenswrapper[4826]: I0319 19:17:18.066790 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4cddaee6-7871-4749-a9c1-43dce8c04cc0-var-log-ovn\") pod \"ovn-controller-wdll6-config-9xnbx\" (UID: \"4cddaee6-7871-4749-a9c1-43dce8c04cc0\") " pod="openstack/ovn-controller-wdll6-config-9xnbx" Mar 19 19:17:18 crc kubenswrapper[4826]: I0319 19:17:18.066904 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h28gs\" (UniqueName: \"kubernetes.io/projected/4cddaee6-7871-4749-a9c1-43dce8c04cc0-kube-api-access-h28gs\") pod \"ovn-controller-wdll6-config-9xnbx\" (UID: \"4cddaee6-7871-4749-a9c1-43dce8c04cc0\") " pod="openstack/ovn-controller-wdll6-config-9xnbx" Mar 19 19:17:18 crc kubenswrapper[4826]: I0319 19:17:18.066981 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4cddaee6-7871-4749-a9c1-43dce8c04cc0-additional-scripts\") pod \"ovn-controller-wdll6-config-9xnbx\" (UID: \"4cddaee6-7871-4749-a9c1-43dce8c04cc0\") " pod="openstack/ovn-controller-wdll6-config-9xnbx" Mar 19 19:17:18 crc kubenswrapper[4826]: I0319 19:17:18.067027 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4cddaee6-7871-4749-a9c1-43dce8c04cc0-scripts\") pod \"ovn-controller-wdll6-config-9xnbx\" (UID: \"4cddaee6-7871-4749-a9c1-43dce8c04cc0\") " pod="openstack/ovn-controller-wdll6-config-9xnbx" Mar 19 19:17:18 crc kubenswrapper[4826]: I0319 19:17:18.067124 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4cddaee6-7871-4749-a9c1-43dce8c04cc0-var-run\") pod \"ovn-controller-wdll6-config-9xnbx\" (UID: \"4cddaee6-7871-4749-a9c1-43dce8c04cc0\") " pod="openstack/ovn-controller-wdll6-config-9xnbx" Mar 19 19:17:18 crc kubenswrapper[4826]: I0319 19:17:18.117775 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-wdll6" Mar 19 19:17:18 crc kubenswrapper[4826]: I0319 19:17:18.163696 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"144ed0ba-58bd-4991-92af-8ed3f401e52c","Type":"ContainerStarted","Data":"744ebc93e8d7de4b825bb37739953fa667633fcf7cc9927c4152231f46c21762"} Mar 19 19:17:18 crc kubenswrapper[4826]: I0319 19:17:18.169768 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4cddaee6-7871-4749-a9c1-43dce8c04cc0-var-run-ovn\") pod \"ovn-controller-wdll6-config-9xnbx\" (UID: \"4cddaee6-7871-4749-a9c1-43dce8c04cc0\") " pod="openstack/ovn-controller-wdll6-config-9xnbx" Mar 19 19:17:18 crc kubenswrapper[4826]: I0319 19:17:18.169885 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4cddaee6-7871-4749-a9c1-43dce8c04cc0-var-log-ovn\") pod \"ovn-controller-wdll6-config-9xnbx\" (UID: \"4cddaee6-7871-4749-a9c1-43dce8c04cc0\") " pod="openstack/ovn-controller-wdll6-config-9xnbx" Mar 19 19:17:18 crc kubenswrapper[4826]: I0319 19:17:18.169923 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h28gs\" (UniqueName: \"kubernetes.io/projected/4cddaee6-7871-4749-a9c1-43dce8c04cc0-kube-api-access-h28gs\") pod \"ovn-controller-wdll6-config-9xnbx\" (UID: \"4cddaee6-7871-4749-a9c1-43dce8c04cc0\") " pod="openstack/ovn-controller-wdll6-config-9xnbx" Mar 19 19:17:18 crc kubenswrapper[4826]: I0319 19:17:18.169953 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4cddaee6-7871-4749-a9c1-43dce8c04cc0-additional-scripts\") pod \"ovn-controller-wdll6-config-9xnbx\" (UID: \"4cddaee6-7871-4749-a9c1-43dce8c04cc0\") " pod="openstack/ovn-controller-wdll6-config-9xnbx" Mar 19 19:17:18 crc kubenswrapper[4826]: I0319 19:17:18.169977 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4cddaee6-7871-4749-a9c1-43dce8c04cc0-scripts\") pod \"ovn-controller-wdll6-config-9xnbx\" (UID: \"4cddaee6-7871-4749-a9c1-43dce8c04cc0\") " pod="openstack/ovn-controller-wdll6-config-9xnbx" Mar 19 19:17:18 crc kubenswrapper[4826]: I0319 19:17:18.170021 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4cddaee6-7871-4749-a9c1-43dce8c04cc0-var-run\") pod \"ovn-controller-wdll6-config-9xnbx\" (UID: \"4cddaee6-7871-4749-a9c1-43dce8c04cc0\") " pod="openstack/ovn-controller-wdll6-config-9xnbx" Mar 19 19:17:18 crc kubenswrapper[4826]: I0319 19:17:18.170639 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4cddaee6-7871-4749-a9c1-43dce8c04cc0-var-run\") pod \"ovn-controller-wdll6-config-9xnbx\" (UID: \"4cddaee6-7871-4749-a9c1-43dce8c04cc0\") " pod="openstack/ovn-controller-wdll6-config-9xnbx" Mar 19 19:17:18 crc kubenswrapper[4826]: I0319 19:17:18.170640 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4cddaee6-7871-4749-a9c1-43dce8c04cc0-var-log-ovn\") pod \"ovn-controller-wdll6-config-9xnbx\" (UID: \"4cddaee6-7871-4749-a9c1-43dce8c04cc0\") " pod="openstack/ovn-controller-wdll6-config-9xnbx" Mar 19 19:17:18 crc kubenswrapper[4826]: I0319 19:17:18.170717 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4cddaee6-7871-4749-a9c1-43dce8c04cc0-var-run-ovn\") pod \"ovn-controller-wdll6-config-9xnbx\" (UID: \"4cddaee6-7871-4749-a9c1-43dce8c04cc0\") " pod="openstack/ovn-controller-wdll6-config-9xnbx" Mar 19 19:17:18 crc kubenswrapper[4826]: I0319 19:17:18.171783 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4cddaee6-7871-4749-a9c1-43dce8c04cc0-additional-scripts\") pod \"ovn-controller-wdll6-config-9xnbx\" (UID: \"4cddaee6-7871-4749-a9c1-43dce8c04cc0\") " pod="openstack/ovn-controller-wdll6-config-9xnbx" Mar 19 19:17:18 crc kubenswrapper[4826]: I0319 19:17:18.173857 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4cddaee6-7871-4749-a9c1-43dce8c04cc0-scripts\") pod \"ovn-controller-wdll6-config-9xnbx\" (UID: \"4cddaee6-7871-4749-a9c1-43dce8c04cc0\") " pod="openstack/ovn-controller-wdll6-config-9xnbx" Mar 19 19:17:18 crc kubenswrapper[4826]: I0319 19:17:18.202611 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h28gs\" (UniqueName: \"kubernetes.io/projected/4cddaee6-7871-4749-a9c1-43dce8c04cc0-kube-api-access-h28gs\") pod \"ovn-controller-wdll6-config-9xnbx\" (UID: \"4cddaee6-7871-4749-a9c1-43dce8c04cc0\") " pod="openstack/ovn-controller-wdll6-config-9xnbx" Mar 19 19:17:18 crc kubenswrapper[4826]: I0319 19:17:18.501492 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wdll6-config-9xnbx" Mar 19 19:17:18 crc kubenswrapper[4826]: I0319 19:17:18.998515 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-wdll6-config-9xnbx"] Mar 19 19:17:19 crc kubenswrapper[4826]: I0319 19:17:19.090961 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/775f9d8a-377a-4913-b2d2-3bb1b7aec077-etc-swift\") pod \"swift-storage-0\" (UID: \"775f9d8a-377a-4913-b2d2-3bb1b7aec077\") " pod="openstack/swift-storage-0" Mar 19 19:17:19 crc kubenswrapper[4826]: I0319 19:17:19.098249 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/775f9d8a-377a-4913-b2d2-3bb1b7aec077-etc-swift\") pod \"swift-storage-0\" (UID: \"775f9d8a-377a-4913-b2d2-3bb1b7aec077\") " pod="openstack/swift-storage-0" Mar 19 19:17:19 crc kubenswrapper[4826]: I0319 19:17:19.293327 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 19 19:17:19 crc kubenswrapper[4826]: W0319 19:17:19.456994 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4cddaee6_7871_4749_a9c1_43dce8c04cc0.slice/crio-b8cae8480cc520bff10f7476ad6372f6af255b01d44740bca1bdab89a0ec14f2 WatchSource:0}: Error finding container b8cae8480cc520bff10f7476ad6372f6af255b01d44740bca1bdab89a0ec14f2: Status 404 returned error can't find the container with id b8cae8480cc520bff10f7476ad6372f6af255b01d44740bca1bdab89a0ec14f2 Mar 19 19:17:19 crc kubenswrapper[4826]: I0319 19:17:19.772011 4826 scope.go:117] "RemoveContainer" containerID="d4d086f860457e06a76634d8648db5c3123c55b723c9efe84cd21a64cefdad84" Mar 19 19:17:19 crc kubenswrapper[4826]: I0319 19:17:19.811518 4826 scope.go:117] "RemoveContainer" containerID="9ed41f3822a8304e04979e86ce35e8cbeb0677462e0e24b59bb97f9799354ab5" Mar 19 19:17:19 crc kubenswrapper[4826]: I0319 19:17:19.840126 4826 scope.go:117] "RemoveContainer" containerID="2a327b90a2b9883180d1d7cf93684b3f22f1a73a49c53c9ce2b62ae5b1bc9116" Mar 19 19:17:19 crc kubenswrapper[4826]: I0319 19:17:19.929130 4826 scope.go:117] "RemoveContainer" containerID="f3f1edd3c076f036101b3c40c3a65a6adb296d8bbed9e027eebd4f204bf57127" Mar 19 19:17:20 crc kubenswrapper[4826]: I0319 19:17:20.052744 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 19 19:17:20 crc kubenswrapper[4826]: W0319 19:17:20.067478 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod775f9d8a_377a_4913_b2d2_3bb1b7aec077.slice/crio-c13ac9a4f3992d754941113a20d1d6cbc97bafdb3a3e0ea2dbeabd07bc5172a3 WatchSource:0}: Error finding container c13ac9a4f3992d754941113a20d1d6cbc97bafdb3a3e0ea2dbeabd07bc5172a3: Status 404 returned error can't find the container with id c13ac9a4f3992d754941113a20d1d6cbc97bafdb3a3e0ea2dbeabd07bc5172a3 Mar 19 19:17:20 crc kubenswrapper[4826]: I0319 19:17:20.186147 4826 generic.go:334] "Generic (PLEG): container finished" podID="4cddaee6-7871-4749-a9c1-43dce8c04cc0" containerID="a3f356020523a8534a55d56aa1622f99d7f72f9c6e7be99763f558b6abe84902" exitCode=0 Mar 19 19:17:20 crc kubenswrapper[4826]: I0319 19:17:20.186224 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wdll6-config-9xnbx" event={"ID":"4cddaee6-7871-4749-a9c1-43dce8c04cc0","Type":"ContainerDied","Data":"a3f356020523a8534a55d56aa1622f99d7f72f9c6e7be99763f558b6abe84902"} Mar 19 19:17:20 crc kubenswrapper[4826]: I0319 19:17:20.186250 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wdll6-config-9xnbx" event={"ID":"4cddaee6-7871-4749-a9c1-43dce8c04cc0","Type":"ContainerStarted","Data":"b8cae8480cc520bff10f7476ad6372f6af255b01d44740bca1bdab89a0ec14f2"} Mar 19 19:17:20 crc kubenswrapper[4826]: I0319 19:17:20.188929 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"775f9d8a-377a-4913-b2d2-3bb1b7aec077","Type":"ContainerStarted","Data":"c13ac9a4f3992d754941113a20d1d6cbc97bafdb3a3e0ea2dbeabd07bc5172a3"} Mar 19 19:17:20 crc kubenswrapper[4826]: I0319 19:17:20.190451 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"144ed0ba-58bd-4991-92af-8ed3f401e52c","Type":"ContainerStarted","Data":"1fa1f710f9d4180d7dfec7f36e73a9bbdc447e1b123dce5f4813e8f7c60a6c69"} Mar 19 19:17:20 crc kubenswrapper[4826]: I0319 19:17:20.239828 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-0" podStartSLOduration=2.267253213 podStartE2EDuration="4.239805017s" podCreationTimestamp="2026-03-19 19:17:16 +0000 UTC" firstStartedPulling="2026-03-19 19:17:17.548067201 +0000 UTC m=+1262.302135514" lastFinishedPulling="2026-03-19 19:17:19.520619005 +0000 UTC m=+1264.274687318" observedRunningTime="2026-03-19 19:17:20.217369645 +0000 UTC m=+1264.971437968" watchObservedRunningTime="2026-03-19 19:17:20.239805017 +0000 UTC m=+1264.993873330" Mar 19 19:17:20 crc kubenswrapper[4826]: I0319 19:17:20.469581 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="e617bcf9-daaa-4a7a-949c-cdf0fc9646a5" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.133:5671: connect: connection refused" Mar 19 19:17:20 crc kubenswrapper[4826]: I0319 19:17:20.574806 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-2" Mar 19 19:17:20 crc kubenswrapper[4826]: I0319 19:17:20.605880 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-1" Mar 19 19:17:21 crc kubenswrapper[4826]: I0319 19:17:21.656783 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wdll6-config-9xnbx" Mar 19 19:17:21 crc kubenswrapper[4826]: I0319 19:17:21.762977 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4cddaee6-7871-4749-a9c1-43dce8c04cc0-scripts\") pod \"4cddaee6-7871-4749-a9c1-43dce8c04cc0\" (UID: \"4cddaee6-7871-4749-a9c1-43dce8c04cc0\") " Mar 19 19:17:21 crc kubenswrapper[4826]: I0319 19:17:21.763092 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4cddaee6-7871-4749-a9c1-43dce8c04cc0-var-run\") pod \"4cddaee6-7871-4749-a9c1-43dce8c04cc0\" (UID: \"4cddaee6-7871-4749-a9c1-43dce8c04cc0\") " Mar 19 19:17:21 crc kubenswrapper[4826]: I0319 19:17:21.763118 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4cddaee6-7871-4749-a9c1-43dce8c04cc0-var-log-ovn\") pod \"4cddaee6-7871-4749-a9c1-43dce8c04cc0\" (UID: \"4cddaee6-7871-4749-a9c1-43dce8c04cc0\") " Mar 19 19:17:21 crc kubenswrapper[4826]: I0319 19:17:21.763155 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h28gs\" (UniqueName: \"kubernetes.io/projected/4cddaee6-7871-4749-a9c1-43dce8c04cc0-kube-api-access-h28gs\") pod \"4cddaee6-7871-4749-a9c1-43dce8c04cc0\" (UID: \"4cddaee6-7871-4749-a9c1-43dce8c04cc0\") " Mar 19 19:17:21 crc kubenswrapper[4826]: I0319 19:17:21.763245 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4cddaee6-7871-4749-a9c1-43dce8c04cc0-additional-scripts\") pod \"4cddaee6-7871-4749-a9c1-43dce8c04cc0\" (UID: \"4cddaee6-7871-4749-a9c1-43dce8c04cc0\") " Mar 19 19:17:21 crc kubenswrapper[4826]: I0319 19:17:21.763279 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4cddaee6-7871-4749-a9c1-43dce8c04cc0-var-run-ovn\") pod \"4cddaee6-7871-4749-a9c1-43dce8c04cc0\" (UID: \"4cddaee6-7871-4749-a9c1-43dce8c04cc0\") " Mar 19 19:17:21 crc kubenswrapper[4826]: I0319 19:17:21.763714 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4cddaee6-7871-4749-a9c1-43dce8c04cc0-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "4cddaee6-7871-4749-a9c1-43dce8c04cc0" (UID: "4cddaee6-7871-4749-a9c1-43dce8c04cc0"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 19:17:21 crc kubenswrapper[4826]: I0319 19:17:21.764073 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4cddaee6-7871-4749-a9c1-43dce8c04cc0-var-run" (OuterVolumeSpecName: "var-run") pod "4cddaee6-7871-4749-a9c1-43dce8c04cc0" (UID: "4cddaee6-7871-4749-a9c1-43dce8c04cc0"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 19:17:21 crc kubenswrapper[4826]: I0319 19:17:21.764153 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4cddaee6-7871-4749-a9c1-43dce8c04cc0-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "4cddaee6-7871-4749-a9c1-43dce8c04cc0" (UID: "4cddaee6-7871-4749-a9c1-43dce8c04cc0"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 19:17:21 crc kubenswrapper[4826]: I0319 19:17:21.764605 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cddaee6-7871-4749-a9c1-43dce8c04cc0-scripts" (OuterVolumeSpecName: "scripts") pod "4cddaee6-7871-4749-a9c1-43dce8c04cc0" (UID: "4cddaee6-7871-4749-a9c1-43dce8c04cc0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:17:21 crc kubenswrapper[4826]: I0319 19:17:21.765228 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cddaee6-7871-4749-a9c1-43dce8c04cc0-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "4cddaee6-7871-4749-a9c1-43dce8c04cc0" (UID: "4cddaee6-7871-4749-a9c1-43dce8c04cc0"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:17:21 crc kubenswrapper[4826]: I0319 19:17:21.784932 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cddaee6-7871-4749-a9c1-43dce8c04cc0-kube-api-access-h28gs" (OuterVolumeSpecName: "kube-api-access-h28gs") pod "4cddaee6-7871-4749-a9c1-43dce8c04cc0" (UID: "4cddaee6-7871-4749-a9c1-43dce8c04cc0"). InnerVolumeSpecName "kube-api-access-h28gs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:17:21 crc kubenswrapper[4826]: I0319 19:17:21.865354 4826 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4cddaee6-7871-4749-a9c1-43dce8c04cc0-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:21 crc kubenswrapper[4826]: I0319 19:17:21.865384 4826 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4cddaee6-7871-4749-a9c1-43dce8c04cc0-var-run\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:21 crc kubenswrapper[4826]: I0319 19:17:21.865395 4826 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4cddaee6-7871-4749-a9c1-43dce8c04cc0-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:21 crc kubenswrapper[4826]: I0319 19:17:21.865404 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h28gs\" (UniqueName: \"kubernetes.io/projected/4cddaee6-7871-4749-a9c1-43dce8c04cc0-kube-api-access-h28gs\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:21 crc kubenswrapper[4826]: I0319 19:17:21.865413 4826 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4cddaee6-7871-4749-a9c1-43dce8c04cc0-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:21 crc kubenswrapper[4826]: I0319 19:17:21.865421 4826 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4cddaee6-7871-4749-a9c1-43dce8c04cc0-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:22 crc kubenswrapper[4826]: I0319 19:17:22.210977 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wdll6-config-9xnbx" event={"ID":"4cddaee6-7871-4749-a9c1-43dce8c04cc0","Type":"ContainerDied","Data":"b8cae8480cc520bff10f7476ad6372f6af255b01d44740bca1bdab89a0ec14f2"} Mar 19 19:17:22 crc kubenswrapper[4826]: I0319 19:17:22.211009 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8cae8480cc520bff10f7476ad6372f6af255b01d44740bca1bdab89a0ec14f2" Mar 19 19:17:22 crc kubenswrapper[4826]: I0319 19:17:22.211095 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wdll6-config-9xnbx" Mar 19 19:17:22 crc kubenswrapper[4826]: I0319 19:17:22.533891 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Mar 19 19:17:22 crc kubenswrapper[4826]: I0319 19:17:22.538132 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Mar 19 19:17:22 crc kubenswrapper[4826]: I0319 19:17:22.784333 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-wdll6-config-9xnbx"] Mar 19 19:17:22 crc kubenswrapper[4826]: I0319 19:17:22.797296 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-wdll6-config-9xnbx"] Mar 19 19:17:23 crc kubenswrapper[4826]: I0319 19:17:23.223928 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 19 19:17:24 crc kubenswrapper[4826]: I0319 19:17:24.015076 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cddaee6-7871-4749-a9c1-43dce8c04cc0" path="/var/lib/kubelet/pods/4cddaee6-7871-4749-a9c1-43dce8c04cc0/volumes" Mar 19 19:17:25 crc kubenswrapper[4826]: I0319 19:17:25.137841 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 19 19:17:25 crc kubenswrapper[4826]: I0319 19:17:25.239077 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="63839c94-d94a-4fe8-a195-b86a6a9e8b79" containerName="prometheus" containerID="cri-o://c0f1f7d466460401af90a3d9bff2ecdb385e1668e8526c5c229b079613d3dbec" gracePeriod=600 Mar 19 19:17:25 crc kubenswrapper[4826]: I0319 19:17:25.239195 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="63839c94-d94a-4fe8-a195-b86a6a9e8b79" containerName="config-reloader" containerID="cri-o://8c9c5267be455ec2905d4a2bf8db5ab89c8e3ed5aac699b450dfeebccdf5f050" gracePeriod=600 Mar 19 19:17:25 crc kubenswrapper[4826]: I0319 19:17:25.239107 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="63839c94-d94a-4fe8-a195-b86a6a9e8b79" containerName="thanos-sidecar" containerID="cri-o://5d9bfa8cd13f3a3eda081700b8b5fea64497a2c1cabd846849199cb1bd8f95c5" gracePeriod=600 Mar 19 19:17:26 crc kubenswrapper[4826]: I0319 19:17:26.250205 4826 generic.go:334] "Generic (PLEG): container finished" podID="63839c94-d94a-4fe8-a195-b86a6a9e8b79" containerID="5d9bfa8cd13f3a3eda081700b8b5fea64497a2c1cabd846849199cb1bd8f95c5" exitCode=0 Mar 19 19:17:26 crc kubenswrapper[4826]: I0319 19:17:26.250443 4826 generic.go:334] "Generic (PLEG): container finished" podID="63839c94-d94a-4fe8-a195-b86a6a9e8b79" containerID="8c9c5267be455ec2905d4a2bf8db5ab89c8e3ed5aac699b450dfeebccdf5f050" exitCode=0 Mar 19 19:17:26 crc kubenswrapper[4826]: I0319 19:17:26.250277 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"63839c94-d94a-4fe8-a195-b86a6a9e8b79","Type":"ContainerDied","Data":"5d9bfa8cd13f3a3eda081700b8b5fea64497a2c1cabd846849199cb1bd8f95c5"} Mar 19 19:17:26 crc kubenswrapper[4826]: I0319 19:17:26.250483 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"63839c94-d94a-4fe8-a195-b86a6a9e8b79","Type":"ContainerDied","Data":"8c9c5267be455ec2905d4a2bf8db5ab89c8e3ed5aac699b450dfeebccdf5f050"} Mar 19 19:17:26 crc kubenswrapper[4826]: I0319 19:17:26.250495 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"63839c94-d94a-4fe8-a195-b86a6a9e8b79","Type":"ContainerDied","Data":"c0f1f7d466460401af90a3d9bff2ecdb385e1668e8526c5c229b079613d3dbec"} Mar 19 19:17:26 crc kubenswrapper[4826]: I0319 19:17:26.250452 4826 generic.go:334] "Generic (PLEG): container finished" podID="63839c94-d94a-4fe8-a195-b86a6a9e8b79" containerID="c0f1f7d466460401af90a3d9bff2ecdb385e1668e8526c5c229b079613d3dbec" exitCode=0 Mar 19 19:17:27 crc kubenswrapper[4826]: I0319 19:17:27.533637 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="63839c94-d94a-4fe8-a195-b86a6a9e8b79" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.143:9090/-/ready\": dial tcp 10.217.0.143:9090: connect: connection refused" Mar 19 19:17:29 crc kubenswrapper[4826]: I0319 19:17:29.844942 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:17:30 crc kubenswrapper[4826]: I0319 19:17:30.470928 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 19 19:17:30 crc kubenswrapper[4826]: I0319 19:17:30.646471 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 19 19:17:30 crc kubenswrapper[4826]: I0319 19:17:30.759433 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/63839c94-d94a-4fe8-a195-b86a6a9e8b79-prometheus-metric-storage-rulefiles-2\") pod \"63839c94-d94a-4fe8-a195-b86a6a9e8b79\" (UID: \"63839c94-d94a-4fe8-a195-b86a6a9e8b79\") " Mar 19 19:17:30 crc kubenswrapper[4826]: I0319 19:17:30.759557 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/63839c94-d94a-4fe8-a195-b86a6a9e8b79-prometheus-metric-storage-rulefiles-1\") pod \"63839c94-d94a-4fe8-a195-b86a6a9e8b79\" (UID: \"63839c94-d94a-4fe8-a195-b86a6a9e8b79\") " Mar 19 19:17:30 crc kubenswrapper[4826]: I0319 19:17:30.759594 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/63839c94-d94a-4fe8-a195-b86a6a9e8b79-config\") pod \"63839c94-d94a-4fe8-a195-b86a6a9e8b79\" (UID: \"63839c94-d94a-4fe8-a195-b86a6a9e8b79\") " Mar 19 19:17:30 crc kubenswrapper[4826]: I0319 19:17:30.759682 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/63839c94-d94a-4fe8-a195-b86a6a9e8b79-thanos-prometheus-http-client-file\") pod \"63839c94-d94a-4fe8-a195-b86a6a9e8b79\" (UID: \"63839c94-d94a-4fe8-a195-b86a6a9e8b79\") " Mar 19 19:17:30 crc kubenswrapper[4826]: I0319 19:17:30.759709 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/63839c94-d94a-4fe8-a195-b86a6a9e8b79-web-config\") pod \"63839c94-d94a-4fe8-a195-b86a6a9e8b79\" (UID: \"63839c94-d94a-4fe8-a195-b86a6a9e8b79\") " Mar 19 19:17:30 crc kubenswrapper[4826]: I0319 19:17:30.759736 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/63839c94-d94a-4fe8-a195-b86a6a9e8b79-tls-assets\") pod \"63839c94-d94a-4fe8-a195-b86a6a9e8b79\" (UID: \"63839c94-d94a-4fe8-a195-b86a6a9e8b79\") " Mar 19 19:17:30 crc kubenswrapper[4826]: I0319 19:17:30.759869 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d1bf03f1-4718-4f37-9624-fffdd3002646\") pod \"63839c94-d94a-4fe8-a195-b86a6a9e8b79\" (UID: \"63839c94-d94a-4fe8-a195-b86a6a9e8b79\") " Mar 19 19:17:30 crc kubenswrapper[4826]: I0319 19:17:30.759958 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/63839c94-d94a-4fe8-a195-b86a6a9e8b79-prometheus-metric-storage-rulefiles-0\") pod \"63839c94-d94a-4fe8-a195-b86a6a9e8b79\" (UID: \"63839c94-d94a-4fe8-a195-b86a6a9e8b79\") " Mar 19 19:17:30 crc kubenswrapper[4826]: I0319 19:17:30.760037 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63839c94-d94a-4fe8-a195-b86a6a9e8b79-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "63839c94-d94a-4fe8-a195-b86a6a9e8b79" (UID: "63839c94-d94a-4fe8-a195-b86a6a9e8b79"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:17:30 crc kubenswrapper[4826]: I0319 19:17:30.760064 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlj5w\" (UniqueName: \"kubernetes.io/projected/63839c94-d94a-4fe8-a195-b86a6a9e8b79-kube-api-access-rlj5w\") pod \"63839c94-d94a-4fe8-a195-b86a6a9e8b79\" (UID: \"63839c94-d94a-4fe8-a195-b86a6a9e8b79\") " Mar 19 19:17:30 crc kubenswrapper[4826]: I0319 19:17:30.760185 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/63839c94-d94a-4fe8-a195-b86a6a9e8b79-config-out\") pod \"63839c94-d94a-4fe8-a195-b86a6a9e8b79\" (UID: \"63839c94-d94a-4fe8-a195-b86a6a9e8b79\") " Mar 19 19:17:30 crc kubenswrapper[4826]: I0319 19:17:30.761128 4826 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/63839c94-d94a-4fe8-a195-b86a6a9e8b79-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:30 crc kubenswrapper[4826]: I0319 19:17:30.761891 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63839c94-d94a-4fe8-a195-b86a6a9e8b79-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "63839c94-d94a-4fe8-a195-b86a6a9e8b79" (UID: "63839c94-d94a-4fe8-a195-b86a6a9e8b79"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:17:30 crc kubenswrapper[4826]: I0319 19:17:30.764184 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63839c94-d94a-4fe8-a195-b86a6a9e8b79-kube-api-access-rlj5w" (OuterVolumeSpecName: "kube-api-access-rlj5w") pod "63839c94-d94a-4fe8-a195-b86a6a9e8b79" (UID: "63839c94-d94a-4fe8-a195-b86a6a9e8b79"). InnerVolumeSpecName "kube-api-access-rlj5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:17:30 crc kubenswrapper[4826]: I0319 19:17:30.764230 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63839c94-d94a-4fe8-a195-b86a6a9e8b79-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "63839c94-d94a-4fe8-a195-b86a6a9e8b79" (UID: "63839c94-d94a-4fe8-a195-b86a6a9e8b79"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:17:30 crc kubenswrapper[4826]: I0319 19:17:30.769807 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63839c94-d94a-4fe8-a195-b86a6a9e8b79-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "63839c94-d94a-4fe8-a195-b86a6a9e8b79" (UID: "63839c94-d94a-4fe8-a195-b86a6a9e8b79"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:17:30 crc kubenswrapper[4826]: I0319 19:17:30.770640 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63839c94-d94a-4fe8-a195-b86a6a9e8b79-config" (OuterVolumeSpecName: "config") pod "63839c94-d94a-4fe8-a195-b86a6a9e8b79" (UID: "63839c94-d94a-4fe8-a195-b86a6a9e8b79"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:17:30 crc kubenswrapper[4826]: I0319 19:17:30.771697 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63839c94-d94a-4fe8-a195-b86a6a9e8b79-config-out" (OuterVolumeSpecName: "config-out") pod "63839c94-d94a-4fe8-a195-b86a6a9e8b79" (UID: "63839c94-d94a-4fe8-a195-b86a6a9e8b79"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:17:30 crc kubenswrapper[4826]: I0319 19:17:30.776241 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63839c94-d94a-4fe8-a195-b86a6a9e8b79-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "63839c94-d94a-4fe8-a195-b86a6a9e8b79" (UID: "63839c94-d94a-4fe8-a195-b86a6a9e8b79"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:17:30 crc kubenswrapper[4826]: I0319 19:17:30.792905 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63839c94-d94a-4fe8-a195-b86a6a9e8b79-web-config" (OuterVolumeSpecName: "web-config") pod "63839c94-d94a-4fe8-a195-b86a6a9e8b79" (UID: "63839c94-d94a-4fe8-a195-b86a6a9e8b79"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:17:30 crc kubenswrapper[4826]: I0319 19:17:30.836342 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d1bf03f1-4718-4f37-9624-fffdd3002646" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "63839c94-d94a-4fe8-a195-b86a6a9e8b79" (UID: "63839c94-d94a-4fe8-a195-b86a6a9e8b79"). InnerVolumeSpecName "pvc-d1bf03f1-4718-4f37-9624-fffdd3002646". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 19 19:17:30 crc kubenswrapper[4826]: I0319 19:17:30.863124 4826 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/63839c94-d94a-4fe8-a195-b86a6a9e8b79-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:30 crc kubenswrapper[4826]: I0319 19:17:30.863154 4826 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/63839c94-d94a-4fe8-a195-b86a6a9e8b79-web-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:30 crc kubenswrapper[4826]: I0319 19:17:30.863165 4826 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/63839c94-d94a-4fe8-a195-b86a6a9e8b79-tls-assets\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:30 crc kubenswrapper[4826]: I0319 19:17:30.863204 4826 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-d1bf03f1-4718-4f37-9624-fffdd3002646\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d1bf03f1-4718-4f37-9624-fffdd3002646\") on node \"crc\" " Mar 19 19:17:30 crc kubenswrapper[4826]: I0319 19:17:30.863217 4826 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/63839c94-d94a-4fe8-a195-b86a6a9e8b79-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:30 crc kubenswrapper[4826]: I0319 19:17:30.863226 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlj5w\" (UniqueName: \"kubernetes.io/projected/63839c94-d94a-4fe8-a195-b86a6a9e8b79-kube-api-access-rlj5w\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:30 crc kubenswrapper[4826]: I0319 19:17:30.863236 4826 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/63839c94-d94a-4fe8-a195-b86a6a9e8b79-config-out\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:30 crc kubenswrapper[4826]: I0319 19:17:30.863246 4826 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/63839c94-d94a-4fe8-a195-b86a6a9e8b79-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:30 crc kubenswrapper[4826]: I0319 19:17:30.863255 4826 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/63839c94-d94a-4fe8-a195-b86a6a9e8b79-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:30 crc kubenswrapper[4826]: I0319 19:17:30.889963 4826 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 19 19:17:30 crc kubenswrapper[4826]: I0319 19:17:30.890133 4826 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-d1bf03f1-4718-4f37-9624-fffdd3002646" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d1bf03f1-4718-4f37-9624-fffdd3002646") on node "crc" Mar 19 19:17:30 crc kubenswrapper[4826]: I0319 19:17:30.964572 4826 reconciler_common.go:293] "Volume detached for volume \"pvc-d1bf03f1-4718-4f37-9624-fffdd3002646\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d1bf03f1-4718-4f37-9624-fffdd3002646\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.304887 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-rvwbt" event={"ID":"9d888eb3-4bcd-470d-95c6-aa3d281c6332","Type":"ContainerStarted","Data":"008f296df9d929c34c10dbca826659686020eccb5e94ffc11af8acc74b3caddd"} Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.314322 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"63839c94-d94a-4fe8-a195-b86a6a9e8b79","Type":"ContainerDied","Data":"d4323ce2073b9fe115b288f1a909240a1be6b110f1a4e83906c64c507040c4f2"} Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.314376 4826 scope.go:117] "RemoveContainer" containerID="5d9bfa8cd13f3a3eda081700b8b5fea64497a2c1cabd846849199cb1bd8f95c5" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.314505 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.326775 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"775f9d8a-377a-4913-b2d2-3bb1b7aec077","Type":"ContainerStarted","Data":"1da2fe2a43cfcf0143ea60e28c432326214cfc8b5254c1c791bdb7b83e234b82"} Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.326819 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"775f9d8a-377a-4913-b2d2-3bb1b7aec077","Type":"ContainerStarted","Data":"74754e6f32f5d40c1420a4ab60a0ba35e1001706e7685fbc530e623b436914d7"} Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.326829 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"775f9d8a-377a-4913-b2d2-3bb1b7aec077","Type":"ContainerStarted","Data":"769aa7ab20c5f351dbe38c4d969921bfc9a62ee0470d3bcc4229a135c8b69a3c"} Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.326839 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"775f9d8a-377a-4913-b2d2-3bb1b7aec077","Type":"ContainerStarted","Data":"9221d216fa9e9fabdd54fa8a9b39598830236b87b2564ae9c62efbe175aad07a"} Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.435979 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-rvwbt" podStartSLOduration=2.448721702 podStartE2EDuration="18.4359571s" podCreationTimestamp="2026-03-19 19:17:13 +0000 UTC" firstStartedPulling="2026-03-19 19:17:14.47798227 +0000 UTC m=+1259.232050583" lastFinishedPulling="2026-03-19 19:17:30.465217668 +0000 UTC m=+1275.219285981" observedRunningTime="2026-03-19 19:17:31.344008709 +0000 UTC m=+1276.098077042" watchObservedRunningTime="2026-03-19 19:17:31.4359571 +0000 UTC m=+1276.190025413" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.440744 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.447338 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.449551 4826 scope.go:117] "RemoveContainer" containerID="8c9c5267be455ec2905d4a2bf8db5ab89c8e3ed5aac699b450dfeebccdf5f050" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.463090 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 19 19:17:31 crc kubenswrapper[4826]: E0319 19:17:31.463585 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cddaee6-7871-4749-a9c1-43dce8c04cc0" containerName="ovn-config" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.463603 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cddaee6-7871-4749-a9c1-43dce8c04cc0" containerName="ovn-config" Mar 19 19:17:31 crc kubenswrapper[4826]: E0319 19:17:31.463621 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63839c94-d94a-4fe8-a195-b86a6a9e8b79" containerName="init-config-reloader" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.463627 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="63839c94-d94a-4fe8-a195-b86a6a9e8b79" containerName="init-config-reloader" Mar 19 19:17:31 crc kubenswrapper[4826]: E0319 19:17:31.463648 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63839c94-d94a-4fe8-a195-b86a6a9e8b79" containerName="thanos-sidecar" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.463668 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="63839c94-d94a-4fe8-a195-b86a6a9e8b79" containerName="thanos-sidecar" Mar 19 19:17:31 crc kubenswrapper[4826]: E0319 19:17:31.463684 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63839c94-d94a-4fe8-a195-b86a6a9e8b79" containerName="prometheus" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.463690 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="63839c94-d94a-4fe8-a195-b86a6a9e8b79" containerName="prometheus" Mar 19 19:17:31 crc kubenswrapper[4826]: E0319 19:17:31.463702 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63839c94-d94a-4fe8-a195-b86a6a9e8b79" containerName="config-reloader" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.463708 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="63839c94-d94a-4fe8-a195-b86a6a9e8b79" containerName="config-reloader" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.463892 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="63839c94-d94a-4fe8-a195-b86a6a9e8b79" containerName="thanos-sidecar" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.463907 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="63839c94-d94a-4fe8-a195-b86a6a9e8b79" containerName="config-reloader" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.463919 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cddaee6-7871-4749-a9c1-43dce8c04cc0" containerName="ovn-config" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.463932 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="63839c94-d94a-4fe8-a195-b86a6a9e8b79" containerName="prometheus" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.465706 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.470930 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.471138 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.471255 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.471741 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.471862 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.472548 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.472674 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-zc2nx" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.472777 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.473359 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.473614 4826 scope.go:117] "RemoveContainer" containerID="c0f1f7d466460401af90a3d9bff2ecdb385e1668e8526c5c229b079613d3dbec" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.479466 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.518981 4826 scope.go:117] "RemoveContainer" containerID="7e3557f404ae5596f07e68f863e66e56de142e27881c119d8b748b24b7a4453b" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.581032 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmdlx\" (UniqueName: \"kubernetes.io/projected/bf194957-ec68-4ea7-b094-3e0912bc3bc5-kube-api-access-zmdlx\") pod \"prometheus-metric-storage-0\" (UID: \"bf194957-ec68-4ea7-b094-3e0912bc3bc5\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.581077 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/bf194957-ec68-4ea7-b094-3e0912bc3bc5-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"bf194957-ec68-4ea7-b094-3e0912bc3bc5\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.581114 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/bf194957-ec68-4ea7-b094-3e0912bc3bc5-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"bf194957-ec68-4ea7-b094-3e0912bc3bc5\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.581146 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/bf194957-ec68-4ea7-b094-3e0912bc3bc5-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"bf194957-ec68-4ea7-b094-3e0912bc3bc5\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.581165 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bf194957-ec68-4ea7-b094-3e0912bc3bc5-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"bf194957-ec68-4ea7-b094-3e0912bc3bc5\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.581183 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d1bf03f1-4718-4f37-9624-fffdd3002646\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d1bf03f1-4718-4f37-9624-fffdd3002646\") pod \"prometheus-metric-storage-0\" (UID: \"bf194957-ec68-4ea7-b094-3e0912bc3bc5\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.581207 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bf194957-ec68-4ea7-b094-3e0912bc3bc5-config\") pod \"prometheus-metric-storage-0\" (UID: \"bf194957-ec68-4ea7-b094-3e0912bc3bc5\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.581251 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bf194957-ec68-4ea7-b094-3e0912bc3bc5-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"bf194957-ec68-4ea7-b094-3e0912bc3bc5\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.581322 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf194957-ec68-4ea7-b094-3e0912bc3bc5-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"bf194957-ec68-4ea7-b094-3e0912bc3bc5\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.581536 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/bf194957-ec68-4ea7-b094-3e0912bc3bc5-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"bf194957-ec68-4ea7-b094-3e0912bc3bc5\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.581562 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/bf194957-ec68-4ea7-b094-3e0912bc3bc5-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"bf194957-ec68-4ea7-b094-3e0912bc3bc5\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.581583 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bf194957-ec68-4ea7-b094-3e0912bc3bc5-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"bf194957-ec68-4ea7-b094-3e0912bc3bc5\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.581709 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/bf194957-ec68-4ea7-b094-3e0912bc3bc5-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"bf194957-ec68-4ea7-b094-3e0912bc3bc5\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.683399 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/bf194957-ec68-4ea7-b094-3e0912bc3bc5-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"bf194957-ec68-4ea7-b094-3e0912bc3bc5\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.683456 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmdlx\" (UniqueName: \"kubernetes.io/projected/bf194957-ec68-4ea7-b094-3e0912bc3bc5-kube-api-access-zmdlx\") pod \"prometheus-metric-storage-0\" (UID: \"bf194957-ec68-4ea7-b094-3e0912bc3bc5\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.683482 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/bf194957-ec68-4ea7-b094-3e0912bc3bc5-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"bf194957-ec68-4ea7-b094-3e0912bc3bc5\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.683515 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/bf194957-ec68-4ea7-b094-3e0912bc3bc5-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"bf194957-ec68-4ea7-b094-3e0912bc3bc5\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.683533 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/bf194957-ec68-4ea7-b094-3e0912bc3bc5-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"bf194957-ec68-4ea7-b094-3e0912bc3bc5\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.683555 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bf194957-ec68-4ea7-b094-3e0912bc3bc5-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"bf194957-ec68-4ea7-b094-3e0912bc3bc5\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.683574 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d1bf03f1-4718-4f37-9624-fffdd3002646\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d1bf03f1-4718-4f37-9624-fffdd3002646\") pod \"prometheus-metric-storage-0\" (UID: \"bf194957-ec68-4ea7-b094-3e0912bc3bc5\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.683598 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bf194957-ec68-4ea7-b094-3e0912bc3bc5-config\") pod \"prometheus-metric-storage-0\" (UID: \"bf194957-ec68-4ea7-b094-3e0912bc3bc5\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.683647 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bf194957-ec68-4ea7-b094-3e0912bc3bc5-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"bf194957-ec68-4ea7-b094-3e0912bc3bc5\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.683724 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf194957-ec68-4ea7-b094-3e0912bc3bc5-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"bf194957-ec68-4ea7-b094-3e0912bc3bc5\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.683752 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/bf194957-ec68-4ea7-b094-3e0912bc3bc5-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"bf194957-ec68-4ea7-b094-3e0912bc3bc5\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.683770 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/bf194957-ec68-4ea7-b094-3e0912bc3bc5-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"bf194957-ec68-4ea7-b094-3e0912bc3bc5\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.683790 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bf194957-ec68-4ea7-b094-3e0912bc3bc5-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"bf194957-ec68-4ea7-b094-3e0912bc3bc5\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.684260 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/bf194957-ec68-4ea7-b094-3e0912bc3bc5-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"bf194957-ec68-4ea7-b094-3e0912bc3bc5\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.686400 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/bf194957-ec68-4ea7-b094-3e0912bc3bc5-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"bf194957-ec68-4ea7-b094-3e0912bc3bc5\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.686992 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/bf194957-ec68-4ea7-b094-3e0912bc3bc5-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"bf194957-ec68-4ea7-b094-3e0912bc3bc5\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.696871 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bf194957-ec68-4ea7-b094-3e0912bc3bc5-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"bf194957-ec68-4ea7-b094-3e0912bc3bc5\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.698033 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bf194957-ec68-4ea7-b094-3e0912bc3bc5-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"bf194957-ec68-4ea7-b094-3e0912bc3bc5\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.698376 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/bf194957-ec68-4ea7-b094-3e0912bc3bc5-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"bf194957-ec68-4ea7-b094-3e0912bc3bc5\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.699918 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/bf194957-ec68-4ea7-b094-3e0912bc3bc5-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"bf194957-ec68-4ea7-b094-3e0912bc3bc5\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.700192 4826 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.700255 4826 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d1bf03f1-4718-4f37-9624-fffdd3002646\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d1bf03f1-4718-4f37-9624-fffdd3002646\") pod \"prometheus-metric-storage-0\" (UID: \"bf194957-ec68-4ea7-b094-3e0912bc3bc5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/bec5dc65a51c5d8459204c761fcdfad10688abc652e4da5f0ede2b4f4a0c41c7/globalmount\"" pod="openstack/prometheus-metric-storage-0" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.708222 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/bf194957-ec68-4ea7-b094-3e0912bc3bc5-config\") pod \"prometheus-metric-storage-0\" (UID: \"bf194957-ec68-4ea7-b094-3e0912bc3bc5\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.708510 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bf194957-ec68-4ea7-b094-3e0912bc3bc5-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"bf194957-ec68-4ea7-b094-3e0912bc3bc5\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.713644 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmdlx\" (UniqueName: \"kubernetes.io/projected/bf194957-ec68-4ea7-b094-3e0912bc3bc5-kube-api-access-zmdlx\") pod \"prometheus-metric-storage-0\" (UID: \"bf194957-ec68-4ea7-b094-3e0912bc3bc5\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.714214 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/bf194957-ec68-4ea7-b094-3e0912bc3bc5-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"bf194957-ec68-4ea7-b094-3e0912bc3bc5\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.727582 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf194957-ec68-4ea7-b094-3e0912bc3bc5-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"bf194957-ec68-4ea7-b094-3e0912bc3bc5\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.816533 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d1bf03f1-4718-4f37-9624-fffdd3002646\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d1bf03f1-4718-4f37-9624-fffdd3002646\") pod \"prometheus-metric-storage-0\" (UID: \"bf194957-ec68-4ea7-b094-3e0912bc3bc5\") " pod="openstack/prometheus-metric-storage-0" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.829181 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-fmgf5"] Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.830754 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-fmgf5" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.850474 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-fmgf5"] Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.986916 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63839c94-d94a-4fe8-a195-b86a6a9e8b79" path="/var/lib/kubelet/pods/63839c94-d94a-4fe8-a195-b86a6a9e8b79/volumes" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.988881 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87ca634c-b18e-4567-a7ee-00d102d65496-operator-scripts\") pod \"heat-db-create-fmgf5\" (UID: \"87ca634c-b18e-4567-a7ee-00d102d65496\") " pod="openstack/heat-db-create-fmgf5" Mar 19 19:17:31 crc kubenswrapper[4826]: I0319 19:17:31.988994 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9hbw\" (UniqueName: \"kubernetes.io/projected/87ca634c-b18e-4567-a7ee-00d102d65496-kube-api-access-l9hbw\") pod \"heat-db-create-fmgf5\" (UID: \"87ca634c-b18e-4567-a7ee-00d102d65496\") " pod="openstack/heat-db-create-fmgf5" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.034959 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-ad6f-account-create-update-wnt5t"] Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.036311 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-ad6f-account-create-update-wnt5t" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.041016 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.045444 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-ad6f-account-create-update-wnt5t"] Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.087148 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.090749 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9hbw\" (UniqueName: \"kubernetes.io/projected/87ca634c-b18e-4567-a7ee-00d102d65496-kube-api-access-l9hbw\") pod \"heat-db-create-fmgf5\" (UID: \"87ca634c-b18e-4567-a7ee-00d102d65496\") " pod="openstack/heat-db-create-fmgf5" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.090892 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87ca634c-b18e-4567-a7ee-00d102d65496-operator-scripts\") pod \"heat-db-create-fmgf5\" (UID: \"87ca634c-b18e-4567-a7ee-00d102d65496\") " pod="openstack/heat-db-create-fmgf5" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.091711 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87ca634c-b18e-4567-a7ee-00d102d65496-operator-scripts\") pod \"heat-db-create-fmgf5\" (UID: \"87ca634c-b18e-4567-a7ee-00d102d65496\") " pod="openstack/heat-db-create-fmgf5" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.113958 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9hbw\" (UniqueName: \"kubernetes.io/projected/87ca634c-b18e-4567-a7ee-00d102d65496-kube-api-access-l9hbw\") pod \"heat-db-create-fmgf5\" (UID: \"87ca634c-b18e-4567-a7ee-00d102d65496\") " pod="openstack/heat-db-create-fmgf5" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.126498 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-454rz"] Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.127860 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-454rz" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.138762 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-1a54-account-create-update-f52lv"] Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.140030 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-1a54-account-create-update-f52lv" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.145895 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.150878 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-454rz"] Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.162667 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-1a54-account-create-update-f52lv"] Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.192237 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbznw\" (UniqueName: \"kubernetes.io/projected/ca947aa7-664b-4392-9446-bdc5afdb3d6b-kube-api-access-kbznw\") pod \"heat-ad6f-account-create-update-wnt5t\" (UID: \"ca947aa7-664b-4392-9446-bdc5afdb3d6b\") " pod="openstack/heat-ad6f-account-create-update-wnt5t" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.192288 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca947aa7-664b-4392-9446-bdc5afdb3d6b-operator-scripts\") pod \"heat-ad6f-account-create-update-wnt5t\" (UID: \"ca947aa7-664b-4392-9446-bdc5afdb3d6b\") " pod="openstack/heat-ad6f-account-create-update-wnt5t" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.192595 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-fmgf5" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.236697 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-b84j2"] Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.238257 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-b84j2" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.252863 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-b84j2"] Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.253033 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.253321 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-sxv9p" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.253422 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.253516 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.312463 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xgxx\" (UniqueName: \"kubernetes.io/projected/733b90f8-38f2-47b3-ae70-43edf8383cd8-kube-api-access-9xgxx\") pod \"cinder-db-create-454rz\" (UID: \"733b90f8-38f2-47b3-ae70-43edf8383cd8\") " pod="openstack/cinder-db-create-454rz" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.312848 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7d96\" (UniqueName: \"kubernetes.io/projected/a09e9d14-f548-48a1-bbdc-1f1588b80e3a-kube-api-access-m7d96\") pod \"cinder-1a54-account-create-update-f52lv\" (UID: \"a09e9d14-f548-48a1-bbdc-1f1588b80e3a\") " pod="openstack/cinder-1a54-account-create-update-f52lv" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.312886 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/733b90f8-38f2-47b3-ae70-43edf8383cd8-operator-scripts\") pod \"cinder-db-create-454rz\" (UID: \"733b90f8-38f2-47b3-ae70-43edf8383cd8\") " pod="openstack/cinder-db-create-454rz" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.312924 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a09e9d14-f548-48a1-bbdc-1f1588b80e3a-operator-scripts\") pod \"cinder-1a54-account-create-update-f52lv\" (UID: \"a09e9d14-f548-48a1-bbdc-1f1588b80e3a\") " pod="openstack/cinder-1a54-account-create-update-f52lv" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.312971 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbznw\" (UniqueName: \"kubernetes.io/projected/ca947aa7-664b-4392-9446-bdc5afdb3d6b-kube-api-access-kbznw\") pod \"heat-ad6f-account-create-update-wnt5t\" (UID: \"ca947aa7-664b-4392-9446-bdc5afdb3d6b\") " pod="openstack/heat-ad6f-account-create-update-wnt5t" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.313003 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca947aa7-664b-4392-9446-bdc5afdb3d6b-operator-scripts\") pod \"heat-ad6f-account-create-update-wnt5t\" (UID: \"ca947aa7-664b-4392-9446-bdc5afdb3d6b\") " pod="openstack/heat-ad6f-account-create-update-wnt5t" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.326746 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca947aa7-664b-4392-9446-bdc5afdb3d6b-operator-scripts\") pod \"heat-ad6f-account-create-update-wnt5t\" (UID: \"ca947aa7-664b-4392-9446-bdc5afdb3d6b\") " pod="openstack/heat-ad6f-account-create-update-wnt5t" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.340367 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-dhqnj"] Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.342066 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-dhqnj" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.368204 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-dhqnj"] Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.391318 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbznw\" (UniqueName: \"kubernetes.io/projected/ca947aa7-664b-4392-9446-bdc5afdb3d6b-kube-api-access-kbznw\") pod \"heat-ad6f-account-create-update-wnt5t\" (UID: \"ca947aa7-664b-4392-9446-bdc5afdb3d6b\") " pod="openstack/heat-ad6f-account-create-update-wnt5t" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.416373 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7d96\" (UniqueName: \"kubernetes.io/projected/a09e9d14-f548-48a1-bbdc-1f1588b80e3a-kube-api-access-m7d96\") pod \"cinder-1a54-account-create-update-f52lv\" (UID: \"a09e9d14-f548-48a1-bbdc-1f1588b80e3a\") " pod="openstack/cinder-1a54-account-create-update-f52lv" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.416431 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/733b90f8-38f2-47b3-ae70-43edf8383cd8-operator-scripts\") pod \"cinder-db-create-454rz\" (UID: \"733b90f8-38f2-47b3-ae70-43edf8383cd8\") " pod="openstack/cinder-db-create-454rz" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.416469 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a09e9d14-f548-48a1-bbdc-1f1588b80e3a-operator-scripts\") pod \"cinder-1a54-account-create-update-f52lv\" (UID: \"a09e9d14-f548-48a1-bbdc-1f1588b80e3a\") " pod="openstack/cinder-1a54-account-create-update-f52lv" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.416546 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkmrf\" (UniqueName: \"kubernetes.io/projected/92f6edca-b463-4c0a-b97a-3d82d73a9590-kube-api-access-wkmrf\") pod \"keystone-db-sync-b84j2\" (UID: \"92f6edca-b463-4c0a-b97a-3d82d73a9590\") " pod="openstack/keystone-db-sync-b84j2" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.416580 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xgxx\" (UniqueName: \"kubernetes.io/projected/733b90f8-38f2-47b3-ae70-43edf8383cd8-kube-api-access-9xgxx\") pod \"cinder-db-create-454rz\" (UID: \"733b90f8-38f2-47b3-ae70-43edf8383cd8\") " pod="openstack/cinder-db-create-454rz" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.416616 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92f6edca-b463-4c0a-b97a-3d82d73a9590-config-data\") pod \"keystone-db-sync-b84j2\" (UID: \"92f6edca-b463-4c0a-b97a-3d82d73a9590\") " pod="openstack/keystone-db-sync-b84j2" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.416672 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92f6edca-b463-4c0a-b97a-3d82d73a9590-combined-ca-bundle\") pod \"keystone-db-sync-b84j2\" (UID: \"92f6edca-b463-4c0a-b97a-3d82d73a9590\") " pod="openstack/keystone-db-sync-b84j2" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.417615 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/733b90f8-38f2-47b3-ae70-43edf8383cd8-operator-scripts\") pod \"cinder-db-create-454rz\" (UID: \"733b90f8-38f2-47b3-ae70-43edf8383cd8\") " pod="openstack/cinder-db-create-454rz" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.417955 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a09e9d14-f548-48a1-bbdc-1f1588b80e3a-operator-scripts\") pod \"cinder-1a54-account-create-update-f52lv\" (UID: \"a09e9d14-f548-48a1-bbdc-1f1588b80e3a\") " pod="openstack/cinder-1a54-account-create-update-f52lv" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.473480 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-f4b2-account-create-update-lxfwz"] Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.474921 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-f4b2-account-create-update-lxfwz" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.477874 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.503324 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xgxx\" (UniqueName: \"kubernetes.io/projected/733b90f8-38f2-47b3-ae70-43edf8383cd8-kube-api-access-9xgxx\") pod \"cinder-db-create-454rz\" (UID: \"733b90f8-38f2-47b3-ae70-43edf8383cd8\") " pod="openstack/cinder-db-create-454rz" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.505751 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-f4b2-account-create-update-lxfwz"] Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.507808 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7d96\" (UniqueName: \"kubernetes.io/projected/a09e9d14-f548-48a1-bbdc-1f1588b80e3a-kube-api-access-m7d96\") pod \"cinder-1a54-account-create-update-f52lv\" (UID: \"a09e9d14-f548-48a1-bbdc-1f1588b80e3a\") " pod="openstack/cinder-1a54-account-create-update-f52lv" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.580683 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92f6edca-b463-4c0a-b97a-3d82d73a9590-combined-ca-bundle\") pod \"keystone-db-sync-b84j2\" (UID: \"92f6edca-b463-4c0a-b97a-3d82d73a9590\") " pod="openstack/keystone-db-sync-b84j2" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.581149 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d4f258d-0295-473e-89cf-b714157c3c60-operator-scripts\") pod \"barbican-db-create-dhqnj\" (UID: \"7d4f258d-0295-473e-89cf-b714157c3c60\") " pod="openstack/barbican-db-create-dhqnj" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.606832 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ld9b\" (UniqueName: \"kubernetes.io/projected/7d4f258d-0295-473e-89cf-b714157c3c60-kube-api-access-5ld9b\") pod \"barbican-db-create-dhqnj\" (UID: \"7d4f258d-0295-473e-89cf-b714157c3c60\") " pod="openstack/barbican-db-create-dhqnj" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.606982 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkmrf\" (UniqueName: \"kubernetes.io/projected/92f6edca-b463-4c0a-b97a-3d82d73a9590-kube-api-access-wkmrf\") pod \"keystone-db-sync-b84j2\" (UID: \"92f6edca-b463-4c0a-b97a-3d82d73a9590\") " pod="openstack/keystone-db-sync-b84j2" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.610923 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92f6edca-b463-4c0a-b97a-3d82d73a9590-combined-ca-bundle\") pod \"keystone-db-sync-b84j2\" (UID: \"92f6edca-b463-4c0a-b97a-3d82d73a9590\") " pod="openstack/keystone-db-sync-b84j2" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.613949 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92f6edca-b463-4c0a-b97a-3d82d73a9590-config-data\") pod \"keystone-db-sync-b84j2\" (UID: \"92f6edca-b463-4c0a-b97a-3d82d73a9590\") " pod="openstack/keystone-db-sync-b84j2" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.617647 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92f6edca-b463-4c0a-b97a-3d82d73a9590-config-data\") pod \"keystone-db-sync-b84j2\" (UID: \"92f6edca-b463-4c0a-b97a-3d82d73a9590\") " pod="openstack/keystone-db-sync-b84j2" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.625344 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-stspj"] Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.627061 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-stspj" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.635152 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-e3a0-account-create-update-zqd66"] Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.636820 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e3a0-account-create-update-zqd66" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.638832 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.639579 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkmrf\" (UniqueName: \"kubernetes.io/projected/92f6edca-b463-4c0a-b97a-3d82d73a9590-kube-api-access-wkmrf\") pod \"keystone-db-sync-b84j2\" (UID: \"92f6edca-b463-4c0a-b97a-3d82d73a9590\") " pod="openstack/keystone-db-sync-b84j2" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.645451 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-stspj"] Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.655527 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-e3a0-account-create-update-zqd66"] Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.657236 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-ad6f-account-create-update-wnt5t" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.718174 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f6915f1-a5f3-4816-8ed6-1f0232327393-operator-scripts\") pod \"neutron-db-create-stspj\" (UID: \"9f6915f1-a5f3-4816-8ed6-1f0232327393\") " pod="openstack/neutron-db-create-stspj" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.718259 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8h4g8\" (UniqueName: \"kubernetes.io/projected/9f6915f1-a5f3-4816-8ed6-1f0232327393-kube-api-access-8h4g8\") pod \"neutron-db-create-stspj\" (UID: \"9f6915f1-a5f3-4816-8ed6-1f0232327393\") " pod="openstack/neutron-db-create-stspj" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.718291 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fk9jg\" (UniqueName: \"kubernetes.io/projected/fd45547e-1987-40bd-ba4a-1156803be411-kube-api-access-fk9jg\") pod \"neutron-e3a0-account-create-update-zqd66\" (UID: \"fd45547e-1987-40bd-ba4a-1156803be411\") " pod="openstack/neutron-e3a0-account-create-update-zqd66" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.718324 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd45547e-1987-40bd-ba4a-1156803be411-operator-scripts\") pod \"neutron-e3a0-account-create-update-zqd66\" (UID: \"fd45547e-1987-40bd-ba4a-1156803be411\") " pod="openstack/neutron-e3a0-account-create-update-zqd66" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.718386 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxxg9\" (UniqueName: \"kubernetes.io/projected/f8b780a2-0bd1-4947-bf59-b7c27a9c031c-kube-api-access-nxxg9\") pod \"barbican-f4b2-account-create-update-lxfwz\" (UID: \"f8b780a2-0bd1-4947-bf59-b7c27a9c031c\") " pod="openstack/barbican-f4b2-account-create-update-lxfwz" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.718406 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8b780a2-0bd1-4947-bf59-b7c27a9c031c-operator-scripts\") pod \"barbican-f4b2-account-create-update-lxfwz\" (UID: \"f8b780a2-0bd1-4947-bf59-b7c27a9c031c\") " pod="openstack/barbican-f4b2-account-create-update-lxfwz" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.718496 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d4f258d-0295-473e-89cf-b714157c3c60-operator-scripts\") pod \"barbican-db-create-dhqnj\" (UID: \"7d4f258d-0295-473e-89cf-b714157c3c60\") " pod="openstack/barbican-db-create-dhqnj" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.718514 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ld9b\" (UniqueName: \"kubernetes.io/projected/7d4f258d-0295-473e-89cf-b714157c3c60-kube-api-access-5ld9b\") pod \"barbican-db-create-dhqnj\" (UID: \"7d4f258d-0295-473e-89cf-b714157c3c60\") " pod="openstack/barbican-db-create-dhqnj" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.719919 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d4f258d-0295-473e-89cf-b714157c3c60-operator-scripts\") pod \"barbican-db-create-dhqnj\" (UID: \"7d4f258d-0295-473e-89cf-b714157c3c60\") " pod="openstack/barbican-db-create-dhqnj" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.742673 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ld9b\" (UniqueName: \"kubernetes.io/projected/7d4f258d-0295-473e-89cf-b714157c3c60-kube-api-access-5ld9b\") pod \"barbican-db-create-dhqnj\" (UID: \"7d4f258d-0295-473e-89cf-b714157c3c60\") " pod="openstack/barbican-db-create-dhqnj" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.747699 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-454rz" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.748205 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-dhqnj" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.765586 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-1a54-account-create-update-f52lv" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.820339 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f6915f1-a5f3-4816-8ed6-1f0232327393-operator-scripts\") pod \"neutron-db-create-stspj\" (UID: \"9f6915f1-a5f3-4816-8ed6-1f0232327393\") " pod="openstack/neutron-db-create-stspj" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.820407 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8h4g8\" (UniqueName: \"kubernetes.io/projected/9f6915f1-a5f3-4816-8ed6-1f0232327393-kube-api-access-8h4g8\") pod \"neutron-db-create-stspj\" (UID: \"9f6915f1-a5f3-4816-8ed6-1f0232327393\") " pod="openstack/neutron-db-create-stspj" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.820440 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fk9jg\" (UniqueName: \"kubernetes.io/projected/fd45547e-1987-40bd-ba4a-1156803be411-kube-api-access-fk9jg\") pod \"neutron-e3a0-account-create-update-zqd66\" (UID: \"fd45547e-1987-40bd-ba4a-1156803be411\") " pod="openstack/neutron-e3a0-account-create-update-zqd66" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.820474 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd45547e-1987-40bd-ba4a-1156803be411-operator-scripts\") pod \"neutron-e3a0-account-create-update-zqd66\" (UID: \"fd45547e-1987-40bd-ba4a-1156803be411\") " pod="openstack/neutron-e3a0-account-create-update-zqd66" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.820539 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxxg9\" (UniqueName: \"kubernetes.io/projected/f8b780a2-0bd1-4947-bf59-b7c27a9c031c-kube-api-access-nxxg9\") pod \"barbican-f4b2-account-create-update-lxfwz\" (UID: \"f8b780a2-0bd1-4947-bf59-b7c27a9c031c\") " pod="openstack/barbican-f4b2-account-create-update-lxfwz" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.820564 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8b780a2-0bd1-4947-bf59-b7c27a9c031c-operator-scripts\") pod \"barbican-f4b2-account-create-update-lxfwz\" (UID: \"f8b780a2-0bd1-4947-bf59-b7c27a9c031c\") " pod="openstack/barbican-f4b2-account-create-update-lxfwz" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.821506 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8b780a2-0bd1-4947-bf59-b7c27a9c031c-operator-scripts\") pod \"barbican-f4b2-account-create-update-lxfwz\" (UID: \"f8b780a2-0bd1-4947-bf59-b7c27a9c031c\") " pod="openstack/barbican-f4b2-account-create-update-lxfwz" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.822384 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f6915f1-a5f3-4816-8ed6-1f0232327393-operator-scripts\") pod \"neutron-db-create-stspj\" (UID: \"9f6915f1-a5f3-4816-8ed6-1f0232327393\") " pod="openstack/neutron-db-create-stspj" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.823214 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd45547e-1987-40bd-ba4a-1156803be411-operator-scripts\") pod \"neutron-e3a0-account-create-update-zqd66\" (UID: \"fd45547e-1987-40bd-ba4a-1156803be411\") " pod="openstack/neutron-e3a0-account-create-update-zqd66" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.848261 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8h4g8\" (UniqueName: \"kubernetes.io/projected/9f6915f1-a5f3-4816-8ed6-1f0232327393-kube-api-access-8h4g8\") pod \"neutron-db-create-stspj\" (UID: \"9f6915f1-a5f3-4816-8ed6-1f0232327393\") " pod="openstack/neutron-db-create-stspj" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.854105 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxxg9\" (UniqueName: \"kubernetes.io/projected/f8b780a2-0bd1-4947-bf59-b7c27a9c031c-kube-api-access-nxxg9\") pod \"barbican-f4b2-account-create-update-lxfwz\" (UID: \"f8b780a2-0bd1-4947-bf59-b7c27a9c031c\") " pod="openstack/barbican-f4b2-account-create-update-lxfwz" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.859144 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fk9jg\" (UniqueName: \"kubernetes.io/projected/fd45547e-1987-40bd-ba4a-1156803be411-kube-api-access-fk9jg\") pod \"neutron-e3a0-account-create-update-zqd66\" (UID: \"fd45547e-1987-40bd-ba4a-1156803be411\") " pod="openstack/neutron-e3a0-account-create-update-zqd66" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.873869 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-b84j2" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.890157 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-f4b2-account-create-update-lxfwz" Mar 19 19:17:32 crc kubenswrapper[4826]: I0319 19:17:32.927782 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 19 19:17:32 crc kubenswrapper[4826]: W0319 19:17:32.950433 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf194957_ec68_4ea7_b094_3e0912bc3bc5.slice/crio-fefbc36949e70b91e777beeba24abe2d27d2a35078ff2a9551cc719e0ec3960c WatchSource:0}: Error finding container fefbc36949e70b91e777beeba24abe2d27d2a35078ff2a9551cc719e0ec3960c: Status 404 returned error can't find the container with id fefbc36949e70b91e777beeba24abe2d27d2a35078ff2a9551cc719e0ec3960c Mar 19 19:17:33 crc kubenswrapper[4826]: I0319 19:17:32.972880 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-stspj" Mar 19 19:17:33 crc kubenswrapper[4826]: I0319 19:17:32.986306 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e3a0-account-create-update-zqd66" Mar 19 19:17:33 crc kubenswrapper[4826]: I0319 19:17:33.098187 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-fmgf5"] Mar 19 19:17:33 crc kubenswrapper[4826]: I0319 19:17:33.211260 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-ad6f-account-create-update-wnt5t"] Mar 19 19:17:33 crc kubenswrapper[4826]: I0319 19:17:33.401584 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"bf194957-ec68-4ea7-b094-3e0912bc3bc5","Type":"ContainerStarted","Data":"fefbc36949e70b91e777beeba24abe2d27d2a35078ff2a9551cc719e0ec3960c"} Mar 19 19:17:33 crc kubenswrapper[4826]: I0319 19:17:33.403236 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-ad6f-account-create-update-wnt5t" event={"ID":"ca947aa7-664b-4392-9446-bdc5afdb3d6b","Type":"ContainerStarted","Data":"f67789917e3dab74d81159755385073eed0906ffae96b0da329ff3a2d9dff34e"} Mar 19 19:17:33 crc kubenswrapper[4826]: I0319 19:17:33.405643 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-fmgf5" event={"ID":"87ca634c-b18e-4567-a7ee-00d102d65496","Type":"ContainerStarted","Data":"2a58bca2e58f1b3c138903abddb3014e054a88d32aff2aa5e0d28bb0fd8feef3"} Mar 19 19:17:33 crc kubenswrapper[4826]: I0319 19:17:33.825920 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-1a54-account-create-update-f52lv"] Mar 19 19:17:33 crc kubenswrapper[4826]: I0319 19:17:33.834435 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-dhqnj"] Mar 19 19:17:33 crc kubenswrapper[4826]: I0319 19:17:33.855386 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-454rz"] Mar 19 19:17:34 crc kubenswrapper[4826]: I0319 19:17:34.012713 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-b84j2"] Mar 19 19:17:34 crc kubenswrapper[4826]: I0319 19:17:34.046011 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-f4b2-account-create-update-lxfwz"] Mar 19 19:17:34 crc kubenswrapper[4826]: I0319 19:17:34.160352 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-stspj"] Mar 19 19:17:34 crc kubenswrapper[4826]: I0319 19:17:34.167866 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-e3a0-account-create-update-zqd66"] Mar 19 19:17:34 crc kubenswrapper[4826]: W0319 19:17:34.328345 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd45547e_1987_40bd_ba4a_1156803be411.slice/crio-0c2bb32eec1adae7192d62be01729158e172ac8833b945f15594e80f1a12d9b3 WatchSource:0}: Error finding container 0c2bb32eec1adae7192d62be01729158e172ac8833b945f15594e80f1a12d9b3: Status 404 returned error can't find the container with id 0c2bb32eec1adae7192d62be01729158e172ac8833b945f15594e80f1a12d9b3 Mar 19 19:17:34 crc kubenswrapper[4826]: I0319 19:17:34.418138 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-e3a0-account-create-update-zqd66" event={"ID":"fd45547e-1987-40bd-ba4a-1156803be411","Type":"ContainerStarted","Data":"0c2bb32eec1adae7192d62be01729158e172ac8833b945f15594e80f1a12d9b3"} Mar 19 19:17:34 crc kubenswrapper[4826]: I0319 19:17:34.420023 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-f4b2-account-create-update-lxfwz" event={"ID":"f8b780a2-0bd1-4947-bf59-b7c27a9c031c","Type":"ContainerStarted","Data":"2a834e9879a4628c72e83b5981a89a546d1da41cb187b070e59a77a786d29294"} Mar 19 19:17:34 crc kubenswrapper[4826]: I0319 19:17:34.421451 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-1a54-account-create-update-f52lv" event={"ID":"a09e9d14-f548-48a1-bbdc-1f1588b80e3a","Type":"ContainerStarted","Data":"ea042cb37dc3a0132a77685c787f3bc6dc86f27c60955bdcace5cdc2156e58bb"} Mar 19 19:17:34 crc kubenswrapper[4826]: I0319 19:17:34.423043 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-stspj" event={"ID":"9f6915f1-a5f3-4816-8ed6-1f0232327393","Type":"ContainerStarted","Data":"e7835c5e4f8375b09c8c90b2a730d9daeeefd36ba161d4c2fc1cb06b67675f41"} Mar 19 19:17:34 crc kubenswrapper[4826]: I0319 19:17:34.429707 4826 generic.go:334] "Generic (PLEG): container finished" podID="ca947aa7-664b-4392-9446-bdc5afdb3d6b" containerID="bfccc4c312593561e65f147879cf5f6faaf8800563fdc4398f2a5e3f52aa9ae6" exitCode=0 Mar 19 19:17:34 crc kubenswrapper[4826]: I0319 19:17:34.429854 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-ad6f-account-create-update-wnt5t" event={"ID":"ca947aa7-664b-4392-9446-bdc5afdb3d6b","Type":"ContainerDied","Data":"bfccc4c312593561e65f147879cf5f6faaf8800563fdc4398f2a5e3f52aa9ae6"} Mar 19 19:17:34 crc kubenswrapper[4826]: I0319 19:17:34.435124 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-b84j2" event={"ID":"92f6edca-b463-4c0a-b97a-3d82d73a9590","Type":"ContainerStarted","Data":"a7f3e475bcb08cb83ff18d21d68953ffea7eed9d98ca14705b383f69a7e0409b"} Mar 19 19:17:34 crc kubenswrapper[4826]: I0319 19:17:34.437970 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-454rz" event={"ID":"733b90f8-38f2-47b3-ae70-43edf8383cd8","Type":"ContainerStarted","Data":"7c14552b2f6f8ef0b3811c9e85beb8b6d423e9c2c9590364efa1154a6b30a68a"} Mar 19 19:17:34 crc kubenswrapper[4826]: I0319 19:17:34.439738 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-dhqnj" event={"ID":"7d4f258d-0295-473e-89cf-b714157c3c60","Type":"ContainerStarted","Data":"fa36e1876ac06afacdd6e8bc0f6e6edfa4aa11d82d8663d79d0616178da799c6"} Mar 19 19:17:34 crc kubenswrapper[4826]: I0319 19:17:34.445107 4826 generic.go:334] "Generic (PLEG): container finished" podID="87ca634c-b18e-4567-a7ee-00d102d65496" containerID="111696d1713a2f5f93fafa656988f505288d1c57dddfcbf6ba1445041e2d6b37" exitCode=0 Mar 19 19:17:34 crc kubenswrapper[4826]: I0319 19:17:34.445200 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-fmgf5" event={"ID":"87ca634c-b18e-4567-a7ee-00d102d65496","Type":"ContainerDied","Data":"111696d1713a2f5f93fafa656988f505288d1c57dddfcbf6ba1445041e2d6b37"} Mar 19 19:17:35 crc kubenswrapper[4826]: I0319 19:17:35.455407 4826 generic.go:334] "Generic (PLEG): container finished" podID="9f6915f1-a5f3-4816-8ed6-1f0232327393" containerID="434b2e2245dc6797fb623b9cf3f35e328c48da5060bc5847bb26e5706e09ff1b" exitCode=0 Mar 19 19:17:35 crc kubenswrapper[4826]: I0319 19:17:35.455608 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-stspj" event={"ID":"9f6915f1-a5f3-4816-8ed6-1f0232327393","Type":"ContainerDied","Data":"434b2e2245dc6797fb623b9cf3f35e328c48da5060bc5847bb26e5706e09ff1b"} Mar 19 19:17:35 crc kubenswrapper[4826]: I0319 19:17:35.459122 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"775f9d8a-377a-4913-b2d2-3bb1b7aec077","Type":"ContainerStarted","Data":"366ffc9b6819ae42a0ec2096618dcf62fc59bb150b0f650d962aa1805ac4c516"} Mar 19 19:17:35 crc kubenswrapper[4826]: I0319 19:17:35.460816 4826 generic.go:334] "Generic (PLEG): container finished" podID="7d4f258d-0295-473e-89cf-b714157c3c60" containerID="cca5679346998ae47aa98543875bc5d8eae4408ff4b5a140ec9b4cdd78d0d45d" exitCode=0 Mar 19 19:17:35 crc kubenswrapper[4826]: I0319 19:17:35.460887 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-dhqnj" event={"ID":"7d4f258d-0295-473e-89cf-b714157c3c60","Type":"ContainerDied","Data":"cca5679346998ae47aa98543875bc5d8eae4408ff4b5a140ec9b4cdd78d0d45d"} Mar 19 19:17:36 crc kubenswrapper[4826]: I0319 19:17:36.474874 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-ad6f-account-create-update-wnt5t" event={"ID":"ca947aa7-664b-4392-9446-bdc5afdb3d6b","Type":"ContainerDied","Data":"f67789917e3dab74d81159755385073eed0906ffae96b0da329ff3a2d9dff34e"} Mar 19 19:17:36 crc kubenswrapper[4826]: I0319 19:17:36.475121 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f67789917e3dab74d81159755385073eed0906ffae96b0da329ff3a2d9dff34e" Mar 19 19:17:36 crc kubenswrapper[4826]: I0319 19:17:36.477925 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-fmgf5" event={"ID":"87ca634c-b18e-4567-a7ee-00d102d65496","Type":"ContainerDied","Data":"2a58bca2e58f1b3c138903abddb3014e054a88d32aff2aa5e0d28bb0fd8feef3"} Mar 19 19:17:36 crc kubenswrapper[4826]: I0319 19:17:36.478182 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a58bca2e58f1b3c138903abddb3014e054a88d32aff2aa5e0d28bb0fd8feef3" Mar 19 19:17:36 crc kubenswrapper[4826]: I0319 19:17:36.538804 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-fmgf5" Mar 19 19:17:36 crc kubenswrapper[4826]: I0319 19:17:36.540682 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-ad6f-account-create-update-wnt5t" Mar 19 19:17:36 crc kubenswrapper[4826]: I0319 19:17:36.608028 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87ca634c-b18e-4567-a7ee-00d102d65496-operator-scripts\") pod \"87ca634c-b18e-4567-a7ee-00d102d65496\" (UID: \"87ca634c-b18e-4567-a7ee-00d102d65496\") " Mar 19 19:17:36 crc kubenswrapper[4826]: I0319 19:17:36.608104 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9hbw\" (UniqueName: \"kubernetes.io/projected/87ca634c-b18e-4567-a7ee-00d102d65496-kube-api-access-l9hbw\") pod \"87ca634c-b18e-4567-a7ee-00d102d65496\" (UID: \"87ca634c-b18e-4567-a7ee-00d102d65496\") " Mar 19 19:17:36 crc kubenswrapper[4826]: I0319 19:17:36.608152 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbznw\" (UniqueName: \"kubernetes.io/projected/ca947aa7-664b-4392-9446-bdc5afdb3d6b-kube-api-access-kbznw\") pod \"ca947aa7-664b-4392-9446-bdc5afdb3d6b\" (UID: \"ca947aa7-664b-4392-9446-bdc5afdb3d6b\") " Mar 19 19:17:36 crc kubenswrapper[4826]: I0319 19:17:36.608177 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca947aa7-664b-4392-9446-bdc5afdb3d6b-operator-scripts\") pod \"ca947aa7-664b-4392-9446-bdc5afdb3d6b\" (UID: \"ca947aa7-664b-4392-9446-bdc5afdb3d6b\") " Mar 19 19:17:36 crc kubenswrapper[4826]: I0319 19:17:36.608835 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca947aa7-664b-4392-9446-bdc5afdb3d6b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ca947aa7-664b-4392-9446-bdc5afdb3d6b" (UID: "ca947aa7-664b-4392-9446-bdc5afdb3d6b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:17:36 crc kubenswrapper[4826]: I0319 19:17:36.608910 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87ca634c-b18e-4567-a7ee-00d102d65496-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "87ca634c-b18e-4567-a7ee-00d102d65496" (UID: "87ca634c-b18e-4567-a7ee-00d102d65496"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:17:36 crc kubenswrapper[4826]: I0319 19:17:36.614315 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca947aa7-664b-4392-9446-bdc5afdb3d6b-kube-api-access-kbznw" (OuterVolumeSpecName: "kube-api-access-kbznw") pod "ca947aa7-664b-4392-9446-bdc5afdb3d6b" (UID: "ca947aa7-664b-4392-9446-bdc5afdb3d6b"). InnerVolumeSpecName "kube-api-access-kbznw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:17:36 crc kubenswrapper[4826]: I0319 19:17:36.614908 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87ca634c-b18e-4567-a7ee-00d102d65496-kube-api-access-l9hbw" (OuterVolumeSpecName: "kube-api-access-l9hbw") pod "87ca634c-b18e-4567-a7ee-00d102d65496" (UID: "87ca634c-b18e-4567-a7ee-00d102d65496"). InnerVolumeSpecName "kube-api-access-l9hbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:17:36 crc kubenswrapper[4826]: I0319 19:17:36.710699 4826 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87ca634c-b18e-4567-a7ee-00d102d65496-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:36 crc kubenswrapper[4826]: I0319 19:17:36.710738 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9hbw\" (UniqueName: \"kubernetes.io/projected/87ca634c-b18e-4567-a7ee-00d102d65496-kube-api-access-l9hbw\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:36 crc kubenswrapper[4826]: I0319 19:17:36.710753 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbznw\" (UniqueName: \"kubernetes.io/projected/ca947aa7-664b-4392-9446-bdc5afdb3d6b-kube-api-access-kbznw\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:36 crc kubenswrapper[4826]: I0319 19:17:36.710765 4826 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca947aa7-664b-4392-9446-bdc5afdb3d6b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:36 crc kubenswrapper[4826]: I0319 19:17:36.846342 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-dhqnj" Mar 19 19:17:36 crc kubenswrapper[4826]: I0319 19:17:36.925051 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-stspj" Mar 19 19:17:37 crc kubenswrapper[4826]: I0319 19:17:37.016345 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ld9b\" (UniqueName: \"kubernetes.io/projected/7d4f258d-0295-473e-89cf-b714157c3c60-kube-api-access-5ld9b\") pod \"7d4f258d-0295-473e-89cf-b714157c3c60\" (UID: \"7d4f258d-0295-473e-89cf-b714157c3c60\") " Mar 19 19:17:37 crc kubenswrapper[4826]: I0319 19:17:37.017278 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d4f258d-0295-473e-89cf-b714157c3c60-operator-scripts\") pod \"7d4f258d-0295-473e-89cf-b714157c3c60\" (UID: \"7d4f258d-0295-473e-89cf-b714157c3c60\") " Mar 19 19:17:37 crc kubenswrapper[4826]: I0319 19:17:37.017810 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d4f258d-0295-473e-89cf-b714157c3c60-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7d4f258d-0295-473e-89cf-b714157c3c60" (UID: "7d4f258d-0295-473e-89cf-b714157c3c60"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:17:37 crc kubenswrapper[4826]: I0319 19:17:37.018821 4826 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d4f258d-0295-473e-89cf-b714157c3c60-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:37 crc kubenswrapper[4826]: I0319 19:17:37.022489 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d4f258d-0295-473e-89cf-b714157c3c60-kube-api-access-5ld9b" (OuterVolumeSpecName: "kube-api-access-5ld9b") pod "7d4f258d-0295-473e-89cf-b714157c3c60" (UID: "7d4f258d-0295-473e-89cf-b714157c3c60"). InnerVolumeSpecName "kube-api-access-5ld9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:17:37 crc kubenswrapper[4826]: I0319 19:17:37.119804 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f6915f1-a5f3-4816-8ed6-1f0232327393-operator-scripts\") pod \"9f6915f1-a5f3-4816-8ed6-1f0232327393\" (UID: \"9f6915f1-a5f3-4816-8ed6-1f0232327393\") " Mar 19 19:17:37 crc kubenswrapper[4826]: I0319 19:17:37.120405 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f6915f1-a5f3-4816-8ed6-1f0232327393-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9f6915f1-a5f3-4816-8ed6-1f0232327393" (UID: "9f6915f1-a5f3-4816-8ed6-1f0232327393"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:17:37 crc kubenswrapper[4826]: I0319 19:17:37.120799 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8h4g8\" (UniqueName: \"kubernetes.io/projected/9f6915f1-a5f3-4816-8ed6-1f0232327393-kube-api-access-8h4g8\") pod \"9f6915f1-a5f3-4816-8ed6-1f0232327393\" (UID: \"9f6915f1-a5f3-4816-8ed6-1f0232327393\") " Mar 19 19:17:37 crc kubenswrapper[4826]: I0319 19:17:37.121389 4826 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f6915f1-a5f3-4816-8ed6-1f0232327393-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:37 crc kubenswrapper[4826]: I0319 19:17:37.121412 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ld9b\" (UniqueName: \"kubernetes.io/projected/7d4f258d-0295-473e-89cf-b714157c3c60-kube-api-access-5ld9b\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:37 crc kubenswrapper[4826]: I0319 19:17:37.123696 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f6915f1-a5f3-4816-8ed6-1f0232327393-kube-api-access-8h4g8" (OuterVolumeSpecName: "kube-api-access-8h4g8") pod "9f6915f1-a5f3-4816-8ed6-1f0232327393" (UID: "9f6915f1-a5f3-4816-8ed6-1f0232327393"). InnerVolumeSpecName "kube-api-access-8h4g8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:17:37 crc kubenswrapper[4826]: I0319 19:17:37.223233 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8h4g8\" (UniqueName: \"kubernetes.io/projected/9f6915f1-a5f3-4816-8ed6-1f0232327393-kube-api-access-8h4g8\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:37 crc kubenswrapper[4826]: I0319 19:17:37.496079 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-stspj" Mar 19 19:17:37 crc kubenswrapper[4826]: I0319 19:17:37.496075 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-stspj" event={"ID":"9f6915f1-a5f3-4816-8ed6-1f0232327393","Type":"ContainerDied","Data":"e7835c5e4f8375b09c8c90b2a730d9daeeefd36ba161d4c2fc1cb06b67675f41"} Mar 19 19:17:37 crc kubenswrapper[4826]: I0319 19:17:37.496720 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7835c5e4f8375b09c8c90b2a730d9daeeefd36ba161d4c2fc1cb06b67675f41" Mar 19 19:17:37 crc kubenswrapper[4826]: I0319 19:17:37.501858 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"775f9d8a-377a-4913-b2d2-3bb1b7aec077","Type":"ContainerStarted","Data":"5272ff3d9ff144701bc238f70231a46ebe4a35c338002cb5c6635bb4d4745a2f"} Mar 19 19:17:37 crc kubenswrapper[4826]: I0319 19:17:37.501899 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"775f9d8a-377a-4913-b2d2-3bb1b7aec077","Type":"ContainerStarted","Data":"debcd92c2bd2da226aaf3bc48d05f4d9a1284837a5c2126fd2cfbc2012200eb6"} Mar 19 19:17:37 crc kubenswrapper[4826]: I0319 19:17:37.504649 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"bf194957-ec68-4ea7-b094-3e0912bc3bc5","Type":"ContainerStarted","Data":"4b15d3885216cdb7f5e62dc52dd5fa57568e5e9acdee2c4485f7a1b28e2c3e6d"} Mar 19 19:17:37 crc kubenswrapper[4826]: I0319 19:17:37.509647 4826 generic.go:334] "Generic (PLEG): container finished" podID="fd45547e-1987-40bd-ba4a-1156803be411" containerID="75212f93ec3129bc16edebc174b0ad39f7df96e9f81928c5a6b537553dd3ecd8" exitCode=0 Mar 19 19:17:37 crc kubenswrapper[4826]: I0319 19:17:37.509739 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-e3a0-account-create-update-zqd66" event={"ID":"fd45547e-1987-40bd-ba4a-1156803be411","Type":"ContainerDied","Data":"75212f93ec3129bc16edebc174b0ad39f7df96e9f81928c5a6b537553dd3ecd8"} Mar 19 19:17:37 crc kubenswrapper[4826]: I0319 19:17:37.511622 4826 generic.go:334] "Generic (PLEG): container finished" podID="f8b780a2-0bd1-4947-bf59-b7c27a9c031c" containerID="21ef0dc3ebf3f9ddb2435ffd8e3564cabbac3213decd62c7806f1503260ee5e7" exitCode=0 Mar 19 19:17:37 crc kubenswrapper[4826]: I0319 19:17:37.511688 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-f4b2-account-create-update-lxfwz" event={"ID":"f8b780a2-0bd1-4947-bf59-b7c27a9c031c","Type":"ContainerDied","Data":"21ef0dc3ebf3f9ddb2435ffd8e3564cabbac3213decd62c7806f1503260ee5e7"} Mar 19 19:17:37 crc kubenswrapper[4826]: I0319 19:17:37.513068 4826 generic.go:334] "Generic (PLEG): container finished" podID="733b90f8-38f2-47b3-ae70-43edf8383cd8" containerID="08a6f4fbd3e914217a0081f36c98447e599c5ecaaabebe10ccfe789eb86e807f" exitCode=0 Mar 19 19:17:37 crc kubenswrapper[4826]: I0319 19:17:37.513114 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-454rz" event={"ID":"733b90f8-38f2-47b3-ae70-43edf8383cd8","Type":"ContainerDied","Data":"08a6f4fbd3e914217a0081f36c98447e599c5ecaaabebe10ccfe789eb86e807f"} Mar 19 19:17:37 crc kubenswrapper[4826]: I0319 19:17:37.514687 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-dhqnj" event={"ID":"7d4f258d-0295-473e-89cf-b714157c3c60","Type":"ContainerDied","Data":"fa36e1876ac06afacdd6e8bc0f6e6edfa4aa11d82d8663d79d0616178da799c6"} Mar 19 19:17:37 crc kubenswrapper[4826]: I0319 19:17:37.514791 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa36e1876ac06afacdd6e8bc0f6e6edfa4aa11d82d8663d79d0616178da799c6" Mar 19 19:17:37 crc kubenswrapper[4826]: I0319 19:17:37.514907 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-dhqnj" Mar 19 19:17:37 crc kubenswrapper[4826]: I0319 19:17:37.522406 4826 generic.go:334] "Generic (PLEG): container finished" podID="a09e9d14-f548-48a1-bbdc-1f1588b80e3a" containerID="c0addfd70dcb79bb9243581f38de8d3e2472393d69e690dc2f92816fc65d8ece" exitCode=0 Mar 19 19:17:37 crc kubenswrapper[4826]: I0319 19:17:37.522531 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-ad6f-account-create-update-wnt5t" Mar 19 19:17:37 crc kubenswrapper[4826]: I0319 19:17:37.526909 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-1a54-account-create-update-f52lv" event={"ID":"a09e9d14-f548-48a1-bbdc-1f1588b80e3a","Type":"ContainerDied","Data":"c0addfd70dcb79bb9243581f38de8d3e2472393d69e690dc2f92816fc65d8ece"} Mar 19 19:17:37 crc kubenswrapper[4826]: I0319 19:17:37.526988 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-fmgf5" Mar 19 19:17:38 crc kubenswrapper[4826]: I0319 19:17:38.538886 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"775f9d8a-377a-4913-b2d2-3bb1b7aec077","Type":"ContainerStarted","Data":"83d4076f2729b73441fc03a53255e9160ad4e8e88cb4c28edbdd7c3d5ec8d1b9"} Mar 19 19:17:41 crc kubenswrapper[4826]: I0319 19:17:41.588122 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-e3a0-account-create-update-zqd66" event={"ID":"fd45547e-1987-40bd-ba4a-1156803be411","Type":"ContainerDied","Data":"0c2bb32eec1adae7192d62be01729158e172ac8833b945f15594e80f1a12d9b3"} Mar 19 19:17:41 crc kubenswrapper[4826]: I0319 19:17:41.588808 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c2bb32eec1adae7192d62be01729158e172ac8833b945f15594e80f1a12d9b3" Mar 19 19:17:41 crc kubenswrapper[4826]: I0319 19:17:41.592932 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-454rz" event={"ID":"733b90f8-38f2-47b3-ae70-43edf8383cd8","Type":"ContainerDied","Data":"7c14552b2f6f8ef0b3811c9e85beb8b6d423e9c2c9590364efa1154a6b30a68a"} Mar 19 19:17:41 crc kubenswrapper[4826]: I0319 19:17:41.593007 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c14552b2f6f8ef0b3811c9e85beb8b6d423e9c2c9590364efa1154a6b30a68a" Mar 19 19:17:41 crc kubenswrapper[4826]: I0319 19:17:41.597248 4826 generic.go:334] "Generic (PLEG): container finished" podID="9d888eb3-4bcd-470d-95c6-aa3d281c6332" containerID="008f296df9d929c34c10dbca826659686020eccb5e94ffc11af8acc74b3caddd" exitCode=0 Mar 19 19:17:41 crc kubenswrapper[4826]: I0319 19:17:41.597895 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-rvwbt" event={"ID":"9d888eb3-4bcd-470d-95c6-aa3d281c6332","Type":"ContainerDied","Data":"008f296df9d929c34c10dbca826659686020eccb5e94ffc11af8acc74b3caddd"} Mar 19 19:17:41 crc kubenswrapper[4826]: I0319 19:17:41.649727 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-454rz" Mar 19 19:17:41 crc kubenswrapper[4826]: I0319 19:17:41.655689 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e3a0-account-create-update-zqd66" Mar 19 19:17:41 crc kubenswrapper[4826]: I0319 19:17:41.758974 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xgxx\" (UniqueName: \"kubernetes.io/projected/733b90f8-38f2-47b3-ae70-43edf8383cd8-kube-api-access-9xgxx\") pod \"733b90f8-38f2-47b3-ae70-43edf8383cd8\" (UID: \"733b90f8-38f2-47b3-ae70-43edf8383cd8\") " Mar 19 19:17:41 crc kubenswrapper[4826]: I0319 19:17:41.759562 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fk9jg\" (UniqueName: \"kubernetes.io/projected/fd45547e-1987-40bd-ba4a-1156803be411-kube-api-access-fk9jg\") pod \"fd45547e-1987-40bd-ba4a-1156803be411\" (UID: \"fd45547e-1987-40bd-ba4a-1156803be411\") " Mar 19 19:17:41 crc kubenswrapper[4826]: I0319 19:17:41.759709 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/733b90f8-38f2-47b3-ae70-43edf8383cd8-operator-scripts\") pod \"733b90f8-38f2-47b3-ae70-43edf8383cd8\" (UID: \"733b90f8-38f2-47b3-ae70-43edf8383cd8\") " Mar 19 19:17:41 crc kubenswrapper[4826]: I0319 19:17:41.759764 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd45547e-1987-40bd-ba4a-1156803be411-operator-scripts\") pod \"fd45547e-1987-40bd-ba4a-1156803be411\" (UID: \"fd45547e-1987-40bd-ba4a-1156803be411\") " Mar 19 19:17:41 crc kubenswrapper[4826]: I0319 19:17:41.760190 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd45547e-1987-40bd-ba4a-1156803be411-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fd45547e-1987-40bd-ba4a-1156803be411" (UID: "fd45547e-1987-40bd-ba4a-1156803be411"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:17:41 crc kubenswrapper[4826]: I0319 19:17:41.760218 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/733b90f8-38f2-47b3-ae70-43edf8383cd8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "733b90f8-38f2-47b3-ae70-43edf8383cd8" (UID: "733b90f8-38f2-47b3-ae70-43edf8383cd8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:17:41 crc kubenswrapper[4826]: I0319 19:17:41.760667 4826 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/733b90f8-38f2-47b3-ae70-43edf8383cd8-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:41 crc kubenswrapper[4826]: I0319 19:17:41.760680 4826 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd45547e-1987-40bd-ba4a-1156803be411-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:41 crc kubenswrapper[4826]: I0319 19:17:41.766549 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd45547e-1987-40bd-ba4a-1156803be411-kube-api-access-fk9jg" (OuterVolumeSpecName: "kube-api-access-fk9jg") pod "fd45547e-1987-40bd-ba4a-1156803be411" (UID: "fd45547e-1987-40bd-ba4a-1156803be411"). InnerVolumeSpecName "kube-api-access-fk9jg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:17:41 crc kubenswrapper[4826]: I0319 19:17:41.766833 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/733b90f8-38f2-47b3-ae70-43edf8383cd8-kube-api-access-9xgxx" (OuterVolumeSpecName: "kube-api-access-9xgxx") pod "733b90f8-38f2-47b3-ae70-43edf8383cd8" (UID: "733b90f8-38f2-47b3-ae70-43edf8383cd8"). InnerVolumeSpecName "kube-api-access-9xgxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:17:41 crc kubenswrapper[4826]: I0319 19:17:41.862400 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fk9jg\" (UniqueName: \"kubernetes.io/projected/fd45547e-1987-40bd-ba4a-1156803be411-kube-api-access-fk9jg\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:41 crc kubenswrapper[4826]: I0319 19:17:41.862433 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xgxx\" (UniqueName: \"kubernetes.io/projected/733b90f8-38f2-47b3-ae70-43edf8383cd8-kube-api-access-9xgxx\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:41 crc kubenswrapper[4826]: I0319 19:17:41.986682 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-1a54-account-create-update-f52lv" Mar 19 19:17:42 crc kubenswrapper[4826]: I0319 19:17:42.001335 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-f4b2-account-create-update-lxfwz" Mar 19 19:17:42 crc kubenswrapper[4826]: I0319 19:17:42.170281 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7d96\" (UniqueName: \"kubernetes.io/projected/a09e9d14-f548-48a1-bbdc-1f1588b80e3a-kube-api-access-m7d96\") pod \"a09e9d14-f548-48a1-bbdc-1f1588b80e3a\" (UID: \"a09e9d14-f548-48a1-bbdc-1f1588b80e3a\") " Mar 19 19:17:42 crc kubenswrapper[4826]: I0319 19:17:42.170377 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8b780a2-0bd1-4947-bf59-b7c27a9c031c-operator-scripts\") pod \"f8b780a2-0bd1-4947-bf59-b7c27a9c031c\" (UID: \"f8b780a2-0bd1-4947-bf59-b7c27a9c031c\") " Mar 19 19:17:42 crc kubenswrapper[4826]: I0319 19:17:42.170426 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a09e9d14-f548-48a1-bbdc-1f1588b80e3a-operator-scripts\") pod \"a09e9d14-f548-48a1-bbdc-1f1588b80e3a\" (UID: \"a09e9d14-f548-48a1-bbdc-1f1588b80e3a\") " Mar 19 19:17:42 crc kubenswrapper[4826]: I0319 19:17:42.170556 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxxg9\" (UniqueName: \"kubernetes.io/projected/f8b780a2-0bd1-4947-bf59-b7c27a9c031c-kube-api-access-nxxg9\") pod \"f8b780a2-0bd1-4947-bf59-b7c27a9c031c\" (UID: \"f8b780a2-0bd1-4947-bf59-b7c27a9c031c\") " Mar 19 19:17:42 crc kubenswrapper[4826]: I0319 19:17:42.172503 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8b780a2-0bd1-4947-bf59-b7c27a9c031c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f8b780a2-0bd1-4947-bf59-b7c27a9c031c" (UID: "f8b780a2-0bd1-4947-bf59-b7c27a9c031c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:17:42 crc kubenswrapper[4826]: I0319 19:17:42.173524 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a09e9d14-f548-48a1-bbdc-1f1588b80e3a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a09e9d14-f548-48a1-bbdc-1f1588b80e3a" (UID: "a09e9d14-f548-48a1-bbdc-1f1588b80e3a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:17:42 crc kubenswrapper[4826]: I0319 19:17:42.175903 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a09e9d14-f548-48a1-bbdc-1f1588b80e3a-kube-api-access-m7d96" (OuterVolumeSpecName: "kube-api-access-m7d96") pod "a09e9d14-f548-48a1-bbdc-1f1588b80e3a" (UID: "a09e9d14-f548-48a1-bbdc-1f1588b80e3a"). InnerVolumeSpecName "kube-api-access-m7d96". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:17:42 crc kubenswrapper[4826]: I0319 19:17:42.176140 4826 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8b780a2-0bd1-4947-bf59-b7c27a9c031c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:42 crc kubenswrapper[4826]: I0319 19:17:42.176170 4826 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a09e9d14-f548-48a1-bbdc-1f1588b80e3a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:42 crc kubenswrapper[4826]: I0319 19:17:42.183963 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8b780a2-0bd1-4947-bf59-b7c27a9c031c-kube-api-access-nxxg9" (OuterVolumeSpecName: "kube-api-access-nxxg9") pod "f8b780a2-0bd1-4947-bf59-b7c27a9c031c" (UID: "f8b780a2-0bd1-4947-bf59-b7c27a9c031c"). InnerVolumeSpecName "kube-api-access-nxxg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:17:42 crc kubenswrapper[4826]: I0319 19:17:42.278515 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxxg9\" (UniqueName: \"kubernetes.io/projected/f8b780a2-0bd1-4947-bf59-b7c27a9c031c-kube-api-access-nxxg9\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:42 crc kubenswrapper[4826]: I0319 19:17:42.278838 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7d96\" (UniqueName: \"kubernetes.io/projected/a09e9d14-f548-48a1-bbdc-1f1588b80e3a-kube-api-access-m7d96\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:42 crc kubenswrapper[4826]: I0319 19:17:42.620853 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-f4b2-account-create-update-lxfwz" event={"ID":"f8b780a2-0bd1-4947-bf59-b7c27a9c031c","Type":"ContainerDied","Data":"2a834e9879a4628c72e83b5981a89a546d1da41cb187b070e59a77a786d29294"} Mar 19 19:17:42 crc kubenswrapper[4826]: I0319 19:17:42.620896 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a834e9879a4628c72e83b5981a89a546d1da41cb187b070e59a77a786d29294" Mar 19 19:17:42 crc kubenswrapper[4826]: I0319 19:17:42.620954 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-f4b2-account-create-update-lxfwz" Mar 19 19:17:42 crc kubenswrapper[4826]: I0319 19:17:42.634568 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-1a54-account-create-update-f52lv" event={"ID":"a09e9d14-f548-48a1-bbdc-1f1588b80e3a","Type":"ContainerDied","Data":"ea042cb37dc3a0132a77685c787f3bc6dc86f27c60955bdcace5cdc2156e58bb"} Mar 19 19:17:42 crc kubenswrapper[4826]: I0319 19:17:42.634613 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea042cb37dc3a0132a77685c787f3bc6dc86f27c60955bdcace5cdc2156e58bb" Mar 19 19:17:42 crc kubenswrapper[4826]: I0319 19:17:42.634698 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-1a54-account-create-update-f52lv" Mar 19 19:17:42 crc kubenswrapper[4826]: I0319 19:17:42.682919 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"775f9d8a-377a-4913-b2d2-3bb1b7aec077","Type":"ContainerStarted","Data":"398f3645b4a9e8dba02188ece89d06b65a263472ace0d4600355372173650d5b"} Mar 19 19:17:42 crc kubenswrapper[4826]: I0319 19:17:42.682965 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"775f9d8a-377a-4913-b2d2-3bb1b7aec077","Type":"ContainerStarted","Data":"166bff2621c4115d43518745b118320544be84377e3c175cb458c80d37e7d90d"} Mar 19 19:17:42 crc kubenswrapper[4826]: I0319 19:17:42.682977 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"775f9d8a-377a-4913-b2d2-3bb1b7aec077","Type":"ContainerStarted","Data":"33ffb600e6db6bc3b2efc42625c9cf1d5f7435cc39b46ffbd1ee8c028ec6fed9"} Mar 19 19:17:42 crc kubenswrapper[4826]: I0319 19:17:42.700012 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"bf194957-ec68-4ea7-b094-3e0912bc3bc5","Type":"ContainerDied","Data":"4b15d3885216cdb7f5e62dc52dd5fa57568e5e9acdee2c4485f7a1b28e2c3e6d"} Mar 19 19:17:42 crc kubenswrapper[4826]: I0319 19:17:42.699973 4826 generic.go:334] "Generic (PLEG): container finished" podID="bf194957-ec68-4ea7-b094-3e0912bc3bc5" containerID="4b15d3885216cdb7f5e62dc52dd5fa57568e5e9acdee2c4485f7a1b28e2c3e6d" exitCode=0 Mar 19 19:17:42 crc kubenswrapper[4826]: I0319 19:17:42.718292 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e3a0-account-create-update-zqd66" Mar 19 19:17:42 crc kubenswrapper[4826]: I0319 19:17:42.719036 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-b84j2" event={"ID":"92f6edca-b463-4c0a-b97a-3d82d73a9590","Type":"ContainerStarted","Data":"11a96161a72d30617a9c3938a06d39b3ea5b8072c9985ba3d6cf909483a17a5a"} Mar 19 19:17:42 crc kubenswrapper[4826]: I0319 19:17:42.719107 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-454rz" Mar 19 19:17:42 crc kubenswrapper[4826]: I0319 19:17:42.748619 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-b84j2" podStartSLOduration=3.1053681969999998 podStartE2EDuration="10.748601999s" podCreationTimestamp="2026-03-19 19:17:32 +0000 UTC" firstStartedPulling="2026-03-19 19:17:34.318865248 +0000 UTC m=+1279.072933561" lastFinishedPulling="2026-03-19 19:17:41.96209905 +0000 UTC m=+1286.716167363" observedRunningTime="2026-03-19 19:17:42.740315089 +0000 UTC m=+1287.494383402" watchObservedRunningTime="2026-03-19 19:17:42.748601999 +0000 UTC m=+1287.502670312" Mar 19 19:17:42 crc kubenswrapper[4826]: E0319 19:17:42.970339 4826 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda09e9d14_f548_48a1_bbdc_1f1588b80e3a.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8b780a2_0bd1_4947_bf59_b7c27a9c031c.slice/crio-2a834e9879a4628c72e83b5981a89a546d1da41cb187b070e59a77a786d29294\": RecentStats: unable to find data in memory cache]" Mar 19 19:17:43 crc kubenswrapper[4826]: I0319 19:17:43.210649 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-rvwbt" Mar 19 19:17:43 crc kubenswrapper[4826]: I0319 19:17:43.403715 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l29wd\" (UniqueName: \"kubernetes.io/projected/9d888eb3-4bcd-470d-95c6-aa3d281c6332-kube-api-access-l29wd\") pod \"9d888eb3-4bcd-470d-95c6-aa3d281c6332\" (UID: \"9d888eb3-4bcd-470d-95c6-aa3d281c6332\") " Mar 19 19:17:43 crc kubenswrapper[4826]: I0319 19:17:43.403931 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d888eb3-4bcd-470d-95c6-aa3d281c6332-config-data\") pod \"9d888eb3-4bcd-470d-95c6-aa3d281c6332\" (UID: \"9d888eb3-4bcd-470d-95c6-aa3d281c6332\") " Mar 19 19:17:43 crc kubenswrapper[4826]: I0319 19:17:43.403983 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d888eb3-4bcd-470d-95c6-aa3d281c6332-combined-ca-bundle\") pod \"9d888eb3-4bcd-470d-95c6-aa3d281c6332\" (UID: \"9d888eb3-4bcd-470d-95c6-aa3d281c6332\") " Mar 19 19:17:43 crc kubenswrapper[4826]: I0319 19:17:43.404185 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9d888eb3-4bcd-470d-95c6-aa3d281c6332-db-sync-config-data\") pod \"9d888eb3-4bcd-470d-95c6-aa3d281c6332\" (UID: \"9d888eb3-4bcd-470d-95c6-aa3d281c6332\") " Mar 19 19:17:43 crc kubenswrapper[4826]: I0319 19:17:43.408800 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d888eb3-4bcd-470d-95c6-aa3d281c6332-kube-api-access-l29wd" (OuterVolumeSpecName: "kube-api-access-l29wd") pod "9d888eb3-4bcd-470d-95c6-aa3d281c6332" (UID: "9d888eb3-4bcd-470d-95c6-aa3d281c6332"). InnerVolumeSpecName "kube-api-access-l29wd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:17:43 crc kubenswrapper[4826]: I0319 19:17:43.409433 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d888eb3-4bcd-470d-95c6-aa3d281c6332-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "9d888eb3-4bcd-470d-95c6-aa3d281c6332" (UID: "9d888eb3-4bcd-470d-95c6-aa3d281c6332"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:17:43 crc kubenswrapper[4826]: I0319 19:17:43.439751 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d888eb3-4bcd-470d-95c6-aa3d281c6332-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9d888eb3-4bcd-470d-95c6-aa3d281c6332" (UID: "9d888eb3-4bcd-470d-95c6-aa3d281c6332"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:17:43 crc kubenswrapper[4826]: I0319 19:17:43.489963 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d888eb3-4bcd-470d-95c6-aa3d281c6332-config-data" (OuterVolumeSpecName: "config-data") pod "9d888eb3-4bcd-470d-95c6-aa3d281c6332" (UID: "9d888eb3-4bcd-470d-95c6-aa3d281c6332"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:17:43 crc kubenswrapper[4826]: I0319 19:17:43.506414 4826 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d888eb3-4bcd-470d-95c6-aa3d281c6332-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:43 crc kubenswrapper[4826]: I0319 19:17:43.506457 4826 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d888eb3-4bcd-470d-95c6-aa3d281c6332-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:43 crc kubenswrapper[4826]: I0319 19:17:43.506470 4826 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9d888eb3-4bcd-470d-95c6-aa3d281c6332-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:43 crc kubenswrapper[4826]: I0319 19:17:43.506484 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l29wd\" (UniqueName: \"kubernetes.io/projected/9d888eb3-4bcd-470d-95c6-aa3d281c6332-kube-api-access-l29wd\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:43 crc kubenswrapper[4826]: I0319 19:17:43.739226 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-rvwbt" event={"ID":"9d888eb3-4bcd-470d-95c6-aa3d281c6332","Type":"ContainerDied","Data":"eed7e652e07b53bd671193eb8051d93d7527d2dc40ad9ad5ca561c0851fe9fba"} Mar 19 19:17:43 crc kubenswrapper[4826]: I0319 19:17:43.739271 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eed7e652e07b53bd671193eb8051d93d7527d2dc40ad9ad5ca561c0851fe9fba" Mar 19 19:17:43 crc kubenswrapper[4826]: I0319 19:17:43.739239 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-rvwbt" Mar 19 19:17:43 crc kubenswrapper[4826]: I0319 19:17:43.747720 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"bf194957-ec68-4ea7-b094-3e0912bc3bc5","Type":"ContainerStarted","Data":"a0a6be61771c3bd1837bd426d4b58d5149185e19bd788e28443b62cb3821baf3"} Mar 19 19:17:43 crc kubenswrapper[4826]: I0319 19:17:43.759861 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"775f9d8a-377a-4913-b2d2-3bb1b7aec077","Type":"ContainerStarted","Data":"77fe9b55ff9341cad6789d6b846a8bb6bf24df5b1604294942431066be590eff"} Mar 19 19:17:43 crc kubenswrapper[4826]: I0319 19:17:43.759942 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"775f9d8a-377a-4913-b2d2-3bb1b7aec077","Type":"ContainerStarted","Data":"e86838f6d700e55828c26a4fdadac354155a82c624e0f4686b6bbcaa9b611b97"} Mar 19 19:17:43 crc kubenswrapper[4826]: I0319 19:17:43.759954 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"775f9d8a-377a-4913-b2d2-3bb1b7aec077","Type":"ContainerStarted","Data":"53de031cdf3e9981415bc79932b9910af44084b440f20bfbb339fb8a86b4aa4e"} Mar 19 19:17:43 crc kubenswrapper[4826]: I0319 19:17:43.759964 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"775f9d8a-377a-4913-b2d2-3bb1b7aec077","Type":"ContainerStarted","Data":"615264bb0537fa45381f56efbe5d08ad442f6d283809b691630901a001d37371"} Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.039247 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=36.154444494 podStartE2EDuration="58.039220913s" podCreationTimestamp="2026-03-19 19:16:46 +0000 UTC" firstStartedPulling="2026-03-19 19:17:20.075500667 +0000 UTC m=+1264.829568980" lastFinishedPulling="2026-03-19 19:17:41.960277056 +0000 UTC m=+1286.714345399" observedRunningTime="2026-03-19 19:17:43.823618562 +0000 UTC m=+1288.577686895" watchObservedRunningTime="2026-03-19 19:17:44.039220913 +0000 UTC m=+1288.793289226" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.051539 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-5mnmh"] Mar 19 19:17:44 crc kubenswrapper[4826]: E0319 19:17:44.052015 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="733b90f8-38f2-47b3-ae70-43edf8383cd8" containerName="mariadb-database-create" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.052057 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="733b90f8-38f2-47b3-ae70-43edf8383cd8" containerName="mariadb-database-create" Mar 19 19:17:44 crc kubenswrapper[4826]: E0319 19:17:44.052083 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a09e9d14-f548-48a1-bbdc-1f1588b80e3a" containerName="mariadb-account-create-update" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.052091 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="a09e9d14-f548-48a1-bbdc-1f1588b80e3a" containerName="mariadb-account-create-update" Mar 19 19:17:44 crc kubenswrapper[4826]: E0319 19:17:44.052100 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87ca634c-b18e-4567-a7ee-00d102d65496" containerName="mariadb-database-create" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.052106 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="87ca634c-b18e-4567-a7ee-00d102d65496" containerName="mariadb-database-create" Mar 19 19:17:44 crc kubenswrapper[4826]: E0319 19:17:44.052124 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d4f258d-0295-473e-89cf-b714157c3c60" containerName="mariadb-database-create" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.052130 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d4f258d-0295-473e-89cf-b714157c3c60" containerName="mariadb-database-create" Mar 19 19:17:44 crc kubenswrapper[4826]: E0319 19:17:44.052148 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d888eb3-4bcd-470d-95c6-aa3d281c6332" containerName="glance-db-sync" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.052154 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d888eb3-4bcd-470d-95c6-aa3d281c6332" containerName="glance-db-sync" Mar 19 19:17:44 crc kubenswrapper[4826]: E0319 19:17:44.052166 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f6915f1-a5f3-4816-8ed6-1f0232327393" containerName="mariadb-database-create" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.052174 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f6915f1-a5f3-4816-8ed6-1f0232327393" containerName="mariadb-database-create" Mar 19 19:17:44 crc kubenswrapper[4826]: E0319 19:17:44.052186 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8b780a2-0bd1-4947-bf59-b7c27a9c031c" containerName="mariadb-account-create-update" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.052192 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8b780a2-0bd1-4947-bf59-b7c27a9c031c" containerName="mariadb-account-create-update" Mar 19 19:17:44 crc kubenswrapper[4826]: E0319 19:17:44.052203 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd45547e-1987-40bd-ba4a-1156803be411" containerName="mariadb-account-create-update" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.052210 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd45547e-1987-40bd-ba4a-1156803be411" containerName="mariadb-account-create-update" Mar 19 19:17:44 crc kubenswrapper[4826]: E0319 19:17:44.052219 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca947aa7-664b-4392-9446-bdc5afdb3d6b" containerName="mariadb-account-create-update" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.052225 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca947aa7-664b-4392-9446-bdc5afdb3d6b" containerName="mariadb-account-create-update" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.052430 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="733b90f8-38f2-47b3-ae70-43edf8383cd8" containerName="mariadb-database-create" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.052445 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8b780a2-0bd1-4947-bf59-b7c27a9c031c" containerName="mariadb-account-create-update" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.052452 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca947aa7-664b-4392-9446-bdc5afdb3d6b" containerName="mariadb-account-create-update" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.052460 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="a09e9d14-f548-48a1-bbdc-1f1588b80e3a" containerName="mariadb-account-create-update" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.052467 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="87ca634c-b18e-4567-a7ee-00d102d65496" containerName="mariadb-database-create" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.052478 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd45547e-1987-40bd-ba4a-1156803be411" containerName="mariadb-account-create-update" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.052492 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d4f258d-0295-473e-89cf-b714157c3c60" containerName="mariadb-database-create" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.052499 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d888eb3-4bcd-470d-95c6-aa3d281c6332" containerName="glance-db-sync" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.052508 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f6915f1-a5f3-4816-8ed6-1f0232327393" containerName="mariadb-database-create" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.053582 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dc88fc-5mnmh" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.065140 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-5mnmh"] Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.191233 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-5mnmh"] Mar 19 19:17:44 crc kubenswrapper[4826]: E0319 19:17:44.196481 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-ljf5s ovsdbserver-nb ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-74dc88fc-5mnmh" podUID="cb4c26d3-e427-4866-9206-f5dd57797e17" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.217508 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-zdqw2"] Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.219913 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-zdqw2" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.221892 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.227093 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb4c26d3-e427-4866-9206-f5dd57797e17-ovsdbserver-sb\") pod \"dnsmasq-dns-74dc88fc-5mnmh\" (UID: \"cb4c26d3-e427-4866-9206-f5dd57797e17\") " pod="openstack/dnsmasq-dns-74dc88fc-5mnmh" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.227134 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb4c26d3-e427-4866-9206-f5dd57797e17-ovsdbserver-nb\") pod \"dnsmasq-dns-74dc88fc-5mnmh\" (UID: \"cb4c26d3-e427-4866-9206-f5dd57797e17\") " pod="openstack/dnsmasq-dns-74dc88fc-5mnmh" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.227190 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljf5s\" (UniqueName: \"kubernetes.io/projected/cb4c26d3-e427-4866-9206-f5dd57797e17-kube-api-access-ljf5s\") pod \"dnsmasq-dns-74dc88fc-5mnmh\" (UID: \"cb4c26d3-e427-4866-9206-f5dd57797e17\") " pod="openstack/dnsmasq-dns-74dc88fc-5mnmh" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.227235 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb4c26d3-e427-4866-9206-f5dd57797e17-dns-svc\") pod \"dnsmasq-dns-74dc88fc-5mnmh\" (UID: \"cb4c26d3-e427-4866-9206-f5dd57797e17\") " pod="openstack/dnsmasq-dns-74dc88fc-5mnmh" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.227446 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb4c26d3-e427-4866-9206-f5dd57797e17-config\") pod \"dnsmasq-dns-74dc88fc-5mnmh\" (UID: \"cb4c26d3-e427-4866-9206-f5dd57797e17\") " pod="openstack/dnsmasq-dns-74dc88fc-5mnmh" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.245762 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-zdqw2"] Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.329235 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb4c26d3-e427-4866-9206-f5dd57797e17-config\") pod \"dnsmasq-dns-74dc88fc-5mnmh\" (UID: \"cb4c26d3-e427-4866-9206-f5dd57797e17\") " pod="openstack/dnsmasq-dns-74dc88fc-5mnmh" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.329309 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1769715-e77c-4d52-944b-d380adc06ed3-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-zdqw2\" (UID: \"e1769715-e77c-4d52-944b-d380adc06ed3\") " pod="openstack/dnsmasq-dns-5f59b8f679-zdqw2" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.329360 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1769715-e77c-4d52-944b-d380adc06ed3-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-zdqw2\" (UID: \"e1769715-e77c-4d52-944b-d380adc06ed3\") " pod="openstack/dnsmasq-dns-5f59b8f679-zdqw2" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.329380 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjc97\" (UniqueName: \"kubernetes.io/projected/e1769715-e77c-4d52-944b-d380adc06ed3-kube-api-access-sjc97\") pod \"dnsmasq-dns-5f59b8f679-zdqw2\" (UID: \"e1769715-e77c-4d52-944b-d380adc06ed3\") " pod="openstack/dnsmasq-dns-5f59b8f679-zdqw2" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.329401 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb4c26d3-e427-4866-9206-f5dd57797e17-ovsdbserver-sb\") pod \"dnsmasq-dns-74dc88fc-5mnmh\" (UID: \"cb4c26d3-e427-4866-9206-f5dd57797e17\") " pod="openstack/dnsmasq-dns-74dc88fc-5mnmh" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.329424 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb4c26d3-e427-4866-9206-f5dd57797e17-ovsdbserver-nb\") pod \"dnsmasq-dns-74dc88fc-5mnmh\" (UID: \"cb4c26d3-e427-4866-9206-f5dd57797e17\") " pod="openstack/dnsmasq-dns-74dc88fc-5mnmh" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.329555 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljf5s\" (UniqueName: \"kubernetes.io/projected/cb4c26d3-e427-4866-9206-f5dd57797e17-kube-api-access-ljf5s\") pod \"dnsmasq-dns-74dc88fc-5mnmh\" (UID: \"cb4c26d3-e427-4866-9206-f5dd57797e17\") " pod="openstack/dnsmasq-dns-74dc88fc-5mnmh" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.329644 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb4c26d3-e427-4866-9206-f5dd57797e17-dns-svc\") pod \"dnsmasq-dns-74dc88fc-5mnmh\" (UID: \"cb4c26d3-e427-4866-9206-f5dd57797e17\") " pod="openstack/dnsmasq-dns-74dc88fc-5mnmh" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.329825 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1769715-e77c-4d52-944b-d380adc06ed3-config\") pod \"dnsmasq-dns-5f59b8f679-zdqw2\" (UID: \"e1769715-e77c-4d52-944b-d380adc06ed3\") " pod="openstack/dnsmasq-dns-5f59b8f679-zdqw2" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.329859 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1769715-e77c-4d52-944b-d380adc06ed3-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-zdqw2\" (UID: \"e1769715-e77c-4d52-944b-d380adc06ed3\") " pod="openstack/dnsmasq-dns-5f59b8f679-zdqw2" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.329924 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e1769715-e77c-4d52-944b-d380adc06ed3-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-zdqw2\" (UID: \"e1769715-e77c-4d52-944b-d380adc06ed3\") " pod="openstack/dnsmasq-dns-5f59b8f679-zdqw2" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.330386 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb4c26d3-e427-4866-9206-f5dd57797e17-config\") pod \"dnsmasq-dns-74dc88fc-5mnmh\" (UID: \"cb4c26d3-e427-4866-9206-f5dd57797e17\") " pod="openstack/dnsmasq-dns-74dc88fc-5mnmh" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.330410 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb4c26d3-e427-4866-9206-f5dd57797e17-ovsdbserver-sb\") pod \"dnsmasq-dns-74dc88fc-5mnmh\" (UID: \"cb4c26d3-e427-4866-9206-f5dd57797e17\") " pod="openstack/dnsmasq-dns-74dc88fc-5mnmh" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.330396 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb4c26d3-e427-4866-9206-f5dd57797e17-ovsdbserver-nb\") pod \"dnsmasq-dns-74dc88fc-5mnmh\" (UID: \"cb4c26d3-e427-4866-9206-f5dd57797e17\") " pod="openstack/dnsmasq-dns-74dc88fc-5mnmh" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.330555 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb4c26d3-e427-4866-9206-f5dd57797e17-dns-svc\") pod \"dnsmasq-dns-74dc88fc-5mnmh\" (UID: \"cb4c26d3-e427-4866-9206-f5dd57797e17\") " pod="openstack/dnsmasq-dns-74dc88fc-5mnmh" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.355414 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljf5s\" (UniqueName: \"kubernetes.io/projected/cb4c26d3-e427-4866-9206-f5dd57797e17-kube-api-access-ljf5s\") pod \"dnsmasq-dns-74dc88fc-5mnmh\" (UID: \"cb4c26d3-e427-4866-9206-f5dd57797e17\") " pod="openstack/dnsmasq-dns-74dc88fc-5mnmh" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.431623 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1769715-e77c-4d52-944b-d380adc06ed3-config\") pod \"dnsmasq-dns-5f59b8f679-zdqw2\" (UID: \"e1769715-e77c-4d52-944b-d380adc06ed3\") " pod="openstack/dnsmasq-dns-5f59b8f679-zdqw2" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.431893 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1769715-e77c-4d52-944b-d380adc06ed3-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-zdqw2\" (UID: \"e1769715-e77c-4d52-944b-d380adc06ed3\") " pod="openstack/dnsmasq-dns-5f59b8f679-zdqw2" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.431984 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e1769715-e77c-4d52-944b-d380adc06ed3-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-zdqw2\" (UID: \"e1769715-e77c-4d52-944b-d380adc06ed3\") " pod="openstack/dnsmasq-dns-5f59b8f679-zdqw2" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.432097 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1769715-e77c-4d52-944b-d380adc06ed3-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-zdqw2\" (UID: \"e1769715-e77c-4d52-944b-d380adc06ed3\") " pod="openstack/dnsmasq-dns-5f59b8f679-zdqw2" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.432204 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1769715-e77c-4d52-944b-d380adc06ed3-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-zdqw2\" (UID: \"e1769715-e77c-4d52-944b-d380adc06ed3\") " pod="openstack/dnsmasq-dns-5f59b8f679-zdqw2" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.432283 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjc97\" (UniqueName: \"kubernetes.io/projected/e1769715-e77c-4d52-944b-d380adc06ed3-kube-api-access-sjc97\") pod \"dnsmasq-dns-5f59b8f679-zdqw2\" (UID: \"e1769715-e77c-4d52-944b-d380adc06ed3\") " pod="openstack/dnsmasq-dns-5f59b8f679-zdqw2" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.432428 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1769715-e77c-4d52-944b-d380adc06ed3-config\") pod \"dnsmasq-dns-5f59b8f679-zdqw2\" (UID: \"e1769715-e77c-4d52-944b-d380adc06ed3\") " pod="openstack/dnsmasq-dns-5f59b8f679-zdqw2" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.432794 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1769715-e77c-4d52-944b-d380adc06ed3-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-zdqw2\" (UID: \"e1769715-e77c-4d52-944b-d380adc06ed3\") " pod="openstack/dnsmasq-dns-5f59b8f679-zdqw2" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.433012 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1769715-e77c-4d52-944b-d380adc06ed3-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-zdqw2\" (UID: \"e1769715-e77c-4d52-944b-d380adc06ed3\") " pod="openstack/dnsmasq-dns-5f59b8f679-zdqw2" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.433386 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e1769715-e77c-4d52-944b-d380adc06ed3-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-zdqw2\" (UID: \"e1769715-e77c-4d52-944b-d380adc06ed3\") " pod="openstack/dnsmasq-dns-5f59b8f679-zdqw2" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.433519 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1769715-e77c-4d52-944b-d380adc06ed3-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-zdqw2\" (UID: \"e1769715-e77c-4d52-944b-d380adc06ed3\") " pod="openstack/dnsmasq-dns-5f59b8f679-zdqw2" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.450290 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjc97\" (UniqueName: \"kubernetes.io/projected/e1769715-e77c-4d52-944b-d380adc06ed3-kube-api-access-sjc97\") pod \"dnsmasq-dns-5f59b8f679-zdqw2\" (UID: \"e1769715-e77c-4d52-944b-d380adc06ed3\") " pod="openstack/dnsmasq-dns-5f59b8f679-zdqw2" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.548715 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-zdqw2" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.768572 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dc88fc-5mnmh" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.782680 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dc88fc-5mnmh" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.960675 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb4c26d3-e427-4866-9206-f5dd57797e17-ovsdbserver-sb\") pod \"cb4c26d3-e427-4866-9206-f5dd57797e17\" (UID: \"cb4c26d3-e427-4866-9206-f5dd57797e17\") " Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.960787 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljf5s\" (UniqueName: \"kubernetes.io/projected/cb4c26d3-e427-4866-9206-f5dd57797e17-kube-api-access-ljf5s\") pod \"cb4c26d3-e427-4866-9206-f5dd57797e17\" (UID: \"cb4c26d3-e427-4866-9206-f5dd57797e17\") " Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.961061 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb4c26d3-e427-4866-9206-f5dd57797e17-dns-svc\") pod \"cb4c26d3-e427-4866-9206-f5dd57797e17\" (UID: \"cb4c26d3-e427-4866-9206-f5dd57797e17\") " Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.961127 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb4c26d3-e427-4866-9206-f5dd57797e17-config\") pod \"cb4c26d3-e427-4866-9206-f5dd57797e17\" (UID: \"cb4c26d3-e427-4866-9206-f5dd57797e17\") " Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.961158 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb4c26d3-e427-4866-9206-f5dd57797e17-ovsdbserver-nb\") pod \"cb4c26d3-e427-4866-9206-f5dd57797e17\" (UID: \"cb4c26d3-e427-4866-9206-f5dd57797e17\") " Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.961267 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb4c26d3-e427-4866-9206-f5dd57797e17-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cb4c26d3-e427-4866-9206-f5dd57797e17" (UID: "cb4c26d3-e427-4866-9206-f5dd57797e17"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.961514 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb4c26d3-e427-4866-9206-f5dd57797e17-config" (OuterVolumeSpecName: "config") pod "cb4c26d3-e427-4866-9206-f5dd57797e17" (UID: "cb4c26d3-e427-4866-9206-f5dd57797e17"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.961554 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb4c26d3-e427-4866-9206-f5dd57797e17-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cb4c26d3-e427-4866-9206-f5dd57797e17" (UID: "cb4c26d3-e427-4866-9206-f5dd57797e17"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.961816 4826 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb4c26d3-e427-4866-9206-f5dd57797e17-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.961836 4826 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb4c26d3-e427-4866-9206-f5dd57797e17-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.961846 4826 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb4c26d3-e427-4866-9206-f5dd57797e17-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.962215 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb4c26d3-e427-4866-9206-f5dd57797e17-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cb4c26d3-e427-4866-9206-f5dd57797e17" (UID: "cb4c26d3-e427-4866-9206-f5dd57797e17"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:17:44 crc kubenswrapper[4826]: I0319 19:17:44.966050 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb4c26d3-e427-4866-9206-f5dd57797e17-kube-api-access-ljf5s" (OuterVolumeSpecName: "kube-api-access-ljf5s") pod "cb4c26d3-e427-4866-9206-f5dd57797e17" (UID: "cb4c26d3-e427-4866-9206-f5dd57797e17"). InnerVolumeSpecName "kube-api-access-ljf5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:17:45 crc kubenswrapper[4826]: I0319 19:17:45.015430 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-zdqw2"] Mar 19 19:17:45 crc kubenswrapper[4826]: I0319 19:17:45.063443 4826 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb4c26d3-e427-4866-9206-f5dd57797e17-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:45 crc kubenswrapper[4826]: I0319 19:17:45.063690 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljf5s\" (UniqueName: \"kubernetes.io/projected/cb4c26d3-e427-4866-9206-f5dd57797e17-kube-api-access-ljf5s\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:45 crc kubenswrapper[4826]: I0319 19:17:45.779755 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-zdqw2" event={"ID":"e1769715-e77c-4d52-944b-d380adc06ed3","Type":"ContainerStarted","Data":"c37ef2fc40147680023569860b9b60d3a99dc257ad45d37e781cbf0a556ae3df"} Mar 19 19:17:45 crc kubenswrapper[4826]: I0319 19:17:45.782068 4826 generic.go:334] "Generic (PLEG): container finished" podID="92f6edca-b463-4c0a-b97a-3d82d73a9590" containerID="11a96161a72d30617a9c3938a06d39b3ea5b8072c9985ba3d6cf909483a17a5a" exitCode=0 Mar 19 19:17:45 crc kubenswrapper[4826]: I0319 19:17:45.782111 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-b84j2" event={"ID":"92f6edca-b463-4c0a-b97a-3d82d73a9590","Type":"ContainerDied","Data":"11a96161a72d30617a9c3938a06d39b3ea5b8072c9985ba3d6cf909483a17a5a"} Mar 19 19:17:45 crc kubenswrapper[4826]: I0319 19:17:45.782149 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dc88fc-5mnmh" Mar 19 19:17:45 crc kubenswrapper[4826]: I0319 19:17:45.846293 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-5mnmh"] Mar 19 19:17:45 crc kubenswrapper[4826]: I0319 19:17:45.854966 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-5mnmh"] Mar 19 19:17:46 crc kubenswrapper[4826]: I0319 19:17:46.008997 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb4c26d3-e427-4866-9206-f5dd57797e17" path="/var/lib/kubelet/pods/cb4c26d3-e427-4866-9206-f5dd57797e17/volumes" Mar 19 19:17:46 crc kubenswrapper[4826]: I0319 19:17:46.795806 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"bf194957-ec68-4ea7-b094-3e0912bc3bc5","Type":"ContainerStarted","Data":"82f46cf55b4c3da9a59beccd27ec0be4ae50b9b57e4d9ceac4c02e04b3269d96"} Mar 19 19:17:46 crc kubenswrapper[4826]: I0319 19:17:46.796192 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"bf194957-ec68-4ea7-b094-3e0912bc3bc5","Type":"ContainerStarted","Data":"a61960746b0bf6bb3247f24d9ad08813c4f7edd9337023c3edd4969ca7469d30"} Mar 19 19:17:46 crc kubenswrapper[4826]: I0319 19:17:46.797992 4826 generic.go:334] "Generic (PLEG): container finished" podID="e1769715-e77c-4d52-944b-d380adc06ed3" containerID="d9e40454feb8b49ff2c67220d3eea38143c3cf24188f29a3c6088d73a1609c8f" exitCode=0 Mar 19 19:17:46 crc kubenswrapper[4826]: I0319 19:17:46.798082 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-zdqw2" event={"ID":"e1769715-e77c-4d52-944b-d380adc06ed3","Type":"ContainerDied","Data":"d9e40454feb8b49ff2c67220d3eea38143c3cf24188f29a3c6088d73a1609c8f"} Mar 19 19:17:46 crc kubenswrapper[4826]: I0319 19:17:46.833483 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=15.833463858 podStartE2EDuration="15.833463858s" podCreationTimestamp="2026-03-19 19:17:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:17:46.825729611 +0000 UTC m=+1291.579797924" watchObservedRunningTime="2026-03-19 19:17:46.833463858 +0000 UTC m=+1291.587532171" Mar 19 19:17:47 crc kubenswrapper[4826]: I0319 19:17:47.090184 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Mar 19 19:17:47 crc kubenswrapper[4826]: I0319 19:17:47.090464 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 19 19:17:47 crc kubenswrapper[4826]: I0319 19:17:47.101602 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Mar 19 19:17:47 crc kubenswrapper[4826]: I0319 19:17:47.211602 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-b84j2" Mar 19 19:17:47 crc kubenswrapper[4826]: I0319 19:17:47.312478 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkmrf\" (UniqueName: \"kubernetes.io/projected/92f6edca-b463-4c0a-b97a-3d82d73a9590-kube-api-access-wkmrf\") pod \"92f6edca-b463-4c0a-b97a-3d82d73a9590\" (UID: \"92f6edca-b463-4c0a-b97a-3d82d73a9590\") " Mar 19 19:17:47 crc kubenswrapper[4826]: I0319 19:17:47.312555 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92f6edca-b463-4c0a-b97a-3d82d73a9590-config-data\") pod \"92f6edca-b463-4c0a-b97a-3d82d73a9590\" (UID: \"92f6edca-b463-4c0a-b97a-3d82d73a9590\") " Mar 19 19:17:47 crc kubenswrapper[4826]: I0319 19:17:47.312586 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92f6edca-b463-4c0a-b97a-3d82d73a9590-combined-ca-bundle\") pod \"92f6edca-b463-4c0a-b97a-3d82d73a9590\" (UID: \"92f6edca-b463-4c0a-b97a-3d82d73a9590\") " Mar 19 19:17:47 crc kubenswrapper[4826]: I0319 19:17:47.317776 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92f6edca-b463-4c0a-b97a-3d82d73a9590-kube-api-access-wkmrf" (OuterVolumeSpecName: "kube-api-access-wkmrf") pod "92f6edca-b463-4c0a-b97a-3d82d73a9590" (UID: "92f6edca-b463-4c0a-b97a-3d82d73a9590"). InnerVolumeSpecName "kube-api-access-wkmrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:17:47 crc kubenswrapper[4826]: I0319 19:17:47.344236 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92f6edca-b463-4c0a-b97a-3d82d73a9590-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "92f6edca-b463-4c0a-b97a-3d82d73a9590" (UID: "92f6edca-b463-4c0a-b97a-3d82d73a9590"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:17:47 crc kubenswrapper[4826]: I0319 19:17:47.361833 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92f6edca-b463-4c0a-b97a-3d82d73a9590-config-data" (OuterVolumeSpecName: "config-data") pod "92f6edca-b463-4c0a-b97a-3d82d73a9590" (UID: "92f6edca-b463-4c0a-b97a-3d82d73a9590"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:17:47 crc kubenswrapper[4826]: I0319 19:17:47.415220 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkmrf\" (UniqueName: \"kubernetes.io/projected/92f6edca-b463-4c0a-b97a-3d82d73a9590-kube-api-access-wkmrf\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:47 crc kubenswrapper[4826]: I0319 19:17:47.415434 4826 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92f6edca-b463-4c0a-b97a-3d82d73a9590-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:47 crc kubenswrapper[4826]: I0319 19:17:47.415518 4826 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92f6edca-b463-4c0a-b97a-3d82d73a9590-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:47 crc kubenswrapper[4826]: I0319 19:17:47.808112 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-b84j2" event={"ID":"92f6edca-b463-4c0a-b97a-3d82d73a9590","Type":"ContainerDied","Data":"a7f3e475bcb08cb83ff18d21d68953ffea7eed9d98ca14705b383f69a7e0409b"} Mar 19 19:17:47 crc kubenswrapper[4826]: I0319 19:17:47.808161 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7f3e475bcb08cb83ff18d21d68953ffea7eed9d98ca14705b383f69a7e0409b" Mar 19 19:17:47 crc kubenswrapper[4826]: I0319 19:17:47.808232 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-b84j2" Mar 19 19:17:47 crc kubenswrapper[4826]: I0319 19:17:47.818732 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-zdqw2" event={"ID":"e1769715-e77c-4d52-944b-d380adc06ed3","Type":"ContainerStarted","Data":"654c6481d5eb7e9acc20850eeb8c537a18e9575673fd46e1476e2d913c1c8574"} Mar 19 19:17:47 crc kubenswrapper[4826]: I0319 19:17:47.819103 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f59b8f679-zdqw2" Mar 19 19:17:47 crc kubenswrapper[4826]: I0319 19:17:47.824749 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 19 19:17:47 crc kubenswrapper[4826]: I0319 19:17:47.859137 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f59b8f679-zdqw2" podStartSLOduration=3.859117777 podStartE2EDuration="3.859117777s" podCreationTimestamp="2026-03-19 19:17:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:17:47.850351055 +0000 UTC m=+1292.604419388" watchObservedRunningTime="2026-03-19 19:17:47.859117777 +0000 UTC m=+1292.613186100" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.019326 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-zdqw2"] Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.064708 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-97nhn"] Mar 19 19:17:48 crc kubenswrapper[4826]: E0319 19:17:48.065500 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92f6edca-b463-4c0a-b97a-3d82d73a9590" containerName="keystone-db-sync" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.065514 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="92f6edca-b463-4c0a-b97a-3d82d73a9590" containerName="keystone-db-sync" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.065823 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="92f6edca-b463-4c0a-b97a-3d82d73a9590" containerName="keystone-db-sync" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.066976 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-97nhn" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.097146 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-6dng4"] Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.098518 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6dng4" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.117399 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.117648 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.117784 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.117868 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.117921 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-sxv9p" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.167728 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-97nhn"] Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.184607 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-6dng4"] Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.233913 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-xfct2"] Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.245939 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjhxb\" (UniqueName: \"kubernetes.io/projected/c7102b49-967d-49be-8a95-88d7ee3f8c94-kube-api-access-bjhxb\") pod \"dnsmasq-dns-bbf5cc879-97nhn\" (UID: \"c7102b49-967d-49be-8a95-88d7ee3f8c94\") " pod="openstack/dnsmasq-dns-bbf5cc879-97nhn" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.245992 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cc531703-8300-4682-bcc7-1772312976a9-credential-keys\") pod \"keystone-bootstrap-6dng4\" (UID: \"cc531703-8300-4682-bcc7-1772312976a9\") " pod="openstack/keystone-bootstrap-6dng4" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.246021 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c7102b49-967d-49be-8a95-88d7ee3f8c94-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-97nhn\" (UID: \"c7102b49-967d-49be-8a95-88d7ee3f8c94\") " pod="openstack/dnsmasq-dns-bbf5cc879-97nhn" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.246073 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc531703-8300-4682-bcc7-1772312976a9-scripts\") pod \"keystone-bootstrap-6dng4\" (UID: \"cc531703-8300-4682-bcc7-1772312976a9\") " pod="openstack/keystone-bootstrap-6dng4" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.246107 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7102b49-967d-49be-8a95-88d7ee3f8c94-config\") pod \"dnsmasq-dns-bbf5cc879-97nhn\" (UID: \"c7102b49-967d-49be-8a95-88d7ee3f8c94\") " pod="openstack/dnsmasq-dns-bbf5cc879-97nhn" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.246122 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7102b49-967d-49be-8a95-88d7ee3f8c94-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-97nhn\" (UID: \"c7102b49-967d-49be-8a95-88d7ee3f8c94\") " pod="openstack/dnsmasq-dns-bbf5cc879-97nhn" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.246143 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc531703-8300-4682-bcc7-1772312976a9-config-data\") pod \"keystone-bootstrap-6dng4\" (UID: \"cc531703-8300-4682-bcc7-1772312976a9\") " pod="openstack/keystone-bootstrap-6dng4" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.246220 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c7102b49-967d-49be-8a95-88d7ee3f8c94-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-97nhn\" (UID: \"c7102b49-967d-49be-8a95-88d7ee3f8c94\") " pod="openstack/dnsmasq-dns-bbf5cc879-97nhn" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.246237 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cc531703-8300-4682-bcc7-1772312976a9-fernet-keys\") pod \"keystone-bootstrap-6dng4\" (UID: \"cc531703-8300-4682-bcc7-1772312976a9\") " pod="openstack/keystone-bootstrap-6dng4" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.246262 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c7102b49-967d-49be-8a95-88d7ee3f8c94-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-97nhn\" (UID: \"c7102b49-967d-49be-8a95-88d7ee3f8c94\") " pod="openstack/dnsmasq-dns-bbf5cc879-97nhn" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.246308 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc531703-8300-4682-bcc7-1772312976a9-combined-ca-bundle\") pod \"keystone-bootstrap-6dng4\" (UID: \"cc531703-8300-4682-bcc7-1772312976a9\") " pod="openstack/keystone-bootstrap-6dng4" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.246336 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cztns\" (UniqueName: \"kubernetes.io/projected/cc531703-8300-4682-bcc7-1772312976a9-kube-api-access-cztns\") pod \"keystone-bootstrap-6dng4\" (UID: \"cc531703-8300-4682-bcc7-1772312976a9\") " pod="openstack/keystone-bootstrap-6dng4" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.250150 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-xfct2"] Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.250263 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-xfct2" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.255685 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-vhwpp" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.255768 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.321506 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-p7jzd"] Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.322894 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-p7jzd" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.330566 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.330779 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.331227 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-266gk" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.347894 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjhxb\" (UniqueName: \"kubernetes.io/projected/c7102b49-967d-49be-8a95-88d7ee3f8c94-kube-api-access-bjhxb\") pod \"dnsmasq-dns-bbf5cc879-97nhn\" (UID: \"c7102b49-967d-49be-8a95-88d7ee3f8c94\") " pod="openstack/dnsmasq-dns-bbf5cc879-97nhn" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.347945 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cc531703-8300-4682-bcc7-1772312976a9-credential-keys\") pod \"keystone-bootstrap-6dng4\" (UID: \"cc531703-8300-4682-bcc7-1772312976a9\") " pod="openstack/keystone-bootstrap-6dng4" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.347969 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c7102b49-967d-49be-8a95-88d7ee3f8c94-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-97nhn\" (UID: \"c7102b49-967d-49be-8a95-88d7ee3f8c94\") " pod="openstack/dnsmasq-dns-bbf5cc879-97nhn" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.348011 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc531703-8300-4682-bcc7-1772312976a9-scripts\") pod \"keystone-bootstrap-6dng4\" (UID: \"cc531703-8300-4682-bcc7-1772312976a9\") " pod="openstack/keystone-bootstrap-6dng4" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.348037 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7102b49-967d-49be-8a95-88d7ee3f8c94-config\") pod \"dnsmasq-dns-bbf5cc879-97nhn\" (UID: \"c7102b49-967d-49be-8a95-88d7ee3f8c94\") " pod="openstack/dnsmasq-dns-bbf5cc879-97nhn" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.348050 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7102b49-967d-49be-8a95-88d7ee3f8c94-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-97nhn\" (UID: \"c7102b49-967d-49be-8a95-88d7ee3f8c94\") " pod="openstack/dnsmasq-dns-bbf5cc879-97nhn" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.348071 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc531703-8300-4682-bcc7-1772312976a9-config-data\") pod \"keystone-bootstrap-6dng4\" (UID: \"cc531703-8300-4682-bcc7-1772312976a9\") " pod="openstack/keystone-bootstrap-6dng4" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.348140 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c7102b49-967d-49be-8a95-88d7ee3f8c94-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-97nhn\" (UID: \"c7102b49-967d-49be-8a95-88d7ee3f8c94\") " pod="openstack/dnsmasq-dns-bbf5cc879-97nhn" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.348158 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cc531703-8300-4682-bcc7-1772312976a9-fernet-keys\") pod \"keystone-bootstrap-6dng4\" (UID: \"cc531703-8300-4682-bcc7-1772312976a9\") " pod="openstack/keystone-bootstrap-6dng4" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.348187 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c7102b49-967d-49be-8a95-88d7ee3f8c94-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-97nhn\" (UID: \"c7102b49-967d-49be-8a95-88d7ee3f8c94\") " pod="openstack/dnsmasq-dns-bbf5cc879-97nhn" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.348225 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc531703-8300-4682-bcc7-1772312976a9-combined-ca-bundle\") pod \"keystone-bootstrap-6dng4\" (UID: \"cc531703-8300-4682-bcc7-1772312976a9\") " pod="openstack/keystone-bootstrap-6dng4" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.348253 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cztns\" (UniqueName: \"kubernetes.io/projected/cc531703-8300-4682-bcc7-1772312976a9-kube-api-access-cztns\") pod \"keystone-bootstrap-6dng4\" (UID: \"cc531703-8300-4682-bcc7-1772312976a9\") " pod="openstack/keystone-bootstrap-6dng4" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.349789 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7102b49-967d-49be-8a95-88d7ee3f8c94-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-97nhn\" (UID: \"c7102b49-967d-49be-8a95-88d7ee3f8c94\") " pod="openstack/dnsmasq-dns-bbf5cc879-97nhn" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.350694 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c7102b49-967d-49be-8a95-88d7ee3f8c94-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-97nhn\" (UID: \"c7102b49-967d-49be-8a95-88d7ee3f8c94\") " pod="openstack/dnsmasq-dns-bbf5cc879-97nhn" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.359513 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cc531703-8300-4682-bcc7-1772312976a9-credential-keys\") pod \"keystone-bootstrap-6dng4\" (UID: \"cc531703-8300-4682-bcc7-1772312976a9\") " pod="openstack/keystone-bootstrap-6dng4" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.360669 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c7102b49-967d-49be-8a95-88d7ee3f8c94-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-97nhn\" (UID: \"c7102b49-967d-49be-8a95-88d7ee3f8c94\") " pod="openstack/dnsmasq-dns-bbf5cc879-97nhn" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.361493 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7102b49-967d-49be-8a95-88d7ee3f8c94-config\") pod \"dnsmasq-dns-bbf5cc879-97nhn\" (UID: \"c7102b49-967d-49be-8a95-88d7ee3f8c94\") " pod="openstack/dnsmasq-dns-bbf5cc879-97nhn" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.363461 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c7102b49-967d-49be-8a95-88d7ee3f8c94-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-97nhn\" (UID: \"c7102b49-967d-49be-8a95-88d7ee3f8c94\") " pod="openstack/dnsmasq-dns-bbf5cc879-97nhn" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.370609 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc531703-8300-4682-bcc7-1772312976a9-combined-ca-bundle\") pod \"keystone-bootstrap-6dng4\" (UID: \"cc531703-8300-4682-bcc7-1772312976a9\") " pod="openstack/keystone-bootstrap-6dng4" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.378452 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc531703-8300-4682-bcc7-1772312976a9-config-data\") pod \"keystone-bootstrap-6dng4\" (UID: \"cc531703-8300-4682-bcc7-1772312976a9\") " pod="openstack/keystone-bootstrap-6dng4" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.380267 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjhxb\" (UniqueName: \"kubernetes.io/projected/c7102b49-967d-49be-8a95-88d7ee3f8c94-kube-api-access-bjhxb\") pod \"dnsmasq-dns-bbf5cc879-97nhn\" (UID: \"c7102b49-967d-49be-8a95-88d7ee3f8c94\") " pod="openstack/dnsmasq-dns-bbf5cc879-97nhn" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.384271 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-p7jzd"] Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.385278 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cc531703-8300-4682-bcc7-1772312976a9-fernet-keys\") pod \"keystone-bootstrap-6dng4\" (UID: \"cc531703-8300-4682-bcc7-1772312976a9\") " pod="openstack/keystone-bootstrap-6dng4" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.385699 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc531703-8300-4682-bcc7-1772312976a9-scripts\") pod \"keystone-bootstrap-6dng4\" (UID: \"cc531703-8300-4682-bcc7-1772312976a9\") " pod="openstack/keystone-bootstrap-6dng4" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.391641 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cztns\" (UniqueName: \"kubernetes.io/projected/cc531703-8300-4682-bcc7-1772312976a9-kube-api-access-cztns\") pod \"keystone-bootstrap-6dng4\" (UID: \"cc531703-8300-4682-bcc7-1772312976a9\") " pod="openstack/keystone-bootstrap-6dng4" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.449929 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4t44\" (UniqueName: \"kubernetes.io/projected/5c80aa39-c840-4267-9677-bb82f387073d-kube-api-access-f4t44\") pod \"cinder-db-sync-p7jzd\" (UID: \"5c80aa39-c840-4267-9677-bb82f387073d\") " pod="openstack/cinder-db-sync-p7jzd" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.449992 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8kf4\" (UniqueName: \"kubernetes.io/projected/d92c353f-6fae-4be8-8580-4066bb56e856-kube-api-access-j8kf4\") pod \"heat-db-sync-xfct2\" (UID: \"d92c353f-6fae-4be8-8580-4066bb56e856\") " pod="openstack/heat-db-sync-xfct2" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.450014 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c80aa39-c840-4267-9677-bb82f387073d-scripts\") pod \"cinder-db-sync-p7jzd\" (UID: \"5c80aa39-c840-4267-9677-bb82f387073d\") " pod="openstack/cinder-db-sync-p7jzd" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.450062 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d92c353f-6fae-4be8-8580-4066bb56e856-combined-ca-bundle\") pod \"heat-db-sync-xfct2\" (UID: \"d92c353f-6fae-4be8-8580-4066bb56e856\") " pod="openstack/heat-db-sync-xfct2" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.450097 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5c80aa39-c840-4267-9677-bb82f387073d-etc-machine-id\") pod \"cinder-db-sync-p7jzd\" (UID: \"5c80aa39-c840-4267-9677-bb82f387073d\") " pod="openstack/cinder-db-sync-p7jzd" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.450134 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5c80aa39-c840-4267-9677-bb82f387073d-db-sync-config-data\") pod \"cinder-db-sync-p7jzd\" (UID: \"5c80aa39-c840-4267-9677-bb82f387073d\") " pod="openstack/cinder-db-sync-p7jzd" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.450162 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d92c353f-6fae-4be8-8580-4066bb56e856-config-data\") pod \"heat-db-sync-xfct2\" (UID: \"d92c353f-6fae-4be8-8580-4066bb56e856\") " pod="openstack/heat-db-sync-xfct2" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.450182 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c80aa39-c840-4267-9677-bb82f387073d-config-data\") pod \"cinder-db-sync-p7jzd\" (UID: \"5c80aa39-c840-4267-9677-bb82f387073d\") " pod="openstack/cinder-db-sync-p7jzd" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.450208 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c80aa39-c840-4267-9677-bb82f387073d-combined-ca-bundle\") pod \"cinder-db-sync-p7jzd\" (UID: \"5c80aa39-c840-4267-9677-bb82f387073d\") " pod="openstack/cinder-db-sync-p7jzd" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.450835 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-97nhn" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.489125 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6dng4" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.509534 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-xkjdp"] Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.519340 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xkjdp" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.545080 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-jgxmp"] Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.546552 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-jgxmp" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.565397 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.565611 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-bsd2c" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.565781 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.565965 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-9hfvr" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.566196 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.570535 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d92c353f-6fae-4be8-8580-4066bb56e856-combined-ca-bundle\") pod \"heat-db-sync-xfct2\" (UID: \"d92c353f-6fae-4be8-8580-4066bb56e856\") " pod="openstack/heat-db-sync-xfct2" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.570616 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5c80aa39-c840-4267-9677-bb82f387073d-etc-machine-id\") pod \"cinder-db-sync-p7jzd\" (UID: \"5c80aa39-c840-4267-9677-bb82f387073d\") " pod="openstack/cinder-db-sync-p7jzd" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.570683 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9ef3b70e-ff7d-48a9-8796-8b20af6e6547-config\") pod \"neutron-db-sync-jgxmp\" (UID: \"9ef3b70e-ff7d-48a9-8796-8b20af6e6547\") " pod="openstack/neutron-db-sync-jgxmp" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.570712 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrl4l\" (UniqueName: \"kubernetes.io/projected/30c5e21e-66a0-47a9-b03d-55fbfe372d1b-kube-api-access-nrl4l\") pod \"barbican-db-sync-xkjdp\" (UID: \"30c5e21e-66a0-47a9-b03d-55fbfe372d1b\") " pod="openstack/barbican-db-sync-xkjdp" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.570731 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5c80aa39-c840-4267-9677-bb82f387073d-db-sync-config-data\") pod \"cinder-db-sync-p7jzd\" (UID: \"5c80aa39-c840-4267-9677-bb82f387073d\") " pod="openstack/cinder-db-sync-p7jzd" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.570750 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30c5e21e-66a0-47a9-b03d-55fbfe372d1b-combined-ca-bundle\") pod \"barbican-db-sync-xkjdp\" (UID: \"30c5e21e-66a0-47a9-b03d-55fbfe372d1b\") " pod="openstack/barbican-db-sync-xkjdp" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.570790 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d92c353f-6fae-4be8-8580-4066bb56e856-config-data\") pod \"heat-db-sync-xfct2\" (UID: \"d92c353f-6fae-4be8-8580-4066bb56e856\") " pod="openstack/heat-db-sync-xfct2" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.570806 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/30c5e21e-66a0-47a9-b03d-55fbfe372d1b-db-sync-config-data\") pod \"barbican-db-sync-xkjdp\" (UID: \"30c5e21e-66a0-47a9-b03d-55fbfe372d1b\") " pod="openstack/barbican-db-sync-xkjdp" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.570827 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c80aa39-c840-4267-9677-bb82f387073d-config-data\") pod \"cinder-db-sync-p7jzd\" (UID: \"5c80aa39-c840-4267-9677-bb82f387073d\") " pod="openstack/cinder-db-sync-p7jzd" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.570856 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c80aa39-c840-4267-9677-bb82f387073d-combined-ca-bundle\") pod \"cinder-db-sync-p7jzd\" (UID: \"5c80aa39-c840-4267-9677-bb82f387073d\") " pod="openstack/cinder-db-sync-p7jzd" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.570911 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ef3b70e-ff7d-48a9-8796-8b20af6e6547-combined-ca-bundle\") pod \"neutron-db-sync-jgxmp\" (UID: \"9ef3b70e-ff7d-48a9-8796-8b20af6e6547\") " pod="openstack/neutron-db-sync-jgxmp" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.570938 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47mxb\" (UniqueName: \"kubernetes.io/projected/9ef3b70e-ff7d-48a9-8796-8b20af6e6547-kube-api-access-47mxb\") pod \"neutron-db-sync-jgxmp\" (UID: \"9ef3b70e-ff7d-48a9-8796-8b20af6e6547\") " pod="openstack/neutron-db-sync-jgxmp" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.570987 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4t44\" (UniqueName: \"kubernetes.io/projected/5c80aa39-c840-4267-9677-bb82f387073d-kube-api-access-f4t44\") pod \"cinder-db-sync-p7jzd\" (UID: \"5c80aa39-c840-4267-9677-bb82f387073d\") " pod="openstack/cinder-db-sync-p7jzd" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.571017 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8kf4\" (UniqueName: \"kubernetes.io/projected/d92c353f-6fae-4be8-8580-4066bb56e856-kube-api-access-j8kf4\") pod \"heat-db-sync-xfct2\" (UID: \"d92c353f-6fae-4be8-8580-4066bb56e856\") " pod="openstack/heat-db-sync-xfct2" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.571035 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c80aa39-c840-4267-9677-bb82f387073d-scripts\") pod \"cinder-db-sync-p7jzd\" (UID: \"5c80aa39-c840-4267-9677-bb82f387073d\") " pod="openstack/cinder-db-sync-p7jzd" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.581616 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5c80aa39-c840-4267-9677-bb82f387073d-etc-machine-id\") pod \"cinder-db-sync-p7jzd\" (UID: \"5c80aa39-c840-4267-9677-bb82f387073d\") " pod="openstack/cinder-db-sync-p7jzd" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.620525 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d92c353f-6fae-4be8-8580-4066bb56e856-config-data\") pod \"heat-db-sync-xfct2\" (UID: \"d92c353f-6fae-4be8-8580-4066bb56e856\") " pod="openstack/heat-db-sync-xfct2" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.622036 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5c80aa39-c840-4267-9677-bb82f387073d-db-sync-config-data\") pod \"cinder-db-sync-p7jzd\" (UID: \"5c80aa39-c840-4267-9677-bb82f387073d\") " pod="openstack/cinder-db-sync-p7jzd" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.623705 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c80aa39-c840-4267-9677-bb82f387073d-config-data\") pod \"cinder-db-sync-p7jzd\" (UID: \"5c80aa39-c840-4267-9677-bb82f387073d\") " pod="openstack/cinder-db-sync-p7jzd" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.623883 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c80aa39-c840-4267-9677-bb82f387073d-combined-ca-bundle\") pod \"cinder-db-sync-p7jzd\" (UID: \"5c80aa39-c840-4267-9677-bb82f387073d\") " pod="openstack/cinder-db-sync-p7jzd" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.624386 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c80aa39-c840-4267-9677-bb82f387073d-scripts\") pod \"cinder-db-sync-p7jzd\" (UID: \"5c80aa39-c840-4267-9677-bb82f387073d\") " pod="openstack/cinder-db-sync-p7jzd" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.688986 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d92c353f-6fae-4be8-8580-4066bb56e856-combined-ca-bundle\") pod \"heat-db-sync-xfct2\" (UID: \"d92c353f-6fae-4be8-8580-4066bb56e856\") " pod="openstack/heat-db-sync-xfct2" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.692241 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8kf4\" (UniqueName: \"kubernetes.io/projected/d92c353f-6fae-4be8-8580-4066bb56e856-kube-api-access-j8kf4\") pod \"heat-db-sync-xfct2\" (UID: \"d92c353f-6fae-4be8-8580-4066bb56e856\") " pod="openstack/heat-db-sync-xfct2" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.707386 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4t44\" (UniqueName: \"kubernetes.io/projected/5c80aa39-c840-4267-9677-bb82f387073d-kube-api-access-f4t44\") pod \"cinder-db-sync-p7jzd\" (UID: \"5c80aa39-c840-4267-9677-bb82f387073d\") " pod="openstack/cinder-db-sync-p7jzd" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.720231 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-xkjdp"] Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.758322 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9ef3b70e-ff7d-48a9-8796-8b20af6e6547-config\") pod \"neutron-db-sync-jgxmp\" (UID: \"9ef3b70e-ff7d-48a9-8796-8b20af6e6547\") " pod="openstack/neutron-db-sync-jgxmp" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.758392 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrl4l\" (UniqueName: \"kubernetes.io/projected/30c5e21e-66a0-47a9-b03d-55fbfe372d1b-kube-api-access-nrl4l\") pod \"barbican-db-sync-xkjdp\" (UID: \"30c5e21e-66a0-47a9-b03d-55fbfe372d1b\") " pod="openstack/barbican-db-sync-xkjdp" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.758420 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30c5e21e-66a0-47a9-b03d-55fbfe372d1b-combined-ca-bundle\") pod \"barbican-db-sync-xkjdp\" (UID: \"30c5e21e-66a0-47a9-b03d-55fbfe372d1b\") " pod="openstack/barbican-db-sync-xkjdp" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.758481 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/30c5e21e-66a0-47a9-b03d-55fbfe372d1b-db-sync-config-data\") pod \"barbican-db-sync-xkjdp\" (UID: \"30c5e21e-66a0-47a9-b03d-55fbfe372d1b\") " pod="openstack/barbican-db-sync-xkjdp" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.758620 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ef3b70e-ff7d-48a9-8796-8b20af6e6547-combined-ca-bundle\") pod \"neutron-db-sync-jgxmp\" (UID: \"9ef3b70e-ff7d-48a9-8796-8b20af6e6547\") " pod="openstack/neutron-db-sync-jgxmp" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.758694 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47mxb\" (UniqueName: \"kubernetes.io/projected/9ef3b70e-ff7d-48a9-8796-8b20af6e6547-kube-api-access-47mxb\") pod \"neutron-db-sync-jgxmp\" (UID: \"9ef3b70e-ff7d-48a9-8796-8b20af6e6547\") " pod="openstack/neutron-db-sync-jgxmp" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.764727 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-jgxmp"] Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.772912 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/9ef3b70e-ff7d-48a9-8796-8b20af6e6547-config\") pod \"neutron-db-sync-jgxmp\" (UID: \"9ef3b70e-ff7d-48a9-8796-8b20af6e6547\") " pod="openstack/neutron-db-sync-jgxmp" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.773929 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30c5e21e-66a0-47a9-b03d-55fbfe372d1b-combined-ca-bundle\") pod \"barbican-db-sync-xkjdp\" (UID: \"30c5e21e-66a0-47a9-b03d-55fbfe372d1b\") " pod="openstack/barbican-db-sync-xkjdp" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.775822 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ef3b70e-ff7d-48a9-8796-8b20af6e6547-combined-ca-bundle\") pod \"neutron-db-sync-jgxmp\" (UID: \"9ef3b70e-ff7d-48a9-8796-8b20af6e6547\") " pod="openstack/neutron-db-sync-jgxmp" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.785761 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-97nhn"] Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.793284 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47mxb\" (UniqueName: \"kubernetes.io/projected/9ef3b70e-ff7d-48a9-8796-8b20af6e6547-kube-api-access-47mxb\") pod \"neutron-db-sync-jgxmp\" (UID: \"9ef3b70e-ff7d-48a9-8796-8b20af6e6547\") " pod="openstack/neutron-db-sync-jgxmp" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.809308 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrl4l\" (UniqueName: \"kubernetes.io/projected/30c5e21e-66a0-47a9-b03d-55fbfe372d1b-kube-api-access-nrl4l\") pod \"barbican-db-sync-xkjdp\" (UID: \"30c5e21e-66a0-47a9-b03d-55fbfe372d1b\") " pod="openstack/barbican-db-sync-xkjdp" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.809735 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-p7jzd" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.819262 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/30c5e21e-66a0-47a9-b03d-55fbfe372d1b-db-sync-config-data\") pod \"barbican-db-sync-xkjdp\" (UID: \"30c5e21e-66a0-47a9-b03d-55fbfe372d1b\") " pod="openstack/barbican-db-sync-xkjdp" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.828717 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-gvpkd"] Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.831195 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-gvpkd" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.861643 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b4865a7-127e-4108-bc3a-3aac30103761-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-gvpkd\" (UID: \"6b4865a7-127e-4108-bc3a-3aac30103761\") " pod="openstack/dnsmasq-dns-56df8fb6b7-gvpkd" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.862295 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6b4865a7-127e-4108-bc3a-3aac30103761-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-gvpkd\" (UID: \"6b4865a7-127e-4108-bc3a-3aac30103761\") " pod="openstack/dnsmasq-dns-56df8fb6b7-gvpkd" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.862446 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6b4865a7-127e-4108-bc3a-3aac30103761-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-gvpkd\" (UID: \"6b4865a7-127e-4108-bc3a-3aac30103761\") " pod="openstack/dnsmasq-dns-56df8fb6b7-gvpkd" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.862641 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbjgj\" (UniqueName: \"kubernetes.io/projected/6b4865a7-127e-4108-bc3a-3aac30103761-kube-api-access-cbjgj\") pod \"dnsmasq-dns-56df8fb6b7-gvpkd\" (UID: \"6b4865a7-127e-4108-bc3a-3aac30103761\") " pod="openstack/dnsmasq-dns-56df8fb6b7-gvpkd" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.868145 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6b4865a7-127e-4108-bc3a-3aac30103761-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-gvpkd\" (UID: \"6b4865a7-127e-4108-bc3a-3aac30103761\") " pod="openstack/dnsmasq-dns-56df8fb6b7-gvpkd" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.868363 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b4865a7-127e-4108-bc3a-3aac30103761-config\") pod \"dnsmasq-dns-56df8fb6b7-gvpkd\" (UID: \"6b4865a7-127e-4108-bc3a-3aac30103761\") " pod="openstack/dnsmasq-dns-56df8fb6b7-gvpkd" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.866751 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-xfct2" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.875943 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-gvpkd"] Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.910295 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-hhk58"] Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.912649 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-hhk58" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.917465 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.918951 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.919246 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-66wpf" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.961575 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-hhk58"] Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.970809 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/789d342d-013e-45e5-a57b-cde9f8bc0d3f-config-data\") pod \"placement-db-sync-hhk58\" (UID: \"789d342d-013e-45e5-a57b-cde9f8bc0d3f\") " pod="openstack/placement-db-sync-hhk58" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.970876 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b4865a7-127e-4108-bc3a-3aac30103761-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-gvpkd\" (UID: \"6b4865a7-127e-4108-bc3a-3aac30103761\") " pod="openstack/dnsmasq-dns-56df8fb6b7-gvpkd" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.970963 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6b4865a7-127e-4108-bc3a-3aac30103761-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-gvpkd\" (UID: \"6b4865a7-127e-4108-bc3a-3aac30103761\") " pod="openstack/dnsmasq-dns-56df8fb6b7-gvpkd" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.970998 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6b4865a7-127e-4108-bc3a-3aac30103761-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-gvpkd\" (UID: \"6b4865a7-127e-4108-bc3a-3aac30103761\") " pod="openstack/dnsmasq-dns-56df8fb6b7-gvpkd" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.971046 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbjgj\" (UniqueName: \"kubernetes.io/projected/6b4865a7-127e-4108-bc3a-3aac30103761-kube-api-access-cbjgj\") pod \"dnsmasq-dns-56df8fb6b7-gvpkd\" (UID: \"6b4865a7-127e-4108-bc3a-3aac30103761\") " pod="openstack/dnsmasq-dns-56df8fb6b7-gvpkd" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.971069 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/789d342d-013e-45e5-a57b-cde9f8bc0d3f-logs\") pod \"placement-db-sync-hhk58\" (UID: \"789d342d-013e-45e5-a57b-cde9f8bc0d3f\") " pod="openstack/placement-db-sync-hhk58" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.971088 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6b4865a7-127e-4108-bc3a-3aac30103761-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-gvpkd\" (UID: \"6b4865a7-127e-4108-bc3a-3aac30103761\") " pod="openstack/dnsmasq-dns-56df8fb6b7-gvpkd" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.971126 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b4865a7-127e-4108-bc3a-3aac30103761-config\") pod \"dnsmasq-dns-56df8fb6b7-gvpkd\" (UID: \"6b4865a7-127e-4108-bc3a-3aac30103761\") " pod="openstack/dnsmasq-dns-56df8fb6b7-gvpkd" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.971149 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/789d342d-013e-45e5-a57b-cde9f8bc0d3f-combined-ca-bundle\") pod \"placement-db-sync-hhk58\" (UID: \"789d342d-013e-45e5-a57b-cde9f8bc0d3f\") " pod="openstack/placement-db-sync-hhk58" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.971172 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/789d342d-013e-45e5-a57b-cde9f8bc0d3f-scripts\") pod \"placement-db-sync-hhk58\" (UID: \"789d342d-013e-45e5-a57b-cde9f8bc0d3f\") " pod="openstack/placement-db-sync-hhk58" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.971187 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75899\" (UniqueName: \"kubernetes.io/projected/789d342d-013e-45e5-a57b-cde9f8bc0d3f-kube-api-access-75899\") pod \"placement-db-sync-hhk58\" (UID: \"789d342d-013e-45e5-a57b-cde9f8bc0d3f\") " pod="openstack/placement-db-sync-hhk58" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.972416 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b4865a7-127e-4108-bc3a-3aac30103761-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-gvpkd\" (UID: \"6b4865a7-127e-4108-bc3a-3aac30103761\") " pod="openstack/dnsmasq-dns-56df8fb6b7-gvpkd" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.972874 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6b4865a7-127e-4108-bc3a-3aac30103761-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-gvpkd\" (UID: \"6b4865a7-127e-4108-bc3a-3aac30103761\") " pod="openstack/dnsmasq-dns-56df8fb6b7-gvpkd" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.974243 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b4865a7-127e-4108-bc3a-3aac30103761-config\") pod \"dnsmasq-dns-56df8fb6b7-gvpkd\" (UID: \"6b4865a7-127e-4108-bc3a-3aac30103761\") " pod="openstack/dnsmasq-dns-56df8fb6b7-gvpkd" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.974949 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6b4865a7-127e-4108-bc3a-3aac30103761-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-gvpkd\" (UID: \"6b4865a7-127e-4108-bc3a-3aac30103761\") " pod="openstack/dnsmasq-dns-56df8fb6b7-gvpkd" Mar 19 19:17:48 crc kubenswrapper[4826]: I0319 19:17:48.976835 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6b4865a7-127e-4108-bc3a-3aac30103761-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-gvpkd\" (UID: \"6b4865a7-127e-4108-bc3a-3aac30103761\") " pod="openstack/dnsmasq-dns-56df8fb6b7-gvpkd" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.001460 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbjgj\" (UniqueName: \"kubernetes.io/projected/6b4865a7-127e-4108-bc3a-3aac30103761-kube-api-access-cbjgj\") pod \"dnsmasq-dns-56df8fb6b7-gvpkd\" (UID: \"6b4865a7-127e-4108-bc3a-3aac30103761\") " pod="openstack/dnsmasq-dns-56df8fb6b7-gvpkd" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.030687 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.033598 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.036459 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.038691 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.075189 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc865bcc-551d-4ae7-a0f3-af128e1d1e2d-config-data\") pod \"ceilometer-0\" (UID: \"dc865bcc-551d-4ae7-a0f3-af128e1d1e2d\") " pod="openstack/ceilometer-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.075465 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc865bcc-551d-4ae7-a0f3-af128e1d1e2d-log-httpd\") pod \"ceilometer-0\" (UID: \"dc865bcc-551d-4ae7-a0f3-af128e1d1e2d\") " pod="openstack/ceilometer-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.075523 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc865bcc-551d-4ae7-a0f3-af128e1d1e2d-run-httpd\") pod \"ceilometer-0\" (UID: \"dc865bcc-551d-4ae7-a0f3-af128e1d1e2d\") " pod="openstack/ceilometer-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.075641 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/789d342d-013e-45e5-a57b-cde9f8bc0d3f-logs\") pod \"placement-db-sync-hhk58\" (UID: \"789d342d-013e-45e5-a57b-cde9f8bc0d3f\") " pod="openstack/placement-db-sync-hhk58" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.075710 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc865bcc-551d-4ae7-a0f3-af128e1d1e2d-scripts\") pod \"ceilometer-0\" (UID: \"dc865bcc-551d-4ae7-a0f3-af128e1d1e2d\") " pod="openstack/ceilometer-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.075746 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/789d342d-013e-45e5-a57b-cde9f8bc0d3f-combined-ca-bundle\") pod \"placement-db-sync-hhk58\" (UID: \"789d342d-013e-45e5-a57b-cde9f8bc0d3f\") " pod="openstack/placement-db-sync-hhk58" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.075770 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc865bcc-551d-4ae7-a0f3-af128e1d1e2d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dc865bcc-551d-4ae7-a0f3-af128e1d1e2d\") " pod="openstack/ceilometer-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.075792 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/789d342d-013e-45e5-a57b-cde9f8bc0d3f-scripts\") pod \"placement-db-sync-hhk58\" (UID: \"789d342d-013e-45e5-a57b-cde9f8bc0d3f\") " pod="openstack/placement-db-sync-hhk58" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.075807 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75899\" (UniqueName: \"kubernetes.io/projected/789d342d-013e-45e5-a57b-cde9f8bc0d3f-kube-api-access-75899\") pod \"placement-db-sync-hhk58\" (UID: \"789d342d-013e-45e5-a57b-cde9f8bc0d3f\") " pod="openstack/placement-db-sync-hhk58" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.075826 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wckx\" (UniqueName: \"kubernetes.io/projected/dc865bcc-551d-4ae7-a0f3-af128e1d1e2d-kube-api-access-7wckx\") pod \"ceilometer-0\" (UID: \"dc865bcc-551d-4ae7-a0f3-af128e1d1e2d\") " pod="openstack/ceilometer-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.075849 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc865bcc-551d-4ae7-a0f3-af128e1d1e2d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dc865bcc-551d-4ae7-a0f3-af128e1d1e2d\") " pod="openstack/ceilometer-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.075867 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/789d342d-013e-45e5-a57b-cde9f8bc0d3f-config-data\") pod \"placement-db-sync-hhk58\" (UID: \"789d342d-013e-45e5-a57b-cde9f8bc0d3f\") " pod="openstack/placement-db-sync-hhk58" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.077375 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/789d342d-013e-45e5-a57b-cde9f8bc0d3f-logs\") pod \"placement-db-sync-hhk58\" (UID: \"789d342d-013e-45e5-a57b-cde9f8bc0d3f\") " pod="openstack/placement-db-sync-hhk58" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.081965 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/789d342d-013e-45e5-a57b-cde9f8bc0d3f-combined-ca-bundle\") pod \"placement-db-sync-hhk58\" (UID: \"789d342d-013e-45e5-a57b-cde9f8bc0d3f\") " pod="openstack/placement-db-sync-hhk58" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.085057 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/789d342d-013e-45e5-a57b-cde9f8bc0d3f-config-data\") pod \"placement-db-sync-hhk58\" (UID: \"789d342d-013e-45e5-a57b-cde9f8bc0d3f\") " pod="openstack/placement-db-sync-hhk58" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.087882 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/789d342d-013e-45e5-a57b-cde9f8bc0d3f-scripts\") pod \"placement-db-sync-hhk58\" (UID: \"789d342d-013e-45e5-a57b-cde9f8bc0d3f\") " pod="openstack/placement-db-sync-hhk58" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.088162 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xkjdp" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.089406 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.112273 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-jgxmp" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.113207 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75899\" (UniqueName: \"kubernetes.io/projected/789d342d-013e-45e5-a57b-cde9f8bc0d3f-kube-api-access-75899\") pod \"placement-db-sync-hhk58\" (UID: \"789d342d-013e-45e5-a57b-cde9f8bc0d3f\") " pod="openstack/placement-db-sync-hhk58" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.124753 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-gvpkd" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.150003 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-hhk58" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.178097 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc865bcc-551d-4ae7-a0f3-af128e1d1e2d-scripts\") pod \"ceilometer-0\" (UID: \"dc865bcc-551d-4ae7-a0f3-af128e1d1e2d\") " pod="openstack/ceilometer-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.178201 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc865bcc-551d-4ae7-a0f3-af128e1d1e2d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dc865bcc-551d-4ae7-a0f3-af128e1d1e2d\") " pod="openstack/ceilometer-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.178274 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wckx\" (UniqueName: \"kubernetes.io/projected/dc865bcc-551d-4ae7-a0f3-af128e1d1e2d-kube-api-access-7wckx\") pod \"ceilometer-0\" (UID: \"dc865bcc-551d-4ae7-a0f3-af128e1d1e2d\") " pod="openstack/ceilometer-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.178322 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc865bcc-551d-4ae7-a0f3-af128e1d1e2d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dc865bcc-551d-4ae7-a0f3-af128e1d1e2d\") " pod="openstack/ceilometer-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.178408 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc865bcc-551d-4ae7-a0f3-af128e1d1e2d-config-data\") pod \"ceilometer-0\" (UID: \"dc865bcc-551d-4ae7-a0f3-af128e1d1e2d\") " pod="openstack/ceilometer-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.178439 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc865bcc-551d-4ae7-a0f3-af128e1d1e2d-log-httpd\") pod \"ceilometer-0\" (UID: \"dc865bcc-551d-4ae7-a0f3-af128e1d1e2d\") " pod="openstack/ceilometer-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.178514 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc865bcc-551d-4ae7-a0f3-af128e1d1e2d-run-httpd\") pod \"ceilometer-0\" (UID: \"dc865bcc-551d-4ae7-a0f3-af128e1d1e2d\") " pod="openstack/ceilometer-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.180692 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc865bcc-551d-4ae7-a0f3-af128e1d1e2d-log-httpd\") pod \"ceilometer-0\" (UID: \"dc865bcc-551d-4ae7-a0f3-af128e1d1e2d\") " pod="openstack/ceilometer-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.181472 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc865bcc-551d-4ae7-a0f3-af128e1d1e2d-run-httpd\") pod \"ceilometer-0\" (UID: \"dc865bcc-551d-4ae7-a0f3-af128e1d1e2d\") " pod="openstack/ceilometer-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.189050 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc865bcc-551d-4ae7-a0f3-af128e1d1e2d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dc865bcc-551d-4ae7-a0f3-af128e1d1e2d\") " pod="openstack/ceilometer-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.191933 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc865bcc-551d-4ae7-a0f3-af128e1d1e2d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dc865bcc-551d-4ae7-a0f3-af128e1d1e2d\") " pod="openstack/ceilometer-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.192250 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc865bcc-551d-4ae7-a0f3-af128e1d1e2d-scripts\") pod \"ceilometer-0\" (UID: \"dc865bcc-551d-4ae7-a0f3-af128e1d1e2d\") " pod="openstack/ceilometer-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.194307 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc865bcc-551d-4ae7-a0f3-af128e1d1e2d-config-data\") pod \"ceilometer-0\" (UID: \"dc865bcc-551d-4ae7-a0f3-af128e1d1e2d\") " pod="openstack/ceilometer-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.201633 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wckx\" (UniqueName: \"kubernetes.io/projected/dc865bcc-551d-4ae7-a0f3-af128e1d1e2d-kube-api-access-7wckx\") pod \"ceilometer-0\" (UID: \"dc865bcc-551d-4ae7-a0f3-af128e1d1e2d\") " pod="openstack/ceilometer-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.226345 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-97nhn"] Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.235555 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.242988 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.256333 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.260497 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.260690 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.260870 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-jrns4" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.264077 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.387238 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.389136 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.394046 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.395825 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.402271 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7b89d7bb-5111-43ac-b0dd-7e23377ef32f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7b89d7bb-5111-43ac-b0dd-7e23377ef32f\") pod \"glance-default-external-api-0\" (UID: \"2d5d8e18-256b-41c5-87b2-8323f9c74620\") " pod="openstack/glance-default-external-api-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.402343 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d5d8e18-256b-41c5-87b2-8323f9c74620-scripts\") pod \"glance-default-external-api-0\" (UID: \"2d5d8e18-256b-41c5-87b2-8323f9c74620\") " pod="openstack/glance-default-external-api-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.402371 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d5d8e18-256b-41c5-87b2-8323f9c74620-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2d5d8e18-256b-41c5-87b2-8323f9c74620\") " pod="openstack/glance-default-external-api-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.402395 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d5d8e18-256b-41c5-87b2-8323f9c74620-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2d5d8e18-256b-41c5-87b2-8323f9c74620\") " pod="openstack/glance-default-external-api-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.402455 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d5d8e18-256b-41c5-87b2-8323f9c74620-logs\") pod \"glance-default-external-api-0\" (UID: \"2d5d8e18-256b-41c5-87b2-8323f9c74620\") " pod="openstack/glance-default-external-api-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.402492 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2d5d8e18-256b-41c5-87b2-8323f9c74620-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2d5d8e18-256b-41c5-87b2-8323f9c74620\") " pod="openstack/glance-default-external-api-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.402521 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d5d8e18-256b-41c5-87b2-8323f9c74620-config-data\") pod \"glance-default-external-api-0\" (UID: \"2d5d8e18-256b-41c5-87b2-8323f9c74620\") " pod="openstack/glance-default-external-api-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.402539 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d2nn\" (UniqueName: \"kubernetes.io/projected/2d5d8e18-256b-41c5-87b2-8323f9c74620-kube-api-access-8d2nn\") pod \"glance-default-external-api-0\" (UID: \"2d5d8e18-256b-41c5-87b2-8323f9c74620\") " pod="openstack/glance-default-external-api-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.404240 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.474349 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.505117 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d5d8e18-256b-41c5-87b2-8323f9c74620-logs\") pod \"glance-default-external-api-0\" (UID: \"2d5d8e18-256b-41c5-87b2-8323f9c74620\") " pod="openstack/glance-default-external-api-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.505335 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2d5d8e18-256b-41c5-87b2-8323f9c74620-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2d5d8e18-256b-41c5-87b2-8323f9c74620\") " pod="openstack/glance-default-external-api-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.505360 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pzbb\" (UniqueName: \"kubernetes.io/projected/29561cc2-3a45-4b4b-b97d-51575bb021d7-kube-api-access-6pzbb\") pod \"glance-default-internal-api-0\" (UID: \"29561cc2-3a45-4b4b-b97d-51575bb021d7\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.505383 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29561cc2-3a45-4b4b-b97d-51575bb021d7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"29561cc2-3a45-4b4b-b97d-51575bb021d7\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.505417 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d5d8e18-256b-41c5-87b2-8323f9c74620-config-data\") pod \"glance-default-external-api-0\" (UID: \"2d5d8e18-256b-41c5-87b2-8323f9c74620\") " pod="openstack/glance-default-external-api-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.505434 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8d2nn\" (UniqueName: \"kubernetes.io/projected/2d5d8e18-256b-41c5-87b2-8323f9c74620-kube-api-access-8d2nn\") pod \"glance-default-external-api-0\" (UID: \"2d5d8e18-256b-41c5-87b2-8323f9c74620\") " pod="openstack/glance-default-external-api-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.505461 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29561cc2-3a45-4b4b-b97d-51575bb021d7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"29561cc2-3a45-4b4b-b97d-51575bb021d7\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.505510 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29561cc2-3a45-4b4b-b97d-51575bb021d7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"29561cc2-3a45-4b4b-b97d-51575bb021d7\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.505550 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ce7ce0de-edb0-4ca1-b23b-a1184d7ab53e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ce7ce0de-edb0-4ca1-b23b-a1184d7ab53e\") pod \"glance-default-internal-api-0\" (UID: \"29561cc2-3a45-4b4b-b97d-51575bb021d7\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.505583 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7b89d7bb-5111-43ac-b0dd-7e23377ef32f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7b89d7bb-5111-43ac-b0dd-7e23377ef32f\") pod \"glance-default-external-api-0\" (UID: \"2d5d8e18-256b-41c5-87b2-8323f9c74620\") " pod="openstack/glance-default-external-api-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.505615 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d5d8e18-256b-41c5-87b2-8323f9c74620-scripts\") pod \"glance-default-external-api-0\" (UID: \"2d5d8e18-256b-41c5-87b2-8323f9c74620\") " pod="openstack/glance-default-external-api-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.505639 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d5d8e18-256b-41c5-87b2-8323f9c74620-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2d5d8e18-256b-41c5-87b2-8323f9c74620\") " pod="openstack/glance-default-external-api-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.505674 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d5d8e18-256b-41c5-87b2-8323f9c74620-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2d5d8e18-256b-41c5-87b2-8323f9c74620\") " pod="openstack/glance-default-external-api-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.505702 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29561cc2-3a45-4b4b-b97d-51575bb021d7-logs\") pod \"glance-default-internal-api-0\" (UID: \"29561cc2-3a45-4b4b-b97d-51575bb021d7\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.505717 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/29561cc2-3a45-4b4b-b97d-51575bb021d7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"29561cc2-3a45-4b4b-b97d-51575bb021d7\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.505737 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/29561cc2-3a45-4b4b-b97d-51575bb021d7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"29561cc2-3a45-4b4b-b97d-51575bb021d7\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.506169 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d5d8e18-256b-41c5-87b2-8323f9c74620-logs\") pod \"glance-default-external-api-0\" (UID: \"2d5d8e18-256b-41c5-87b2-8323f9c74620\") " pod="openstack/glance-default-external-api-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.506365 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2d5d8e18-256b-41c5-87b2-8323f9c74620-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2d5d8e18-256b-41c5-87b2-8323f9c74620\") " pod="openstack/glance-default-external-api-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.522772 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d5d8e18-256b-41c5-87b2-8323f9c74620-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2d5d8e18-256b-41c5-87b2-8323f9c74620\") " pod="openstack/glance-default-external-api-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.523550 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d5d8e18-256b-41c5-87b2-8323f9c74620-scripts\") pod \"glance-default-external-api-0\" (UID: \"2d5d8e18-256b-41c5-87b2-8323f9c74620\") " pod="openstack/glance-default-external-api-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.528406 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d5d8e18-256b-41c5-87b2-8323f9c74620-config-data\") pod \"glance-default-external-api-0\" (UID: \"2d5d8e18-256b-41c5-87b2-8323f9c74620\") " pod="openstack/glance-default-external-api-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.533392 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d5d8e18-256b-41c5-87b2-8323f9c74620-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2d5d8e18-256b-41c5-87b2-8323f9c74620\") " pod="openstack/glance-default-external-api-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.545926 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d2nn\" (UniqueName: \"kubernetes.io/projected/2d5d8e18-256b-41c5-87b2-8323f9c74620-kube-api-access-8d2nn\") pod \"glance-default-external-api-0\" (UID: \"2d5d8e18-256b-41c5-87b2-8323f9c74620\") " pod="openstack/glance-default-external-api-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.572389 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-6dng4"] Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.609397 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pzbb\" (UniqueName: \"kubernetes.io/projected/29561cc2-3a45-4b4b-b97d-51575bb021d7-kube-api-access-6pzbb\") pod \"glance-default-internal-api-0\" (UID: \"29561cc2-3a45-4b4b-b97d-51575bb021d7\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.609464 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29561cc2-3a45-4b4b-b97d-51575bb021d7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"29561cc2-3a45-4b4b-b97d-51575bb021d7\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.609510 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29561cc2-3a45-4b4b-b97d-51575bb021d7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"29561cc2-3a45-4b4b-b97d-51575bb021d7\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.609574 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29561cc2-3a45-4b4b-b97d-51575bb021d7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"29561cc2-3a45-4b4b-b97d-51575bb021d7\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.609628 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ce7ce0de-edb0-4ca1-b23b-a1184d7ab53e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ce7ce0de-edb0-4ca1-b23b-a1184d7ab53e\") pod \"glance-default-internal-api-0\" (UID: \"29561cc2-3a45-4b4b-b97d-51575bb021d7\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.609729 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29561cc2-3a45-4b4b-b97d-51575bb021d7-logs\") pod \"glance-default-internal-api-0\" (UID: \"29561cc2-3a45-4b4b-b97d-51575bb021d7\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.609760 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/29561cc2-3a45-4b4b-b97d-51575bb021d7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"29561cc2-3a45-4b4b-b97d-51575bb021d7\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.609805 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/29561cc2-3a45-4b4b-b97d-51575bb021d7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"29561cc2-3a45-4b4b-b97d-51575bb021d7\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.610489 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/29561cc2-3a45-4b4b-b97d-51575bb021d7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"29561cc2-3a45-4b4b-b97d-51575bb021d7\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.612647 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29561cc2-3a45-4b4b-b97d-51575bb021d7-logs\") pod \"glance-default-internal-api-0\" (UID: \"29561cc2-3a45-4b4b-b97d-51575bb021d7\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.619600 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29561cc2-3a45-4b4b-b97d-51575bb021d7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"29561cc2-3a45-4b4b-b97d-51575bb021d7\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.622720 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29561cc2-3a45-4b4b-b97d-51575bb021d7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"29561cc2-3a45-4b4b-b97d-51575bb021d7\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.624014 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/29561cc2-3a45-4b4b-b97d-51575bb021d7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"29561cc2-3a45-4b4b-b97d-51575bb021d7\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.624029 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29561cc2-3a45-4b4b-b97d-51575bb021d7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"29561cc2-3a45-4b4b-b97d-51575bb021d7\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.630881 4826 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.630924 4826 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7b89d7bb-5111-43ac-b0dd-7e23377ef32f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7b89d7bb-5111-43ac-b0dd-7e23377ef32f\") pod \"glance-default-external-api-0\" (UID: \"2d5d8e18-256b-41c5-87b2-8323f9c74620\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f5ddb69bbd0946a253964aa6f0f321c34484aa81c710d424a0f6be0ed74bf7c0/globalmount\"" pod="openstack/glance-default-external-api-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.641552 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pzbb\" (UniqueName: \"kubernetes.io/projected/29561cc2-3a45-4b4b-b97d-51575bb021d7-kube-api-access-6pzbb\") pod \"glance-default-internal-api-0\" (UID: \"29561cc2-3a45-4b4b-b97d-51575bb021d7\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.650015 4826 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.650057 4826 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ce7ce0de-edb0-4ca1-b23b-a1184d7ab53e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ce7ce0de-edb0-4ca1-b23b-a1184d7ab53e\") pod \"glance-default-internal-api-0\" (UID: \"29561cc2-3a45-4b4b-b97d-51575bb021d7\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/8a5c6a8766a786cbd3eca35b0f5a1e3802ae1e4cbe235d9479277134f5caec0c/globalmount\"" pod="openstack/glance-default-internal-api-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.770300 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ce7ce0de-edb0-4ca1-b23b-a1184d7ab53e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ce7ce0de-edb0-4ca1-b23b-a1184d7ab53e\") pod \"glance-default-internal-api-0\" (UID: \"29561cc2-3a45-4b4b-b97d-51575bb021d7\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.848274 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7b89d7bb-5111-43ac-b0dd-7e23377ef32f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7b89d7bb-5111-43ac-b0dd-7e23377ef32f\") pod \"glance-default-external-api-0\" (UID: \"2d5d8e18-256b-41c5-87b2-8323f9c74620\") " pod="openstack/glance-default-external-api-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.848857 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-xfct2"] Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.849882 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-p7jzd"] Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.891169 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-xfct2" event={"ID":"d92c353f-6fae-4be8-8580-4066bb56e856","Type":"ContainerStarted","Data":"9c6ef73f012d6c9d864f42cd7b5eb0f8b613370217ae491511a18979a4c7db1a"} Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.893055 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-p7jzd" event={"ID":"5c80aa39-c840-4267-9677-bb82f387073d","Type":"ContainerStarted","Data":"abfb107d188854484028ff47fea895572a119cc0a6c969e19ca4434c1fb33fef"} Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.898270 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.902040 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6dng4" event={"ID":"cc531703-8300-4682-bcc7-1772312976a9","Type":"ContainerStarted","Data":"641e9343403485b3d5394e70a3468ec4bb3d83c5cb3a0effb1119400b1bfef13"} Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.906520 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-97nhn" event={"ID":"c7102b49-967d-49be-8a95-88d7ee3f8c94","Type":"ContainerStarted","Data":"32c6a1a4cd8705d01f525e4c3c47b104672756ede363bec8ddf7dd18329dd127"} Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.906682 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f59b8f679-zdqw2" podUID="e1769715-e77c-4d52-944b-d380adc06ed3" containerName="dnsmasq-dns" containerID="cri-o://654c6481d5eb7e9acc20850eeb8c537a18e9575673fd46e1476e2d913c1c8574" gracePeriod=10 Mar 19 19:17:49 crc kubenswrapper[4826]: I0319 19:17:49.993701 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-xkjdp"] Mar 19 19:17:50 crc kubenswrapper[4826]: I0319 19:17:50.045081 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 19:17:50 crc kubenswrapper[4826]: I0319 19:17:50.522933 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-hhk58"] Mar 19 19:17:50 crc kubenswrapper[4826]: W0319 19:17:50.531187 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod789d342d_013e_45e5_a57b_cde9f8bc0d3f.slice/crio-e45ca12a95b5c3eb600679543a7e0ec9009925206c4c8dbbce8a57b3d4c5190a WatchSource:0}: Error finding container e45ca12a95b5c3eb600679543a7e0ec9009925206c4c8dbbce8a57b3d4c5190a: Status 404 returned error can't find the container with id e45ca12a95b5c3eb600679543a7e0ec9009925206c4c8dbbce8a57b3d4c5190a Mar 19 19:17:50 crc kubenswrapper[4826]: I0319 19:17:50.537388 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-jgxmp"] Mar 19 19:17:50 crc kubenswrapper[4826]: W0319 19:17:50.577618 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ef3b70e_ff7d_48a9_8796_8b20af6e6547.slice/crio-c22f0e8cf02cbecbc107b706b9d26d98d7eb45707a3c7e76a69447360192fddf WatchSource:0}: Error finding container c22f0e8cf02cbecbc107b706b9d26d98d7eb45707a3c7e76a69447360192fddf: Status 404 returned error can't find the container with id c22f0e8cf02cbecbc107b706b9d26d98d7eb45707a3c7e76a69447360192fddf Mar 19 19:17:50 crc kubenswrapper[4826]: I0319 19:17:50.682476 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 19:17:50 crc kubenswrapper[4826]: I0319 19:17:50.727861 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-zdqw2" Mar 19 19:17:50 crc kubenswrapper[4826]: I0319 19:17:50.730917 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-gvpkd"] Mar 19 19:17:50 crc kubenswrapper[4826]: W0319 19:17:50.746277 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc865bcc_551d_4ae7_a0f3_af128e1d1e2d.slice/crio-198110759ac95ec5060deb3b0f6f55b21c28cdd021d6f4446c1c6feb3bb23295 WatchSource:0}: Error finding container 198110759ac95ec5060deb3b0f6f55b21c28cdd021d6f4446c1c6feb3bb23295: Status 404 returned error can't find the container with id 198110759ac95ec5060deb3b0f6f55b21c28cdd021d6f4446c1c6feb3bb23295 Mar 19 19:17:50 crc kubenswrapper[4826]: I0319 19:17:50.748081 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 19:17:50 crc kubenswrapper[4826]: I0319 19:17:50.763950 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:17:50 crc kubenswrapper[4826]: I0319 19:17:50.860088 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e1769715-e77c-4d52-944b-d380adc06ed3-dns-swift-storage-0\") pod \"e1769715-e77c-4d52-944b-d380adc06ed3\" (UID: \"e1769715-e77c-4d52-944b-d380adc06ed3\") " Mar 19 19:17:50 crc kubenswrapper[4826]: I0319 19:17:50.860404 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjc97\" (UniqueName: \"kubernetes.io/projected/e1769715-e77c-4d52-944b-d380adc06ed3-kube-api-access-sjc97\") pod \"e1769715-e77c-4d52-944b-d380adc06ed3\" (UID: \"e1769715-e77c-4d52-944b-d380adc06ed3\") " Mar 19 19:17:50 crc kubenswrapper[4826]: I0319 19:17:50.860422 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1769715-e77c-4d52-944b-d380adc06ed3-ovsdbserver-nb\") pod \"e1769715-e77c-4d52-944b-d380adc06ed3\" (UID: \"e1769715-e77c-4d52-944b-d380adc06ed3\") " Mar 19 19:17:50 crc kubenswrapper[4826]: I0319 19:17:50.860448 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1769715-e77c-4d52-944b-d380adc06ed3-ovsdbserver-sb\") pod \"e1769715-e77c-4d52-944b-d380adc06ed3\" (UID: \"e1769715-e77c-4d52-944b-d380adc06ed3\") " Mar 19 19:17:50 crc kubenswrapper[4826]: I0319 19:17:50.860475 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1769715-e77c-4d52-944b-d380adc06ed3-config\") pod \"e1769715-e77c-4d52-944b-d380adc06ed3\" (UID: \"e1769715-e77c-4d52-944b-d380adc06ed3\") " Mar 19 19:17:50 crc kubenswrapper[4826]: I0319 19:17:50.860516 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1769715-e77c-4d52-944b-d380adc06ed3-dns-svc\") pod \"e1769715-e77c-4d52-944b-d380adc06ed3\" (UID: \"e1769715-e77c-4d52-944b-d380adc06ed3\") " Mar 19 19:17:50 crc kubenswrapper[4826]: I0319 19:17:50.865248 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1769715-e77c-4d52-944b-d380adc06ed3-kube-api-access-sjc97" (OuterVolumeSpecName: "kube-api-access-sjc97") pod "e1769715-e77c-4d52-944b-d380adc06ed3" (UID: "e1769715-e77c-4d52-944b-d380adc06ed3"). InnerVolumeSpecName "kube-api-access-sjc97". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:17:50 crc kubenswrapper[4826]: I0319 19:17:50.963226 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjc97\" (UniqueName: \"kubernetes.io/projected/e1769715-e77c-4d52-944b-d380adc06ed3-kube-api-access-sjc97\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:50 crc kubenswrapper[4826]: I0319 19:17:50.969695 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1769715-e77c-4d52-944b-d380adc06ed3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e1769715-e77c-4d52-944b-d380adc06ed3" (UID: "e1769715-e77c-4d52-944b-d380adc06ed3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:17:50 crc kubenswrapper[4826]: I0319 19:17:50.985592 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:17:50 crc kubenswrapper[4826]: I0319 19:17:50.992801 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1769715-e77c-4d52-944b-d380adc06ed3-config" (OuterVolumeSpecName: "config") pod "e1769715-e77c-4d52-944b-d380adc06ed3" (UID: "e1769715-e77c-4d52-944b-d380adc06ed3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:17:51 crc kubenswrapper[4826]: I0319 19:17:51.018178 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1769715-e77c-4d52-944b-d380adc06ed3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e1769715-e77c-4d52-944b-d380adc06ed3" (UID: "e1769715-e77c-4d52-944b-d380adc06ed3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:17:51 crc kubenswrapper[4826]: I0319 19:17:51.020090 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1769715-e77c-4d52-944b-d380adc06ed3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e1769715-e77c-4d52-944b-d380adc06ed3" (UID: "e1769715-e77c-4d52-944b-d380adc06ed3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:17:51 crc kubenswrapper[4826]: I0319 19:17:51.041446 4826 generic.go:334] "Generic (PLEG): container finished" podID="c7102b49-967d-49be-8a95-88d7ee3f8c94" containerID="cc67db9c2fe2c6bf1ca1c8fdcad484bd8d8bf633fee08ebaba1bce9ecfbc614e" exitCode=0 Mar 19 19:17:51 crc kubenswrapper[4826]: I0319 19:17:51.041510 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-97nhn" event={"ID":"c7102b49-967d-49be-8a95-88d7ee3f8c94","Type":"ContainerDied","Data":"cc67db9c2fe2c6bf1ca1c8fdcad484bd8d8bf633fee08ebaba1bce9ecfbc614e"} Mar 19 19:17:51 crc kubenswrapper[4826]: I0319 19:17:51.050498 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1769715-e77c-4d52-944b-d380adc06ed3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e1769715-e77c-4d52-944b-d380adc06ed3" (UID: "e1769715-e77c-4d52-944b-d380adc06ed3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:17:51 crc kubenswrapper[4826]: I0319 19:17:51.058388 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc865bcc-551d-4ae7-a0f3-af128e1d1e2d","Type":"ContainerStarted","Data":"198110759ac95ec5060deb3b0f6f55b21c28cdd021d6f4446c1c6feb3bb23295"} Mar 19 19:17:51 crc kubenswrapper[4826]: I0319 19:17:51.060726 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xkjdp" event={"ID":"30c5e21e-66a0-47a9-b03d-55fbfe372d1b","Type":"ContainerStarted","Data":"258275de3c3bcd4351ba5dbb36b1240c6ec9f22f8d761008ce9de87a1de528d6"} Mar 19 19:17:51 crc kubenswrapper[4826]: I0319 19:17:51.076506 4826 generic.go:334] "Generic (PLEG): container finished" podID="e1769715-e77c-4d52-944b-d380adc06ed3" containerID="654c6481d5eb7e9acc20850eeb8c537a18e9575673fd46e1476e2d913c1c8574" exitCode=0 Mar 19 19:17:51 crc kubenswrapper[4826]: I0319 19:17:51.076566 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-zdqw2" event={"ID":"e1769715-e77c-4d52-944b-d380adc06ed3","Type":"ContainerDied","Data":"654c6481d5eb7e9acc20850eeb8c537a18e9575673fd46e1476e2d913c1c8574"} Mar 19 19:17:51 crc kubenswrapper[4826]: I0319 19:17:51.076646 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-zdqw2" Mar 19 19:17:51 crc kubenswrapper[4826]: I0319 19:17:51.076872 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-zdqw2" event={"ID":"e1769715-e77c-4d52-944b-d380adc06ed3","Type":"ContainerDied","Data":"c37ef2fc40147680023569860b9b60d3a99dc257ad45d37e781cbf0a556ae3df"} Mar 19 19:17:51 crc kubenswrapper[4826]: I0319 19:17:51.076890 4826 scope.go:117] "RemoveContainer" containerID="654c6481d5eb7e9acc20850eeb8c537a18e9575673fd46e1476e2d913c1c8574" Mar 19 19:17:51 crc kubenswrapper[4826]: I0319 19:17:51.090910 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-jgxmp" event={"ID":"9ef3b70e-ff7d-48a9-8796-8b20af6e6547","Type":"ContainerStarted","Data":"c22f0e8cf02cbecbc107b706b9d26d98d7eb45707a3c7e76a69447360192fddf"} Mar 19 19:17:51 crc kubenswrapper[4826]: I0319 19:17:51.094221 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-hhk58" event={"ID":"789d342d-013e-45e5-a57b-cde9f8bc0d3f","Type":"ContainerStarted","Data":"e45ca12a95b5c3eb600679543a7e0ec9009925206c4c8dbbce8a57b3d4c5190a"} Mar 19 19:17:51 crc kubenswrapper[4826]: I0319 19:17:51.100534 4826 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e1769715-e77c-4d52-944b-d380adc06ed3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:51 crc kubenswrapper[4826]: I0319 19:17:51.100569 4826 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1769715-e77c-4d52-944b-d380adc06ed3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:51 crc kubenswrapper[4826]: I0319 19:17:51.100580 4826 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1769715-e77c-4d52-944b-d380adc06ed3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:51 crc kubenswrapper[4826]: I0319 19:17:51.100591 4826 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1769715-e77c-4d52-944b-d380adc06ed3-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:51 crc kubenswrapper[4826]: I0319 19:17:51.100602 4826 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1769715-e77c-4d52-944b-d380adc06ed3-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:51 crc kubenswrapper[4826]: I0319 19:17:51.102532 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-gvpkd" event={"ID":"6b4865a7-127e-4108-bc3a-3aac30103761","Type":"ContainerStarted","Data":"65eafa6a5d4ef1b8f2c1a3dd6fd553c988a604e3c12c1a76dede8218c7c98568"} Mar 19 19:17:51 crc kubenswrapper[4826]: I0319 19:17:51.108070 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6dng4" event={"ID":"cc531703-8300-4682-bcc7-1772312976a9","Type":"ContainerStarted","Data":"872c881ab948c81d0951bdb7bf9dadc68d5f06840e9b32507e13f4e851e6e533"} Mar 19 19:17:51 crc kubenswrapper[4826]: I0319 19:17:51.124209 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 19:17:51 crc kubenswrapper[4826]: I0319 19:17:51.132910 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-6dng4" podStartSLOduration=3.132891302 podStartE2EDuration="3.132891302s" podCreationTimestamp="2026-03-19 19:17:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:17:51.132169365 +0000 UTC m=+1295.886237698" watchObservedRunningTime="2026-03-19 19:17:51.132891302 +0000 UTC m=+1295.886959615" Mar 19 19:17:51 crc kubenswrapper[4826]: I0319 19:17:51.179207 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-zdqw2"] Mar 19 19:17:51 crc kubenswrapper[4826]: I0319 19:17:51.190572 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-zdqw2"] Mar 19 19:17:51 crc kubenswrapper[4826]: I0319 19:17:51.307381 4826 scope.go:117] "RemoveContainer" containerID="d9e40454feb8b49ff2c67220d3eea38143c3cf24188f29a3c6088d73a1609c8f" Mar 19 19:17:51 crc kubenswrapper[4826]: W0319 19:17:51.363199 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d5d8e18_256b_41c5_87b2_8323f9c74620.slice/crio-fbd85dcac6fa351a0b2a8e20dd8a2cfce2102c36bd733b20a1452ad7e3c1294c WatchSource:0}: Error finding container fbd85dcac6fa351a0b2a8e20dd8a2cfce2102c36bd733b20a1452ad7e3c1294c: Status 404 returned error can't find the container with id fbd85dcac6fa351a0b2a8e20dd8a2cfce2102c36bd733b20a1452ad7e3c1294c Mar 19 19:17:51 crc kubenswrapper[4826]: I0319 19:17:51.374376 4826 scope.go:117] "RemoveContainer" containerID="654c6481d5eb7e9acc20850eeb8c537a18e9575673fd46e1476e2d913c1c8574" Mar 19 19:17:51 crc kubenswrapper[4826]: E0319 19:17:51.375245 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"654c6481d5eb7e9acc20850eeb8c537a18e9575673fd46e1476e2d913c1c8574\": container with ID starting with 654c6481d5eb7e9acc20850eeb8c537a18e9575673fd46e1476e2d913c1c8574 not found: ID does not exist" containerID="654c6481d5eb7e9acc20850eeb8c537a18e9575673fd46e1476e2d913c1c8574" Mar 19 19:17:51 crc kubenswrapper[4826]: I0319 19:17:51.375276 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"654c6481d5eb7e9acc20850eeb8c537a18e9575673fd46e1476e2d913c1c8574"} err="failed to get container status \"654c6481d5eb7e9acc20850eeb8c537a18e9575673fd46e1476e2d913c1c8574\": rpc error: code = NotFound desc = could not find container \"654c6481d5eb7e9acc20850eeb8c537a18e9575673fd46e1476e2d913c1c8574\": container with ID starting with 654c6481d5eb7e9acc20850eeb8c537a18e9575673fd46e1476e2d913c1c8574 not found: ID does not exist" Mar 19 19:17:51 crc kubenswrapper[4826]: I0319 19:17:51.375296 4826 scope.go:117] "RemoveContainer" containerID="d9e40454feb8b49ff2c67220d3eea38143c3cf24188f29a3c6088d73a1609c8f" Mar 19 19:17:51 crc kubenswrapper[4826]: E0319 19:17:51.375909 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9e40454feb8b49ff2c67220d3eea38143c3cf24188f29a3c6088d73a1609c8f\": container with ID starting with d9e40454feb8b49ff2c67220d3eea38143c3cf24188f29a3c6088d73a1609c8f not found: ID does not exist" containerID="d9e40454feb8b49ff2c67220d3eea38143c3cf24188f29a3c6088d73a1609c8f" Mar 19 19:17:51 crc kubenswrapper[4826]: I0319 19:17:51.375930 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9e40454feb8b49ff2c67220d3eea38143c3cf24188f29a3c6088d73a1609c8f"} err="failed to get container status \"d9e40454feb8b49ff2c67220d3eea38143c3cf24188f29a3c6088d73a1609c8f\": rpc error: code = NotFound desc = could not find container \"d9e40454feb8b49ff2c67220d3eea38143c3cf24188f29a3c6088d73a1609c8f\": container with ID starting with d9e40454feb8b49ff2c67220d3eea38143c3cf24188f29a3c6088d73a1609c8f not found: ID does not exist" Mar 19 19:17:51 crc kubenswrapper[4826]: I0319 19:17:51.618185 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-97nhn" Mar 19 19:17:51 crc kubenswrapper[4826]: I0319 19:17:51.698877 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 19:17:51 crc kubenswrapper[4826]: I0319 19:17:51.720417 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c7102b49-967d-49be-8a95-88d7ee3f8c94-ovsdbserver-sb\") pod \"c7102b49-967d-49be-8a95-88d7ee3f8c94\" (UID: \"c7102b49-967d-49be-8a95-88d7ee3f8c94\") " Mar 19 19:17:51 crc kubenswrapper[4826]: I0319 19:17:51.720594 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7102b49-967d-49be-8a95-88d7ee3f8c94-dns-svc\") pod \"c7102b49-967d-49be-8a95-88d7ee3f8c94\" (UID: \"c7102b49-967d-49be-8a95-88d7ee3f8c94\") " Mar 19 19:17:51 crc kubenswrapper[4826]: W0319 19:17:51.749461 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29561cc2_3a45_4b4b_b97d_51575bb021d7.slice/crio-e7b8a0bba75a966f46588fd4088a155bd5fb45a0792ace401fa0536be594de75 WatchSource:0}: Error finding container e7b8a0bba75a966f46588fd4088a155bd5fb45a0792ace401fa0536be594de75: Status 404 returned error can't find the container with id e7b8a0bba75a966f46588fd4088a155bd5fb45a0792ace401fa0536be594de75 Mar 19 19:17:51 crc kubenswrapper[4826]: I0319 19:17:51.785618 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7102b49-967d-49be-8a95-88d7ee3f8c94-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c7102b49-967d-49be-8a95-88d7ee3f8c94" (UID: "c7102b49-967d-49be-8a95-88d7ee3f8c94"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:17:51 crc kubenswrapper[4826]: I0319 19:17:51.798460 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7102b49-967d-49be-8a95-88d7ee3f8c94-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c7102b49-967d-49be-8a95-88d7ee3f8c94" (UID: "c7102b49-967d-49be-8a95-88d7ee3f8c94"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:17:51 crc kubenswrapper[4826]: I0319 19:17:51.833476 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c7102b49-967d-49be-8a95-88d7ee3f8c94-ovsdbserver-nb\") pod \"c7102b49-967d-49be-8a95-88d7ee3f8c94\" (UID: \"c7102b49-967d-49be-8a95-88d7ee3f8c94\") " Mar 19 19:17:51 crc kubenswrapper[4826]: I0319 19:17:51.833544 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c7102b49-967d-49be-8a95-88d7ee3f8c94-dns-swift-storage-0\") pod \"c7102b49-967d-49be-8a95-88d7ee3f8c94\" (UID: \"c7102b49-967d-49be-8a95-88d7ee3f8c94\") " Mar 19 19:17:51 crc kubenswrapper[4826]: I0319 19:17:51.833600 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjhxb\" (UniqueName: \"kubernetes.io/projected/c7102b49-967d-49be-8a95-88d7ee3f8c94-kube-api-access-bjhxb\") pod \"c7102b49-967d-49be-8a95-88d7ee3f8c94\" (UID: \"c7102b49-967d-49be-8a95-88d7ee3f8c94\") " Mar 19 19:17:51 crc kubenswrapper[4826]: I0319 19:17:51.833669 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7102b49-967d-49be-8a95-88d7ee3f8c94-config\") pod \"c7102b49-967d-49be-8a95-88d7ee3f8c94\" (UID: \"c7102b49-967d-49be-8a95-88d7ee3f8c94\") " Mar 19 19:17:51 crc kubenswrapper[4826]: I0319 19:17:51.834350 4826 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c7102b49-967d-49be-8a95-88d7ee3f8c94-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:51 crc kubenswrapper[4826]: I0319 19:17:51.834368 4826 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7102b49-967d-49be-8a95-88d7ee3f8c94-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:51 crc kubenswrapper[4826]: I0319 19:17:51.881809 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7102b49-967d-49be-8a95-88d7ee3f8c94-kube-api-access-bjhxb" (OuterVolumeSpecName: "kube-api-access-bjhxb") pod "c7102b49-967d-49be-8a95-88d7ee3f8c94" (UID: "c7102b49-967d-49be-8a95-88d7ee3f8c94"). InnerVolumeSpecName "kube-api-access-bjhxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:17:51 crc kubenswrapper[4826]: I0319 19:17:51.882918 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7102b49-967d-49be-8a95-88d7ee3f8c94-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c7102b49-967d-49be-8a95-88d7ee3f8c94" (UID: "c7102b49-967d-49be-8a95-88d7ee3f8c94"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:17:51 crc kubenswrapper[4826]: I0319 19:17:51.885878 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7102b49-967d-49be-8a95-88d7ee3f8c94-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c7102b49-967d-49be-8a95-88d7ee3f8c94" (UID: "c7102b49-967d-49be-8a95-88d7ee3f8c94"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:17:51 crc kubenswrapper[4826]: I0319 19:17:51.888430 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7102b49-967d-49be-8a95-88d7ee3f8c94-config" (OuterVolumeSpecName: "config") pod "c7102b49-967d-49be-8a95-88d7ee3f8c94" (UID: "c7102b49-967d-49be-8a95-88d7ee3f8c94"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:17:51 crc kubenswrapper[4826]: I0319 19:17:51.937388 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjhxb\" (UniqueName: \"kubernetes.io/projected/c7102b49-967d-49be-8a95-88d7ee3f8c94-kube-api-access-bjhxb\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:51 crc kubenswrapper[4826]: I0319 19:17:51.937452 4826 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7102b49-967d-49be-8a95-88d7ee3f8c94-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:51 crc kubenswrapper[4826]: I0319 19:17:51.937465 4826 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c7102b49-967d-49be-8a95-88d7ee3f8c94-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:51 crc kubenswrapper[4826]: I0319 19:17:51.937473 4826 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c7102b49-967d-49be-8a95-88d7ee3f8c94-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 19 19:17:52 crc kubenswrapper[4826]: I0319 19:17:52.044302 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1769715-e77c-4d52-944b-d380adc06ed3" path="/var/lib/kubelet/pods/e1769715-e77c-4d52-944b-d380adc06ed3/volumes" Mar 19 19:17:52 crc kubenswrapper[4826]: I0319 19:17:52.118020 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2d5d8e18-256b-41c5-87b2-8323f9c74620","Type":"ContainerStarted","Data":"fbd85dcac6fa351a0b2a8e20dd8a2cfce2102c36bd733b20a1452ad7e3c1294c"} Mar 19 19:17:52 crc kubenswrapper[4826]: I0319 19:17:52.120052 4826 generic.go:334] "Generic (PLEG): container finished" podID="6b4865a7-127e-4108-bc3a-3aac30103761" containerID="3e3f7b8074e3de05d6a41fed9971959d1a337b230ae074fb29f0fadfd278f287" exitCode=0 Mar 19 19:17:52 crc kubenswrapper[4826]: I0319 19:17:52.120097 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-gvpkd" event={"ID":"6b4865a7-127e-4108-bc3a-3aac30103761","Type":"ContainerDied","Data":"3e3f7b8074e3de05d6a41fed9971959d1a337b230ae074fb29f0fadfd278f287"} Mar 19 19:17:52 crc kubenswrapper[4826]: I0319 19:17:52.124241 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-97nhn" Mar 19 19:17:52 crc kubenswrapper[4826]: I0319 19:17:52.124261 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-97nhn" event={"ID":"c7102b49-967d-49be-8a95-88d7ee3f8c94","Type":"ContainerDied","Data":"32c6a1a4cd8705d01f525e4c3c47b104672756ede363bec8ddf7dd18329dd127"} Mar 19 19:17:52 crc kubenswrapper[4826]: I0319 19:17:52.124617 4826 scope.go:117] "RemoveContainer" containerID="cc67db9c2fe2c6bf1ca1c8fdcad484bd8d8bf633fee08ebaba1bce9ecfbc614e" Mar 19 19:17:52 crc kubenswrapper[4826]: I0319 19:17:52.198764 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-97nhn"] Mar 19 19:17:52 crc kubenswrapper[4826]: I0319 19:17:52.203753 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"29561cc2-3a45-4b4b-b97d-51575bb021d7","Type":"ContainerStarted","Data":"e7b8a0bba75a966f46588fd4088a155bd5fb45a0792ace401fa0536be594de75"} Mar 19 19:17:52 crc kubenswrapper[4826]: I0319 19:17:52.209936 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-97nhn"] Mar 19 19:17:52 crc kubenswrapper[4826]: I0319 19:17:52.222006 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-jgxmp" event={"ID":"9ef3b70e-ff7d-48a9-8796-8b20af6e6547","Type":"ContainerStarted","Data":"65170b93cfb43575d4256115d3e03de56f7c6e64fd93442416bc2e4a4f359520"} Mar 19 19:17:52 crc kubenswrapper[4826]: I0319 19:17:52.258041 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-jgxmp" podStartSLOduration=4.258023066 podStartE2EDuration="4.258023066s" podCreationTimestamp="2026-03-19 19:17:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:17:52.235614984 +0000 UTC m=+1296.989683307" watchObservedRunningTime="2026-03-19 19:17:52.258023066 +0000 UTC m=+1297.012091379" Mar 19 19:17:53 crc kubenswrapper[4826]: I0319 19:17:53.268519 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"29561cc2-3a45-4b4b-b97d-51575bb021d7","Type":"ContainerStarted","Data":"607568831a713d54b5253632020405f47f1b0e4e0e2d1c7bfe2448ee22480bd5"} Mar 19 19:17:53 crc kubenswrapper[4826]: I0319 19:17:53.280793 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2d5d8e18-256b-41c5-87b2-8323f9c74620","Type":"ContainerStarted","Data":"f9b24dca59990792911707f44c368633deacf829d20224880101a0320588b3ee"} Mar 19 19:17:53 crc kubenswrapper[4826]: I0319 19:17:53.291815 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-gvpkd" event={"ID":"6b4865a7-127e-4108-bc3a-3aac30103761","Type":"ContainerStarted","Data":"c16a2cb6caeceb61b6c714fde783f85b19d924734e0dc6b9a1d10bab431c90f6"} Mar 19 19:17:53 crc kubenswrapper[4826]: I0319 19:17:53.293058 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56df8fb6b7-gvpkd" Mar 19 19:17:53 crc kubenswrapper[4826]: I0319 19:17:53.334978 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56df8fb6b7-gvpkd" podStartSLOduration=5.334958344 podStartE2EDuration="5.334958344s" podCreationTimestamp="2026-03-19 19:17:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:17:53.312792248 +0000 UTC m=+1298.066860581" watchObservedRunningTime="2026-03-19 19:17:53.334958344 +0000 UTC m=+1298.089026657" Mar 19 19:17:53 crc kubenswrapper[4826]: I0319 19:17:53.997726 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7102b49-967d-49be-8a95-88d7ee3f8c94" path="/var/lib/kubelet/pods/c7102b49-967d-49be-8a95-88d7ee3f8c94/volumes" Mar 19 19:17:54 crc kubenswrapper[4826]: I0319 19:17:54.403966 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"29561cc2-3a45-4b4b-b97d-51575bb021d7","Type":"ContainerStarted","Data":"f0e76fd3575fa73c4103c1158165352541e3236ddcfe6d40a160d6afe52f76f8"} Mar 19 19:17:54 crc kubenswrapper[4826]: I0319 19:17:54.404168 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="29561cc2-3a45-4b4b-b97d-51575bb021d7" containerName="glance-log" containerID="cri-o://607568831a713d54b5253632020405f47f1b0e4e0e2d1c7bfe2448ee22480bd5" gracePeriod=30 Mar 19 19:17:54 crc kubenswrapper[4826]: I0319 19:17:54.404820 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="29561cc2-3a45-4b4b-b97d-51575bb021d7" containerName="glance-httpd" containerID="cri-o://f0e76fd3575fa73c4103c1158165352541e3236ddcfe6d40a160d6afe52f76f8" gracePeriod=30 Mar 19 19:17:54 crc kubenswrapper[4826]: I0319 19:17:54.419470 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="2d5d8e18-256b-41c5-87b2-8323f9c74620" containerName="glance-log" containerID="cri-o://f9b24dca59990792911707f44c368633deacf829d20224880101a0320588b3ee" gracePeriod=30 Mar 19 19:17:54 crc kubenswrapper[4826]: I0319 19:17:54.419728 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2d5d8e18-256b-41c5-87b2-8323f9c74620","Type":"ContainerStarted","Data":"074c5d1c9911b84ee78079e0aba393ce4336d8bba1a8cac3706cffa03976c6bb"} Mar 19 19:17:54 crc kubenswrapper[4826]: I0319 19:17:54.419800 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="2d5d8e18-256b-41c5-87b2-8323f9c74620" containerName="glance-httpd" containerID="cri-o://074c5d1c9911b84ee78079e0aba393ce4336d8bba1a8cac3706cffa03976c6bb" gracePeriod=30 Mar 19 19:17:54 crc kubenswrapper[4826]: I0319 19:17:54.443700 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.443684211 podStartE2EDuration="6.443684211s" podCreationTimestamp="2026-03-19 19:17:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:17:54.439030179 +0000 UTC m=+1299.193098492" watchObservedRunningTime="2026-03-19 19:17:54.443684211 +0000 UTC m=+1299.197752524" Mar 19 19:17:55 crc kubenswrapper[4826]: I0319 19:17:55.401044 4826 patch_prober.go:28] interesting pod/machine-config-daemon-zz87p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:17:55 crc kubenswrapper[4826]: I0319 19:17:55.401416 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:17:55 crc kubenswrapper[4826]: I0319 19:17:55.433271 4826 generic.go:334] "Generic (PLEG): container finished" podID="cc531703-8300-4682-bcc7-1772312976a9" containerID="872c881ab948c81d0951bdb7bf9dadc68d5f06840e9b32507e13f4e851e6e533" exitCode=0 Mar 19 19:17:55 crc kubenswrapper[4826]: I0319 19:17:55.433331 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6dng4" event={"ID":"cc531703-8300-4682-bcc7-1772312976a9","Type":"ContainerDied","Data":"872c881ab948c81d0951bdb7bf9dadc68d5f06840e9b32507e13f4e851e6e533"} Mar 19 19:17:55 crc kubenswrapper[4826]: I0319 19:17:55.435980 4826 generic.go:334] "Generic (PLEG): container finished" podID="29561cc2-3a45-4b4b-b97d-51575bb021d7" containerID="f0e76fd3575fa73c4103c1158165352541e3236ddcfe6d40a160d6afe52f76f8" exitCode=0 Mar 19 19:17:55 crc kubenswrapper[4826]: I0319 19:17:55.436012 4826 generic.go:334] "Generic (PLEG): container finished" podID="29561cc2-3a45-4b4b-b97d-51575bb021d7" containerID="607568831a713d54b5253632020405f47f1b0e4e0e2d1c7bfe2448ee22480bd5" exitCode=143 Mar 19 19:17:55 crc kubenswrapper[4826]: I0319 19:17:55.436011 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"29561cc2-3a45-4b4b-b97d-51575bb021d7","Type":"ContainerDied","Data":"f0e76fd3575fa73c4103c1158165352541e3236ddcfe6d40a160d6afe52f76f8"} Mar 19 19:17:55 crc kubenswrapper[4826]: I0319 19:17:55.436035 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"29561cc2-3a45-4b4b-b97d-51575bb021d7","Type":"ContainerDied","Data":"607568831a713d54b5253632020405f47f1b0e4e0e2d1c7bfe2448ee22480bd5"} Mar 19 19:17:55 crc kubenswrapper[4826]: I0319 19:17:55.441475 4826 generic.go:334] "Generic (PLEG): container finished" podID="2d5d8e18-256b-41c5-87b2-8323f9c74620" containerID="074c5d1c9911b84ee78079e0aba393ce4336d8bba1a8cac3706cffa03976c6bb" exitCode=0 Mar 19 19:17:55 crc kubenswrapper[4826]: I0319 19:17:55.441508 4826 generic.go:334] "Generic (PLEG): container finished" podID="2d5d8e18-256b-41c5-87b2-8323f9c74620" containerID="f9b24dca59990792911707f44c368633deacf829d20224880101a0320588b3ee" exitCode=143 Mar 19 19:17:55 crc kubenswrapper[4826]: I0319 19:17:55.441579 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2d5d8e18-256b-41c5-87b2-8323f9c74620","Type":"ContainerDied","Data":"074c5d1c9911b84ee78079e0aba393ce4336d8bba1a8cac3706cffa03976c6bb"} Mar 19 19:17:55 crc kubenswrapper[4826]: I0319 19:17:55.441639 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2d5d8e18-256b-41c5-87b2-8323f9c74620","Type":"ContainerDied","Data":"f9b24dca59990792911707f44c368633deacf829d20224880101a0320588b3ee"} Mar 19 19:17:55 crc kubenswrapper[4826]: I0319 19:17:55.451240 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=7.451222713 podStartE2EDuration="7.451222713s" podCreationTimestamp="2026-03-19 19:17:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:17:54.49491081 +0000 UTC m=+1299.248979123" watchObservedRunningTime="2026-03-19 19:17:55.451222713 +0000 UTC m=+1300.205291026" Mar 19 19:17:59 crc kubenswrapper[4826]: I0319 19:17:59.126979 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56df8fb6b7-gvpkd" Mar 19 19:17:59 crc kubenswrapper[4826]: I0319 19:17:59.243352 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-nqjw7"] Mar 19 19:17:59 crc kubenswrapper[4826]: I0319 19:17:59.243625 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-nqjw7" podUID="f22cc2a8-4d4f-42c8-b22e-478636b5259e" containerName="dnsmasq-dns" containerID="cri-o://326c7e917beb6524f0a0ba588c56756018b6f25727555edcfa14a915da74514e" gracePeriod=10 Mar 19 19:18:00 crc kubenswrapper[4826]: I0319 19:18:00.151787 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565798-jlzqc"] Mar 19 19:18:00 crc kubenswrapper[4826]: E0319 19:18:00.152991 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1769715-e77c-4d52-944b-d380adc06ed3" containerName="dnsmasq-dns" Mar 19 19:18:00 crc kubenswrapper[4826]: I0319 19:18:00.153004 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1769715-e77c-4d52-944b-d380adc06ed3" containerName="dnsmasq-dns" Mar 19 19:18:00 crc kubenswrapper[4826]: E0319 19:18:00.153030 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7102b49-967d-49be-8a95-88d7ee3f8c94" containerName="init" Mar 19 19:18:00 crc kubenswrapper[4826]: I0319 19:18:00.153036 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7102b49-967d-49be-8a95-88d7ee3f8c94" containerName="init" Mar 19 19:18:00 crc kubenswrapper[4826]: E0319 19:18:00.153066 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1769715-e77c-4d52-944b-d380adc06ed3" containerName="init" Mar 19 19:18:00 crc kubenswrapper[4826]: I0319 19:18:00.153073 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1769715-e77c-4d52-944b-d380adc06ed3" containerName="init" Mar 19 19:18:00 crc kubenswrapper[4826]: I0319 19:18:00.153553 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7102b49-967d-49be-8a95-88d7ee3f8c94" containerName="init" Mar 19 19:18:00 crc kubenswrapper[4826]: I0319 19:18:00.153581 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1769715-e77c-4d52-944b-d380adc06ed3" containerName="dnsmasq-dns" Mar 19 19:18:00 crc kubenswrapper[4826]: I0319 19:18:00.163754 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565798-jlzqc" Mar 19 19:18:00 crc kubenswrapper[4826]: I0319 19:18:00.166914 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b27wl" Mar 19 19:18:00 crc kubenswrapper[4826]: I0319 19:18:00.167132 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 19:18:00 crc kubenswrapper[4826]: I0319 19:18:00.167222 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 19:18:00 crc kubenswrapper[4826]: I0319 19:18:00.199145 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565798-jlzqc"] Mar 19 19:18:00 crc kubenswrapper[4826]: I0319 19:18:00.242150 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svzvr\" (UniqueName: \"kubernetes.io/projected/eef07f29-afd3-40df-a0c7-098109beedde-kube-api-access-svzvr\") pod \"auto-csr-approver-29565798-jlzqc\" (UID: \"eef07f29-afd3-40df-a0c7-098109beedde\") " pod="openshift-infra/auto-csr-approver-29565798-jlzqc" Mar 19 19:18:00 crc kubenswrapper[4826]: I0319 19:18:00.344336 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svzvr\" (UniqueName: \"kubernetes.io/projected/eef07f29-afd3-40df-a0c7-098109beedde-kube-api-access-svzvr\") pod \"auto-csr-approver-29565798-jlzqc\" (UID: \"eef07f29-afd3-40df-a0c7-098109beedde\") " pod="openshift-infra/auto-csr-approver-29565798-jlzqc" Mar 19 19:18:00 crc kubenswrapper[4826]: I0319 19:18:00.375946 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svzvr\" (UniqueName: \"kubernetes.io/projected/eef07f29-afd3-40df-a0c7-098109beedde-kube-api-access-svzvr\") pod \"auto-csr-approver-29565798-jlzqc\" (UID: \"eef07f29-afd3-40df-a0c7-098109beedde\") " pod="openshift-infra/auto-csr-approver-29565798-jlzqc" Mar 19 19:18:00 crc kubenswrapper[4826]: I0319 19:18:00.499587 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565798-jlzqc" Mar 19 19:18:00 crc kubenswrapper[4826]: I0319 19:18:00.516760 4826 generic.go:334] "Generic (PLEG): container finished" podID="f22cc2a8-4d4f-42c8-b22e-478636b5259e" containerID="326c7e917beb6524f0a0ba588c56756018b6f25727555edcfa14a915da74514e" exitCode=0 Mar 19 19:18:00 crc kubenswrapper[4826]: I0319 19:18:00.516931 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-nqjw7" event={"ID":"f22cc2a8-4d4f-42c8-b22e-478636b5259e","Type":"ContainerDied","Data":"326c7e917beb6524f0a0ba588c56756018b6f25727555edcfa14a915da74514e"} Mar 19 19:18:01 crc kubenswrapper[4826]: I0319 19:18:01.647286 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-nqjw7" podUID="f22cc2a8-4d4f-42c8-b22e-478636b5259e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.151:5353: connect: connection refused" Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.081424 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.192344 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d5d8e18-256b-41c5-87b2-8323f9c74620-logs\") pod \"2d5d8e18-256b-41c5-87b2-8323f9c74620\" (UID: \"2d5d8e18-256b-41c5-87b2-8323f9c74620\") " Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.192729 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d5d8e18-256b-41c5-87b2-8323f9c74620-combined-ca-bundle\") pod \"2d5d8e18-256b-41c5-87b2-8323f9c74620\" (UID: \"2d5d8e18-256b-41c5-87b2-8323f9c74620\") " Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.192852 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8d2nn\" (UniqueName: \"kubernetes.io/projected/2d5d8e18-256b-41c5-87b2-8323f9c74620-kube-api-access-8d2nn\") pod \"2d5d8e18-256b-41c5-87b2-8323f9c74620\" (UID: \"2d5d8e18-256b-41c5-87b2-8323f9c74620\") " Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.192997 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d5d8e18-256b-41c5-87b2-8323f9c74620-scripts\") pod \"2d5d8e18-256b-41c5-87b2-8323f9c74620\" (UID: \"2d5d8e18-256b-41c5-87b2-8323f9c74620\") " Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.193186 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2d5d8e18-256b-41c5-87b2-8323f9c74620-httpd-run\") pod \"2d5d8e18-256b-41c5-87b2-8323f9c74620\" (UID: \"2d5d8e18-256b-41c5-87b2-8323f9c74620\") " Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.193358 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d5d8e18-256b-41c5-87b2-8323f9c74620-config-data\") pod \"2d5d8e18-256b-41c5-87b2-8323f9c74620\" (UID: \"2d5d8e18-256b-41c5-87b2-8323f9c74620\") " Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.193514 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d5d8e18-256b-41c5-87b2-8323f9c74620-logs" (OuterVolumeSpecName: "logs") pod "2d5d8e18-256b-41c5-87b2-8323f9c74620" (UID: "2d5d8e18-256b-41c5-87b2-8323f9c74620"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.193684 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d5d8e18-256b-41c5-87b2-8323f9c74620-public-tls-certs\") pod \"2d5d8e18-256b-41c5-87b2-8323f9c74620\" (UID: \"2d5d8e18-256b-41c5-87b2-8323f9c74620\") " Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.193952 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7b89d7bb-5111-43ac-b0dd-7e23377ef32f\") pod \"2d5d8e18-256b-41c5-87b2-8323f9c74620\" (UID: \"2d5d8e18-256b-41c5-87b2-8323f9c74620\") " Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.194999 4826 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d5d8e18-256b-41c5-87b2-8323f9c74620-logs\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.193948 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d5d8e18-256b-41c5-87b2-8323f9c74620-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2d5d8e18-256b-41c5-87b2-8323f9c74620" (UID: "2d5d8e18-256b-41c5-87b2-8323f9c74620"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.199902 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d5d8e18-256b-41c5-87b2-8323f9c74620-scripts" (OuterVolumeSpecName: "scripts") pod "2d5d8e18-256b-41c5-87b2-8323f9c74620" (UID: "2d5d8e18-256b-41c5-87b2-8323f9c74620"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.210133 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d5d8e18-256b-41c5-87b2-8323f9c74620-kube-api-access-8d2nn" (OuterVolumeSpecName: "kube-api-access-8d2nn") pod "2d5d8e18-256b-41c5-87b2-8323f9c74620" (UID: "2d5d8e18-256b-41c5-87b2-8323f9c74620"). InnerVolumeSpecName "kube-api-access-8d2nn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.223286 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7b89d7bb-5111-43ac-b0dd-7e23377ef32f" (OuterVolumeSpecName: "glance") pod "2d5d8e18-256b-41c5-87b2-8323f9c74620" (UID: "2d5d8e18-256b-41c5-87b2-8323f9c74620"). InnerVolumeSpecName "pvc-7b89d7bb-5111-43ac-b0dd-7e23377ef32f". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.246949 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d5d8e18-256b-41c5-87b2-8323f9c74620-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2d5d8e18-256b-41c5-87b2-8323f9c74620" (UID: "2d5d8e18-256b-41c5-87b2-8323f9c74620"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.249900 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d5d8e18-256b-41c5-87b2-8323f9c74620-config-data" (OuterVolumeSpecName: "config-data") pod "2d5d8e18-256b-41c5-87b2-8323f9c74620" (UID: "2d5d8e18-256b-41c5-87b2-8323f9c74620"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.253323 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d5d8e18-256b-41c5-87b2-8323f9c74620-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2d5d8e18-256b-41c5-87b2-8323f9c74620" (UID: "2d5d8e18-256b-41c5-87b2-8323f9c74620"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.297711 4826 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d5d8e18-256b-41c5-87b2-8323f9c74620-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.297983 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8d2nn\" (UniqueName: \"kubernetes.io/projected/2d5d8e18-256b-41c5-87b2-8323f9c74620-kube-api-access-8d2nn\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.298005 4826 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d5d8e18-256b-41c5-87b2-8323f9c74620-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.298018 4826 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2d5d8e18-256b-41c5-87b2-8323f9c74620-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.298029 4826 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d5d8e18-256b-41c5-87b2-8323f9c74620-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.298044 4826 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d5d8e18-256b-41c5-87b2-8323f9c74620-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.298083 4826 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-7b89d7bb-5111-43ac-b0dd-7e23377ef32f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7b89d7bb-5111-43ac-b0dd-7e23377ef32f\") on node \"crc\" " Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.342568 4826 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.342755 4826 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-7b89d7bb-5111-43ac-b0dd-7e23377ef32f" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7b89d7bb-5111-43ac-b0dd-7e23377ef32f") on node "crc" Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.400913 4826 reconciler_common.go:293] "Volume detached for volume \"pvc-7b89d7bb-5111-43ac-b0dd-7e23377ef32f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7b89d7bb-5111-43ac-b0dd-7e23377ef32f\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.539411 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2d5d8e18-256b-41c5-87b2-8323f9c74620","Type":"ContainerDied","Data":"fbd85dcac6fa351a0b2a8e20dd8a2cfce2102c36bd733b20a1452ad7e3c1294c"} Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.539452 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.539462 4826 scope.go:117] "RemoveContainer" containerID="074c5d1c9911b84ee78079e0aba393ce4336d8bba1a8cac3706cffa03976c6bb" Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.593355 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.614699 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.628757 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 19:18:02 crc kubenswrapper[4826]: E0319 19:18:02.629469 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d5d8e18-256b-41c5-87b2-8323f9c74620" containerName="glance-log" Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.629497 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d5d8e18-256b-41c5-87b2-8323f9c74620" containerName="glance-log" Mar 19 19:18:02 crc kubenswrapper[4826]: E0319 19:18:02.629541 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d5d8e18-256b-41c5-87b2-8323f9c74620" containerName="glance-httpd" Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.629553 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d5d8e18-256b-41c5-87b2-8323f9c74620" containerName="glance-httpd" Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.629988 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d5d8e18-256b-41c5-87b2-8323f9c74620" containerName="glance-log" Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.630022 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d5d8e18-256b-41c5-87b2-8323f9c74620" containerName="glance-httpd" Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.636830 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.643164 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.643359 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.645228 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.717602 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b2afd63-9250-4f56-be2a-271b75704b95-logs\") pod \"glance-default-external-api-0\" (UID: \"8b2afd63-9250-4f56-be2a-271b75704b95\") " pod="openstack/glance-default-external-api-0" Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.720943 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7b89d7bb-5111-43ac-b0dd-7e23377ef32f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7b89d7bb-5111-43ac-b0dd-7e23377ef32f\") pod \"glance-default-external-api-0\" (UID: \"8b2afd63-9250-4f56-be2a-271b75704b95\") " pod="openstack/glance-default-external-api-0" Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.721104 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b2afd63-9250-4f56-be2a-271b75704b95-config-data\") pod \"glance-default-external-api-0\" (UID: \"8b2afd63-9250-4f56-be2a-271b75704b95\") " pod="openstack/glance-default-external-api-0" Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.721301 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blf8s\" (UniqueName: \"kubernetes.io/projected/8b2afd63-9250-4f56-be2a-271b75704b95-kube-api-access-blf8s\") pod \"glance-default-external-api-0\" (UID: \"8b2afd63-9250-4f56-be2a-271b75704b95\") " pod="openstack/glance-default-external-api-0" Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.721423 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b2afd63-9250-4f56-be2a-271b75704b95-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8b2afd63-9250-4f56-be2a-271b75704b95\") " pod="openstack/glance-default-external-api-0" Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.721473 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b2afd63-9250-4f56-be2a-271b75704b95-scripts\") pod \"glance-default-external-api-0\" (UID: \"8b2afd63-9250-4f56-be2a-271b75704b95\") " pod="openstack/glance-default-external-api-0" Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.721589 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8b2afd63-9250-4f56-be2a-271b75704b95-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8b2afd63-9250-4f56-be2a-271b75704b95\") " pod="openstack/glance-default-external-api-0" Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.721677 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b2afd63-9250-4f56-be2a-271b75704b95-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8b2afd63-9250-4f56-be2a-271b75704b95\") " pod="openstack/glance-default-external-api-0" Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.824170 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blf8s\" (UniqueName: \"kubernetes.io/projected/8b2afd63-9250-4f56-be2a-271b75704b95-kube-api-access-blf8s\") pod \"glance-default-external-api-0\" (UID: \"8b2afd63-9250-4f56-be2a-271b75704b95\") " pod="openstack/glance-default-external-api-0" Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.824249 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b2afd63-9250-4f56-be2a-271b75704b95-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8b2afd63-9250-4f56-be2a-271b75704b95\") " pod="openstack/glance-default-external-api-0" Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.824269 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b2afd63-9250-4f56-be2a-271b75704b95-scripts\") pod \"glance-default-external-api-0\" (UID: \"8b2afd63-9250-4f56-be2a-271b75704b95\") " pod="openstack/glance-default-external-api-0" Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.824309 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8b2afd63-9250-4f56-be2a-271b75704b95-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8b2afd63-9250-4f56-be2a-271b75704b95\") " pod="openstack/glance-default-external-api-0" Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.824339 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b2afd63-9250-4f56-be2a-271b75704b95-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8b2afd63-9250-4f56-be2a-271b75704b95\") " pod="openstack/glance-default-external-api-0" Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.824372 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b2afd63-9250-4f56-be2a-271b75704b95-logs\") pod \"glance-default-external-api-0\" (UID: \"8b2afd63-9250-4f56-be2a-271b75704b95\") " pod="openstack/glance-default-external-api-0" Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.824410 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7b89d7bb-5111-43ac-b0dd-7e23377ef32f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7b89d7bb-5111-43ac-b0dd-7e23377ef32f\") pod \"glance-default-external-api-0\" (UID: \"8b2afd63-9250-4f56-be2a-271b75704b95\") " pod="openstack/glance-default-external-api-0" Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.824456 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b2afd63-9250-4f56-be2a-271b75704b95-config-data\") pod \"glance-default-external-api-0\" (UID: \"8b2afd63-9250-4f56-be2a-271b75704b95\") " pod="openstack/glance-default-external-api-0" Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.825574 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b2afd63-9250-4f56-be2a-271b75704b95-logs\") pod \"glance-default-external-api-0\" (UID: \"8b2afd63-9250-4f56-be2a-271b75704b95\") " pod="openstack/glance-default-external-api-0" Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.826015 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8b2afd63-9250-4f56-be2a-271b75704b95-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8b2afd63-9250-4f56-be2a-271b75704b95\") " pod="openstack/glance-default-external-api-0" Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.828860 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b2afd63-9250-4f56-be2a-271b75704b95-scripts\") pod \"glance-default-external-api-0\" (UID: \"8b2afd63-9250-4f56-be2a-271b75704b95\") " pod="openstack/glance-default-external-api-0" Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.829131 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b2afd63-9250-4f56-be2a-271b75704b95-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8b2afd63-9250-4f56-be2a-271b75704b95\") " pod="openstack/glance-default-external-api-0" Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.830347 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b2afd63-9250-4f56-be2a-271b75704b95-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8b2afd63-9250-4f56-be2a-271b75704b95\") " pod="openstack/glance-default-external-api-0" Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.836860 4826 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.837273 4826 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7b89d7bb-5111-43ac-b0dd-7e23377ef32f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7b89d7bb-5111-43ac-b0dd-7e23377ef32f\") pod \"glance-default-external-api-0\" (UID: \"8b2afd63-9250-4f56-be2a-271b75704b95\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f5ddb69bbd0946a253964aa6f0f321c34484aa81c710d424a0f6be0ed74bf7c0/globalmount\"" pod="openstack/glance-default-external-api-0" Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.844497 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b2afd63-9250-4f56-be2a-271b75704b95-config-data\") pod \"glance-default-external-api-0\" (UID: \"8b2afd63-9250-4f56-be2a-271b75704b95\") " pod="openstack/glance-default-external-api-0" Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.847714 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blf8s\" (UniqueName: \"kubernetes.io/projected/8b2afd63-9250-4f56-be2a-271b75704b95-kube-api-access-blf8s\") pod \"glance-default-external-api-0\" (UID: \"8b2afd63-9250-4f56-be2a-271b75704b95\") " pod="openstack/glance-default-external-api-0" Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.890409 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7b89d7bb-5111-43ac-b0dd-7e23377ef32f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7b89d7bb-5111-43ac-b0dd-7e23377ef32f\") pod \"glance-default-external-api-0\" (UID: \"8b2afd63-9250-4f56-be2a-271b75704b95\") " pod="openstack/glance-default-external-api-0" Mar 19 19:18:02 crc kubenswrapper[4826]: I0319 19:18:02.972634 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 19 19:18:03 crc kubenswrapper[4826]: I0319 19:18:03.992009 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d5d8e18-256b-41c5-87b2-8323f9c74620" path="/var/lib/kubelet/pods/2d5d8e18-256b-41c5-87b2-8323f9c74620/volumes" Mar 19 19:18:06 crc kubenswrapper[4826]: I0319 19:18:06.602348 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6dng4" event={"ID":"cc531703-8300-4682-bcc7-1772312976a9","Type":"ContainerDied","Data":"641e9343403485b3d5394e70a3468ec4bb3d83c5cb3a0effb1119400b1bfef13"} Mar 19 19:18:06 crc kubenswrapper[4826]: I0319 19:18:06.602382 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="641e9343403485b3d5394e70a3468ec4bb3d83c5cb3a0effb1119400b1bfef13" Mar 19 19:18:06 crc kubenswrapper[4826]: I0319 19:18:06.606283 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6dng4" Mar 19 19:18:06 crc kubenswrapper[4826]: I0319 19:18:06.758151 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc531703-8300-4682-bcc7-1772312976a9-scripts\") pod \"cc531703-8300-4682-bcc7-1772312976a9\" (UID: \"cc531703-8300-4682-bcc7-1772312976a9\") " Mar 19 19:18:06 crc kubenswrapper[4826]: I0319 19:18:06.758520 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc531703-8300-4682-bcc7-1772312976a9-config-data\") pod \"cc531703-8300-4682-bcc7-1772312976a9\" (UID: \"cc531703-8300-4682-bcc7-1772312976a9\") " Mar 19 19:18:06 crc kubenswrapper[4826]: I0319 19:18:06.758694 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc531703-8300-4682-bcc7-1772312976a9-combined-ca-bundle\") pod \"cc531703-8300-4682-bcc7-1772312976a9\" (UID: \"cc531703-8300-4682-bcc7-1772312976a9\") " Mar 19 19:18:06 crc kubenswrapper[4826]: I0319 19:18:06.758842 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cc531703-8300-4682-bcc7-1772312976a9-fernet-keys\") pod \"cc531703-8300-4682-bcc7-1772312976a9\" (UID: \"cc531703-8300-4682-bcc7-1772312976a9\") " Mar 19 19:18:06 crc kubenswrapper[4826]: I0319 19:18:06.758864 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cztns\" (UniqueName: \"kubernetes.io/projected/cc531703-8300-4682-bcc7-1772312976a9-kube-api-access-cztns\") pod \"cc531703-8300-4682-bcc7-1772312976a9\" (UID: \"cc531703-8300-4682-bcc7-1772312976a9\") " Mar 19 19:18:06 crc kubenswrapper[4826]: I0319 19:18:06.758902 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cc531703-8300-4682-bcc7-1772312976a9-credential-keys\") pod \"cc531703-8300-4682-bcc7-1772312976a9\" (UID: \"cc531703-8300-4682-bcc7-1772312976a9\") " Mar 19 19:18:06 crc kubenswrapper[4826]: I0319 19:18:06.781267 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc531703-8300-4682-bcc7-1772312976a9-kube-api-access-cztns" (OuterVolumeSpecName: "kube-api-access-cztns") pod "cc531703-8300-4682-bcc7-1772312976a9" (UID: "cc531703-8300-4682-bcc7-1772312976a9"). InnerVolumeSpecName "kube-api-access-cztns". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:18:06 crc kubenswrapper[4826]: I0319 19:18:06.781743 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc531703-8300-4682-bcc7-1772312976a9-scripts" (OuterVolumeSpecName: "scripts") pod "cc531703-8300-4682-bcc7-1772312976a9" (UID: "cc531703-8300-4682-bcc7-1772312976a9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:06 crc kubenswrapper[4826]: I0319 19:18:06.783119 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc531703-8300-4682-bcc7-1772312976a9-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "cc531703-8300-4682-bcc7-1772312976a9" (UID: "cc531703-8300-4682-bcc7-1772312976a9"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:06 crc kubenswrapper[4826]: I0319 19:18:06.783794 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc531703-8300-4682-bcc7-1772312976a9-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "cc531703-8300-4682-bcc7-1772312976a9" (UID: "cc531703-8300-4682-bcc7-1772312976a9"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:06 crc kubenswrapper[4826]: I0319 19:18:06.798183 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc531703-8300-4682-bcc7-1772312976a9-config-data" (OuterVolumeSpecName: "config-data") pod "cc531703-8300-4682-bcc7-1772312976a9" (UID: "cc531703-8300-4682-bcc7-1772312976a9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:06 crc kubenswrapper[4826]: I0319 19:18:06.804679 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc531703-8300-4682-bcc7-1772312976a9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cc531703-8300-4682-bcc7-1772312976a9" (UID: "cc531703-8300-4682-bcc7-1772312976a9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:06 crc kubenswrapper[4826]: I0319 19:18:06.861947 4826 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc531703-8300-4682-bcc7-1772312976a9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:06 crc kubenswrapper[4826]: I0319 19:18:06.861999 4826 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cc531703-8300-4682-bcc7-1772312976a9-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:06 crc kubenswrapper[4826]: I0319 19:18:06.862015 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cztns\" (UniqueName: \"kubernetes.io/projected/cc531703-8300-4682-bcc7-1772312976a9-kube-api-access-cztns\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:06 crc kubenswrapper[4826]: I0319 19:18:06.862030 4826 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cc531703-8300-4682-bcc7-1772312976a9-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:06 crc kubenswrapper[4826]: I0319 19:18:06.862044 4826 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc531703-8300-4682-bcc7-1772312976a9-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:06 crc kubenswrapper[4826]: I0319 19:18:06.862055 4826 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc531703-8300-4682-bcc7-1772312976a9-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:06 crc kubenswrapper[4826]: E0319 19:18:06.998271 4826 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Mar 19 19:18:06 crc kubenswrapper[4826]: E0319 19:18:06.998811 4826 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n66dh65fh59dh575hbh567h648h5dch55fh566h669h578h685h58dh559h96h66fh545h57ch556h9bh667h5ddh557h697h656hd5h576hd5h5dhcfhbbq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7wckx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(dc865bcc-551d-4ae7-a0f3-af128e1d1e2d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 19:18:07 crc kubenswrapper[4826]: I0319 19:18:07.614228 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6dng4" Mar 19 19:18:07 crc kubenswrapper[4826]: I0319 19:18:07.704705 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-6dng4"] Mar 19 19:18:07 crc kubenswrapper[4826]: I0319 19:18:07.711722 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-6dng4"] Mar 19 19:18:07 crc kubenswrapper[4826]: I0319 19:18:07.789260 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-s7bsn"] Mar 19 19:18:07 crc kubenswrapper[4826]: E0319 19:18:07.798802 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc531703-8300-4682-bcc7-1772312976a9" containerName="keystone-bootstrap" Mar 19 19:18:07 crc kubenswrapper[4826]: I0319 19:18:07.798833 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc531703-8300-4682-bcc7-1772312976a9" containerName="keystone-bootstrap" Mar 19 19:18:07 crc kubenswrapper[4826]: I0319 19:18:07.799125 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc531703-8300-4682-bcc7-1772312976a9" containerName="keystone-bootstrap" Mar 19 19:18:07 crc kubenswrapper[4826]: I0319 19:18:07.799757 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-s7bsn"] Mar 19 19:18:07 crc kubenswrapper[4826]: I0319 19:18:07.799846 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-s7bsn" Mar 19 19:18:07 crc kubenswrapper[4826]: I0319 19:18:07.802271 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 19 19:18:07 crc kubenswrapper[4826]: I0319 19:18:07.802417 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 19 19:18:07 crc kubenswrapper[4826]: I0319 19:18:07.806127 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 19 19:18:07 crc kubenswrapper[4826]: I0319 19:18:07.806431 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-sxv9p" Mar 19 19:18:07 crc kubenswrapper[4826]: I0319 19:18:07.806536 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 19 19:18:07 crc kubenswrapper[4826]: I0319 19:18:07.988487 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc531703-8300-4682-bcc7-1772312976a9" path="/var/lib/kubelet/pods/cc531703-8300-4682-bcc7-1772312976a9/volumes" Mar 19 19:18:07 crc kubenswrapper[4826]: I0319 19:18:07.989896 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1abad829-61e3-47f1-b24d-58ccb40e58f7-config-data\") pod \"keystone-bootstrap-s7bsn\" (UID: \"1abad829-61e3-47f1-b24d-58ccb40e58f7\") " pod="openstack/keystone-bootstrap-s7bsn" Mar 19 19:18:07 crc kubenswrapper[4826]: I0319 19:18:07.989947 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1abad829-61e3-47f1-b24d-58ccb40e58f7-combined-ca-bundle\") pod \"keystone-bootstrap-s7bsn\" (UID: \"1abad829-61e3-47f1-b24d-58ccb40e58f7\") " pod="openstack/keystone-bootstrap-s7bsn" Mar 19 19:18:07 crc kubenswrapper[4826]: I0319 19:18:07.990048 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1abad829-61e3-47f1-b24d-58ccb40e58f7-credential-keys\") pod \"keystone-bootstrap-s7bsn\" (UID: \"1abad829-61e3-47f1-b24d-58ccb40e58f7\") " pod="openstack/keystone-bootstrap-s7bsn" Mar 19 19:18:07 crc kubenswrapper[4826]: I0319 19:18:07.990150 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1abad829-61e3-47f1-b24d-58ccb40e58f7-fernet-keys\") pod \"keystone-bootstrap-s7bsn\" (UID: \"1abad829-61e3-47f1-b24d-58ccb40e58f7\") " pod="openstack/keystone-bootstrap-s7bsn" Mar 19 19:18:07 crc kubenswrapper[4826]: I0319 19:18:07.990261 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1abad829-61e3-47f1-b24d-58ccb40e58f7-scripts\") pod \"keystone-bootstrap-s7bsn\" (UID: \"1abad829-61e3-47f1-b24d-58ccb40e58f7\") " pod="openstack/keystone-bootstrap-s7bsn" Mar 19 19:18:07 crc kubenswrapper[4826]: I0319 19:18:07.990293 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs9n6\" (UniqueName: \"kubernetes.io/projected/1abad829-61e3-47f1-b24d-58ccb40e58f7-kube-api-access-zs9n6\") pod \"keystone-bootstrap-s7bsn\" (UID: \"1abad829-61e3-47f1-b24d-58ccb40e58f7\") " pod="openstack/keystone-bootstrap-s7bsn" Mar 19 19:18:08 crc kubenswrapper[4826]: I0319 19:18:08.092907 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1abad829-61e3-47f1-b24d-58ccb40e58f7-scripts\") pod \"keystone-bootstrap-s7bsn\" (UID: \"1abad829-61e3-47f1-b24d-58ccb40e58f7\") " pod="openstack/keystone-bootstrap-s7bsn" Mar 19 19:18:08 crc kubenswrapper[4826]: I0319 19:18:08.092977 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zs9n6\" (UniqueName: \"kubernetes.io/projected/1abad829-61e3-47f1-b24d-58ccb40e58f7-kube-api-access-zs9n6\") pod \"keystone-bootstrap-s7bsn\" (UID: \"1abad829-61e3-47f1-b24d-58ccb40e58f7\") " pod="openstack/keystone-bootstrap-s7bsn" Mar 19 19:18:08 crc kubenswrapper[4826]: I0319 19:18:08.093118 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1abad829-61e3-47f1-b24d-58ccb40e58f7-config-data\") pod \"keystone-bootstrap-s7bsn\" (UID: \"1abad829-61e3-47f1-b24d-58ccb40e58f7\") " pod="openstack/keystone-bootstrap-s7bsn" Mar 19 19:18:08 crc kubenswrapper[4826]: I0319 19:18:08.093146 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1abad829-61e3-47f1-b24d-58ccb40e58f7-combined-ca-bundle\") pod \"keystone-bootstrap-s7bsn\" (UID: \"1abad829-61e3-47f1-b24d-58ccb40e58f7\") " pod="openstack/keystone-bootstrap-s7bsn" Mar 19 19:18:08 crc kubenswrapper[4826]: I0319 19:18:08.093240 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1abad829-61e3-47f1-b24d-58ccb40e58f7-credential-keys\") pod \"keystone-bootstrap-s7bsn\" (UID: \"1abad829-61e3-47f1-b24d-58ccb40e58f7\") " pod="openstack/keystone-bootstrap-s7bsn" Mar 19 19:18:08 crc kubenswrapper[4826]: I0319 19:18:08.093644 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1abad829-61e3-47f1-b24d-58ccb40e58f7-fernet-keys\") pod \"keystone-bootstrap-s7bsn\" (UID: \"1abad829-61e3-47f1-b24d-58ccb40e58f7\") " pod="openstack/keystone-bootstrap-s7bsn" Mar 19 19:18:08 crc kubenswrapper[4826]: I0319 19:18:08.099871 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1abad829-61e3-47f1-b24d-58ccb40e58f7-credential-keys\") pod \"keystone-bootstrap-s7bsn\" (UID: \"1abad829-61e3-47f1-b24d-58ccb40e58f7\") " pod="openstack/keystone-bootstrap-s7bsn" Mar 19 19:18:08 crc kubenswrapper[4826]: I0319 19:18:08.099952 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1abad829-61e3-47f1-b24d-58ccb40e58f7-config-data\") pod \"keystone-bootstrap-s7bsn\" (UID: \"1abad829-61e3-47f1-b24d-58ccb40e58f7\") " pod="openstack/keystone-bootstrap-s7bsn" Mar 19 19:18:08 crc kubenswrapper[4826]: I0319 19:18:08.100419 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1abad829-61e3-47f1-b24d-58ccb40e58f7-combined-ca-bundle\") pod \"keystone-bootstrap-s7bsn\" (UID: \"1abad829-61e3-47f1-b24d-58ccb40e58f7\") " pod="openstack/keystone-bootstrap-s7bsn" Mar 19 19:18:08 crc kubenswrapper[4826]: I0319 19:18:08.101694 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1abad829-61e3-47f1-b24d-58ccb40e58f7-fernet-keys\") pod \"keystone-bootstrap-s7bsn\" (UID: \"1abad829-61e3-47f1-b24d-58ccb40e58f7\") " pod="openstack/keystone-bootstrap-s7bsn" Mar 19 19:18:08 crc kubenswrapper[4826]: I0319 19:18:08.110820 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1abad829-61e3-47f1-b24d-58ccb40e58f7-scripts\") pod \"keystone-bootstrap-s7bsn\" (UID: \"1abad829-61e3-47f1-b24d-58ccb40e58f7\") " pod="openstack/keystone-bootstrap-s7bsn" Mar 19 19:18:08 crc kubenswrapper[4826]: I0319 19:18:08.116470 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs9n6\" (UniqueName: \"kubernetes.io/projected/1abad829-61e3-47f1-b24d-58ccb40e58f7-kube-api-access-zs9n6\") pod \"keystone-bootstrap-s7bsn\" (UID: \"1abad829-61e3-47f1-b24d-58ccb40e58f7\") " pod="openstack/keystone-bootstrap-s7bsn" Mar 19 19:18:08 crc kubenswrapper[4826]: I0319 19:18:08.122639 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-s7bsn" Mar 19 19:18:11 crc kubenswrapper[4826]: I0319 19:18:11.648070 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-nqjw7" podUID="f22cc2a8-4d4f-42c8-b22e-478636b5259e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.151:5353: i/o timeout" Mar 19 19:18:12 crc kubenswrapper[4826]: I0319 19:18:12.672100 4826 generic.go:334] "Generic (PLEG): container finished" podID="9ef3b70e-ff7d-48a9-8796-8b20af6e6547" containerID="65170b93cfb43575d4256115d3e03de56f7c6e64fd93442416bc2e4a4f359520" exitCode=0 Mar 19 19:18:12 crc kubenswrapper[4826]: I0319 19:18:12.672197 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-jgxmp" event={"ID":"9ef3b70e-ff7d-48a9-8796-8b20af6e6547","Type":"ContainerDied","Data":"65170b93cfb43575d4256115d3e03de56f7c6e64fd93442416bc2e4a4f359520"} Mar 19 19:18:15 crc kubenswrapper[4826]: E0319 19:18:15.784856 4826 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Mar 19 19:18:15 crc kubenswrapper[4826]: E0319 19:18:15.785436 4826 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nrl4l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-xkjdp_openstack(30c5e21e-66a0-47a9-b03d-55fbfe372d1b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 19:18:15 crc kubenswrapper[4826]: E0319 19:18:15.789750 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-xkjdp" podUID="30c5e21e-66a0-47a9-b03d-55fbfe372d1b" Mar 19 19:18:15 crc kubenswrapper[4826]: I0319 19:18:15.921426 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.101307 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/29561cc2-3a45-4b4b-b97d-51575bb021d7-httpd-run\") pod \"29561cc2-3a45-4b4b-b97d-51575bb021d7\" (UID: \"29561cc2-3a45-4b4b-b97d-51575bb021d7\") " Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.101416 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29561cc2-3a45-4b4b-b97d-51575bb021d7-config-data\") pod \"29561cc2-3a45-4b4b-b97d-51575bb021d7\" (UID: \"29561cc2-3a45-4b4b-b97d-51575bb021d7\") " Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.101561 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pzbb\" (UniqueName: \"kubernetes.io/projected/29561cc2-3a45-4b4b-b97d-51575bb021d7-kube-api-access-6pzbb\") pod \"29561cc2-3a45-4b4b-b97d-51575bb021d7\" (UID: \"29561cc2-3a45-4b4b-b97d-51575bb021d7\") " Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.101599 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29561cc2-3a45-4b4b-b97d-51575bb021d7-logs\") pod \"29561cc2-3a45-4b4b-b97d-51575bb021d7\" (UID: \"29561cc2-3a45-4b4b-b97d-51575bb021d7\") " Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.101649 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29561cc2-3a45-4b4b-b97d-51575bb021d7-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "29561cc2-3a45-4b4b-b97d-51575bb021d7" (UID: "29561cc2-3a45-4b4b-b97d-51575bb021d7"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.101721 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ce7ce0de-edb0-4ca1-b23b-a1184d7ab53e\") pod \"29561cc2-3a45-4b4b-b97d-51575bb021d7\" (UID: \"29561cc2-3a45-4b4b-b97d-51575bb021d7\") " Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.101764 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29561cc2-3a45-4b4b-b97d-51575bb021d7-scripts\") pod \"29561cc2-3a45-4b4b-b97d-51575bb021d7\" (UID: \"29561cc2-3a45-4b4b-b97d-51575bb021d7\") " Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.101906 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/29561cc2-3a45-4b4b-b97d-51575bb021d7-internal-tls-certs\") pod \"29561cc2-3a45-4b4b-b97d-51575bb021d7\" (UID: \"29561cc2-3a45-4b4b-b97d-51575bb021d7\") " Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.101913 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29561cc2-3a45-4b4b-b97d-51575bb021d7-logs" (OuterVolumeSpecName: "logs") pod "29561cc2-3a45-4b4b-b97d-51575bb021d7" (UID: "29561cc2-3a45-4b4b-b97d-51575bb021d7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.101934 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29561cc2-3a45-4b4b-b97d-51575bb021d7-combined-ca-bundle\") pod \"29561cc2-3a45-4b4b-b97d-51575bb021d7\" (UID: \"29561cc2-3a45-4b4b-b97d-51575bb021d7\") " Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.103720 4826 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29561cc2-3a45-4b4b-b97d-51575bb021d7-logs\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.103753 4826 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/29561cc2-3a45-4b4b-b97d-51575bb021d7-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.108823 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29561cc2-3a45-4b4b-b97d-51575bb021d7-kube-api-access-6pzbb" (OuterVolumeSpecName: "kube-api-access-6pzbb") pod "29561cc2-3a45-4b4b-b97d-51575bb021d7" (UID: "29561cc2-3a45-4b4b-b97d-51575bb021d7"). InnerVolumeSpecName "kube-api-access-6pzbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.121071 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29561cc2-3a45-4b4b-b97d-51575bb021d7-scripts" (OuterVolumeSpecName: "scripts") pod "29561cc2-3a45-4b4b-b97d-51575bb021d7" (UID: "29561cc2-3a45-4b4b-b97d-51575bb021d7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.129804 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ce7ce0de-edb0-4ca1-b23b-a1184d7ab53e" (OuterVolumeSpecName: "glance") pod "29561cc2-3a45-4b4b-b97d-51575bb021d7" (UID: "29561cc2-3a45-4b4b-b97d-51575bb021d7"). InnerVolumeSpecName "pvc-ce7ce0de-edb0-4ca1-b23b-a1184d7ab53e". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.131258 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29561cc2-3a45-4b4b-b97d-51575bb021d7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "29561cc2-3a45-4b4b-b97d-51575bb021d7" (UID: "29561cc2-3a45-4b4b-b97d-51575bb021d7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.157178 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29561cc2-3a45-4b4b-b97d-51575bb021d7-config-data" (OuterVolumeSpecName: "config-data") pod "29561cc2-3a45-4b4b-b97d-51575bb021d7" (UID: "29561cc2-3a45-4b4b-b97d-51575bb021d7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.180524 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29561cc2-3a45-4b4b-b97d-51575bb021d7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "29561cc2-3a45-4b4b-b97d-51575bb021d7" (UID: "29561cc2-3a45-4b4b-b97d-51575bb021d7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.254891 4826 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/29561cc2-3a45-4b4b-b97d-51575bb021d7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.254931 4826 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29561cc2-3a45-4b4b-b97d-51575bb021d7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.254944 4826 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29561cc2-3a45-4b4b-b97d-51575bb021d7-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.254954 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pzbb\" (UniqueName: \"kubernetes.io/projected/29561cc2-3a45-4b4b-b97d-51575bb021d7-kube-api-access-6pzbb\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.254988 4826 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-ce7ce0de-edb0-4ca1-b23b-a1184d7ab53e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ce7ce0de-edb0-4ca1-b23b-a1184d7ab53e\") on node \"crc\" " Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.254998 4826 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29561cc2-3a45-4b4b-b97d-51575bb021d7-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.294016 4826 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.294253 4826 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-ce7ce0de-edb0-4ca1-b23b-a1184d7ab53e" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ce7ce0de-edb0-4ca1-b23b-a1184d7ab53e") on node "crc" Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.356325 4826 reconciler_common.go:293] "Volume detached for volume \"pvc-ce7ce0de-edb0-4ca1-b23b-a1184d7ab53e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ce7ce0de-edb0-4ca1-b23b-a1184d7ab53e\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:16 crc kubenswrapper[4826]: E0319 19:18:16.390441 4826 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified" Mar 19 19:18:16 crc kubenswrapper[4826]: E0319 19:18:16.390562 4826 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:heat-db-sync,Image:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,Command:[/bin/bash],Args:[-c /usr/bin/heat-manage --config-dir /etc/heat/heat.conf.d db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/heat/heat.conf.d/00-default.conf,SubPath:00-default.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/heat/heat.conf.d/01-custom.conf,SubPath:01-custom.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-j8kf4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42418,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42418,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-db-sync-xfct2_openstack(d92c353f-6fae-4be8-8580-4066bb56e856): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 19:18:16 crc kubenswrapper[4826]: E0319 19:18:16.392115 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/heat-db-sync-xfct2" podUID="d92c353f-6fae-4be8-8580-4066bb56e856" Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.393762 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-nqjw7" Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.409127 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-jgxmp" Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.457799 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2wq2\" (UniqueName: \"kubernetes.io/projected/f22cc2a8-4d4f-42c8-b22e-478636b5259e-kube-api-access-l2wq2\") pod \"f22cc2a8-4d4f-42c8-b22e-478636b5259e\" (UID: \"f22cc2a8-4d4f-42c8-b22e-478636b5259e\") " Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.457909 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47mxb\" (UniqueName: \"kubernetes.io/projected/9ef3b70e-ff7d-48a9-8796-8b20af6e6547-kube-api-access-47mxb\") pod \"9ef3b70e-ff7d-48a9-8796-8b20af6e6547\" (UID: \"9ef3b70e-ff7d-48a9-8796-8b20af6e6547\") " Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.457994 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f22cc2a8-4d4f-42c8-b22e-478636b5259e-ovsdbserver-nb\") pod \"f22cc2a8-4d4f-42c8-b22e-478636b5259e\" (UID: \"f22cc2a8-4d4f-42c8-b22e-478636b5259e\") " Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.458047 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f22cc2a8-4d4f-42c8-b22e-478636b5259e-dns-svc\") pod \"f22cc2a8-4d4f-42c8-b22e-478636b5259e\" (UID: \"f22cc2a8-4d4f-42c8-b22e-478636b5259e\") " Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.458102 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f22cc2a8-4d4f-42c8-b22e-478636b5259e-config\") pod \"f22cc2a8-4d4f-42c8-b22e-478636b5259e\" (UID: \"f22cc2a8-4d4f-42c8-b22e-478636b5259e\") " Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.458184 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f22cc2a8-4d4f-42c8-b22e-478636b5259e-ovsdbserver-sb\") pod \"f22cc2a8-4d4f-42c8-b22e-478636b5259e\" (UID: \"f22cc2a8-4d4f-42c8-b22e-478636b5259e\") " Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.458283 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ef3b70e-ff7d-48a9-8796-8b20af6e6547-combined-ca-bundle\") pod \"9ef3b70e-ff7d-48a9-8796-8b20af6e6547\" (UID: \"9ef3b70e-ff7d-48a9-8796-8b20af6e6547\") " Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.458328 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9ef3b70e-ff7d-48a9-8796-8b20af6e6547-config\") pod \"9ef3b70e-ff7d-48a9-8796-8b20af6e6547\" (UID: \"9ef3b70e-ff7d-48a9-8796-8b20af6e6547\") " Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.463170 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f22cc2a8-4d4f-42c8-b22e-478636b5259e-kube-api-access-l2wq2" (OuterVolumeSpecName: "kube-api-access-l2wq2") pod "f22cc2a8-4d4f-42c8-b22e-478636b5259e" (UID: "f22cc2a8-4d4f-42c8-b22e-478636b5259e"). InnerVolumeSpecName "kube-api-access-l2wq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.463950 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ef3b70e-ff7d-48a9-8796-8b20af6e6547-kube-api-access-47mxb" (OuterVolumeSpecName: "kube-api-access-47mxb") pod "9ef3b70e-ff7d-48a9-8796-8b20af6e6547" (UID: "9ef3b70e-ff7d-48a9-8796-8b20af6e6547"). InnerVolumeSpecName "kube-api-access-47mxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.512387 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ef3b70e-ff7d-48a9-8796-8b20af6e6547-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9ef3b70e-ff7d-48a9-8796-8b20af6e6547" (UID: "9ef3b70e-ff7d-48a9-8796-8b20af6e6547"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.525156 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ef3b70e-ff7d-48a9-8796-8b20af6e6547-config" (OuterVolumeSpecName: "config") pod "9ef3b70e-ff7d-48a9-8796-8b20af6e6547" (UID: "9ef3b70e-ff7d-48a9-8796-8b20af6e6547"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.530555 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f22cc2a8-4d4f-42c8-b22e-478636b5259e-config" (OuterVolumeSpecName: "config") pod "f22cc2a8-4d4f-42c8-b22e-478636b5259e" (UID: "f22cc2a8-4d4f-42c8-b22e-478636b5259e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.533748 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f22cc2a8-4d4f-42c8-b22e-478636b5259e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f22cc2a8-4d4f-42c8-b22e-478636b5259e" (UID: "f22cc2a8-4d4f-42c8-b22e-478636b5259e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.541192 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f22cc2a8-4d4f-42c8-b22e-478636b5259e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f22cc2a8-4d4f-42c8-b22e-478636b5259e" (UID: "f22cc2a8-4d4f-42c8-b22e-478636b5259e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.555927 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f22cc2a8-4d4f-42c8-b22e-478636b5259e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f22cc2a8-4d4f-42c8-b22e-478636b5259e" (UID: "f22cc2a8-4d4f-42c8-b22e-478636b5259e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.562433 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47mxb\" (UniqueName: \"kubernetes.io/projected/9ef3b70e-ff7d-48a9-8796-8b20af6e6547-kube-api-access-47mxb\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.562466 4826 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f22cc2a8-4d4f-42c8-b22e-478636b5259e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.562483 4826 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f22cc2a8-4d4f-42c8-b22e-478636b5259e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.562496 4826 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f22cc2a8-4d4f-42c8-b22e-478636b5259e-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.562512 4826 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f22cc2a8-4d4f-42c8-b22e-478636b5259e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.562524 4826 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ef3b70e-ff7d-48a9-8796-8b20af6e6547-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.562535 4826 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/9ef3b70e-ff7d-48a9-8796-8b20af6e6547-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.562551 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2wq2\" (UniqueName: \"kubernetes.io/projected/f22cc2a8-4d4f-42c8-b22e-478636b5259e-kube-api-access-l2wq2\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.649072 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-nqjw7" podUID="f22cc2a8-4d4f-42c8-b22e-478636b5259e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.151:5353: i/o timeout" Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.649178 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-nqjw7" Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.727286 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-nqjw7" event={"ID":"f22cc2a8-4d4f-42c8-b22e-478636b5259e","Type":"ContainerDied","Data":"dd37605856f33ca28b622c507288eac0f0b11f5073b144c9890a2d1128a75804"} Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.727334 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-nqjw7" Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.737284 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"29561cc2-3a45-4b4b-b97d-51575bb021d7","Type":"ContainerDied","Data":"e7b8a0bba75a966f46588fd4088a155bd5fb45a0792ace401fa0536be594de75"} Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.737469 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.751781 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-jgxmp" Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.752157 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-jgxmp" event={"ID":"9ef3b70e-ff7d-48a9-8796-8b20af6e6547","Type":"ContainerDied","Data":"c22f0e8cf02cbecbc107b706b9d26d98d7eb45707a3c7e76a69447360192fddf"} Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.752206 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c22f0e8cf02cbecbc107b706b9d26d98d7eb45707a3c7e76a69447360192fddf" Mar 19 19:18:16 crc kubenswrapper[4826]: E0319 19:18:16.754032 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-xkjdp" podUID="30c5e21e-66a0-47a9-b03d-55fbfe372d1b" Mar 19 19:18:16 crc kubenswrapper[4826]: E0319 19:18:16.757404 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified\\\"\"" pod="openstack/heat-db-sync-xfct2" podUID="d92c353f-6fae-4be8-8580-4066bb56e856" Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.854849 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-nqjw7"] Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.864716 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-nqjw7"] Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.886759 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.901645 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.912679 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 19:18:16 crc kubenswrapper[4826]: E0319 19:18:16.913291 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29561cc2-3a45-4b4b-b97d-51575bb021d7" containerName="glance-httpd" Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.913307 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="29561cc2-3a45-4b4b-b97d-51575bb021d7" containerName="glance-httpd" Mar 19 19:18:16 crc kubenswrapper[4826]: E0319 19:18:16.913319 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f22cc2a8-4d4f-42c8-b22e-478636b5259e" containerName="dnsmasq-dns" Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.913327 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="f22cc2a8-4d4f-42c8-b22e-478636b5259e" containerName="dnsmasq-dns" Mar 19 19:18:16 crc kubenswrapper[4826]: E0319 19:18:16.913346 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ef3b70e-ff7d-48a9-8796-8b20af6e6547" containerName="neutron-db-sync" Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.913353 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ef3b70e-ff7d-48a9-8796-8b20af6e6547" containerName="neutron-db-sync" Mar 19 19:18:16 crc kubenswrapper[4826]: E0319 19:18:16.913377 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f22cc2a8-4d4f-42c8-b22e-478636b5259e" containerName="init" Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.913384 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="f22cc2a8-4d4f-42c8-b22e-478636b5259e" containerName="init" Mar 19 19:18:16 crc kubenswrapper[4826]: E0319 19:18:16.913401 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29561cc2-3a45-4b4b-b97d-51575bb021d7" containerName="glance-log" Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.913408 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="29561cc2-3a45-4b4b-b97d-51575bb021d7" containerName="glance-log" Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.913740 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="29561cc2-3a45-4b4b-b97d-51575bb021d7" containerName="glance-log" Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.913756 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="f22cc2a8-4d4f-42c8-b22e-478636b5259e" containerName="dnsmasq-dns" Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.913778 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ef3b70e-ff7d-48a9-8796-8b20af6e6547" containerName="neutron-db-sync" Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.913802 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="29561cc2-3a45-4b4b-b97d-51575bb021d7" containerName="glance-httpd" Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.915247 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.917937 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.918112 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 19 19:18:16 crc kubenswrapper[4826]: I0319 19:18:16.924387 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 19:18:17 crc kubenswrapper[4826]: I0319 19:18:17.074006 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ee13755-098f-4b30-8f68-4376adb9d4aa-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2ee13755-098f-4b30-8f68-4376adb9d4aa\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:18:17 crc kubenswrapper[4826]: I0319 19:18:17.074068 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ee13755-098f-4b30-8f68-4376adb9d4aa-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2ee13755-098f-4b30-8f68-4376adb9d4aa\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:18:17 crc kubenswrapper[4826]: I0319 19:18:17.074185 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ee13755-098f-4b30-8f68-4376adb9d4aa-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2ee13755-098f-4b30-8f68-4376adb9d4aa\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:18:17 crc kubenswrapper[4826]: I0319 19:18:17.074386 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ee13755-098f-4b30-8f68-4376adb9d4aa-logs\") pod \"glance-default-internal-api-0\" (UID: \"2ee13755-098f-4b30-8f68-4376adb9d4aa\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:18:17 crc kubenswrapper[4826]: I0319 19:18:17.074437 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cj2jl\" (UniqueName: \"kubernetes.io/projected/2ee13755-098f-4b30-8f68-4376adb9d4aa-kube-api-access-cj2jl\") pod \"glance-default-internal-api-0\" (UID: \"2ee13755-098f-4b30-8f68-4376adb9d4aa\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:18:17 crc kubenswrapper[4826]: I0319 19:18:17.074627 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ee13755-098f-4b30-8f68-4376adb9d4aa-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2ee13755-098f-4b30-8f68-4376adb9d4aa\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:18:17 crc kubenswrapper[4826]: I0319 19:18:17.074848 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2ee13755-098f-4b30-8f68-4376adb9d4aa-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2ee13755-098f-4b30-8f68-4376adb9d4aa\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:18:17 crc kubenswrapper[4826]: I0319 19:18:17.074889 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ce7ce0de-edb0-4ca1-b23b-a1184d7ab53e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ce7ce0de-edb0-4ca1-b23b-a1184d7ab53e\") pod \"glance-default-internal-api-0\" (UID: \"2ee13755-098f-4b30-8f68-4376adb9d4aa\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:18:17 crc kubenswrapper[4826]: I0319 19:18:17.176319 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ee13755-098f-4b30-8f68-4376adb9d4aa-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2ee13755-098f-4b30-8f68-4376adb9d4aa\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:18:17 crc kubenswrapper[4826]: I0319 19:18:17.176425 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2ee13755-098f-4b30-8f68-4376adb9d4aa-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2ee13755-098f-4b30-8f68-4376adb9d4aa\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:18:17 crc kubenswrapper[4826]: I0319 19:18:17.176457 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ce7ce0de-edb0-4ca1-b23b-a1184d7ab53e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ce7ce0de-edb0-4ca1-b23b-a1184d7ab53e\") pod \"glance-default-internal-api-0\" (UID: \"2ee13755-098f-4b30-8f68-4376adb9d4aa\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:18:17 crc kubenswrapper[4826]: I0319 19:18:17.176500 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ee13755-098f-4b30-8f68-4376adb9d4aa-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2ee13755-098f-4b30-8f68-4376adb9d4aa\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:18:17 crc kubenswrapper[4826]: I0319 19:18:17.176527 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ee13755-098f-4b30-8f68-4376adb9d4aa-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2ee13755-098f-4b30-8f68-4376adb9d4aa\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:18:17 crc kubenswrapper[4826]: I0319 19:18:17.176554 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ee13755-098f-4b30-8f68-4376adb9d4aa-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2ee13755-098f-4b30-8f68-4376adb9d4aa\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:18:17 crc kubenswrapper[4826]: I0319 19:18:17.176620 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ee13755-098f-4b30-8f68-4376adb9d4aa-logs\") pod \"glance-default-internal-api-0\" (UID: \"2ee13755-098f-4b30-8f68-4376adb9d4aa\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:18:17 crc kubenswrapper[4826]: I0319 19:18:17.176656 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cj2jl\" (UniqueName: \"kubernetes.io/projected/2ee13755-098f-4b30-8f68-4376adb9d4aa-kube-api-access-cj2jl\") pod \"glance-default-internal-api-0\" (UID: \"2ee13755-098f-4b30-8f68-4376adb9d4aa\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:18:17 crc kubenswrapper[4826]: I0319 19:18:17.176835 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2ee13755-098f-4b30-8f68-4376adb9d4aa-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2ee13755-098f-4b30-8f68-4376adb9d4aa\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:18:17 crc kubenswrapper[4826]: I0319 19:18:17.179592 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ee13755-098f-4b30-8f68-4376adb9d4aa-logs\") pod \"glance-default-internal-api-0\" (UID: \"2ee13755-098f-4b30-8f68-4376adb9d4aa\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:18:17 crc kubenswrapper[4826]: I0319 19:18:17.180456 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ee13755-098f-4b30-8f68-4376adb9d4aa-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2ee13755-098f-4b30-8f68-4376adb9d4aa\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:18:17 crc kubenswrapper[4826]: I0319 19:18:17.180887 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ee13755-098f-4b30-8f68-4376adb9d4aa-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2ee13755-098f-4b30-8f68-4376adb9d4aa\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:18:17 crc kubenswrapper[4826]: I0319 19:18:17.181466 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ee13755-098f-4b30-8f68-4376adb9d4aa-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2ee13755-098f-4b30-8f68-4376adb9d4aa\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:18:17 crc kubenswrapper[4826]: I0319 19:18:17.189313 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ee13755-098f-4b30-8f68-4376adb9d4aa-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2ee13755-098f-4b30-8f68-4376adb9d4aa\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:18:17 crc kubenswrapper[4826]: I0319 19:18:17.190614 4826 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 19:18:17 crc kubenswrapper[4826]: I0319 19:18:17.190643 4826 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ce7ce0de-edb0-4ca1-b23b-a1184d7ab53e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ce7ce0de-edb0-4ca1-b23b-a1184d7ab53e\") pod \"glance-default-internal-api-0\" (UID: \"2ee13755-098f-4b30-8f68-4376adb9d4aa\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/8a5c6a8766a786cbd3eca35b0f5a1e3802ae1e4cbe235d9479277134f5caec0c/globalmount\"" pod="openstack/glance-default-internal-api-0" Mar 19 19:18:17 crc kubenswrapper[4826]: I0319 19:18:17.193811 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cj2jl\" (UniqueName: \"kubernetes.io/projected/2ee13755-098f-4b30-8f68-4376adb9d4aa-kube-api-access-cj2jl\") pod \"glance-default-internal-api-0\" (UID: \"2ee13755-098f-4b30-8f68-4376adb9d4aa\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:18:17 crc kubenswrapper[4826]: I0319 19:18:17.226380 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ce7ce0de-edb0-4ca1-b23b-a1184d7ab53e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ce7ce0de-edb0-4ca1-b23b-a1184d7ab53e\") pod \"glance-default-internal-api-0\" (UID: \"2ee13755-098f-4b30-8f68-4376adb9d4aa\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:18:17 crc kubenswrapper[4826]: I0319 19:18:17.242809 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 19:18:17 crc kubenswrapper[4826]: I0319 19:18:17.750135 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-t2kwz"] Mar 19 19:18:17 crc kubenswrapper[4826]: I0319 19:18:17.757111 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-t2kwz" Mar 19 19:18:17 crc kubenswrapper[4826]: I0319 19:18:17.777197 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-t2kwz"] Mar 19 19:18:17 crc kubenswrapper[4826]: I0319 19:18:17.799157 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe47625c-be5d-44ad-be5f-b64628005833-config\") pod \"dnsmasq-dns-6b7b667979-t2kwz\" (UID: \"fe47625c-be5d-44ad-be5f-b64628005833\") " pod="openstack/dnsmasq-dns-6b7b667979-t2kwz" Mar 19 19:18:17 crc kubenswrapper[4826]: I0319 19:18:17.799224 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fe47625c-be5d-44ad-be5f-b64628005833-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-t2kwz\" (UID: \"fe47625c-be5d-44ad-be5f-b64628005833\") " pod="openstack/dnsmasq-dns-6b7b667979-t2kwz" Mar 19 19:18:17 crc kubenswrapper[4826]: I0319 19:18:17.799247 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe47625c-be5d-44ad-be5f-b64628005833-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-t2kwz\" (UID: \"fe47625c-be5d-44ad-be5f-b64628005833\") " pod="openstack/dnsmasq-dns-6b7b667979-t2kwz" Mar 19 19:18:17 crc kubenswrapper[4826]: I0319 19:18:17.799276 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe47625c-be5d-44ad-be5f-b64628005833-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-t2kwz\" (UID: \"fe47625c-be5d-44ad-be5f-b64628005833\") " pod="openstack/dnsmasq-dns-6b7b667979-t2kwz" Mar 19 19:18:17 crc kubenswrapper[4826]: I0319 19:18:17.799364 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe47625c-be5d-44ad-be5f-b64628005833-dns-svc\") pod \"dnsmasq-dns-6b7b667979-t2kwz\" (UID: \"fe47625c-be5d-44ad-be5f-b64628005833\") " pod="openstack/dnsmasq-dns-6b7b667979-t2kwz" Mar 19 19:18:17 crc kubenswrapper[4826]: I0319 19:18:17.799494 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5nfw\" (UniqueName: \"kubernetes.io/projected/fe47625c-be5d-44ad-be5f-b64628005833-kube-api-access-f5nfw\") pod \"dnsmasq-dns-6b7b667979-t2kwz\" (UID: \"fe47625c-be5d-44ad-be5f-b64628005833\") " pod="openstack/dnsmasq-dns-6b7b667979-t2kwz" Mar 19 19:18:17 crc kubenswrapper[4826]: I0319 19:18:17.817134 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-dddfc89b6-hxwk2"] Mar 19 19:18:17 crc kubenswrapper[4826]: I0319 19:18:17.818810 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dddfc89b6-hxwk2" Mar 19 19:18:17 crc kubenswrapper[4826]: I0319 19:18:17.822087 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 19 19:18:17 crc kubenswrapper[4826]: I0319 19:18:17.822265 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 19 19:18:17 crc kubenswrapper[4826]: I0319 19:18:17.822364 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-9hfvr" Mar 19 19:18:17 crc kubenswrapper[4826]: I0319 19:18:17.824017 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 19 19:18:17 crc kubenswrapper[4826]: I0319 19:18:17.875820 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dddfc89b6-hxwk2"] Mar 19 19:18:17 crc kubenswrapper[4826]: I0319 19:18:17.902128 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fe47625c-be5d-44ad-be5f-b64628005833-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-t2kwz\" (UID: \"fe47625c-be5d-44ad-be5f-b64628005833\") " pod="openstack/dnsmasq-dns-6b7b667979-t2kwz" Mar 19 19:18:17 crc kubenswrapper[4826]: I0319 19:18:17.902171 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe47625c-be5d-44ad-be5f-b64628005833-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-t2kwz\" (UID: \"fe47625c-be5d-44ad-be5f-b64628005833\") " pod="openstack/dnsmasq-dns-6b7b667979-t2kwz" Mar 19 19:18:17 crc kubenswrapper[4826]: I0319 19:18:17.902204 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe47625c-be5d-44ad-be5f-b64628005833-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-t2kwz\" (UID: \"fe47625c-be5d-44ad-be5f-b64628005833\") " pod="openstack/dnsmasq-dns-6b7b667979-t2kwz" Mar 19 19:18:17 crc kubenswrapper[4826]: I0319 19:18:17.902223 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ef1c972-c655-460d-a249-a1f7d4990b2f-ovndb-tls-certs\") pod \"neutron-dddfc89b6-hxwk2\" (UID: \"5ef1c972-c655-460d-a249-a1f7d4990b2f\") " pod="openstack/neutron-dddfc89b6-hxwk2" Mar 19 19:18:17 crc kubenswrapper[4826]: I0319 19:18:17.902265 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5ef1c972-c655-460d-a249-a1f7d4990b2f-config\") pod \"neutron-dddfc89b6-hxwk2\" (UID: \"5ef1c972-c655-460d-a249-a1f7d4990b2f\") " pod="openstack/neutron-dddfc89b6-hxwk2" Mar 19 19:18:17 crc kubenswrapper[4826]: I0319 19:18:17.902285 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe47625c-be5d-44ad-be5f-b64628005833-dns-svc\") pod \"dnsmasq-dns-6b7b667979-t2kwz\" (UID: \"fe47625c-be5d-44ad-be5f-b64628005833\") " pod="openstack/dnsmasq-dns-6b7b667979-t2kwz" Mar 19 19:18:17 crc kubenswrapper[4826]: I0319 19:18:17.902323 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5ef1c972-c655-460d-a249-a1f7d4990b2f-httpd-config\") pod \"neutron-dddfc89b6-hxwk2\" (UID: \"5ef1c972-c655-460d-a249-a1f7d4990b2f\") " pod="openstack/neutron-dddfc89b6-hxwk2" Mar 19 19:18:17 crc kubenswrapper[4826]: I0319 19:18:17.902376 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xsbl\" (UniqueName: \"kubernetes.io/projected/5ef1c972-c655-460d-a249-a1f7d4990b2f-kube-api-access-7xsbl\") pod \"neutron-dddfc89b6-hxwk2\" (UID: \"5ef1c972-c655-460d-a249-a1f7d4990b2f\") " pod="openstack/neutron-dddfc89b6-hxwk2" Mar 19 19:18:17 crc kubenswrapper[4826]: I0319 19:18:17.902407 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5nfw\" (UniqueName: \"kubernetes.io/projected/fe47625c-be5d-44ad-be5f-b64628005833-kube-api-access-f5nfw\") pod \"dnsmasq-dns-6b7b667979-t2kwz\" (UID: \"fe47625c-be5d-44ad-be5f-b64628005833\") " pod="openstack/dnsmasq-dns-6b7b667979-t2kwz" Mar 19 19:18:17 crc kubenswrapper[4826]: I0319 19:18:17.902454 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ef1c972-c655-460d-a249-a1f7d4990b2f-combined-ca-bundle\") pod \"neutron-dddfc89b6-hxwk2\" (UID: \"5ef1c972-c655-460d-a249-a1f7d4990b2f\") " pod="openstack/neutron-dddfc89b6-hxwk2" Mar 19 19:18:17 crc kubenswrapper[4826]: I0319 19:18:17.902486 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe47625c-be5d-44ad-be5f-b64628005833-config\") pod \"dnsmasq-dns-6b7b667979-t2kwz\" (UID: \"fe47625c-be5d-44ad-be5f-b64628005833\") " pod="openstack/dnsmasq-dns-6b7b667979-t2kwz" Mar 19 19:18:17 crc kubenswrapper[4826]: I0319 19:18:17.903335 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fe47625c-be5d-44ad-be5f-b64628005833-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-t2kwz\" (UID: \"fe47625c-be5d-44ad-be5f-b64628005833\") " pod="openstack/dnsmasq-dns-6b7b667979-t2kwz" Mar 19 19:18:17 crc kubenswrapper[4826]: I0319 19:18:17.903343 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe47625c-be5d-44ad-be5f-b64628005833-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-t2kwz\" (UID: \"fe47625c-be5d-44ad-be5f-b64628005833\") " pod="openstack/dnsmasq-dns-6b7b667979-t2kwz" Mar 19 19:18:17 crc kubenswrapper[4826]: I0319 19:18:17.903734 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe47625c-be5d-44ad-be5f-b64628005833-config\") pod \"dnsmasq-dns-6b7b667979-t2kwz\" (UID: \"fe47625c-be5d-44ad-be5f-b64628005833\") " pod="openstack/dnsmasq-dns-6b7b667979-t2kwz" Mar 19 19:18:17 crc kubenswrapper[4826]: I0319 19:18:17.904031 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe47625c-be5d-44ad-be5f-b64628005833-dns-svc\") pod \"dnsmasq-dns-6b7b667979-t2kwz\" (UID: \"fe47625c-be5d-44ad-be5f-b64628005833\") " pod="openstack/dnsmasq-dns-6b7b667979-t2kwz" Mar 19 19:18:17 crc kubenswrapper[4826]: I0319 19:18:17.904149 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe47625c-be5d-44ad-be5f-b64628005833-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-t2kwz\" (UID: \"fe47625c-be5d-44ad-be5f-b64628005833\") " pod="openstack/dnsmasq-dns-6b7b667979-t2kwz" Mar 19 19:18:17 crc kubenswrapper[4826]: I0319 19:18:17.929735 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5nfw\" (UniqueName: \"kubernetes.io/projected/fe47625c-be5d-44ad-be5f-b64628005833-kube-api-access-f5nfw\") pod \"dnsmasq-dns-6b7b667979-t2kwz\" (UID: \"fe47625c-be5d-44ad-be5f-b64628005833\") " pod="openstack/dnsmasq-dns-6b7b667979-t2kwz" Mar 19 19:18:18 crc kubenswrapper[4826]: I0319 19:18:18.005334 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ef1c972-c655-460d-a249-a1f7d4990b2f-ovndb-tls-certs\") pod \"neutron-dddfc89b6-hxwk2\" (UID: \"5ef1c972-c655-460d-a249-a1f7d4990b2f\") " pod="openstack/neutron-dddfc89b6-hxwk2" Mar 19 19:18:18 crc kubenswrapper[4826]: I0319 19:18:18.005465 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5ef1c972-c655-460d-a249-a1f7d4990b2f-config\") pod \"neutron-dddfc89b6-hxwk2\" (UID: \"5ef1c972-c655-460d-a249-a1f7d4990b2f\") " pod="openstack/neutron-dddfc89b6-hxwk2" Mar 19 19:18:18 crc kubenswrapper[4826]: I0319 19:18:18.005578 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5ef1c972-c655-460d-a249-a1f7d4990b2f-httpd-config\") pod \"neutron-dddfc89b6-hxwk2\" (UID: \"5ef1c972-c655-460d-a249-a1f7d4990b2f\") " pod="openstack/neutron-dddfc89b6-hxwk2" Mar 19 19:18:18 crc kubenswrapper[4826]: I0319 19:18:18.005744 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xsbl\" (UniqueName: \"kubernetes.io/projected/5ef1c972-c655-460d-a249-a1f7d4990b2f-kube-api-access-7xsbl\") pod \"neutron-dddfc89b6-hxwk2\" (UID: \"5ef1c972-c655-460d-a249-a1f7d4990b2f\") " pod="openstack/neutron-dddfc89b6-hxwk2" Mar 19 19:18:18 crc kubenswrapper[4826]: I0319 19:18:18.005859 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ef1c972-c655-460d-a249-a1f7d4990b2f-combined-ca-bundle\") pod \"neutron-dddfc89b6-hxwk2\" (UID: \"5ef1c972-c655-460d-a249-a1f7d4990b2f\") " pod="openstack/neutron-dddfc89b6-hxwk2" Mar 19 19:18:18 crc kubenswrapper[4826]: I0319 19:18:18.008227 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29561cc2-3a45-4b4b-b97d-51575bb021d7" path="/var/lib/kubelet/pods/29561cc2-3a45-4b4b-b97d-51575bb021d7/volumes" Mar 19 19:18:18 crc kubenswrapper[4826]: I0319 19:18:18.009485 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ef1c972-c655-460d-a249-a1f7d4990b2f-ovndb-tls-certs\") pod \"neutron-dddfc89b6-hxwk2\" (UID: \"5ef1c972-c655-460d-a249-a1f7d4990b2f\") " pod="openstack/neutron-dddfc89b6-hxwk2" Mar 19 19:18:18 crc kubenswrapper[4826]: I0319 19:18:18.010158 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f22cc2a8-4d4f-42c8-b22e-478636b5259e" path="/var/lib/kubelet/pods/f22cc2a8-4d4f-42c8-b22e-478636b5259e/volumes" Mar 19 19:18:18 crc kubenswrapper[4826]: I0319 19:18:18.014602 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ef1c972-c655-460d-a249-a1f7d4990b2f-combined-ca-bundle\") pod \"neutron-dddfc89b6-hxwk2\" (UID: \"5ef1c972-c655-460d-a249-a1f7d4990b2f\") " pod="openstack/neutron-dddfc89b6-hxwk2" Mar 19 19:18:18 crc kubenswrapper[4826]: I0319 19:18:18.017134 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5ef1c972-c655-460d-a249-a1f7d4990b2f-httpd-config\") pod \"neutron-dddfc89b6-hxwk2\" (UID: \"5ef1c972-c655-460d-a249-a1f7d4990b2f\") " pod="openstack/neutron-dddfc89b6-hxwk2" Mar 19 19:18:18 crc kubenswrapper[4826]: I0319 19:18:18.017858 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5ef1c972-c655-460d-a249-a1f7d4990b2f-config\") pod \"neutron-dddfc89b6-hxwk2\" (UID: \"5ef1c972-c655-460d-a249-a1f7d4990b2f\") " pod="openstack/neutron-dddfc89b6-hxwk2" Mar 19 19:18:18 crc kubenswrapper[4826]: I0319 19:18:18.030761 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xsbl\" (UniqueName: \"kubernetes.io/projected/5ef1c972-c655-460d-a249-a1f7d4990b2f-kube-api-access-7xsbl\") pod \"neutron-dddfc89b6-hxwk2\" (UID: \"5ef1c972-c655-460d-a249-a1f7d4990b2f\") " pod="openstack/neutron-dddfc89b6-hxwk2" Mar 19 19:18:18 crc kubenswrapper[4826]: I0319 19:18:18.119830 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-t2kwz" Mar 19 19:18:18 crc kubenswrapper[4826]: I0319 19:18:18.153566 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dddfc89b6-hxwk2" Mar 19 19:18:18 crc kubenswrapper[4826]: I0319 19:18:18.223917 4826 scope.go:117] "RemoveContainer" containerID="f9b24dca59990792911707f44c368633deacf829d20224880101a0320588b3ee" Mar 19 19:18:18 crc kubenswrapper[4826]: E0319 19:18:18.256501 4826 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Mar 19 19:18:18 crc kubenswrapper[4826]: E0319 19:18:18.256676 4826 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f4t44,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-p7jzd_openstack(5c80aa39-c840-4267-9677-bb82f387073d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 19:18:18 crc kubenswrapper[4826]: E0319 19:18:18.258580 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-p7jzd" podUID="5c80aa39-c840-4267-9677-bb82f387073d" Mar 19 19:18:18 crc kubenswrapper[4826]: I0319 19:18:18.354241 4826 scope.go:117] "RemoveContainer" containerID="326c7e917beb6524f0a0ba588c56756018b6f25727555edcfa14a915da74514e" Mar 19 19:18:18 crc kubenswrapper[4826]: I0319 19:18:18.785957 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565798-jlzqc"] Mar 19 19:18:18 crc kubenswrapper[4826]: E0319 19:18:18.800082 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-p7jzd" podUID="5c80aa39-c840-4267-9677-bb82f387073d" Mar 19 19:18:19 crc kubenswrapper[4826]: I0319 19:18:19.349359 4826 scope.go:117] "RemoveContainer" containerID="15014fa0408a8c08289b2115c220d8fc06ec8c78be7b7f7d342af741e3a520f0" Mar 19 19:18:19 crc kubenswrapper[4826]: I0319 19:18:19.438214 4826 scope.go:117] "RemoveContainer" containerID="f0e76fd3575fa73c4103c1158165352541e3236ddcfe6d40a160d6afe52f76f8" Mar 19 19:18:19 crc kubenswrapper[4826]: I0319 19:18:19.676799 4826 scope.go:117] "RemoveContainer" containerID="607568831a713d54b5253632020405f47f1b0e4e0e2d1c7bfe2448ee22480bd5" Mar 19 19:18:19 crc kubenswrapper[4826]: I0319 19:18:19.807497 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565798-jlzqc" event={"ID":"eef07f29-afd3-40df-a0c7-098109beedde","Type":"ContainerStarted","Data":"77479c8180fac23fefdcb6ea632875bcf737782715f6633c65fa4fec32b28ac1"} Mar 19 19:18:19 crc kubenswrapper[4826]: I0319 19:18:19.921277 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-s7bsn"] Mar 19 19:18:19 crc kubenswrapper[4826]: I0319 19:18:19.994923 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-8bf647cd5-pw2jr"] Mar 19 19:18:19 crc kubenswrapper[4826]: I0319 19:18:19.999578 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8bf647cd5-pw2jr" Mar 19 19:18:20 crc kubenswrapper[4826]: I0319 19:18:20.002541 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 19 19:18:20 crc kubenswrapper[4826]: I0319 19:18:20.002673 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 19 19:18:20 crc kubenswrapper[4826]: I0319 19:18:20.008486 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-8bf647cd5-pw2jr"] Mar 19 19:18:20 crc kubenswrapper[4826]: I0319 19:18:20.026960 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 19:18:20 crc kubenswrapper[4826]: I0319 19:18:20.150303 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce18435f-6584-401e-b983-2d82bb66f9b3-combined-ca-bundle\") pod \"neutron-8bf647cd5-pw2jr\" (UID: \"ce18435f-6584-401e-b983-2d82bb66f9b3\") " pod="openstack/neutron-8bf647cd5-pw2jr" Mar 19 19:18:20 crc kubenswrapper[4826]: I0319 19:18:20.150340 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce18435f-6584-401e-b983-2d82bb66f9b3-internal-tls-certs\") pod \"neutron-8bf647cd5-pw2jr\" (UID: \"ce18435f-6584-401e-b983-2d82bb66f9b3\") " pod="openstack/neutron-8bf647cd5-pw2jr" Mar 19 19:18:20 crc kubenswrapper[4826]: I0319 19:18:20.150415 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flmv8\" (UniqueName: \"kubernetes.io/projected/ce18435f-6584-401e-b983-2d82bb66f9b3-kube-api-access-flmv8\") pod \"neutron-8bf647cd5-pw2jr\" (UID: \"ce18435f-6584-401e-b983-2d82bb66f9b3\") " pod="openstack/neutron-8bf647cd5-pw2jr" Mar 19 19:18:20 crc kubenswrapper[4826]: I0319 19:18:20.151370 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ce18435f-6584-401e-b983-2d82bb66f9b3-config\") pod \"neutron-8bf647cd5-pw2jr\" (UID: \"ce18435f-6584-401e-b983-2d82bb66f9b3\") " pod="openstack/neutron-8bf647cd5-pw2jr" Mar 19 19:18:20 crc kubenswrapper[4826]: I0319 19:18:20.151449 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce18435f-6584-401e-b983-2d82bb66f9b3-public-tls-certs\") pod \"neutron-8bf647cd5-pw2jr\" (UID: \"ce18435f-6584-401e-b983-2d82bb66f9b3\") " pod="openstack/neutron-8bf647cd5-pw2jr" Mar 19 19:18:20 crc kubenswrapper[4826]: I0319 19:18:20.151612 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ce18435f-6584-401e-b983-2d82bb66f9b3-httpd-config\") pod \"neutron-8bf647cd5-pw2jr\" (UID: \"ce18435f-6584-401e-b983-2d82bb66f9b3\") " pod="openstack/neutron-8bf647cd5-pw2jr" Mar 19 19:18:20 crc kubenswrapper[4826]: I0319 19:18:20.152015 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce18435f-6584-401e-b983-2d82bb66f9b3-ovndb-tls-certs\") pod \"neutron-8bf647cd5-pw2jr\" (UID: \"ce18435f-6584-401e-b983-2d82bb66f9b3\") " pod="openstack/neutron-8bf647cd5-pw2jr" Mar 19 19:18:20 crc kubenswrapper[4826]: I0319 19:18:20.225876 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 19:18:20 crc kubenswrapper[4826]: I0319 19:18:20.253547 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flmv8\" (UniqueName: \"kubernetes.io/projected/ce18435f-6584-401e-b983-2d82bb66f9b3-kube-api-access-flmv8\") pod \"neutron-8bf647cd5-pw2jr\" (UID: \"ce18435f-6584-401e-b983-2d82bb66f9b3\") " pod="openstack/neutron-8bf647cd5-pw2jr" Mar 19 19:18:20 crc kubenswrapper[4826]: I0319 19:18:20.253608 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ce18435f-6584-401e-b983-2d82bb66f9b3-config\") pod \"neutron-8bf647cd5-pw2jr\" (UID: \"ce18435f-6584-401e-b983-2d82bb66f9b3\") " pod="openstack/neutron-8bf647cd5-pw2jr" Mar 19 19:18:20 crc kubenswrapper[4826]: I0319 19:18:20.253632 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce18435f-6584-401e-b983-2d82bb66f9b3-public-tls-certs\") pod \"neutron-8bf647cd5-pw2jr\" (UID: \"ce18435f-6584-401e-b983-2d82bb66f9b3\") " pod="openstack/neutron-8bf647cd5-pw2jr" Mar 19 19:18:20 crc kubenswrapper[4826]: I0319 19:18:20.253825 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ce18435f-6584-401e-b983-2d82bb66f9b3-httpd-config\") pod \"neutron-8bf647cd5-pw2jr\" (UID: \"ce18435f-6584-401e-b983-2d82bb66f9b3\") " pod="openstack/neutron-8bf647cd5-pw2jr" Mar 19 19:18:20 crc kubenswrapper[4826]: I0319 19:18:20.254067 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce18435f-6584-401e-b983-2d82bb66f9b3-ovndb-tls-certs\") pod \"neutron-8bf647cd5-pw2jr\" (UID: \"ce18435f-6584-401e-b983-2d82bb66f9b3\") " pod="openstack/neutron-8bf647cd5-pw2jr" Mar 19 19:18:20 crc kubenswrapper[4826]: I0319 19:18:20.254190 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce18435f-6584-401e-b983-2d82bb66f9b3-combined-ca-bundle\") pod \"neutron-8bf647cd5-pw2jr\" (UID: \"ce18435f-6584-401e-b983-2d82bb66f9b3\") " pod="openstack/neutron-8bf647cd5-pw2jr" Mar 19 19:18:20 crc kubenswrapper[4826]: I0319 19:18:20.254223 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce18435f-6584-401e-b983-2d82bb66f9b3-internal-tls-certs\") pod \"neutron-8bf647cd5-pw2jr\" (UID: \"ce18435f-6584-401e-b983-2d82bb66f9b3\") " pod="openstack/neutron-8bf647cd5-pw2jr" Mar 19 19:18:20 crc kubenswrapper[4826]: I0319 19:18:20.259980 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce18435f-6584-401e-b983-2d82bb66f9b3-combined-ca-bundle\") pod \"neutron-8bf647cd5-pw2jr\" (UID: \"ce18435f-6584-401e-b983-2d82bb66f9b3\") " pod="openstack/neutron-8bf647cd5-pw2jr" Mar 19 19:18:20 crc kubenswrapper[4826]: I0319 19:18:20.260048 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ce18435f-6584-401e-b983-2d82bb66f9b3-config\") pod \"neutron-8bf647cd5-pw2jr\" (UID: \"ce18435f-6584-401e-b983-2d82bb66f9b3\") " pod="openstack/neutron-8bf647cd5-pw2jr" Mar 19 19:18:20 crc kubenswrapper[4826]: I0319 19:18:20.260164 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce18435f-6584-401e-b983-2d82bb66f9b3-internal-tls-certs\") pod \"neutron-8bf647cd5-pw2jr\" (UID: \"ce18435f-6584-401e-b983-2d82bb66f9b3\") " pod="openstack/neutron-8bf647cd5-pw2jr" Mar 19 19:18:20 crc kubenswrapper[4826]: I0319 19:18:20.260257 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ce18435f-6584-401e-b983-2d82bb66f9b3-httpd-config\") pod \"neutron-8bf647cd5-pw2jr\" (UID: \"ce18435f-6584-401e-b983-2d82bb66f9b3\") " pod="openstack/neutron-8bf647cd5-pw2jr" Mar 19 19:18:20 crc kubenswrapper[4826]: I0319 19:18:20.263190 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce18435f-6584-401e-b983-2d82bb66f9b3-public-tls-certs\") pod \"neutron-8bf647cd5-pw2jr\" (UID: \"ce18435f-6584-401e-b983-2d82bb66f9b3\") " pod="openstack/neutron-8bf647cd5-pw2jr" Mar 19 19:18:20 crc kubenswrapper[4826]: I0319 19:18:20.264068 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce18435f-6584-401e-b983-2d82bb66f9b3-ovndb-tls-certs\") pod \"neutron-8bf647cd5-pw2jr\" (UID: \"ce18435f-6584-401e-b983-2d82bb66f9b3\") " pod="openstack/neutron-8bf647cd5-pw2jr" Mar 19 19:18:20 crc kubenswrapper[4826]: I0319 19:18:20.280439 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flmv8\" (UniqueName: \"kubernetes.io/projected/ce18435f-6584-401e-b983-2d82bb66f9b3-kube-api-access-flmv8\") pod \"neutron-8bf647cd5-pw2jr\" (UID: \"ce18435f-6584-401e-b983-2d82bb66f9b3\") " pod="openstack/neutron-8bf647cd5-pw2jr" Mar 19 19:18:20 crc kubenswrapper[4826]: I0319 19:18:20.306816 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-t2kwz"] Mar 19 19:18:20 crc kubenswrapper[4826]: I0319 19:18:20.322175 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8bf647cd5-pw2jr" Mar 19 19:18:20 crc kubenswrapper[4826]: W0319 19:18:20.632113 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ee13755_098f_4b30_8f68_4376adb9d4aa.slice/crio-c70ffe91fc6694f141565003afb38ef310c5fda8d8043c89fa5b94ee5e7a2c0b WatchSource:0}: Error finding container c70ffe91fc6694f141565003afb38ef310c5fda8d8043c89fa5b94ee5e7a2c0b: Status 404 returned error can't find the container with id c70ffe91fc6694f141565003afb38ef310c5fda8d8043c89fa5b94ee5e7a2c0b Mar 19 19:18:20 crc kubenswrapper[4826]: W0319 19:18:20.641818 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b2afd63_9250_4f56_be2a_271b75704b95.slice/crio-b453f513f1f8eb5c9b2f27cd22a881584f4abcac16a980e3c4eaf84ea7098067 WatchSource:0}: Error finding container b453f513f1f8eb5c9b2f27cd22a881584f4abcac16a980e3c4eaf84ea7098067: Status 404 returned error can't find the container with id b453f513f1f8eb5c9b2f27cd22a881584f4abcac16a980e3c4eaf84ea7098067 Mar 19 19:18:20 crc kubenswrapper[4826]: I0319 19:18:20.822556 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2ee13755-098f-4b30-8f68-4376adb9d4aa","Type":"ContainerStarted","Data":"c70ffe91fc6694f141565003afb38ef310c5fda8d8043c89fa5b94ee5e7a2c0b"} Mar 19 19:18:20 crc kubenswrapper[4826]: I0319 19:18:20.824118 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-t2kwz" event={"ID":"fe47625c-be5d-44ad-be5f-b64628005833","Type":"ContainerStarted","Data":"a25d2483196dcd97c82aec1949be20fa664f9128e15cf2b80e33f55b246fe072"} Mar 19 19:18:20 crc kubenswrapper[4826]: I0319 19:18:20.825832 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-hhk58" event={"ID":"789d342d-013e-45e5-a57b-cde9f8bc0d3f","Type":"ContainerStarted","Data":"1c91fd70db79cc97b6c155a9aa2c4c06984d69be8279807468057978454a73d9"} Mar 19 19:18:20 crc kubenswrapper[4826]: I0319 19:18:20.828371 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8b2afd63-9250-4f56-be2a-271b75704b95","Type":"ContainerStarted","Data":"b453f513f1f8eb5c9b2f27cd22a881584f4abcac16a980e3c4eaf84ea7098067"} Mar 19 19:18:20 crc kubenswrapper[4826]: I0319 19:18:20.829684 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-s7bsn" event={"ID":"1abad829-61e3-47f1-b24d-58ccb40e58f7","Type":"ContainerStarted","Data":"cac09af41bfeadb72bde553a8afbc7e73361f685bef5f794324d2d33c815f212"} Mar 19 19:18:20 crc kubenswrapper[4826]: I0319 19:18:20.846936 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-hhk58" podStartSLOduration=7.021716931 podStartE2EDuration="32.846913736s" podCreationTimestamp="2026-03-19 19:17:48 +0000 UTC" firstStartedPulling="2026-03-19 19:17:50.54200279 +0000 UTC m=+1295.296071103" lastFinishedPulling="2026-03-19 19:18:16.367199595 +0000 UTC m=+1321.121267908" observedRunningTime="2026-03-19 19:18:20.838883083 +0000 UTC m=+1325.592951426" watchObservedRunningTime="2026-03-19 19:18:20.846913736 +0000 UTC m=+1325.600982049" Mar 19 19:18:21 crc kubenswrapper[4826]: I0319 19:18:21.120899 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dddfc89b6-hxwk2"] Mar 19 19:18:22 crc kubenswrapper[4826]: I0319 19:18:22.252560 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 19 19:18:22 crc kubenswrapper[4826]: W0319 19:18:22.277116 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ef1c972_c655_460d_a249_a1f7d4990b2f.slice/crio-6231258059cb48cf598379d2a2b3f2474af44be853875b7a576d0be465f40737 WatchSource:0}: Error finding container 6231258059cb48cf598379d2a2b3f2474af44be853875b7a576d0be465f40737: Status 404 returned error can't find the container with id 6231258059cb48cf598379d2a2b3f2474af44be853875b7a576d0be465f40737 Mar 19 19:18:22 crc kubenswrapper[4826]: I0319 19:18:22.872454 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-s7bsn" event={"ID":"1abad829-61e3-47f1-b24d-58ccb40e58f7","Type":"ContainerStarted","Data":"0deba0cf5d3615df13eb6eb2348c1c67555e0d143f4cccf28346e05fe136c7bd"} Mar 19 19:18:22 crc kubenswrapper[4826]: I0319 19:18:22.873962 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dddfc89b6-hxwk2" event={"ID":"5ef1c972-c655-460d-a249-a1f7d4990b2f","Type":"ContainerStarted","Data":"6231258059cb48cf598379d2a2b3f2474af44be853875b7a576d0be465f40737"} Mar 19 19:18:22 crc kubenswrapper[4826]: I0319 19:18:22.912589 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-s7bsn" podStartSLOduration=15.912567853 podStartE2EDuration="15.912567853s" podCreationTimestamp="2026-03-19 19:18:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:18:22.897308164 +0000 UTC m=+1327.651376477" watchObservedRunningTime="2026-03-19 19:18:22.912567853 +0000 UTC m=+1327.666636176" Mar 19 19:18:22 crc kubenswrapper[4826]: I0319 19:18:22.957006 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-8bf647cd5-pw2jr"] Mar 19 19:18:22 crc kubenswrapper[4826]: W0319 19:18:22.963771 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce18435f_6584_401e_b983_2d82bb66f9b3.slice/crio-65b26f8c3ab04755968bea8b177a153398a852fdb2a6b750a219a0567594dcea WatchSource:0}: Error finding container 65b26f8c3ab04755968bea8b177a153398a852fdb2a6b750a219a0567594dcea: Status 404 returned error can't find the container with id 65b26f8c3ab04755968bea8b177a153398a852fdb2a6b750a219a0567594dcea Mar 19 19:18:23 crc kubenswrapper[4826]: I0319 19:18:23.913906 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dddfc89b6-hxwk2" event={"ID":"5ef1c972-c655-460d-a249-a1f7d4990b2f","Type":"ContainerStarted","Data":"cd104fc87b47839a2c9c5fcb924ef27eb67868772e968598c2ea2c5177099e9b"} Mar 19 19:18:23 crc kubenswrapper[4826]: I0319 19:18:23.914386 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dddfc89b6-hxwk2" event={"ID":"5ef1c972-c655-460d-a249-a1f7d4990b2f","Type":"ContainerStarted","Data":"83e7c3bfc46a79113e6c663c7ed9ecee31bd393e79e33163c32138d7bb8d0b91"} Mar 19 19:18:23 crc kubenswrapper[4826]: I0319 19:18:23.915564 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-dddfc89b6-hxwk2" Mar 19 19:18:23 crc kubenswrapper[4826]: I0319 19:18:23.926256 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2ee13755-098f-4b30-8f68-4376adb9d4aa","Type":"ContainerStarted","Data":"f1b048f794c806a00b659dc40b9462845f9f298f42c762391b8b6db5f6b14904"} Mar 19 19:18:23 crc kubenswrapper[4826]: I0319 19:18:23.952333 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-dddfc89b6-hxwk2" podStartSLOduration=6.952311172 podStartE2EDuration="6.952311172s" podCreationTimestamp="2026-03-19 19:18:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:18:23.942604847 +0000 UTC m=+1328.696673160" watchObservedRunningTime="2026-03-19 19:18:23.952311172 +0000 UTC m=+1328.706379495" Mar 19 19:18:23 crc kubenswrapper[4826]: I0319 19:18:23.960927 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8bf647cd5-pw2jr" event={"ID":"ce18435f-6584-401e-b983-2d82bb66f9b3","Type":"ContainerStarted","Data":"039b77b34fa8a5bdae7d4e93f3a043efe118970621120649b5587efc23cd8535"} Mar 19 19:18:23 crc kubenswrapper[4826]: I0319 19:18:23.960965 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8bf647cd5-pw2jr" event={"ID":"ce18435f-6584-401e-b983-2d82bb66f9b3","Type":"ContainerStarted","Data":"5171a83f6c4abf2fc8b10d0be3c0168e50375a59d480df0886c0b1cb6d330034"} Mar 19 19:18:23 crc kubenswrapper[4826]: I0319 19:18:23.960975 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8bf647cd5-pw2jr" event={"ID":"ce18435f-6584-401e-b983-2d82bb66f9b3","Type":"ContainerStarted","Data":"65b26f8c3ab04755968bea8b177a153398a852fdb2a6b750a219a0567594dcea"} Mar 19 19:18:23 crc kubenswrapper[4826]: I0319 19:18:23.961007 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-8bf647cd5-pw2jr" Mar 19 19:18:23 crc kubenswrapper[4826]: I0319 19:18:23.991800 4826 generic.go:334] "Generic (PLEG): container finished" podID="fe47625c-be5d-44ad-be5f-b64628005833" containerID="17abd6d3328ad0c68ad1637abc0c078d40b863486b2545ab438dc928843ace1d" exitCode=0 Mar 19 19:18:24 crc kubenswrapper[4826]: I0319 19:18:24.010561 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-t2kwz" event={"ID":"fe47625c-be5d-44ad-be5f-b64628005833","Type":"ContainerDied","Data":"17abd6d3328ad0c68ad1637abc0c078d40b863486b2545ab438dc928843ace1d"} Mar 19 19:18:24 crc kubenswrapper[4826]: I0319 19:18:24.022772 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc865bcc-551d-4ae7-a0f3-af128e1d1e2d","Type":"ContainerStarted","Data":"b817efec6ae173dd8722c23c875f2d78f243d318a14b4b568442518dde4e2a73"} Mar 19 19:18:24 crc kubenswrapper[4826]: I0319 19:18:24.032342 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565798-jlzqc" event={"ID":"eef07f29-afd3-40df-a0c7-098109beedde","Type":"ContainerStarted","Data":"030d67dc9fded5f77bdaa9901408cf79bf4434eeb018ce2652412744efa216e6"} Mar 19 19:18:24 crc kubenswrapper[4826]: I0319 19:18:24.038271 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8b2afd63-9250-4f56-be2a-271b75704b95","Type":"ContainerStarted","Data":"3b7f96ca48f5f84c67af82ba8d22e31bfc942a1b6150444356ba090ccd0a15de"} Mar 19 19:18:24 crc kubenswrapper[4826]: I0319 19:18:24.043573 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-8bf647cd5-pw2jr" podStartSLOduration=5.043554008 podStartE2EDuration="5.043554008s" podCreationTimestamp="2026-03-19 19:18:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:18:24.004165445 +0000 UTC m=+1328.758233758" watchObservedRunningTime="2026-03-19 19:18:24.043554008 +0000 UTC m=+1328.797622321" Mar 19 19:18:24 crc kubenswrapper[4826]: I0319 19:18:24.066670 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565798-jlzqc" podStartSLOduration=20.675054683 podStartE2EDuration="24.066640215s" podCreationTimestamp="2026-03-19 19:18:00 +0000 UTC" firstStartedPulling="2026-03-19 19:18:19.308232159 +0000 UTC m=+1324.062300472" lastFinishedPulling="2026-03-19 19:18:22.699817681 +0000 UTC m=+1327.453886004" observedRunningTime="2026-03-19 19:18:24.055993068 +0000 UTC m=+1328.810061411" watchObservedRunningTime="2026-03-19 19:18:24.066640215 +0000 UTC m=+1328.820708528" Mar 19 19:18:25 crc kubenswrapper[4826]: I0319 19:18:25.059564 4826 generic.go:334] "Generic (PLEG): container finished" podID="eef07f29-afd3-40df-a0c7-098109beedde" containerID="030d67dc9fded5f77bdaa9901408cf79bf4434eeb018ce2652412744efa216e6" exitCode=0 Mar 19 19:18:25 crc kubenswrapper[4826]: I0319 19:18:25.060364 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565798-jlzqc" event={"ID":"eef07f29-afd3-40df-a0c7-098109beedde","Type":"ContainerDied","Data":"030d67dc9fded5f77bdaa9901408cf79bf4434eeb018ce2652412744efa216e6"} Mar 19 19:18:25 crc kubenswrapper[4826]: I0319 19:18:25.067588 4826 generic.go:334] "Generic (PLEG): container finished" podID="789d342d-013e-45e5-a57b-cde9f8bc0d3f" containerID="1c91fd70db79cc97b6c155a9aa2c4c06984d69be8279807468057978454a73d9" exitCode=0 Mar 19 19:18:25 crc kubenswrapper[4826]: I0319 19:18:25.067676 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-hhk58" event={"ID":"789d342d-013e-45e5-a57b-cde9f8bc0d3f","Type":"ContainerDied","Data":"1c91fd70db79cc97b6c155a9aa2c4c06984d69be8279807468057978454a73d9"} Mar 19 19:18:25 crc kubenswrapper[4826]: I0319 19:18:25.076964 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8b2afd63-9250-4f56-be2a-271b75704b95","Type":"ContainerStarted","Data":"a0ea25b10134f259221bf3f106ba88d381ec21f9146a923476d1ce60455d1651"} Mar 19 19:18:25 crc kubenswrapper[4826]: I0319 19:18:25.092546 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2ee13755-098f-4b30-8f68-4376adb9d4aa","Type":"ContainerStarted","Data":"7f01b1f044c2e1724a50234bed289541d76936595212ea22858965ec18c6384c"} Mar 19 19:18:25 crc kubenswrapper[4826]: I0319 19:18:25.108608 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-t2kwz" event={"ID":"fe47625c-be5d-44ad-be5f-b64628005833","Type":"ContainerStarted","Data":"2fad22d114982867e0d3f0a2233fb55f638b563f6899839e4b215945422a348e"} Mar 19 19:18:25 crc kubenswrapper[4826]: I0319 19:18:25.127490 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=23.127468005 podStartE2EDuration="23.127468005s" podCreationTimestamp="2026-03-19 19:18:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:18:25.110293179 +0000 UTC m=+1329.864361512" watchObservedRunningTime="2026-03-19 19:18:25.127468005 +0000 UTC m=+1329.881536318" Mar 19 19:18:25 crc kubenswrapper[4826]: I0319 19:18:25.142508 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b7b667979-t2kwz" podStartSLOduration=8.142487668 podStartE2EDuration="8.142487668s" podCreationTimestamp="2026-03-19 19:18:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:18:25.131438811 +0000 UTC m=+1329.885507134" watchObservedRunningTime="2026-03-19 19:18:25.142487668 +0000 UTC m=+1329.896555981" Mar 19 19:18:25 crc kubenswrapper[4826]: I0319 19:18:25.169112 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=9.16906999 podStartE2EDuration="9.16906999s" podCreationTimestamp="2026-03-19 19:18:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:18:25.155335289 +0000 UTC m=+1329.909403602" watchObservedRunningTime="2026-03-19 19:18:25.16906999 +0000 UTC m=+1329.923138313" Mar 19 19:18:25 crc kubenswrapper[4826]: I0319 19:18:25.400764 4826 patch_prober.go:28] interesting pod/machine-config-daemon-zz87p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:18:25 crc kubenswrapper[4826]: I0319 19:18:25.400827 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:18:26 crc kubenswrapper[4826]: I0319 19:18:26.124297 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b7b667979-t2kwz" Mar 19 19:18:27 crc kubenswrapper[4826]: I0319 19:18:27.240958 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 19 19:18:27 crc kubenswrapper[4826]: I0319 19:18:27.243057 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 19 19:18:27 crc kubenswrapper[4826]: I0319 19:18:27.279276 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 19 19:18:27 crc kubenswrapper[4826]: I0319 19:18:27.297873 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 19 19:18:28 crc kubenswrapper[4826]: I0319 19:18:28.148497 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 19 19:18:28 crc kubenswrapper[4826]: I0319 19:18:28.148774 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 19 19:18:29 crc kubenswrapper[4826]: I0319 19:18:29.169183 4826 generic.go:334] "Generic (PLEG): container finished" podID="1abad829-61e3-47f1-b24d-58ccb40e58f7" containerID="0deba0cf5d3615df13eb6eb2348c1c67555e0d143f4cccf28346e05fe136c7bd" exitCode=0 Mar 19 19:18:29 crc kubenswrapper[4826]: I0319 19:18:29.169861 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-s7bsn" event={"ID":"1abad829-61e3-47f1-b24d-58ccb40e58f7","Type":"ContainerDied","Data":"0deba0cf5d3615df13eb6eb2348c1c67555e0d143f4cccf28346e05fe136c7bd"} Mar 19 19:18:29 crc kubenswrapper[4826]: I0319 19:18:29.921086 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565798-jlzqc" Mar 19 19:18:29 crc kubenswrapper[4826]: I0319 19:18:29.929348 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-hhk58" Mar 19 19:18:30 crc kubenswrapper[4826]: I0319 19:18:30.018338 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/789d342d-013e-45e5-a57b-cde9f8bc0d3f-combined-ca-bundle\") pod \"789d342d-013e-45e5-a57b-cde9f8bc0d3f\" (UID: \"789d342d-013e-45e5-a57b-cde9f8bc0d3f\") " Mar 19 19:18:30 crc kubenswrapper[4826]: I0319 19:18:30.018412 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/789d342d-013e-45e5-a57b-cde9f8bc0d3f-config-data\") pod \"789d342d-013e-45e5-a57b-cde9f8bc0d3f\" (UID: \"789d342d-013e-45e5-a57b-cde9f8bc0d3f\") " Mar 19 19:18:30 crc kubenswrapper[4826]: I0319 19:18:30.018543 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/789d342d-013e-45e5-a57b-cde9f8bc0d3f-scripts\") pod \"789d342d-013e-45e5-a57b-cde9f8bc0d3f\" (UID: \"789d342d-013e-45e5-a57b-cde9f8bc0d3f\") " Mar 19 19:18:30 crc kubenswrapper[4826]: I0319 19:18:30.018595 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75899\" (UniqueName: \"kubernetes.io/projected/789d342d-013e-45e5-a57b-cde9f8bc0d3f-kube-api-access-75899\") pod \"789d342d-013e-45e5-a57b-cde9f8bc0d3f\" (UID: \"789d342d-013e-45e5-a57b-cde9f8bc0d3f\") " Mar 19 19:18:30 crc kubenswrapper[4826]: I0319 19:18:30.018764 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svzvr\" (UniqueName: \"kubernetes.io/projected/eef07f29-afd3-40df-a0c7-098109beedde-kube-api-access-svzvr\") pod \"eef07f29-afd3-40df-a0c7-098109beedde\" (UID: \"eef07f29-afd3-40df-a0c7-098109beedde\") " Mar 19 19:18:30 crc kubenswrapper[4826]: I0319 19:18:30.018796 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/789d342d-013e-45e5-a57b-cde9f8bc0d3f-logs\") pod \"789d342d-013e-45e5-a57b-cde9f8bc0d3f\" (UID: \"789d342d-013e-45e5-a57b-cde9f8bc0d3f\") " Mar 19 19:18:30 crc kubenswrapper[4826]: I0319 19:18:30.020328 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/789d342d-013e-45e5-a57b-cde9f8bc0d3f-logs" (OuterVolumeSpecName: "logs") pod "789d342d-013e-45e5-a57b-cde9f8bc0d3f" (UID: "789d342d-013e-45e5-a57b-cde9f8bc0d3f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:18:30 crc kubenswrapper[4826]: I0319 19:18:30.033843 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/789d342d-013e-45e5-a57b-cde9f8bc0d3f-scripts" (OuterVolumeSpecName: "scripts") pod "789d342d-013e-45e5-a57b-cde9f8bc0d3f" (UID: "789d342d-013e-45e5-a57b-cde9f8bc0d3f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:30 crc kubenswrapper[4826]: I0319 19:18:30.035708 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eef07f29-afd3-40df-a0c7-098109beedde-kube-api-access-svzvr" (OuterVolumeSpecName: "kube-api-access-svzvr") pod "eef07f29-afd3-40df-a0c7-098109beedde" (UID: "eef07f29-afd3-40df-a0c7-098109beedde"). InnerVolumeSpecName "kube-api-access-svzvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:18:30 crc kubenswrapper[4826]: I0319 19:18:30.041860 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/789d342d-013e-45e5-a57b-cde9f8bc0d3f-kube-api-access-75899" (OuterVolumeSpecName: "kube-api-access-75899") pod "789d342d-013e-45e5-a57b-cde9f8bc0d3f" (UID: "789d342d-013e-45e5-a57b-cde9f8bc0d3f"). InnerVolumeSpecName "kube-api-access-75899". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:18:30 crc kubenswrapper[4826]: I0319 19:18:30.058588 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/789d342d-013e-45e5-a57b-cde9f8bc0d3f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "789d342d-013e-45e5-a57b-cde9f8bc0d3f" (UID: "789d342d-013e-45e5-a57b-cde9f8bc0d3f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:30 crc kubenswrapper[4826]: I0319 19:18:30.087439 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/789d342d-013e-45e5-a57b-cde9f8bc0d3f-config-data" (OuterVolumeSpecName: "config-data") pod "789d342d-013e-45e5-a57b-cde9f8bc0d3f" (UID: "789d342d-013e-45e5-a57b-cde9f8bc0d3f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:30 crc kubenswrapper[4826]: I0319 19:18:30.120718 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75899\" (UniqueName: \"kubernetes.io/projected/789d342d-013e-45e5-a57b-cde9f8bc0d3f-kube-api-access-75899\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:30 crc kubenswrapper[4826]: I0319 19:18:30.120752 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svzvr\" (UniqueName: \"kubernetes.io/projected/eef07f29-afd3-40df-a0c7-098109beedde-kube-api-access-svzvr\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:30 crc kubenswrapper[4826]: I0319 19:18:30.120764 4826 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/789d342d-013e-45e5-a57b-cde9f8bc0d3f-logs\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:30 crc kubenswrapper[4826]: I0319 19:18:30.120773 4826 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/789d342d-013e-45e5-a57b-cde9f8bc0d3f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:30 crc kubenswrapper[4826]: I0319 19:18:30.120782 4826 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/789d342d-013e-45e5-a57b-cde9f8bc0d3f-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:30 crc kubenswrapper[4826]: I0319 19:18:30.120792 4826 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/789d342d-013e-45e5-a57b-cde9f8bc0d3f-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:30 crc kubenswrapper[4826]: I0319 19:18:30.181856 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565798-jlzqc" event={"ID":"eef07f29-afd3-40df-a0c7-098109beedde","Type":"ContainerDied","Data":"77479c8180fac23fefdcb6ea632875bcf737782715f6633c65fa4fec32b28ac1"} Mar 19 19:18:30 crc kubenswrapper[4826]: I0319 19:18:30.181896 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77479c8180fac23fefdcb6ea632875bcf737782715f6633c65fa4fec32b28ac1" Mar 19 19:18:30 crc kubenswrapper[4826]: I0319 19:18:30.181953 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565798-jlzqc" Mar 19 19:18:30 crc kubenswrapper[4826]: I0319 19:18:30.190672 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-hhk58" event={"ID":"789d342d-013e-45e5-a57b-cde9f8bc0d3f","Type":"ContainerDied","Data":"e45ca12a95b5c3eb600679543a7e0ec9009925206c4c8dbbce8a57b3d4c5190a"} Mar 19 19:18:30 crc kubenswrapper[4826]: I0319 19:18:30.190730 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e45ca12a95b5c3eb600679543a7e0ec9009925206c4c8dbbce8a57b3d4c5190a" Mar 19 19:18:30 crc kubenswrapper[4826]: I0319 19:18:30.190770 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-hhk58" Mar 19 19:18:30 crc kubenswrapper[4826]: I0319 19:18:30.522818 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-s7bsn" Mar 19 19:18:30 crc kubenswrapper[4826]: I0319 19:18:30.633322 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1abad829-61e3-47f1-b24d-58ccb40e58f7-fernet-keys\") pod \"1abad829-61e3-47f1-b24d-58ccb40e58f7\" (UID: \"1abad829-61e3-47f1-b24d-58ccb40e58f7\") " Mar 19 19:18:30 crc kubenswrapper[4826]: I0319 19:18:30.633454 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1abad829-61e3-47f1-b24d-58ccb40e58f7-scripts\") pod \"1abad829-61e3-47f1-b24d-58ccb40e58f7\" (UID: \"1abad829-61e3-47f1-b24d-58ccb40e58f7\") " Mar 19 19:18:30 crc kubenswrapper[4826]: I0319 19:18:30.633648 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zs9n6\" (UniqueName: \"kubernetes.io/projected/1abad829-61e3-47f1-b24d-58ccb40e58f7-kube-api-access-zs9n6\") pod \"1abad829-61e3-47f1-b24d-58ccb40e58f7\" (UID: \"1abad829-61e3-47f1-b24d-58ccb40e58f7\") " Mar 19 19:18:30 crc kubenswrapper[4826]: I0319 19:18:30.633703 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1abad829-61e3-47f1-b24d-58ccb40e58f7-config-data\") pod \"1abad829-61e3-47f1-b24d-58ccb40e58f7\" (UID: \"1abad829-61e3-47f1-b24d-58ccb40e58f7\") " Mar 19 19:18:30 crc kubenswrapper[4826]: I0319 19:18:30.633727 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1abad829-61e3-47f1-b24d-58ccb40e58f7-credential-keys\") pod \"1abad829-61e3-47f1-b24d-58ccb40e58f7\" (UID: \"1abad829-61e3-47f1-b24d-58ccb40e58f7\") " Mar 19 19:18:30 crc kubenswrapper[4826]: I0319 19:18:30.633770 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1abad829-61e3-47f1-b24d-58ccb40e58f7-combined-ca-bundle\") pod \"1abad829-61e3-47f1-b24d-58ccb40e58f7\" (UID: \"1abad829-61e3-47f1-b24d-58ccb40e58f7\") " Mar 19 19:18:30 crc kubenswrapper[4826]: I0319 19:18:30.647995 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1abad829-61e3-47f1-b24d-58ccb40e58f7-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "1abad829-61e3-47f1-b24d-58ccb40e58f7" (UID: "1abad829-61e3-47f1-b24d-58ccb40e58f7"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:30 crc kubenswrapper[4826]: I0319 19:18:30.659067 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1abad829-61e3-47f1-b24d-58ccb40e58f7-kube-api-access-zs9n6" (OuterVolumeSpecName: "kube-api-access-zs9n6") pod "1abad829-61e3-47f1-b24d-58ccb40e58f7" (UID: "1abad829-61e3-47f1-b24d-58ccb40e58f7"). InnerVolumeSpecName "kube-api-access-zs9n6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:18:30 crc kubenswrapper[4826]: I0319 19:18:30.664885 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1abad829-61e3-47f1-b24d-58ccb40e58f7-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "1abad829-61e3-47f1-b24d-58ccb40e58f7" (UID: "1abad829-61e3-47f1-b24d-58ccb40e58f7"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:30 crc kubenswrapper[4826]: I0319 19:18:30.664986 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1abad829-61e3-47f1-b24d-58ccb40e58f7-scripts" (OuterVolumeSpecName: "scripts") pod "1abad829-61e3-47f1-b24d-58ccb40e58f7" (UID: "1abad829-61e3-47f1-b24d-58ccb40e58f7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:30 crc kubenswrapper[4826]: I0319 19:18:30.696183 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1abad829-61e3-47f1-b24d-58ccb40e58f7-config-data" (OuterVolumeSpecName: "config-data") pod "1abad829-61e3-47f1-b24d-58ccb40e58f7" (UID: "1abad829-61e3-47f1-b24d-58ccb40e58f7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:30 crc kubenswrapper[4826]: I0319 19:18:30.737292 4826 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1abad829-61e3-47f1-b24d-58ccb40e58f7-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:30 crc kubenswrapper[4826]: I0319 19:18:30.737325 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zs9n6\" (UniqueName: \"kubernetes.io/projected/1abad829-61e3-47f1-b24d-58ccb40e58f7-kube-api-access-zs9n6\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:30 crc kubenswrapper[4826]: I0319 19:18:30.737334 4826 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1abad829-61e3-47f1-b24d-58ccb40e58f7-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:30 crc kubenswrapper[4826]: I0319 19:18:30.737343 4826 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1abad829-61e3-47f1-b24d-58ccb40e58f7-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:30 crc kubenswrapper[4826]: I0319 19:18:30.737350 4826 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1abad829-61e3-47f1-b24d-58ccb40e58f7-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:30 crc kubenswrapper[4826]: I0319 19:18:30.744419 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1abad829-61e3-47f1-b24d-58ccb40e58f7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1abad829-61e3-47f1-b24d-58ccb40e58f7" (UID: "1abad829-61e3-47f1-b24d-58ccb40e58f7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:30 crc kubenswrapper[4826]: I0319 19:18:30.838950 4826 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1abad829-61e3-47f1-b24d-58ccb40e58f7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:30 crc kubenswrapper[4826]: I0319 19:18:30.993990 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565792-h8n58"] Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.014005 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565792-h8n58"] Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.066146 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-849c97b598-8kp5r"] Mar 19 19:18:31 crc kubenswrapper[4826]: E0319 19:18:31.067300 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eef07f29-afd3-40df-a0c7-098109beedde" containerName="oc" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.067326 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="eef07f29-afd3-40df-a0c7-098109beedde" containerName="oc" Mar 19 19:18:31 crc kubenswrapper[4826]: E0319 19:18:31.067369 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="789d342d-013e-45e5-a57b-cde9f8bc0d3f" containerName="placement-db-sync" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.067378 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="789d342d-013e-45e5-a57b-cde9f8bc0d3f" containerName="placement-db-sync" Mar 19 19:18:31 crc kubenswrapper[4826]: E0319 19:18:31.067403 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1abad829-61e3-47f1-b24d-58ccb40e58f7" containerName="keystone-bootstrap" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.067411 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="1abad829-61e3-47f1-b24d-58ccb40e58f7" containerName="keystone-bootstrap" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.067981 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="789d342d-013e-45e5-a57b-cde9f8bc0d3f" containerName="placement-db-sync" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.068004 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="1abad829-61e3-47f1-b24d-58ccb40e58f7" containerName="keystone-bootstrap" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.068048 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="eef07f29-afd3-40df-a0c7-098109beedde" containerName="oc" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.071520 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-849c97b598-8kp5r" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.079076 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.093345 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.093644 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-66wpf" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.093797 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.093925 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.108803 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-849c97b598-8kp5r"] Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.156244 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/35e51cc9-70e8-40ab-a613-b7f0ffe9b04e-internal-tls-certs\") pod \"placement-849c97b598-8kp5r\" (UID: \"35e51cc9-70e8-40ab-a613-b7f0ffe9b04e\") " pod="openstack/placement-849c97b598-8kp5r" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.156342 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/35e51cc9-70e8-40ab-a613-b7f0ffe9b04e-public-tls-certs\") pod \"placement-849c97b598-8kp5r\" (UID: \"35e51cc9-70e8-40ab-a613-b7f0ffe9b04e\") " pod="openstack/placement-849c97b598-8kp5r" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.156422 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35e51cc9-70e8-40ab-a613-b7f0ffe9b04e-config-data\") pod \"placement-849c97b598-8kp5r\" (UID: \"35e51cc9-70e8-40ab-a613-b7f0ffe9b04e\") " pod="openstack/placement-849c97b598-8kp5r" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.156557 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35e51cc9-70e8-40ab-a613-b7f0ffe9b04e-scripts\") pod \"placement-849c97b598-8kp5r\" (UID: \"35e51cc9-70e8-40ab-a613-b7f0ffe9b04e\") " pod="openstack/placement-849c97b598-8kp5r" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.156584 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35e51cc9-70e8-40ab-a613-b7f0ffe9b04e-combined-ca-bundle\") pod \"placement-849c97b598-8kp5r\" (UID: \"35e51cc9-70e8-40ab-a613-b7f0ffe9b04e\") " pod="openstack/placement-849c97b598-8kp5r" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.156727 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35e51cc9-70e8-40ab-a613-b7f0ffe9b04e-logs\") pod \"placement-849c97b598-8kp5r\" (UID: \"35e51cc9-70e8-40ab-a613-b7f0ffe9b04e\") " pod="openstack/placement-849c97b598-8kp5r" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.156782 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj97v\" (UniqueName: \"kubernetes.io/projected/35e51cc9-70e8-40ab-a613-b7f0ffe9b04e-kube-api-access-sj97v\") pod \"placement-849c97b598-8kp5r\" (UID: \"35e51cc9-70e8-40ab-a613-b7f0ffe9b04e\") " pod="openstack/placement-849c97b598-8kp5r" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.203917 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc865bcc-551d-4ae7-a0f3-af128e1d1e2d","Type":"ContainerStarted","Data":"71098aa46f4c29e2324a7a3122515a2d8bcfde0c286c19147dff3fc814d70f59"} Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.205974 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xkjdp" event={"ID":"30c5e21e-66a0-47a9-b03d-55fbfe372d1b","Type":"ContainerStarted","Data":"1ef54ebc6936b221c2cf1fbad176ea2f4a00c6202a38a5c29df95790ff70da95"} Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.207595 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-s7bsn" event={"ID":"1abad829-61e3-47f1-b24d-58ccb40e58f7","Type":"ContainerDied","Data":"cac09af41bfeadb72bde553a8afbc7e73361f685bef5f794324d2d33c815f212"} Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.207638 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cac09af41bfeadb72bde553a8afbc7e73361f685bef5f794324d2d33c815f212" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.207684 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-s7bsn" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.217546 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-xfct2" event={"ID":"d92c353f-6fae-4be8-8580-4066bb56e856","Type":"ContainerStarted","Data":"d4baa6814b918b1dfbc41b1d24266caac18661ff58725f3f3044a81bdf0da630"} Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.235238 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-p7jzd" event={"ID":"5c80aa39-c840-4267-9677-bb82f387073d","Type":"ContainerStarted","Data":"7c58f9f262ad39c6602e773a8352f8bab23f243e0f8ad7878bd22836f1982c79"} Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.238284 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-xkjdp" podStartSLOduration=3.359858107 podStartE2EDuration="43.238265649s" podCreationTimestamp="2026-03-19 19:17:48 +0000 UTC" firstStartedPulling="2026-03-19 19:17:50.064456428 +0000 UTC m=+1294.818524741" lastFinishedPulling="2026-03-19 19:18:29.94286397 +0000 UTC m=+1334.696932283" observedRunningTime="2026-03-19 19:18:31.228305498 +0000 UTC m=+1335.982373811" watchObservedRunningTime="2026-03-19 19:18:31.238265649 +0000 UTC m=+1335.992333962" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.258769 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/35e51cc9-70e8-40ab-a613-b7f0ffe9b04e-internal-tls-certs\") pod \"placement-849c97b598-8kp5r\" (UID: \"35e51cc9-70e8-40ab-a613-b7f0ffe9b04e\") " pod="openstack/placement-849c97b598-8kp5r" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.259476 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/35e51cc9-70e8-40ab-a613-b7f0ffe9b04e-public-tls-certs\") pod \"placement-849c97b598-8kp5r\" (UID: \"35e51cc9-70e8-40ab-a613-b7f0ffe9b04e\") " pod="openstack/placement-849c97b598-8kp5r" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.259543 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35e51cc9-70e8-40ab-a613-b7f0ffe9b04e-config-data\") pod \"placement-849c97b598-8kp5r\" (UID: \"35e51cc9-70e8-40ab-a613-b7f0ffe9b04e\") " pod="openstack/placement-849c97b598-8kp5r" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.259711 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35e51cc9-70e8-40ab-a613-b7f0ffe9b04e-scripts\") pod \"placement-849c97b598-8kp5r\" (UID: \"35e51cc9-70e8-40ab-a613-b7f0ffe9b04e\") " pod="openstack/placement-849c97b598-8kp5r" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.259735 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35e51cc9-70e8-40ab-a613-b7f0ffe9b04e-combined-ca-bundle\") pod \"placement-849c97b598-8kp5r\" (UID: \"35e51cc9-70e8-40ab-a613-b7f0ffe9b04e\") " pod="openstack/placement-849c97b598-8kp5r" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.259782 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35e51cc9-70e8-40ab-a613-b7f0ffe9b04e-logs\") pod \"placement-849c97b598-8kp5r\" (UID: \"35e51cc9-70e8-40ab-a613-b7f0ffe9b04e\") " pod="openstack/placement-849c97b598-8kp5r" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.259811 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj97v\" (UniqueName: \"kubernetes.io/projected/35e51cc9-70e8-40ab-a613-b7f0ffe9b04e-kube-api-access-sj97v\") pod \"placement-849c97b598-8kp5r\" (UID: \"35e51cc9-70e8-40ab-a613-b7f0ffe9b04e\") " pod="openstack/placement-849c97b598-8kp5r" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.262169 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35e51cc9-70e8-40ab-a613-b7f0ffe9b04e-logs\") pod \"placement-849c97b598-8kp5r\" (UID: \"35e51cc9-70e8-40ab-a613-b7f0ffe9b04e\") " pod="openstack/placement-849c97b598-8kp5r" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.268493 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35e51cc9-70e8-40ab-a613-b7f0ffe9b04e-scripts\") pod \"placement-849c97b598-8kp5r\" (UID: \"35e51cc9-70e8-40ab-a613-b7f0ffe9b04e\") " pod="openstack/placement-849c97b598-8kp5r" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.268824 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35e51cc9-70e8-40ab-a613-b7f0ffe9b04e-combined-ca-bundle\") pod \"placement-849c97b598-8kp5r\" (UID: \"35e51cc9-70e8-40ab-a613-b7f0ffe9b04e\") " pod="openstack/placement-849c97b598-8kp5r" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.268876 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/35e51cc9-70e8-40ab-a613-b7f0ffe9b04e-internal-tls-certs\") pod \"placement-849c97b598-8kp5r\" (UID: \"35e51cc9-70e8-40ab-a613-b7f0ffe9b04e\") " pod="openstack/placement-849c97b598-8kp5r" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.270155 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/35e51cc9-70e8-40ab-a613-b7f0ffe9b04e-public-tls-certs\") pod \"placement-849c97b598-8kp5r\" (UID: \"35e51cc9-70e8-40ab-a613-b7f0ffe9b04e\") " pod="openstack/placement-849c97b598-8kp5r" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.270317 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-xfct2" podStartSLOduration=2.636813152 podStartE2EDuration="43.270303363s" podCreationTimestamp="2026-03-19 19:17:48 +0000 UTC" firstStartedPulling="2026-03-19 19:17:49.82205806 +0000 UTC m=+1294.576126363" lastFinishedPulling="2026-03-19 19:18:30.455548261 +0000 UTC m=+1335.209616574" observedRunningTime="2026-03-19 19:18:31.247433401 +0000 UTC m=+1336.001501714" watchObservedRunningTime="2026-03-19 19:18:31.270303363 +0000 UTC m=+1336.024371676" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.289821 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-p7jzd" podStartSLOduration=3.220425987 podStartE2EDuration="43.289800714s" podCreationTimestamp="2026-03-19 19:17:48 +0000 UTC" firstStartedPulling="2026-03-19 19:17:49.8733688 +0000 UTC m=+1294.627437103" lastFinishedPulling="2026-03-19 19:18:29.942743517 +0000 UTC m=+1334.696811830" observedRunningTime="2026-03-19 19:18:31.268393967 +0000 UTC m=+1336.022462280" watchObservedRunningTime="2026-03-19 19:18:31.289800714 +0000 UTC m=+1336.043869027" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.292899 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35e51cc9-70e8-40ab-a613-b7f0ffe9b04e-config-data\") pod \"placement-849c97b598-8kp5r\" (UID: \"35e51cc9-70e8-40ab-a613-b7f0ffe9b04e\") " pod="openstack/placement-849c97b598-8kp5r" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.300833 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj97v\" (UniqueName: \"kubernetes.io/projected/35e51cc9-70e8-40ab-a613-b7f0ffe9b04e-kube-api-access-sj97v\") pod \"placement-849c97b598-8kp5r\" (UID: \"35e51cc9-70e8-40ab-a613-b7f0ffe9b04e\") " pod="openstack/placement-849c97b598-8kp5r" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.343283 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-578cc9f57d-6xb4n"] Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.344745 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-578cc9f57d-6xb4n" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.351748 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.351945 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.352075 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.352183 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.353071 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.354836 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-sxv9p" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.360277 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-578cc9f57d-6xb4n"] Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.366020 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9470cb99-e153-461b-a17e-83897c3de6f1-public-tls-certs\") pod \"keystone-578cc9f57d-6xb4n\" (UID: \"9470cb99-e153-461b-a17e-83897c3de6f1\") " pod="openstack/keystone-578cc9f57d-6xb4n" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.366100 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9470cb99-e153-461b-a17e-83897c3de6f1-config-data\") pod \"keystone-578cc9f57d-6xb4n\" (UID: \"9470cb99-e153-461b-a17e-83897c3de6f1\") " pod="openstack/keystone-578cc9f57d-6xb4n" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.366120 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9470cb99-e153-461b-a17e-83897c3de6f1-internal-tls-certs\") pod \"keystone-578cc9f57d-6xb4n\" (UID: \"9470cb99-e153-461b-a17e-83897c3de6f1\") " pod="openstack/keystone-578cc9f57d-6xb4n" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.366274 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9470cb99-e153-461b-a17e-83897c3de6f1-combined-ca-bundle\") pod \"keystone-578cc9f57d-6xb4n\" (UID: \"9470cb99-e153-461b-a17e-83897c3de6f1\") " pod="openstack/keystone-578cc9f57d-6xb4n" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.366329 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9470cb99-e153-461b-a17e-83897c3de6f1-fernet-keys\") pod \"keystone-578cc9f57d-6xb4n\" (UID: \"9470cb99-e153-461b-a17e-83897c3de6f1\") " pod="openstack/keystone-578cc9f57d-6xb4n" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.366344 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnspv\" (UniqueName: \"kubernetes.io/projected/9470cb99-e153-461b-a17e-83897c3de6f1-kube-api-access-xnspv\") pod \"keystone-578cc9f57d-6xb4n\" (UID: \"9470cb99-e153-461b-a17e-83897c3de6f1\") " pod="openstack/keystone-578cc9f57d-6xb4n" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.366364 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9470cb99-e153-461b-a17e-83897c3de6f1-scripts\") pod \"keystone-578cc9f57d-6xb4n\" (UID: \"9470cb99-e153-461b-a17e-83897c3de6f1\") " pod="openstack/keystone-578cc9f57d-6xb4n" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.366392 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9470cb99-e153-461b-a17e-83897c3de6f1-credential-keys\") pod \"keystone-578cc9f57d-6xb4n\" (UID: \"9470cb99-e153-461b-a17e-83897c3de6f1\") " pod="openstack/keystone-578cc9f57d-6xb4n" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.411350 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-849c97b598-8kp5r" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.467988 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9470cb99-e153-461b-a17e-83897c3de6f1-fernet-keys\") pod \"keystone-578cc9f57d-6xb4n\" (UID: \"9470cb99-e153-461b-a17e-83897c3de6f1\") " pod="openstack/keystone-578cc9f57d-6xb4n" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.468031 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnspv\" (UniqueName: \"kubernetes.io/projected/9470cb99-e153-461b-a17e-83897c3de6f1-kube-api-access-xnspv\") pod \"keystone-578cc9f57d-6xb4n\" (UID: \"9470cb99-e153-461b-a17e-83897c3de6f1\") " pod="openstack/keystone-578cc9f57d-6xb4n" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.468058 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9470cb99-e153-461b-a17e-83897c3de6f1-scripts\") pod \"keystone-578cc9f57d-6xb4n\" (UID: \"9470cb99-e153-461b-a17e-83897c3de6f1\") " pod="openstack/keystone-578cc9f57d-6xb4n" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.468085 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9470cb99-e153-461b-a17e-83897c3de6f1-credential-keys\") pod \"keystone-578cc9f57d-6xb4n\" (UID: \"9470cb99-e153-461b-a17e-83897c3de6f1\") " pod="openstack/keystone-578cc9f57d-6xb4n" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.468146 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9470cb99-e153-461b-a17e-83897c3de6f1-public-tls-certs\") pod \"keystone-578cc9f57d-6xb4n\" (UID: \"9470cb99-e153-461b-a17e-83897c3de6f1\") " pod="openstack/keystone-578cc9f57d-6xb4n" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.468186 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9470cb99-e153-461b-a17e-83897c3de6f1-internal-tls-certs\") pod \"keystone-578cc9f57d-6xb4n\" (UID: \"9470cb99-e153-461b-a17e-83897c3de6f1\") " pod="openstack/keystone-578cc9f57d-6xb4n" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.468206 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9470cb99-e153-461b-a17e-83897c3de6f1-config-data\") pod \"keystone-578cc9f57d-6xb4n\" (UID: \"9470cb99-e153-461b-a17e-83897c3de6f1\") " pod="openstack/keystone-578cc9f57d-6xb4n" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.468308 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9470cb99-e153-461b-a17e-83897c3de6f1-combined-ca-bundle\") pod \"keystone-578cc9f57d-6xb4n\" (UID: \"9470cb99-e153-461b-a17e-83897c3de6f1\") " pod="openstack/keystone-578cc9f57d-6xb4n" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.471991 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9470cb99-e153-461b-a17e-83897c3de6f1-combined-ca-bundle\") pod \"keystone-578cc9f57d-6xb4n\" (UID: \"9470cb99-e153-461b-a17e-83897c3de6f1\") " pod="openstack/keystone-578cc9f57d-6xb4n" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.481928 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9470cb99-e153-461b-a17e-83897c3de6f1-internal-tls-certs\") pod \"keystone-578cc9f57d-6xb4n\" (UID: \"9470cb99-e153-461b-a17e-83897c3de6f1\") " pod="openstack/keystone-578cc9f57d-6xb4n" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.488331 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9470cb99-e153-461b-a17e-83897c3de6f1-config-data\") pod \"keystone-578cc9f57d-6xb4n\" (UID: \"9470cb99-e153-461b-a17e-83897c3de6f1\") " pod="openstack/keystone-578cc9f57d-6xb4n" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.490204 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9470cb99-e153-461b-a17e-83897c3de6f1-scripts\") pod \"keystone-578cc9f57d-6xb4n\" (UID: \"9470cb99-e153-461b-a17e-83897c3de6f1\") " pod="openstack/keystone-578cc9f57d-6xb4n" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.490684 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9470cb99-e153-461b-a17e-83897c3de6f1-credential-keys\") pod \"keystone-578cc9f57d-6xb4n\" (UID: \"9470cb99-e153-461b-a17e-83897c3de6f1\") " pod="openstack/keystone-578cc9f57d-6xb4n" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.495261 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9470cb99-e153-461b-a17e-83897c3de6f1-fernet-keys\") pod \"keystone-578cc9f57d-6xb4n\" (UID: \"9470cb99-e153-461b-a17e-83897c3de6f1\") " pod="openstack/keystone-578cc9f57d-6xb4n" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.495425 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9470cb99-e153-461b-a17e-83897c3de6f1-public-tls-certs\") pod \"keystone-578cc9f57d-6xb4n\" (UID: \"9470cb99-e153-461b-a17e-83897c3de6f1\") " pod="openstack/keystone-578cc9f57d-6xb4n" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.496248 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnspv\" (UniqueName: \"kubernetes.io/projected/9470cb99-e153-461b-a17e-83897c3de6f1-kube-api-access-xnspv\") pod \"keystone-578cc9f57d-6xb4n\" (UID: \"9470cb99-e153-461b-a17e-83897c3de6f1\") " pod="openstack/keystone-578cc9f57d-6xb4n" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.703544 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-578cc9f57d-6xb4n" Mar 19 19:18:31 crc kubenswrapper[4826]: I0319 19:18:31.949968 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-849c97b598-8kp5r"] Mar 19 19:18:32 crc kubenswrapper[4826]: I0319 19:18:32.025436 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5ff4e07-a700-4be9-a448-7ccc4a75f8ae" path="/var/lib/kubelet/pods/a5ff4e07-a700-4be9-a448-7ccc4a75f8ae/volumes" Mar 19 19:18:32 crc kubenswrapper[4826]: I0319 19:18:32.180842 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-578cc9f57d-6xb4n"] Mar 19 19:18:32 crc kubenswrapper[4826]: I0319 19:18:32.250321 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-578cc9f57d-6xb4n" event={"ID":"9470cb99-e153-461b-a17e-83897c3de6f1","Type":"ContainerStarted","Data":"0968d6ee53f8eda0f8b4569d8ab06150670ab9589179ac1092f74f8423e2e724"} Mar 19 19:18:32 crc kubenswrapper[4826]: I0319 19:18:32.252364 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-849c97b598-8kp5r" event={"ID":"35e51cc9-70e8-40ab-a613-b7f0ffe9b04e","Type":"ContainerStarted","Data":"f89ad55452439cf7b6024e24d15ded80d2d695458438f9abf5bf2d12b1fe1824"} Mar 19 19:18:32 crc kubenswrapper[4826]: I0319 19:18:32.354919 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 19 19:18:32 crc kubenswrapper[4826]: I0319 19:18:32.449389 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 19 19:18:32 crc kubenswrapper[4826]: I0319 19:18:32.974935 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 19 19:18:32 crc kubenswrapper[4826]: I0319 19:18:32.976473 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 19 19:18:32 crc kubenswrapper[4826]: I0319 19:18:32.976546 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 19 19:18:32 crc kubenswrapper[4826]: I0319 19:18:32.976617 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 19 19:18:33 crc kubenswrapper[4826]: I0319 19:18:33.048671 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 19 19:18:33 crc kubenswrapper[4826]: I0319 19:18:33.091452 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 19 19:18:33 crc kubenswrapper[4826]: I0319 19:18:33.121787 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b7b667979-t2kwz" Mar 19 19:18:33 crc kubenswrapper[4826]: I0319 19:18:33.188620 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-gvpkd"] Mar 19 19:18:33 crc kubenswrapper[4826]: I0319 19:18:33.193829 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56df8fb6b7-gvpkd" podUID="6b4865a7-127e-4108-bc3a-3aac30103761" containerName="dnsmasq-dns" containerID="cri-o://c16a2cb6caeceb61b6c714fde783f85b19d924734e0dc6b9a1d10bab431c90f6" gracePeriod=10 Mar 19 19:18:33 crc kubenswrapper[4826]: I0319 19:18:33.277095 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-849c97b598-8kp5r" event={"ID":"35e51cc9-70e8-40ab-a613-b7f0ffe9b04e","Type":"ContainerStarted","Data":"4f149913d18416843831582f7a3640dcecba1ef3dac591f776f00032fd7813e5"} Mar 19 19:18:33 crc kubenswrapper[4826]: I0319 19:18:33.277144 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-849c97b598-8kp5r" event={"ID":"35e51cc9-70e8-40ab-a613-b7f0ffe9b04e","Type":"ContainerStarted","Data":"55970305422c2e6cbf86f40c0b9e98cdee5dc4de5452a9f833a2e65e413a024d"} Mar 19 19:18:33 crc kubenswrapper[4826]: I0319 19:18:33.277186 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-849c97b598-8kp5r" Mar 19 19:18:33 crc kubenswrapper[4826]: I0319 19:18:33.277209 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-849c97b598-8kp5r" Mar 19 19:18:33 crc kubenswrapper[4826]: I0319 19:18:33.284330 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-578cc9f57d-6xb4n" event={"ID":"9470cb99-e153-461b-a17e-83897c3de6f1","Type":"ContainerStarted","Data":"67491b26e31815755695e64a331faa470172e96933514f62eaad0b2c28617756"} Mar 19 19:18:33 crc kubenswrapper[4826]: I0319 19:18:33.325560 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-849c97b598-8kp5r" podStartSLOduration=2.325526816 podStartE2EDuration="2.325526816s" podCreationTimestamp="2026-03-19 19:18:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:18:33.308473875 +0000 UTC m=+1338.062542178" watchObservedRunningTime="2026-03-19 19:18:33.325526816 +0000 UTC m=+1338.079595129" Mar 19 19:18:33 crc kubenswrapper[4826]: I0319 19:18:33.378396 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-578cc9f57d-6xb4n" podStartSLOduration=2.378376564 podStartE2EDuration="2.378376564s" podCreationTimestamp="2026-03-19 19:18:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:18:33.334845522 +0000 UTC m=+1338.088913855" watchObservedRunningTime="2026-03-19 19:18:33.378376564 +0000 UTC m=+1338.132444877" Mar 19 19:18:34 crc kubenswrapper[4826]: I0319 19:18:34.085759 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-58fdd88484-zg97f"] Mar 19 19:18:34 crc kubenswrapper[4826]: I0319 19:18:34.087878 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-58fdd88484-zg97f" Mar 19 19:18:34 crc kubenswrapper[4826]: I0319 19:18:34.106683 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-58fdd88484-zg97f"] Mar 19 19:18:34 crc kubenswrapper[4826]: I0319 19:18:34.126442 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-56df8fb6b7-gvpkd" podUID="6b4865a7-127e-4108-bc3a-3aac30103761" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.189:5353: connect: connection refused" Mar 19 19:18:34 crc kubenswrapper[4826]: I0319 19:18:34.162550 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa0311ae-1bf2-42fe-a9f8-557f722b8065-internal-tls-certs\") pod \"placement-58fdd88484-zg97f\" (UID: \"aa0311ae-1bf2-42fe-a9f8-557f722b8065\") " pod="openstack/placement-58fdd88484-zg97f" Mar 19 19:18:34 crc kubenswrapper[4826]: I0319 19:18:34.162614 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa0311ae-1bf2-42fe-a9f8-557f722b8065-config-data\") pod \"placement-58fdd88484-zg97f\" (UID: \"aa0311ae-1bf2-42fe-a9f8-557f722b8065\") " pod="openstack/placement-58fdd88484-zg97f" Mar 19 19:18:34 crc kubenswrapper[4826]: I0319 19:18:34.162690 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa0311ae-1bf2-42fe-a9f8-557f722b8065-combined-ca-bundle\") pod \"placement-58fdd88484-zg97f\" (UID: \"aa0311ae-1bf2-42fe-a9f8-557f722b8065\") " pod="openstack/placement-58fdd88484-zg97f" Mar 19 19:18:34 crc kubenswrapper[4826]: I0319 19:18:34.162712 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa0311ae-1bf2-42fe-a9f8-557f722b8065-scripts\") pod \"placement-58fdd88484-zg97f\" (UID: \"aa0311ae-1bf2-42fe-a9f8-557f722b8065\") " pod="openstack/placement-58fdd88484-zg97f" Mar 19 19:18:34 crc kubenswrapper[4826]: I0319 19:18:34.162739 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa0311ae-1bf2-42fe-a9f8-557f722b8065-public-tls-certs\") pod \"placement-58fdd88484-zg97f\" (UID: \"aa0311ae-1bf2-42fe-a9f8-557f722b8065\") " pod="openstack/placement-58fdd88484-zg97f" Mar 19 19:18:34 crc kubenswrapper[4826]: I0319 19:18:34.162970 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6xbr\" (UniqueName: \"kubernetes.io/projected/aa0311ae-1bf2-42fe-a9f8-557f722b8065-kube-api-access-l6xbr\") pod \"placement-58fdd88484-zg97f\" (UID: \"aa0311ae-1bf2-42fe-a9f8-557f722b8065\") " pod="openstack/placement-58fdd88484-zg97f" Mar 19 19:18:34 crc kubenswrapper[4826]: I0319 19:18:34.163084 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa0311ae-1bf2-42fe-a9f8-557f722b8065-logs\") pod \"placement-58fdd88484-zg97f\" (UID: \"aa0311ae-1bf2-42fe-a9f8-557f722b8065\") " pod="openstack/placement-58fdd88484-zg97f" Mar 19 19:18:34 crc kubenswrapper[4826]: I0319 19:18:34.266106 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6xbr\" (UniqueName: \"kubernetes.io/projected/aa0311ae-1bf2-42fe-a9f8-557f722b8065-kube-api-access-l6xbr\") pod \"placement-58fdd88484-zg97f\" (UID: \"aa0311ae-1bf2-42fe-a9f8-557f722b8065\") " pod="openstack/placement-58fdd88484-zg97f" Mar 19 19:18:34 crc kubenswrapper[4826]: I0319 19:18:34.266159 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa0311ae-1bf2-42fe-a9f8-557f722b8065-logs\") pod \"placement-58fdd88484-zg97f\" (UID: \"aa0311ae-1bf2-42fe-a9f8-557f722b8065\") " pod="openstack/placement-58fdd88484-zg97f" Mar 19 19:18:34 crc kubenswrapper[4826]: I0319 19:18:34.266189 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa0311ae-1bf2-42fe-a9f8-557f722b8065-internal-tls-certs\") pod \"placement-58fdd88484-zg97f\" (UID: \"aa0311ae-1bf2-42fe-a9f8-557f722b8065\") " pod="openstack/placement-58fdd88484-zg97f" Mar 19 19:18:34 crc kubenswrapper[4826]: I0319 19:18:34.266220 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa0311ae-1bf2-42fe-a9f8-557f722b8065-config-data\") pod \"placement-58fdd88484-zg97f\" (UID: \"aa0311ae-1bf2-42fe-a9f8-557f722b8065\") " pod="openstack/placement-58fdd88484-zg97f" Mar 19 19:18:34 crc kubenswrapper[4826]: I0319 19:18:34.266317 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa0311ae-1bf2-42fe-a9f8-557f722b8065-combined-ca-bundle\") pod \"placement-58fdd88484-zg97f\" (UID: \"aa0311ae-1bf2-42fe-a9f8-557f722b8065\") " pod="openstack/placement-58fdd88484-zg97f" Mar 19 19:18:34 crc kubenswrapper[4826]: I0319 19:18:34.266344 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa0311ae-1bf2-42fe-a9f8-557f722b8065-scripts\") pod \"placement-58fdd88484-zg97f\" (UID: \"aa0311ae-1bf2-42fe-a9f8-557f722b8065\") " pod="openstack/placement-58fdd88484-zg97f" Mar 19 19:18:34 crc kubenswrapper[4826]: I0319 19:18:34.266375 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa0311ae-1bf2-42fe-a9f8-557f722b8065-public-tls-certs\") pod \"placement-58fdd88484-zg97f\" (UID: \"aa0311ae-1bf2-42fe-a9f8-557f722b8065\") " pod="openstack/placement-58fdd88484-zg97f" Mar 19 19:18:34 crc kubenswrapper[4826]: I0319 19:18:34.266807 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa0311ae-1bf2-42fe-a9f8-557f722b8065-logs\") pod \"placement-58fdd88484-zg97f\" (UID: \"aa0311ae-1bf2-42fe-a9f8-557f722b8065\") " pod="openstack/placement-58fdd88484-zg97f" Mar 19 19:18:34 crc kubenswrapper[4826]: I0319 19:18:34.273100 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa0311ae-1bf2-42fe-a9f8-557f722b8065-scripts\") pod \"placement-58fdd88484-zg97f\" (UID: \"aa0311ae-1bf2-42fe-a9f8-557f722b8065\") " pod="openstack/placement-58fdd88484-zg97f" Mar 19 19:18:34 crc kubenswrapper[4826]: I0319 19:18:34.274228 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa0311ae-1bf2-42fe-a9f8-557f722b8065-public-tls-certs\") pod \"placement-58fdd88484-zg97f\" (UID: \"aa0311ae-1bf2-42fe-a9f8-557f722b8065\") " pod="openstack/placement-58fdd88484-zg97f" Mar 19 19:18:34 crc kubenswrapper[4826]: I0319 19:18:34.274447 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa0311ae-1bf2-42fe-a9f8-557f722b8065-config-data\") pod \"placement-58fdd88484-zg97f\" (UID: \"aa0311ae-1bf2-42fe-a9f8-557f722b8065\") " pod="openstack/placement-58fdd88484-zg97f" Mar 19 19:18:34 crc kubenswrapper[4826]: I0319 19:18:34.277068 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa0311ae-1bf2-42fe-a9f8-557f722b8065-internal-tls-certs\") pod \"placement-58fdd88484-zg97f\" (UID: \"aa0311ae-1bf2-42fe-a9f8-557f722b8065\") " pod="openstack/placement-58fdd88484-zg97f" Mar 19 19:18:34 crc kubenswrapper[4826]: I0319 19:18:34.286127 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa0311ae-1bf2-42fe-a9f8-557f722b8065-combined-ca-bundle\") pod \"placement-58fdd88484-zg97f\" (UID: \"aa0311ae-1bf2-42fe-a9f8-557f722b8065\") " pod="openstack/placement-58fdd88484-zg97f" Mar 19 19:18:34 crc kubenswrapper[4826]: I0319 19:18:34.287033 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6xbr\" (UniqueName: \"kubernetes.io/projected/aa0311ae-1bf2-42fe-a9f8-557f722b8065-kube-api-access-l6xbr\") pod \"placement-58fdd88484-zg97f\" (UID: \"aa0311ae-1bf2-42fe-a9f8-557f722b8065\") " pod="openstack/placement-58fdd88484-zg97f" Mar 19 19:18:34 crc kubenswrapper[4826]: I0319 19:18:34.303694 4826 generic.go:334] "Generic (PLEG): container finished" podID="6b4865a7-127e-4108-bc3a-3aac30103761" containerID="c16a2cb6caeceb61b6c714fde783f85b19d924734e0dc6b9a1d10bab431c90f6" exitCode=0 Mar 19 19:18:34 crc kubenswrapper[4826]: I0319 19:18:34.303852 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-gvpkd" event={"ID":"6b4865a7-127e-4108-bc3a-3aac30103761","Type":"ContainerDied","Data":"c16a2cb6caeceb61b6c714fde783f85b19d924734e0dc6b9a1d10bab431c90f6"} Mar 19 19:18:34 crc kubenswrapper[4826]: I0319 19:18:34.305807 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-578cc9f57d-6xb4n" Mar 19 19:18:34 crc kubenswrapper[4826]: I0319 19:18:34.416232 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-58fdd88484-zg97f" Mar 19 19:18:34 crc kubenswrapper[4826]: I0319 19:18:34.896261 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-gvpkd" Mar 19 19:18:34 crc kubenswrapper[4826]: I0319 19:18:34.983313 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b4865a7-127e-4108-bc3a-3aac30103761-config\") pod \"6b4865a7-127e-4108-bc3a-3aac30103761\" (UID: \"6b4865a7-127e-4108-bc3a-3aac30103761\") " Mar 19 19:18:34 crc kubenswrapper[4826]: I0319 19:18:34.983355 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6b4865a7-127e-4108-bc3a-3aac30103761-ovsdbserver-sb\") pod \"6b4865a7-127e-4108-bc3a-3aac30103761\" (UID: \"6b4865a7-127e-4108-bc3a-3aac30103761\") " Mar 19 19:18:34 crc kubenswrapper[4826]: I0319 19:18:34.983420 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbjgj\" (UniqueName: \"kubernetes.io/projected/6b4865a7-127e-4108-bc3a-3aac30103761-kube-api-access-cbjgj\") pod \"6b4865a7-127e-4108-bc3a-3aac30103761\" (UID: \"6b4865a7-127e-4108-bc3a-3aac30103761\") " Mar 19 19:18:34 crc kubenswrapper[4826]: I0319 19:18:34.983468 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6b4865a7-127e-4108-bc3a-3aac30103761-ovsdbserver-nb\") pod \"6b4865a7-127e-4108-bc3a-3aac30103761\" (UID: \"6b4865a7-127e-4108-bc3a-3aac30103761\") " Mar 19 19:18:34 crc kubenswrapper[4826]: I0319 19:18:34.983566 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6b4865a7-127e-4108-bc3a-3aac30103761-dns-swift-storage-0\") pod \"6b4865a7-127e-4108-bc3a-3aac30103761\" (UID: \"6b4865a7-127e-4108-bc3a-3aac30103761\") " Mar 19 19:18:34 crc kubenswrapper[4826]: I0319 19:18:34.983695 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b4865a7-127e-4108-bc3a-3aac30103761-dns-svc\") pod \"6b4865a7-127e-4108-bc3a-3aac30103761\" (UID: \"6b4865a7-127e-4108-bc3a-3aac30103761\") " Mar 19 19:18:34 crc kubenswrapper[4826]: I0319 19:18:34.995419 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b4865a7-127e-4108-bc3a-3aac30103761-kube-api-access-cbjgj" (OuterVolumeSpecName: "kube-api-access-cbjgj") pod "6b4865a7-127e-4108-bc3a-3aac30103761" (UID: "6b4865a7-127e-4108-bc3a-3aac30103761"). InnerVolumeSpecName "kube-api-access-cbjgj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:18:35 crc kubenswrapper[4826]: I0319 19:18:35.042913 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b4865a7-127e-4108-bc3a-3aac30103761-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6b4865a7-127e-4108-bc3a-3aac30103761" (UID: "6b4865a7-127e-4108-bc3a-3aac30103761"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:18:35 crc kubenswrapper[4826]: I0319 19:18:35.048593 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b4865a7-127e-4108-bc3a-3aac30103761-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6b4865a7-127e-4108-bc3a-3aac30103761" (UID: "6b4865a7-127e-4108-bc3a-3aac30103761"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:18:35 crc kubenswrapper[4826]: I0319 19:18:35.058300 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b4865a7-127e-4108-bc3a-3aac30103761-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6b4865a7-127e-4108-bc3a-3aac30103761" (UID: "6b4865a7-127e-4108-bc3a-3aac30103761"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:18:35 crc kubenswrapper[4826]: I0319 19:18:35.075349 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b4865a7-127e-4108-bc3a-3aac30103761-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6b4865a7-127e-4108-bc3a-3aac30103761" (UID: "6b4865a7-127e-4108-bc3a-3aac30103761"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:18:35 crc kubenswrapper[4826]: I0319 19:18:35.086283 4826 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6b4865a7-127e-4108-bc3a-3aac30103761-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:35 crc kubenswrapper[4826]: I0319 19:18:35.086313 4826 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6b4865a7-127e-4108-bc3a-3aac30103761-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:35 crc kubenswrapper[4826]: I0319 19:18:35.088419 4826 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b4865a7-127e-4108-bc3a-3aac30103761-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:35 crc kubenswrapper[4826]: I0319 19:18:35.088438 4826 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6b4865a7-127e-4108-bc3a-3aac30103761-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:35 crc kubenswrapper[4826]: I0319 19:18:35.088449 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbjgj\" (UniqueName: \"kubernetes.io/projected/6b4865a7-127e-4108-bc3a-3aac30103761-kube-api-access-cbjgj\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:35 crc kubenswrapper[4826]: I0319 19:18:35.100910 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b4865a7-127e-4108-bc3a-3aac30103761-config" (OuterVolumeSpecName: "config") pod "6b4865a7-127e-4108-bc3a-3aac30103761" (UID: "6b4865a7-127e-4108-bc3a-3aac30103761"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:18:35 crc kubenswrapper[4826]: I0319 19:18:35.190845 4826 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b4865a7-127e-4108-bc3a-3aac30103761-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:35 crc kubenswrapper[4826]: I0319 19:18:35.277993 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-58fdd88484-zg97f"] Mar 19 19:18:35 crc kubenswrapper[4826]: W0319 19:18:35.294524 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa0311ae_1bf2_42fe_a9f8_557f722b8065.slice/crio-22e74a030f42a2fb6f61aea9891bf72975780373a216015d5a6eb919e2d990d1 WatchSource:0}: Error finding container 22e74a030f42a2fb6f61aea9891bf72975780373a216015d5a6eb919e2d990d1: Status 404 returned error can't find the container with id 22e74a030f42a2fb6f61aea9891bf72975780373a216015d5a6eb919e2d990d1 Mar 19 19:18:35 crc kubenswrapper[4826]: I0319 19:18:35.319034 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-gvpkd" event={"ID":"6b4865a7-127e-4108-bc3a-3aac30103761","Type":"ContainerDied","Data":"65eafa6a5d4ef1b8f2c1a3dd6fd553c988a604e3c12c1a76dede8218c7c98568"} Mar 19 19:18:35 crc kubenswrapper[4826]: I0319 19:18:35.319296 4826 scope.go:117] "RemoveContainer" containerID="c16a2cb6caeceb61b6c714fde783f85b19d924734e0dc6b9a1d10bab431c90f6" Mar 19 19:18:35 crc kubenswrapper[4826]: I0319 19:18:35.319568 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-gvpkd" Mar 19 19:18:35 crc kubenswrapper[4826]: I0319 19:18:35.326999 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-58fdd88484-zg97f" event={"ID":"aa0311ae-1bf2-42fe-a9f8-557f722b8065","Type":"ContainerStarted","Data":"22e74a030f42a2fb6f61aea9891bf72975780373a216015d5a6eb919e2d990d1"} Mar 19 19:18:35 crc kubenswrapper[4826]: I0319 19:18:35.456400 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-gvpkd"] Mar 19 19:18:35 crc kubenswrapper[4826]: I0319 19:18:35.463062 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-gvpkd"] Mar 19 19:18:35 crc kubenswrapper[4826]: I0319 19:18:35.474628 4826 scope.go:117] "RemoveContainer" containerID="3e3f7b8074e3de05d6a41fed9971959d1a337b230ae074fb29f0fadfd278f287" Mar 19 19:18:35 crc kubenswrapper[4826]: I0319 19:18:35.880861 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 19 19:18:35 crc kubenswrapper[4826]: I0319 19:18:35.880954 4826 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 19:18:36 crc kubenswrapper[4826]: I0319 19:18:36.081350 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b4865a7-127e-4108-bc3a-3aac30103761" path="/var/lib/kubelet/pods/6b4865a7-127e-4108-bc3a-3aac30103761/volumes" Mar 19 19:18:36 crc kubenswrapper[4826]: I0319 19:18:36.192081 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 19 19:18:36 crc kubenswrapper[4826]: I0319 19:18:36.352750 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-58fdd88484-zg97f" event={"ID":"aa0311ae-1bf2-42fe-a9f8-557f722b8065","Type":"ContainerStarted","Data":"c56a64aef4bb2c738fd545c74d61030f47f970804ebaf642d9b134b972922e30"} Mar 19 19:18:36 crc kubenswrapper[4826]: I0319 19:18:36.353286 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-58fdd88484-zg97f" event={"ID":"aa0311ae-1bf2-42fe-a9f8-557f722b8065","Type":"ContainerStarted","Data":"491713f77575e5bbbff615875863f301c22f65f3439770837159f57dfecfb080"} Mar 19 19:18:36 crc kubenswrapper[4826]: I0319 19:18:36.353327 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-58fdd88484-zg97f" Mar 19 19:18:36 crc kubenswrapper[4826]: I0319 19:18:36.353376 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-58fdd88484-zg97f" Mar 19 19:18:36 crc kubenswrapper[4826]: I0319 19:18:36.402451 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-58fdd88484-zg97f" podStartSLOduration=2.402432743 podStartE2EDuration="2.402432743s" podCreationTimestamp="2026-03-19 19:18:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:18:36.39198198 +0000 UTC m=+1341.146050293" watchObservedRunningTime="2026-03-19 19:18:36.402432743 +0000 UTC m=+1341.156501056" Mar 19 19:18:37 crc kubenswrapper[4826]: I0319 19:18:37.364565 4826 generic.go:334] "Generic (PLEG): container finished" podID="30c5e21e-66a0-47a9-b03d-55fbfe372d1b" containerID="1ef54ebc6936b221c2cf1fbad176ea2f4a00c6202a38a5c29df95790ff70da95" exitCode=0 Mar 19 19:18:37 crc kubenswrapper[4826]: I0319 19:18:37.364625 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xkjdp" event={"ID":"30c5e21e-66a0-47a9-b03d-55fbfe372d1b","Type":"ContainerDied","Data":"1ef54ebc6936b221c2cf1fbad176ea2f4a00c6202a38a5c29df95790ff70da95"} Mar 19 19:18:39 crc kubenswrapper[4826]: I0319 19:18:39.389868 4826 generic.go:334] "Generic (PLEG): container finished" podID="d92c353f-6fae-4be8-8580-4066bb56e856" containerID="d4baa6814b918b1dfbc41b1d24266caac18661ff58725f3f3044a81bdf0da630" exitCode=0 Mar 19 19:18:39 crc kubenswrapper[4826]: I0319 19:18:39.389999 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-xfct2" event={"ID":"d92c353f-6fae-4be8-8580-4066bb56e856","Type":"ContainerDied","Data":"d4baa6814b918b1dfbc41b1d24266caac18661ff58725f3f3044a81bdf0da630"} Mar 19 19:18:40 crc kubenswrapper[4826]: I0319 19:18:40.405118 4826 generic.go:334] "Generic (PLEG): container finished" podID="5c80aa39-c840-4267-9677-bb82f387073d" containerID="7c58f9f262ad39c6602e773a8352f8bab23f243e0f8ad7878bd22836f1982c79" exitCode=0 Mar 19 19:18:40 crc kubenswrapper[4826]: I0319 19:18:40.405228 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-p7jzd" event={"ID":"5c80aa39-c840-4267-9677-bb82f387073d","Type":"ContainerDied","Data":"7c58f9f262ad39c6602e773a8352f8bab23f243e0f8ad7878bd22836f1982c79"} Mar 19 19:18:40 crc kubenswrapper[4826]: I0319 19:18:40.883080 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xkjdp" Mar 19 19:18:41 crc kubenswrapper[4826]: I0319 19:18:41.043179 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30c5e21e-66a0-47a9-b03d-55fbfe372d1b-combined-ca-bundle\") pod \"30c5e21e-66a0-47a9-b03d-55fbfe372d1b\" (UID: \"30c5e21e-66a0-47a9-b03d-55fbfe372d1b\") " Mar 19 19:18:41 crc kubenswrapper[4826]: I0319 19:18:41.043359 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrl4l\" (UniqueName: \"kubernetes.io/projected/30c5e21e-66a0-47a9-b03d-55fbfe372d1b-kube-api-access-nrl4l\") pod \"30c5e21e-66a0-47a9-b03d-55fbfe372d1b\" (UID: \"30c5e21e-66a0-47a9-b03d-55fbfe372d1b\") " Mar 19 19:18:41 crc kubenswrapper[4826]: I0319 19:18:41.043406 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/30c5e21e-66a0-47a9-b03d-55fbfe372d1b-db-sync-config-data\") pod \"30c5e21e-66a0-47a9-b03d-55fbfe372d1b\" (UID: \"30c5e21e-66a0-47a9-b03d-55fbfe372d1b\") " Mar 19 19:18:41 crc kubenswrapper[4826]: I0319 19:18:41.048981 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30c5e21e-66a0-47a9-b03d-55fbfe372d1b-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "30c5e21e-66a0-47a9-b03d-55fbfe372d1b" (UID: "30c5e21e-66a0-47a9-b03d-55fbfe372d1b"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:41 crc kubenswrapper[4826]: I0319 19:18:41.049952 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30c5e21e-66a0-47a9-b03d-55fbfe372d1b-kube-api-access-nrl4l" (OuterVolumeSpecName: "kube-api-access-nrl4l") pod "30c5e21e-66a0-47a9-b03d-55fbfe372d1b" (UID: "30c5e21e-66a0-47a9-b03d-55fbfe372d1b"). InnerVolumeSpecName "kube-api-access-nrl4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:18:41 crc kubenswrapper[4826]: I0319 19:18:41.095331 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30c5e21e-66a0-47a9-b03d-55fbfe372d1b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "30c5e21e-66a0-47a9-b03d-55fbfe372d1b" (UID: "30c5e21e-66a0-47a9-b03d-55fbfe372d1b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:41 crc kubenswrapper[4826]: I0319 19:18:41.146422 4826 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30c5e21e-66a0-47a9-b03d-55fbfe372d1b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:41 crc kubenswrapper[4826]: I0319 19:18:41.146451 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrl4l\" (UniqueName: \"kubernetes.io/projected/30c5e21e-66a0-47a9-b03d-55fbfe372d1b-kube-api-access-nrl4l\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:41 crc kubenswrapper[4826]: I0319 19:18:41.146463 4826 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/30c5e21e-66a0-47a9-b03d-55fbfe372d1b-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:41 crc kubenswrapper[4826]: I0319 19:18:41.302329 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-xfct2" Mar 19 19:18:41 crc kubenswrapper[4826]: I0319 19:18:41.426797 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xkjdp" event={"ID":"30c5e21e-66a0-47a9-b03d-55fbfe372d1b","Type":"ContainerDied","Data":"258275de3c3bcd4351ba5dbb36b1240c6ec9f22f8d761008ce9de87a1de528d6"} Mar 19 19:18:41 crc kubenswrapper[4826]: I0319 19:18:41.426901 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="258275de3c3bcd4351ba5dbb36b1240c6ec9f22f8d761008ce9de87a1de528d6" Mar 19 19:18:41 crc kubenswrapper[4826]: I0319 19:18:41.427002 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xkjdp" Mar 19 19:18:41 crc kubenswrapper[4826]: I0319 19:18:41.432778 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-xfct2" Mar 19 19:18:41 crc kubenswrapper[4826]: I0319 19:18:41.436735 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-xfct2" event={"ID":"d92c353f-6fae-4be8-8580-4066bb56e856","Type":"ContainerDied","Data":"9c6ef73f012d6c9d864f42cd7b5eb0f8b613370217ae491511a18979a4c7db1a"} Mar 19 19:18:41 crc kubenswrapper[4826]: I0319 19:18:41.436766 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c6ef73f012d6c9d864f42cd7b5eb0f8b613370217ae491511a18979a4c7db1a" Mar 19 19:18:41 crc kubenswrapper[4826]: I0319 19:18:41.452353 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d92c353f-6fae-4be8-8580-4066bb56e856-config-data\") pod \"d92c353f-6fae-4be8-8580-4066bb56e856\" (UID: \"d92c353f-6fae-4be8-8580-4066bb56e856\") " Mar 19 19:18:41 crc kubenswrapper[4826]: I0319 19:18:41.452866 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d92c353f-6fae-4be8-8580-4066bb56e856-combined-ca-bundle\") pod \"d92c353f-6fae-4be8-8580-4066bb56e856\" (UID: \"d92c353f-6fae-4be8-8580-4066bb56e856\") " Mar 19 19:18:41 crc kubenswrapper[4826]: I0319 19:18:41.453109 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8kf4\" (UniqueName: \"kubernetes.io/projected/d92c353f-6fae-4be8-8580-4066bb56e856-kube-api-access-j8kf4\") pod \"d92c353f-6fae-4be8-8580-4066bb56e856\" (UID: \"d92c353f-6fae-4be8-8580-4066bb56e856\") " Mar 19 19:18:41 crc kubenswrapper[4826]: I0319 19:18:41.457635 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d92c353f-6fae-4be8-8580-4066bb56e856-kube-api-access-j8kf4" (OuterVolumeSpecName: "kube-api-access-j8kf4") pod "d92c353f-6fae-4be8-8580-4066bb56e856" (UID: "d92c353f-6fae-4be8-8580-4066bb56e856"). InnerVolumeSpecName "kube-api-access-j8kf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:18:41 crc kubenswrapper[4826]: I0319 19:18:41.489101 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d92c353f-6fae-4be8-8580-4066bb56e856-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d92c353f-6fae-4be8-8580-4066bb56e856" (UID: "d92c353f-6fae-4be8-8580-4066bb56e856"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:41 crc kubenswrapper[4826]: E0319 19:18:41.539779 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="dc865bcc-551d-4ae7-a0f3-af128e1d1e2d" Mar 19 19:18:41 crc kubenswrapper[4826]: I0319 19:18:41.555551 4826 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d92c353f-6fae-4be8-8580-4066bb56e856-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:41 crc kubenswrapper[4826]: I0319 19:18:41.555592 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8kf4\" (UniqueName: \"kubernetes.io/projected/d92c353f-6fae-4be8-8580-4066bb56e856-kube-api-access-j8kf4\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:41 crc kubenswrapper[4826]: I0319 19:18:41.584349 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d92c353f-6fae-4be8-8580-4066bb56e856-config-data" (OuterVolumeSpecName: "config-data") pod "d92c353f-6fae-4be8-8580-4066bb56e856" (UID: "d92c353f-6fae-4be8-8580-4066bb56e856"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:41 crc kubenswrapper[4826]: I0319 19:18:41.658035 4826 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d92c353f-6fae-4be8-8580-4066bb56e856-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:41 crc kubenswrapper[4826]: I0319 19:18:41.767914 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-p7jzd" Mar 19 19:18:41 crc kubenswrapper[4826]: I0319 19:18:41.860545 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c80aa39-c840-4267-9677-bb82f387073d-config-data\") pod \"5c80aa39-c840-4267-9677-bb82f387073d\" (UID: \"5c80aa39-c840-4267-9677-bb82f387073d\") " Mar 19 19:18:41 crc kubenswrapper[4826]: I0319 19:18:41.860841 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5c80aa39-c840-4267-9677-bb82f387073d-etc-machine-id\") pod \"5c80aa39-c840-4267-9677-bb82f387073d\" (UID: \"5c80aa39-c840-4267-9677-bb82f387073d\") " Mar 19 19:18:41 crc kubenswrapper[4826]: I0319 19:18:41.860874 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4t44\" (UniqueName: \"kubernetes.io/projected/5c80aa39-c840-4267-9677-bb82f387073d-kube-api-access-f4t44\") pod \"5c80aa39-c840-4267-9677-bb82f387073d\" (UID: \"5c80aa39-c840-4267-9677-bb82f387073d\") " Mar 19 19:18:41 crc kubenswrapper[4826]: I0319 19:18:41.860957 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c80aa39-c840-4267-9677-bb82f387073d-scripts\") pod \"5c80aa39-c840-4267-9677-bb82f387073d\" (UID: \"5c80aa39-c840-4267-9677-bb82f387073d\") " Mar 19 19:18:41 crc kubenswrapper[4826]: I0319 19:18:41.861125 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c80aa39-c840-4267-9677-bb82f387073d-combined-ca-bundle\") pod \"5c80aa39-c840-4267-9677-bb82f387073d\" (UID: \"5c80aa39-c840-4267-9677-bb82f387073d\") " Mar 19 19:18:41 crc kubenswrapper[4826]: I0319 19:18:41.861162 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5c80aa39-c840-4267-9677-bb82f387073d-db-sync-config-data\") pod \"5c80aa39-c840-4267-9677-bb82f387073d\" (UID: \"5c80aa39-c840-4267-9677-bb82f387073d\") " Mar 19 19:18:41 crc kubenswrapper[4826]: I0319 19:18:41.862226 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5c80aa39-c840-4267-9677-bb82f387073d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "5c80aa39-c840-4267-9677-bb82f387073d" (UID: "5c80aa39-c840-4267-9677-bb82f387073d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 19:18:41 crc kubenswrapper[4826]: I0319 19:18:41.865957 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c80aa39-c840-4267-9677-bb82f387073d-scripts" (OuterVolumeSpecName: "scripts") pod "5c80aa39-c840-4267-9677-bb82f387073d" (UID: "5c80aa39-c840-4267-9677-bb82f387073d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:41 crc kubenswrapper[4826]: I0319 19:18:41.866687 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c80aa39-c840-4267-9677-bb82f387073d-kube-api-access-f4t44" (OuterVolumeSpecName: "kube-api-access-f4t44") pod "5c80aa39-c840-4267-9677-bb82f387073d" (UID: "5c80aa39-c840-4267-9677-bb82f387073d"). InnerVolumeSpecName "kube-api-access-f4t44". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:18:41 crc kubenswrapper[4826]: I0319 19:18:41.866984 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c80aa39-c840-4267-9677-bb82f387073d-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "5c80aa39-c840-4267-9677-bb82f387073d" (UID: "5c80aa39-c840-4267-9677-bb82f387073d"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:41 crc kubenswrapper[4826]: I0319 19:18:41.908523 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c80aa39-c840-4267-9677-bb82f387073d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5c80aa39-c840-4267-9677-bb82f387073d" (UID: "5c80aa39-c840-4267-9677-bb82f387073d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:41 crc kubenswrapper[4826]: I0319 19:18:41.946814 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c80aa39-c840-4267-9677-bb82f387073d-config-data" (OuterVolumeSpecName: "config-data") pod "5c80aa39-c840-4267-9677-bb82f387073d" (UID: "5c80aa39-c840-4267-9677-bb82f387073d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:41 crc kubenswrapper[4826]: I0319 19:18:41.963316 4826 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c80aa39-c840-4267-9677-bb82f387073d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:41 crc kubenswrapper[4826]: I0319 19:18:41.963353 4826 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5c80aa39-c840-4267-9677-bb82f387073d-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:41 crc kubenswrapper[4826]: I0319 19:18:41.963369 4826 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c80aa39-c840-4267-9677-bb82f387073d-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:41 crc kubenswrapper[4826]: I0319 19:18:41.963382 4826 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5c80aa39-c840-4267-9677-bb82f387073d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:41 crc kubenswrapper[4826]: I0319 19:18:41.963394 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4t44\" (UniqueName: \"kubernetes.io/projected/5c80aa39-c840-4267-9677-bb82f387073d-kube-api-access-f4t44\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:41 crc kubenswrapper[4826]: I0319 19:18:41.963408 4826 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c80aa39-c840-4267-9677-bb82f387073d-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.174412 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-67df6b8d6c-9kl5x"] Mar 19 19:18:42 crc kubenswrapper[4826]: E0319 19:18:42.175006 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b4865a7-127e-4108-bc3a-3aac30103761" containerName="init" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.175096 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b4865a7-127e-4108-bc3a-3aac30103761" containerName="init" Mar 19 19:18:42 crc kubenswrapper[4826]: E0319 19:18:42.175165 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c80aa39-c840-4267-9677-bb82f387073d" containerName="cinder-db-sync" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.175225 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c80aa39-c840-4267-9677-bb82f387073d" containerName="cinder-db-sync" Mar 19 19:18:42 crc kubenswrapper[4826]: E0319 19:18:42.175298 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b4865a7-127e-4108-bc3a-3aac30103761" containerName="dnsmasq-dns" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.175349 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b4865a7-127e-4108-bc3a-3aac30103761" containerName="dnsmasq-dns" Mar 19 19:18:42 crc kubenswrapper[4826]: E0319 19:18:42.175406 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30c5e21e-66a0-47a9-b03d-55fbfe372d1b" containerName="barbican-db-sync" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.175460 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="30c5e21e-66a0-47a9-b03d-55fbfe372d1b" containerName="barbican-db-sync" Mar 19 19:18:42 crc kubenswrapper[4826]: E0319 19:18:42.175532 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d92c353f-6fae-4be8-8580-4066bb56e856" containerName="heat-db-sync" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.175589 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="d92c353f-6fae-4be8-8580-4066bb56e856" containerName="heat-db-sync" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.175912 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="d92c353f-6fae-4be8-8580-4066bb56e856" containerName="heat-db-sync" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.176056 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b4865a7-127e-4108-bc3a-3aac30103761" containerName="dnsmasq-dns" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.176135 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="30c5e21e-66a0-47a9-b03d-55fbfe372d1b" containerName="barbican-db-sync" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.176198 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c80aa39-c840-4267-9677-bb82f387073d" containerName="cinder-db-sync" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.177333 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-67df6b8d6c-9kl5x" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.190323 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.190939 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-bsd2c" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.191610 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.195824 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-7f89998948-7dsjr"] Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.207009 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7f89998948-7dsjr" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.209979 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.224009 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-67df6b8d6c-9kl5x"] Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.248930 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7f89998948-7dsjr"] Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.274418 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6cf6154b-c9df-415a-82dd-907847aaf7d4-config-data-custom\") pod \"barbican-worker-67df6b8d6c-9kl5x\" (UID: \"6cf6154b-c9df-415a-82dd-907847aaf7d4\") " pod="openstack/barbican-worker-67df6b8d6c-9kl5x" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.274583 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cf6154b-c9df-415a-82dd-907847aaf7d4-config-data\") pod \"barbican-worker-67df6b8d6c-9kl5x\" (UID: \"6cf6154b-c9df-415a-82dd-907847aaf7d4\") " pod="openstack/barbican-worker-67df6b8d6c-9kl5x" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.274860 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cf6154b-c9df-415a-82dd-907847aaf7d4-combined-ca-bundle\") pod \"barbican-worker-67df6b8d6c-9kl5x\" (UID: \"6cf6154b-c9df-415a-82dd-907847aaf7d4\") " pod="openstack/barbican-worker-67df6b8d6c-9kl5x" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.274951 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6cf6154b-c9df-415a-82dd-907847aaf7d4-logs\") pod \"barbican-worker-67df6b8d6c-9kl5x\" (UID: \"6cf6154b-c9df-415a-82dd-907847aaf7d4\") " pod="openstack/barbican-worker-67df6b8d6c-9kl5x" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.275065 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-br9kt\" (UniqueName: \"kubernetes.io/projected/6cf6154b-c9df-415a-82dd-907847aaf7d4-kube-api-access-br9kt\") pod \"barbican-worker-67df6b8d6c-9kl5x\" (UID: \"6cf6154b-c9df-415a-82dd-907847aaf7d4\") " pod="openstack/barbican-worker-67df6b8d6c-9kl5x" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.314052 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-tv45r"] Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.316262 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-tv45r" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.324317 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-tv45r"] Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.379372 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnhns\" (UniqueName: \"kubernetes.io/projected/f794027a-e079-4fdc-94b5-aa448e034df7-kube-api-access-qnhns\") pod \"barbican-keystone-listener-7f89998948-7dsjr\" (UID: \"f794027a-e079-4fdc-94b5-aa448e034df7\") " pod="openstack/barbican-keystone-listener-7f89998948-7dsjr" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.379430 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cf6154b-c9df-415a-82dd-907847aaf7d4-config-data\") pod \"barbican-worker-67df6b8d6c-9kl5x\" (UID: \"6cf6154b-c9df-415a-82dd-907847aaf7d4\") " pod="openstack/barbican-worker-67df6b8d6c-9kl5x" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.379482 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f794027a-e079-4fdc-94b5-aa448e034df7-config-data-custom\") pod \"barbican-keystone-listener-7f89998948-7dsjr\" (UID: \"f794027a-e079-4fdc-94b5-aa448e034df7\") " pod="openstack/barbican-keystone-listener-7f89998948-7dsjr" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.379535 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cf6154b-c9df-415a-82dd-907847aaf7d4-combined-ca-bundle\") pod \"barbican-worker-67df6b8d6c-9kl5x\" (UID: \"6cf6154b-c9df-415a-82dd-907847aaf7d4\") " pod="openstack/barbican-worker-67df6b8d6c-9kl5x" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.379556 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6cf6154b-c9df-415a-82dd-907847aaf7d4-logs\") pod \"barbican-worker-67df6b8d6c-9kl5x\" (UID: \"6cf6154b-c9df-415a-82dd-907847aaf7d4\") " pod="openstack/barbican-worker-67df6b8d6c-9kl5x" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.379579 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f794027a-e079-4fdc-94b5-aa448e034df7-combined-ca-bundle\") pod \"barbican-keystone-listener-7f89998948-7dsjr\" (UID: \"f794027a-e079-4fdc-94b5-aa448e034df7\") " pod="openstack/barbican-keystone-listener-7f89998948-7dsjr" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.379597 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-br9kt\" (UniqueName: \"kubernetes.io/projected/6cf6154b-c9df-415a-82dd-907847aaf7d4-kube-api-access-br9kt\") pod \"barbican-worker-67df6b8d6c-9kl5x\" (UID: \"6cf6154b-c9df-415a-82dd-907847aaf7d4\") " pod="openstack/barbican-worker-67df6b8d6c-9kl5x" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.379644 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f794027a-e079-4fdc-94b5-aa448e034df7-config-data\") pod \"barbican-keystone-listener-7f89998948-7dsjr\" (UID: \"f794027a-e079-4fdc-94b5-aa448e034df7\") " pod="openstack/barbican-keystone-listener-7f89998948-7dsjr" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.379707 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6cf6154b-c9df-415a-82dd-907847aaf7d4-config-data-custom\") pod \"barbican-worker-67df6b8d6c-9kl5x\" (UID: \"6cf6154b-c9df-415a-82dd-907847aaf7d4\") " pod="openstack/barbican-worker-67df6b8d6c-9kl5x" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.379728 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f794027a-e079-4fdc-94b5-aa448e034df7-logs\") pod \"barbican-keystone-listener-7f89998948-7dsjr\" (UID: \"f794027a-e079-4fdc-94b5-aa448e034df7\") " pod="openstack/barbican-keystone-listener-7f89998948-7dsjr" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.386392 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6cf6154b-c9df-415a-82dd-907847aaf7d4-logs\") pod \"barbican-worker-67df6b8d6c-9kl5x\" (UID: \"6cf6154b-c9df-415a-82dd-907847aaf7d4\") " pod="openstack/barbican-worker-67df6b8d6c-9kl5x" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.389031 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6cf6154b-c9df-415a-82dd-907847aaf7d4-config-data-custom\") pod \"barbican-worker-67df6b8d6c-9kl5x\" (UID: \"6cf6154b-c9df-415a-82dd-907847aaf7d4\") " pod="openstack/barbican-worker-67df6b8d6c-9kl5x" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.389568 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cf6154b-c9df-415a-82dd-907847aaf7d4-config-data\") pod \"barbican-worker-67df6b8d6c-9kl5x\" (UID: \"6cf6154b-c9df-415a-82dd-907847aaf7d4\") " pod="openstack/barbican-worker-67df6b8d6c-9kl5x" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.389577 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cf6154b-c9df-415a-82dd-907847aaf7d4-combined-ca-bundle\") pod \"barbican-worker-67df6b8d6c-9kl5x\" (UID: \"6cf6154b-c9df-415a-82dd-907847aaf7d4\") " pod="openstack/barbican-worker-67df6b8d6c-9kl5x" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.405947 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-br9kt\" (UniqueName: \"kubernetes.io/projected/6cf6154b-c9df-415a-82dd-907847aaf7d4-kube-api-access-br9kt\") pod \"barbican-worker-67df6b8d6c-9kl5x\" (UID: \"6cf6154b-c9df-415a-82dd-907847aaf7d4\") " pod="openstack/barbican-worker-67df6b8d6c-9kl5x" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.423768 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-764c5457fd-mw862"] Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.425734 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-764c5457fd-mw862" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.432101 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.462396 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-764c5457fd-mw862"] Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.490472 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f794027a-e079-4fdc-94b5-aa448e034df7-combined-ca-bundle\") pod \"barbican-keystone-listener-7f89998948-7dsjr\" (UID: \"f794027a-e079-4fdc-94b5-aa448e034df7\") " pod="openstack/barbican-keystone-listener-7f89998948-7dsjr" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.490638 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f794027a-e079-4fdc-94b5-aa448e034df7-config-data\") pod \"barbican-keystone-listener-7f89998948-7dsjr\" (UID: \"f794027a-e079-4fdc-94b5-aa448e034df7\") " pod="openstack/barbican-keystone-listener-7f89998948-7dsjr" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.490799 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8064c2d9-20b8-4d57-b79b-9579455d8370-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-tv45r\" (UID: \"8064c2d9-20b8-4d57-b79b-9579455d8370\") " pod="openstack/dnsmasq-dns-848cf88cfc-tv45r" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.490901 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8064c2d9-20b8-4d57-b79b-9579455d8370-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-tv45r\" (UID: \"8064c2d9-20b8-4d57-b79b-9579455d8370\") " pod="openstack/dnsmasq-dns-848cf88cfc-tv45r" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.491217 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f794027a-e079-4fdc-94b5-aa448e034df7-logs\") pod \"barbican-keystone-listener-7f89998948-7dsjr\" (UID: \"f794027a-e079-4fdc-94b5-aa448e034df7\") " pod="openstack/barbican-keystone-listener-7f89998948-7dsjr" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.491281 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f794027a-e079-4fdc-94b5-aa448e034df7-logs\") pod \"barbican-keystone-listener-7f89998948-7dsjr\" (UID: \"f794027a-e079-4fdc-94b5-aa448e034df7\") " pod="openstack/barbican-keystone-listener-7f89998948-7dsjr" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.491525 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnhns\" (UniqueName: \"kubernetes.io/projected/f794027a-e079-4fdc-94b5-aa448e034df7-kube-api-access-qnhns\") pod \"barbican-keystone-listener-7f89998948-7dsjr\" (UID: \"f794027a-e079-4fdc-94b5-aa448e034df7\") " pod="openstack/barbican-keystone-listener-7f89998948-7dsjr" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.491605 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8064c2d9-20b8-4d57-b79b-9579455d8370-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-tv45r\" (UID: \"8064c2d9-20b8-4d57-b79b-9579455d8370\") " pod="openstack/dnsmasq-dns-848cf88cfc-tv45r" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.491709 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f794027a-e079-4fdc-94b5-aa448e034df7-config-data-custom\") pod \"barbican-keystone-listener-7f89998948-7dsjr\" (UID: \"f794027a-e079-4fdc-94b5-aa448e034df7\") " pod="openstack/barbican-keystone-listener-7f89998948-7dsjr" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.491751 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8064c2d9-20b8-4d57-b79b-9579455d8370-config\") pod \"dnsmasq-dns-848cf88cfc-tv45r\" (UID: \"8064c2d9-20b8-4d57-b79b-9579455d8370\") " pod="openstack/dnsmasq-dns-848cf88cfc-tv45r" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.491772 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4nhp\" (UniqueName: \"kubernetes.io/projected/8064c2d9-20b8-4d57-b79b-9579455d8370-kube-api-access-d4nhp\") pod \"dnsmasq-dns-848cf88cfc-tv45r\" (UID: \"8064c2d9-20b8-4d57-b79b-9579455d8370\") " pod="openstack/dnsmasq-dns-848cf88cfc-tv45r" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.491963 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8064c2d9-20b8-4d57-b79b-9579455d8370-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-tv45r\" (UID: \"8064c2d9-20b8-4d57-b79b-9579455d8370\") " pod="openstack/dnsmasq-dns-848cf88cfc-tv45r" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.494558 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-p7jzd" event={"ID":"5c80aa39-c840-4267-9677-bb82f387073d","Type":"ContainerDied","Data":"abfb107d188854484028ff47fea895572a119cc0a6c969e19ca4434c1fb33fef"} Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.494595 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="abfb107d188854484028ff47fea895572a119cc0a6c969e19ca4434c1fb33fef" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.494759 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-p7jzd" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.495600 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f794027a-e079-4fdc-94b5-aa448e034df7-config-data-custom\") pod \"barbican-keystone-listener-7f89998948-7dsjr\" (UID: \"f794027a-e079-4fdc-94b5-aa448e034df7\") " pod="openstack/barbican-keystone-listener-7f89998948-7dsjr" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.496017 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f794027a-e079-4fdc-94b5-aa448e034df7-combined-ca-bundle\") pod \"barbican-keystone-listener-7f89998948-7dsjr\" (UID: \"f794027a-e079-4fdc-94b5-aa448e034df7\") " pod="openstack/barbican-keystone-listener-7f89998948-7dsjr" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.512265 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f794027a-e079-4fdc-94b5-aa448e034df7-config-data\") pod \"barbican-keystone-listener-7f89998948-7dsjr\" (UID: \"f794027a-e079-4fdc-94b5-aa448e034df7\") " pod="openstack/barbican-keystone-listener-7f89998948-7dsjr" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.522399 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc865bcc-551d-4ae7-a0f3-af128e1d1e2d","Type":"ContainerStarted","Data":"5de9e81e5ad92596313fd9a21748d7252aafee81793961220ed2af636e525e34"} Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.522489 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnhns\" (UniqueName: \"kubernetes.io/projected/f794027a-e079-4fdc-94b5-aa448e034df7-kube-api-access-qnhns\") pod \"barbican-keystone-listener-7f89998948-7dsjr\" (UID: \"f794027a-e079-4fdc-94b5-aa448e034df7\") " pod="openstack/barbican-keystone-listener-7f89998948-7dsjr" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.522573 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dc865bcc-551d-4ae7-a0f3-af128e1d1e2d" containerName="ceilometer-notification-agent" containerID="cri-o://b817efec6ae173dd8722c23c875f2d78f243d318a14b4b568442518dde4e2a73" gracePeriod=30 Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.522613 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.522665 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dc865bcc-551d-4ae7-a0f3-af128e1d1e2d" containerName="proxy-httpd" containerID="cri-o://5de9e81e5ad92596313fd9a21748d7252aafee81793961220ed2af636e525e34" gracePeriod=30 Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.522698 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dc865bcc-551d-4ae7-a0f3-af128e1d1e2d" containerName="sg-core" containerID="cri-o://71098aa46f4c29e2324a7a3122515a2d8bcfde0c286c19147dff3fc814d70f59" gracePeriod=30 Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.541875 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-67df6b8d6c-9kl5x" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.555699 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7f89998948-7dsjr" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.595239 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8064c2d9-20b8-4d57-b79b-9579455d8370-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-tv45r\" (UID: \"8064c2d9-20b8-4d57-b79b-9579455d8370\") " pod="openstack/dnsmasq-dns-848cf88cfc-tv45r" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.595310 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8064c2d9-20b8-4d57-b79b-9579455d8370-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-tv45r\" (UID: \"8064c2d9-20b8-4d57-b79b-9579455d8370\") " pod="openstack/dnsmasq-dns-848cf88cfc-tv45r" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.595335 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3208df19-70b8-42be-b3b3-66b38532ab47-combined-ca-bundle\") pod \"barbican-api-764c5457fd-mw862\" (UID: \"3208df19-70b8-42be-b3b3-66b38532ab47\") " pod="openstack/barbican-api-764c5457fd-mw862" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.595375 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3208df19-70b8-42be-b3b3-66b38532ab47-logs\") pod \"barbican-api-764c5457fd-mw862\" (UID: \"3208df19-70b8-42be-b3b3-66b38532ab47\") " pod="openstack/barbican-api-764c5457fd-mw862" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.595404 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3208df19-70b8-42be-b3b3-66b38532ab47-config-data-custom\") pod \"barbican-api-764c5457fd-mw862\" (UID: \"3208df19-70b8-42be-b3b3-66b38532ab47\") " pod="openstack/barbican-api-764c5457fd-mw862" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.595441 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8064c2d9-20b8-4d57-b79b-9579455d8370-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-tv45r\" (UID: \"8064c2d9-20b8-4d57-b79b-9579455d8370\") " pod="openstack/dnsmasq-dns-848cf88cfc-tv45r" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.595456 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k589r\" (UniqueName: \"kubernetes.io/projected/3208df19-70b8-42be-b3b3-66b38532ab47-kube-api-access-k589r\") pod \"barbican-api-764c5457fd-mw862\" (UID: \"3208df19-70b8-42be-b3b3-66b38532ab47\") " pod="openstack/barbican-api-764c5457fd-mw862" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.595488 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8064c2d9-20b8-4d57-b79b-9579455d8370-config\") pod \"dnsmasq-dns-848cf88cfc-tv45r\" (UID: \"8064c2d9-20b8-4d57-b79b-9579455d8370\") " pod="openstack/dnsmasq-dns-848cf88cfc-tv45r" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.595505 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4nhp\" (UniqueName: \"kubernetes.io/projected/8064c2d9-20b8-4d57-b79b-9579455d8370-kube-api-access-d4nhp\") pod \"dnsmasq-dns-848cf88cfc-tv45r\" (UID: \"8064c2d9-20b8-4d57-b79b-9579455d8370\") " pod="openstack/dnsmasq-dns-848cf88cfc-tv45r" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.595535 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3208df19-70b8-42be-b3b3-66b38532ab47-config-data\") pod \"barbican-api-764c5457fd-mw862\" (UID: \"3208df19-70b8-42be-b3b3-66b38532ab47\") " pod="openstack/barbican-api-764c5457fd-mw862" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.595571 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8064c2d9-20b8-4d57-b79b-9579455d8370-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-tv45r\" (UID: \"8064c2d9-20b8-4d57-b79b-9579455d8370\") " pod="openstack/dnsmasq-dns-848cf88cfc-tv45r" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.596401 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8064c2d9-20b8-4d57-b79b-9579455d8370-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-tv45r\" (UID: \"8064c2d9-20b8-4d57-b79b-9579455d8370\") " pod="openstack/dnsmasq-dns-848cf88cfc-tv45r" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.596925 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8064c2d9-20b8-4d57-b79b-9579455d8370-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-tv45r\" (UID: \"8064c2d9-20b8-4d57-b79b-9579455d8370\") " pod="openstack/dnsmasq-dns-848cf88cfc-tv45r" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.597434 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8064c2d9-20b8-4d57-b79b-9579455d8370-config\") pod \"dnsmasq-dns-848cf88cfc-tv45r\" (UID: \"8064c2d9-20b8-4d57-b79b-9579455d8370\") " pod="openstack/dnsmasq-dns-848cf88cfc-tv45r" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.597917 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8064c2d9-20b8-4d57-b79b-9579455d8370-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-tv45r\" (UID: \"8064c2d9-20b8-4d57-b79b-9579455d8370\") " pod="openstack/dnsmasq-dns-848cf88cfc-tv45r" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.598039 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8064c2d9-20b8-4d57-b79b-9579455d8370-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-tv45r\" (UID: \"8064c2d9-20b8-4d57-b79b-9579455d8370\") " pod="openstack/dnsmasq-dns-848cf88cfc-tv45r" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.622865 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4nhp\" (UniqueName: \"kubernetes.io/projected/8064c2d9-20b8-4d57-b79b-9579455d8370-kube-api-access-d4nhp\") pod \"dnsmasq-dns-848cf88cfc-tv45r\" (UID: \"8064c2d9-20b8-4d57-b79b-9579455d8370\") " pod="openstack/dnsmasq-dns-848cf88cfc-tv45r" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.640255 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-tv45r" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.672817 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.675314 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.683258 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.683546 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-266gk" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.683741 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.683834 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.698920 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3208df19-70b8-42be-b3b3-66b38532ab47-config-data-custom\") pod \"barbican-api-764c5457fd-mw862\" (UID: \"3208df19-70b8-42be-b3b3-66b38532ab47\") " pod="openstack/barbican-api-764c5457fd-mw862" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.702701 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k589r\" (UniqueName: \"kubernetes.io/projected/3208df19-70b8-42be-b3b3-66b38532ab47-kube-api-access-k589r\") pod \"barbican-api-764c5457fd-mw862\" (UID: \"3208df19-70b8-42be-b3b3-66b38532ab47\") " pod="openstack/barbican-api-764c5457fd-mw862" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.702836 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3208df19-70b8-42be-b3b3-66b38532ab47-config-data\") pod \"barbican-api-764c5457fd-mw862\" (UID: \"3208df19-70b8-42be-b3b3-66b38532ab47\") " pod="openstack/barbican-api-764c5457fd-mw862" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.703125 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3208df19-70b8-42be-b3b3-66b38532ab47-combined-ca-bundle\") pod \"barbican-api-764c5457fd-mw862\" (UID: \"3208df19-70b8-42be-b3b3-66b38532ab47\") " pod="openstack/barbican-api-764c5457fd-mw862" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.703199 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3208df19-70b8-42be-b3b3-66b38532ab47-logs\") pod \"barbican-api-764c5457fd-mw862\" (UID: \"3208df19-70b8-42be-b3b3-66b38532ab47\") " pod="openstack/barbican-api-764c5457fd-mw862" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.703613 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3208df19-70b8-42be-b3b3-66b38532ab47-logs\") pod \"barbican-api-764c5457fd-mw862\" (UID: \"3208df19-70b8-42be-b3b3-66b38532ab47\") " pod="openstack/barbican-api-764c5457fd-mw862" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.706500 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3208df19-70b8-42be-b3b3-66b38532ab47-config-data-custom\") pod \"barbican-api-764c5457fd-mw862\" (UID: \"3208df19-70b8-42be-b3b3-66b38532ab47\") " pod="openstack/barbican-api-764c5457fd-mw862" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.719496 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3208df19-70b8-42be-b3b3-66b38532ab47-config-data\") pod \"barbican-api-764c5457fd-mw862\" (UID: \"3208df19-70b8-42be-b3b3-66b38532ab47\") " pod="openstack/barbican-api-764c5457fd-mw862" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.741871 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3208df19-70b8-42be-b3b3-66b38532ab47-combined-ca-bundle\") pod \"barbican-api-764c5457fd-mw862\" (UID: \"3208df19-70b8-42be-b3b3-66b38532ab47\") " pod="openstack/barbican-api-764c5457fd-mw862" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.742565 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k589r\" (UniqueName: \"kubernetes.io/projected/3208df19-70b8-42be-b3b3-66b38532ab47-kube-api-access-k589r\") pod \"barbican-api-764c5457fd-mw862\" (UID: \"3208df19-70b8-42be-b3b3-66b38532ab47\") " pod="openstack/barbican-api-764c5457fd-mw862" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.749966 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.784720 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-tv45r"] Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.785399 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-764c5457fd-mw862" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.803283 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-z2dgh"] Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.805321 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-z2dgh" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.807362 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9\") " pod="openstack/cinder-scheduler-0" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.807484 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9\") " pod="openstack/cinder-scheduler-0" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.807626 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9-scripts\") pod \"cinder-scheduler-0\" (UID: \"86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9\") " pod="openstack/cinder-scheduler-0" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.807756 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr49t\" (UniqueName: \"kubernetes.io/projected/86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9-kube-api-access-rr49t\") pod \"cinder-scheduler-0\" (UID: \"86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9\") " pod="openstack/cinder-scheduler-0" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.807831 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9\") " pod="openstack/cinder-scheduler-0" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.807926 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9-config-data\") pod \"cinder-scheduler-0\" (UID: \"86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9\") " pod="openstack/cinder-scheduler-0" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.816954 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-z2dgh"] Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.916009 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9-scripts\") pod \"cinder-scheduler-0\" (UID: \"86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9\") " pod="openstack/cinder-scheduler-0" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.916304 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fd164df-263a-4817-a6d0-bedc59421e75-config\") pod \"dnsmasq-dns-6578955fd5-z2dgh\" (UID: \"3fd164df-263a-4817-a6d0-bedc59421e75\") " pod="openstack/dnsmasq-dns-6578955fd5-z2dgh" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.916355 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3fd164df-263a-4817-a6d0-bedc59421e75-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-z2dgh\" (UID: \"3fd164df-263a-4817-a6d0-bedc59421e75\") " pod="openstack/dnsmasq-dns-6578955fd5-z2dgh" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.916389 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr49t\" (UniqueName: \"kubernetes.io/projected/86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9-kube-api-access-rr49t\") pod \"cinder-scheduler-0\" (UID: \"86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9\") " pod="openstack/cinder-scheduler-0" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.916413 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9\") " pod="openstack/cinder-scheduler-0" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.916435 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3fd164df-263a-4817-a6d0-bedc59421e75-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-z2dgh\" (UID: \"3fd164df-263a-4817-a6d0-bedc59421e75\") " pod="openstack/dnsmasq-dns-6578955fd5-z2dgh" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.916459 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3fd164df-263a-4817-a6d0-bedc59421e75-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-z2dgh\" (UID: \"3fd164df-263a-4817-a6d0-bedc59421e75\") " pod="openstack/dnsmasq-dns-6578955fd5-z2dgh" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.916483 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9-config-data\") pod \"cinder-scheduler-0\" (UID: \"86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9\") " pod="openstack/cinder-scheduler-0" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.916532 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnqbp\" (UniqueName: \"kubernetes.io/projected/3fd164df-263a-4817-a6d0-bedc59421e75-kube-api-access-jnqbp\") pod \"dnsmasq-dns-6578955fd5-z2dgh\" (UID: \"3fd164df-263a-4817-a6d0-bedc59421e75\") " pod="openstack/dnsmasq-dns-6578955fd5-z2dgh" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.916573 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3fd164df-263a-4817-a6d0-bedc59421e75-dns-svc\") pod \"dnsmasq-dns-6578955fd5-z2dgh\" (UID: \"3fd164df-263a-4817-a6d0-bedc59421e75\") " pod="openstack/dnsmasq-dns-6578955fd5-z2dgh" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.916600 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9\") " pod="openstack/cinder-scheduler-0" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.916618 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9\") " pod="openstack/cinder-scheduler-0" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.916754 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9\") " pod="openstack/cinder-scheduler-0" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.946981 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9\") " pod="openstack/cinder-scheduler-0" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.947409 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9\") " pod="openstack/cinder-scheduler-0" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.948282 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9-config-data\") pod \"cinder-scheduler-0\" (UID: \"86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9\") " pod="openstack/cinder-scheduler-0" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.954074 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.956215 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.958898 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.963629 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9-scripts\") pod \"cinder-scheduler-0\" (UID: \"86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9\") " pod="openstack/cinder-scheduler-0" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.965416 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr49t\" (UniqueName: \"kubernetes.io/projected/86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9-kube-api-access-rr49t\") pod \"cinder-scheduler-0\" (UID: \"86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9\") " pod="openstack/cinder-scheduler-0" Mar 19 19:18:42 crc kubenswrapper[4826]: I0319 19:18:42.972550 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 19 19:18:43 crc kubenswrapper[4826]: I0319 19:18:43.034272 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnqbp\" (UniqueName: \"kubernetes.io/projected/3fd164df-263a-4817-a6d0-bedc59421e75-kube-api-access-jnqbp\") pod \"dnsmasq-dns-6578955fd5-z2dgh\" (UID: \"3fd164df-263a-4817-a6d0-bedc59421e75\") " pod="openstack/dnsmasq-dns-6578955fd5-z2dgh" Mar 19 19:18:43 crc kubenswrapper[4826]: I0319 19:18:43.034358 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3fd164df-263a-4817-a6d0-bedc59421e75-dns-svc\") pod \"dnsmasq-dns-6578955fd5-z2dgh\" (UID: \"3fd164df-263a-4817-a6d0-bedc59421e75\") " pod="openstack/dnsmasq-dns-6578955fd5-z2dgh" Mar 19 19:18:43 crc kubenswrapper[4826]: I0319 19:18:43.034496 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fd164df-263a-4817-a6d0-bedc59421e75-config\") pod \"dnsmasq-dns-6578955fd5-z2dgh\" (UID: \"3fd164df-263a-4817-a6d0-bedc59421e75\") " pod="openstack/dnsmasq-dns-6578955fd5-z2dgh" Mar 19 19:18:43 crc kubenswrapper[4826]: I0319 19:18:43.034534 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3fd164df-263a-4817-a6d0-bedc59421e75-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-z2dgh\" (UID: \"3fd164df-263a-4817-a6d0-bedc59421e75\") " pod="openstack/dnsmasq-dns-6578955fd5-z2dgh" Mar 19 19:18:43 crc kubenswrapper[4826]: I0319 19:18:43.034588 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3fd164df-263a-4817-a6d0-bedc59421e75-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-z2dgh\" (UID: \"3fd164df-263a-4817-a6d0-bedc59421e75\") " pod="openstack/dnsmasq-dns-6578955fd5-z2dgh" Mar 19 19:18:43 crc kubenswrapper[4826]: I0319 19:18:43.034739 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3fd164df-263a-4817-a6d0-bedc59421e75-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-z2dgh\" (UID: \"3fd164df-263a-4817-a6d0-bedc59421e75\") " pod="openstack/dnsmasq-dns-6578955fd5-z2dgh" Mar 19 19:18:43 crc kubenswrapper[4826]: I0319 19:18:43.036483 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fd164df-263a-4817-a6d0-bedc59421e75-config\") pod \"dnsmasq-dns-6578955fd5-z2dgh\" (UID: \"3fd164df-263a-4817-a6d0-bedc59421e75\") " pod="openstack/dnsmasq-dns-6578955fd5-z2dgh" Mar 19 19:18:43 crc kubenswrapper[4826]: I0319 19:18:43.037102 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3fd164df-263a-4817-a6d0-bedc59421e75-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-z2dgh\" (UID: \"3fd164df-263a-4817-a6d0-bedc59421e75\") " pod="openstack/dnsmasq-dns-6578955fd5-z2dgh" Mar 19 19:18:43 crc kubenswrapper[4826]: I0319 19:18:43.037613 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3fd164df-263a-4817-a6d0-bedc59421e75-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-z2dgh\" (UID: \"3fd164df-263a-4817-a6d0-bedc59421e75\") " pod="openstack/dnsmasq-dns-6578955fd5-z2dgh" Mar 19 19:18:43 crc kubenswrapper[4826]: I0319 19:18:43.039018 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3fd164df-263a-4817-a6d0-bedc59421e75-dns-svc\") pod \"dnsmasq-dns-6578955fd5-z2dgh\" (UID: \"3fd164df-263a-4817-a6d0-bedc59421e75\") " pod="openstack/dnsmasq-dns-6578955fd5-z2dgh" Mar 19 19:18:43 crc kubenswrapper[4826]: I0319 19:18:43.039356 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3fd164df-263a-4817-a6d0-bedc59421e75-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-z2dgh\" (UID: \"3fd164df-263a-4817-a6d0-bedc59421e75\") " pod="openstack/dnsmasq-dns-6578955fd5-z2dgh" Mar 19 19:18:43 crc kubenswrapper[4826]: I0319 19:18:43.043205 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 19 19:18:43 crc kubenswrapper[4826]: I0319 19:18:43.055381 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnqbp\" (UniqueName: \"kubernetes.io/projected/3fd164df-263a-4817-a6d0-bedc59421e75-kube-api-access-jnqbp\") pod \"dnsmasq-dns-6578955fd5-z2dgh\" (UID: \"3fd164df-263a-4817-a6d0-bedc59421e75\") " pod="openstack/dnsmasq-dns-6578955fd5-z2dgh" Mar 19 19:18:43 crc kubenswrapper[4826]: I0319 19:18:43.144269 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e115aacd-2c47-4070-bcdf-ead46ded7eb5-config-data\") pod \"cinder-api-0\" (UID: \"e115aacd-2c47-4070-bcdf-ead46ded7eb5\") " pod="openstack/cinder-api-0" Mar 19 19:18:43 crc kubenswrapper[4826]: I0319 19:18:43.144530 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e115aacd-2c47-4070-bcdf-ead46ded7eb5-scripts\") pod \"cinder-api-0\" (UID: \"e115aacd-2c47-4070-bcdf-ead46ded7eb5\") " pod="openstack/cinder-api-0" Mar 19 19:18:43 crc kubenswrapper[4826]: I0319 19:18:43.144572 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e115aacd-2c47-4070-bcdf-ead46ded7eb5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e115aacd-2c47-4070-bcdf-ead46ded7eb5\") " pod="openstack/cinder-api-0" Mar 19 19:18:43 crc kubenswrapper[4826]: I0319 19:18:43.144616 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e115aacd-2c47-4070-bcdf-ead46ded7eb5-logs\") pod \"cinder-api-0\" (UID: \"e115aacd-2c47-4070-bcdf-ead46ded7eb5\") " pod="openstack/cinder-api-0" Mar 19 19:18:43 crc kubenswrapper[4826]: I0319 19:18:43.144630 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e115aacd-2c47-4070-bcdf-ead46ded7eb5-config-data-custom\") pod \"cinder-api-0\" (UID: \"e115aacd-2c47-4070-bcdf-ead46ded7eb5\") " pod="openstack/cinder-api-0" Mar 19 19:18:43 crc kubenswrapper[4826]: I0319 19:18:43.145955 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e115aacd-2c47-4070-bcdf-ead46ded7eb5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e115aacd-2c47-4070-bcdf-ead46ded7eb5\") " pod="openstack/cinder-api-0" Mar 19 19:18:43 crc kubenswrapper[4826]: I0319 19:18:43.146008 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzlmg\" (UniqueName: \"kubernetes.io/projected/e115aacd-2c47-4070-bcdf-ead46ded7eb5-kube-api-access-zzlmg\") pod \"cinder-api-0\" (UID: \"e115aacd-2c47-4070-bcdf-ead46ded7eb5\") " pod="openstack/cinder-api-0" Mar 19 19:18:43 crc kubenswrapper[4826]: I0319 19:18:43.241419 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-67df6b8d6c-9kl5x"] Mar 19 19:18:43 crc kubenswrapper[4826]: I0319 19:18:43.248112 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e115aacd-2c47-4070-bcdf-ead46ded7eb5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e115aacd-2c47-4070-bcdf-ead46ded7eb5\") " pod="openstack/cinder-api-0" Mar 19 19:18:43 crc kubenswrapper[4826]: I0319 19:18:43.248173 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzlmg\" (UniqueName: \"kubernetes.io/projected/e115aacd-2c47-4070-bcdf-ead46ded7eb5-kube-api-access-zzlmg\") pod \"cinder-api-0\" (UID: \"e115aacd-2c47-4070-bcdf-ead46ded7eb5\") " pod="openstack/cinder-api-0" Mar 19 19:18:43 crc kubenswrapper[4826]: I0319 19:18:43.248267 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e115aacd-2c47-4070-bcdf-ead46ded7eb5-config-data\") pod \"cinder-api-0\" (UID: \"e115aacd-2c47-4070-bcdf-ead46ded7eb5\") " pod="openstack/cinder-api-0" Mar 19 19:18:43 crc kubenswrapper[4826]: I0319 19:18:43.248292 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e115aacd-2c47-4070-bcdf-ead46ded7eb5-scripts\") pod \"cinder-api-0\" (UID: \"e115aacd-2c47-4070-bcdf-ead46ded7eb5\") " pod="openstack/cinder-api-0" Mar 19 19:18:43 crc kubenswrapper[4826]: I0319 19:18:43.248340 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e115aacd-2c47-4070-bcdf-ead46ded7eb5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e115aacd-2c47-4070-bcdf-ead46ded7eb5\") " pod="openstack/cinder-api-0" Mar 19 19:18:43 crc kubenswrapper[4826]: I0319 19:18:43.248387 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e115aacd-2c47-4070-bcdf-ead46ded7eb5-logs\") pod \"cinder-api-0\" (UID: \"e115aacd-2c47-4070-bcdf-ead46ded7eb5\") " pod="openstack/cinder-api-0" Mar 19 19:18:43 crc kubenswrapper[4826]: I0319 19:18:43.248408 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e115aacd-2c47-4070-bcdf-ead46ded7eb5-config-data-custom\") pod \"cinder-api-0\" (UID: \"e115aacd-2c47-4070-bcdf-ead46ded7eb5\") " pod="openstack/cinder-api-0" Mar 19 19:18:43 crc kubenswrapper[4826]: I0319 19:18:43.249212 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e115aacd-2c47-4070-bcdf-ead46ded7eb5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e115aacd-2c47-4070-bcdf-ead46ded7eb5\") " pod="openstack/cinder-api-0" Mar 19 19:18:43 crc kubenswrapper[4826]: I0319 19:18:43.251576 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e115aacd-2c47-4070-bcdf-ead46ded7eb5-logs\") pod \"cinder-api-0\" (UID: \"e115aacd-2c47-4070-bcdf-ead46ded7eb5\") " pod="openstack/cinder-api-0" Mar 19 19:18:43 crc kubenswrapper[4826]: I0319 19:18:43.253683 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e115aacd-2c47-4070-bcdf-ead46ded7eb5-scripts\") pod \"cinder-api-0\" (UID: \"e115aacd-2c47-4070-bcdf-ead46ded7eb5\") " pod="openstack/cinder-api-0" Mar 19 19:18:43 crc kubenswrapper[4826]: I0319 19:18:43.254394 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e115aacd-2c47-4070-bcdf-ead46ded7eb5-config-data\") pod \"cinder-api-0\" (UID: \"e115aacd-2c47-4070-bcdf-ead46ded7eb5\") " pod="openstack/cinder-api-0" Mar 19 19:18:43 crc kubenswrapper[4826]: I0319 19:18:43.255374 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e115aacd-2c47-4070-bcdf-ead46ded7eb5-config-data-custom\") pod \"cinder-api-0\" (UID: \"e115aacd-2c47-4070-bcdf-ead46ded7eb5\") " pod="openstack/cinder-api-0" Mar 19 19:18:43 crc kubenswrapper[4826]: I0319 19:18:43.261831 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e115aacd-2c47-4070-bcdf-ead46ded7eb5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e115aacd-2c47-4070-bcdf-ead46ded7eb5\") " pod="openstack/cinder-api-0" Mar 19 19:18:43 crc kubenswrapper[4826]: I0319 19:18:43.267689 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzlmg\" (UniqueName: \"kubernetes.io/projected/e115aacd-2c47-4070-bcdf-ead46ded7eb5-kube-api-access-zzlmg\") pod \"cinder-api-0\" (UID: \"e115aacd-2c47-4070-bcdf-ead46ded7eb5\") " pod="openstack/cinder-api-0" Mar 19 19:18:43 crc kubenswrapper[4826]: I0319 19:18:43.353841 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-z2dgh" Mar 19 19:18:43 crc kubenswrapper[4826]: I0319 19:18:43.354776 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 19 19:18:43 crc kubenswrapper[4826]: I0319 19:18:43.510389 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-764c5457fd-mw862"] Mar 19 19:18:43 crc kubenswrapper[4826]: I0319 19:18:43.526927 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7f89998948-7dsjr"] Mar 19 19:18:43 crc kubenswrapper[4826]: I0319 19:18:43.539725 4826 generic.go:334] "Generic (PLEG): container finished" podID="dc865bcc-551d-4ae7-a0f3-af128e1d1e2d" containerID="5de9e81e5ad92596313fd9a21748d7252aafee81793961220ed2af636e525e34" exitCode=0 Mar 19 19:18:43 crc kubenswrapper[4826]: I0319 19:18:43.539758 4826 generic.go:334] "Generic (PLEG): container finished" podID="dc865bcc-551d-4ae7-a0f3-af128e1d1e2d" containerID="71098aa46f4c29e2324a7a3122515a2d8bcfde0c286c19147dff3fc814d70f59" exitCode=2 Mar 19 19:18:43 crc kubenswrapper[4826]: I0319 19:18:43.539796 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc865bcc-551d-4ae7-a0f3-af128e1d1e2d","Type":"ContainerDied","Data":"5de9e81e5ad92596313fd9a21748d7252aafee81793961220ed2af636e525e34"} Mar 19 19:18:43 crc kubenswrapper[4826]: I0319 19:18:43.539823 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc865bcc-551d-4ae7-a0f3-af128e1d1e2d","Type":"ContainerDied","Data":"71098aa46f4c29e2324a7a3122515a2d8bcfde0c286c19147dff3fc814d70f59"} Mar 19 19:18:43 crc kubenswrapper[4826]: I0319 19:18:43.541171 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-67df6b8d6c-9kl5x" event={"ID":"6cf6154b-c9df-415a-82dd-907847aaf7d4","Type":"ContainerStarted","Data":"82347d20a97583f3f3964cd1b12f837abe4ad6f8b9eee67ed43a7342962bbafc"} Mar 19 19:18:43 crc kubenswrapper[4826]: I0319 19:18:43.685975 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-tv45r"] Mar 19 19:18:43 crc kubenswrapper[4826]: I0319 19:18:43.713481 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 19 19:18:44 crc kubenswrapper[4826]: W0319 19:18:44.753735 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3208df19_70b8_42be_b3b3_66b38532ab47.slice/crio-aa6ab795d7abf4f6ba0e47a80a164e5a6c969486caa62c16ef4e7abb986c13e7 WatchSource:0}: Error finding container aa6ab795d7abf4f6ba0e47a80a164e5a6c969486caa62c16ef4e7abb986c13e7: Status 404 returned error can't find the container with id aa6ab795d7abf4f6ba0e47a80a164e5a6c969486caa62c16ef4e7abb986c13e7 Mar 19 19:18:45 crc kubenswrapper[4826]: I0319 19:18:45.293765 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-z2dgh"] Mar 19 19:18:45 crc kubenswrapper[4826]: I0319 19:18:45.391175 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:18:45 crc kubenswrapper[4826]: I0319 19:18:45.527459 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 19 19:18:45 crc kubenswrapper[4826]: I0319 19:18:45.542336 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc865bcc-551d-4ae7-a0f3-af128e1d1e2d-config-data\") pod \"dc865bcc-551d-4ae7-a0f3-af128e1d1e2d\" (UID: \"dc865bcc-551d-4ae7-a0f3-af128e1d1e2d\") " Mar 19 19:18:45 crc kubenswrapper[4826]: I0319 19:18:45.542702 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wckx\" (UniqueName: \"kubernetes.io/projected/dc865bcc-551d-4ae7-a0f3-af128e1d1e2d-kube-api-access-7wckx\") pod \"dc865bcc-551d-4ae7-a0f3-af128e1d1e2d\" (UID: \"dc865bcc-551d-4ae7-a0f3-af128e1d1e2d\") " Mar 19 19:18:45 crc kubenswrapper[4826]: I0319 19:18:45.542824 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc865bcc-551d-4ae7-a0f3-af128e1d1e2d-log-httpd\") pod \"dc865bcc-551d-4ae7-a0f3-af128e1d1e2d\" (UID: \"dc865bcc-551d-4ae7-a0f3-af128e1d1e2d\") " Mar 19 19:18:45 crc kubenswrapper[4826]: I0319 19:18:45.542977 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc865bcc-551d-4ae7-a0f3-af128e1d1e2d-scripts\") pod \"dc865bcc-551d-4ae7-a0f3-af128e1d1e2d\" (UID: \"dc865bcc-551d-4ae7-a0f3-af128e1d1e2d\") " Mar 19 19:18:45 crc kubenswrapper[4826]: I0319 19:18:45.543113 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc865bcc-551d-4ae7-a0f3-af128e1d1e2d-sg-core-conf-yaml\") pod \"dc865bcc-551d-4ae7-a0f3-af128e1d1e2d\" (UID: \"dc865bcc-551d-4ae7-a0f3-af128e1d1e2d\") " Mar 19 19:18:45 crc kubenswrapper[4826]: I0319 19:18:45.543368 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc865bcc-551d-4ae7-a0f3-af128e1d1e2d-run-httpd\") pod \"dc865bcc-551d-4ae7-a0f3-af128e1d1e2d\" (UID: \"dc865bcc-551d-4ae7-a0f3-af128e1d1e2d\") " Mar 19 19:18:45 crc kubenswrapper[4826]: I0319 19:18:45.543527 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc865bcc-551d-4ae7-a0f3-af128e1d1e2d-combined-ca-bundle\") pod \"dc865bcc-551d-4ae7-a0f3-af128e1d1e2d\" (UID: \"dc865bcc-551d-4ae7-a0f3-af128e1d1e2d\") " Mar 19 19:18:45 crc kubenswrapper[4826]: I0319 19:18:45.544411 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc865bcc-551d-4ae7-a0f3-af128e1d1e2d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "dc865bcc-551d-4ae7-a0f3-af128e1d1e2d" (UID: "dc865bcc-551d-4ae7-a0f3-af128e1d1e2d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:18:45 crc kubenswrapper[4826]: I0319 19:18:45.544913 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc865bcc-551d-4ae7-a0f3-af128e1d1e2d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "dc865bcc-551d-4ae7-a0f3-af128e1d1e2d" (UID: "dc865bcc-551d-4ae7-a0f3-af128e1d1e2d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:18:45 crc kubenswrapper[4826]: I0319 19:18:45.548026 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc865bcc-551d-4ae7-a0f3-af128e1d1e2d-scripts" (OuterVolumeSpecName: "scripts") pod "dc865bcc-551d-4ae7-a0f3-af128e1d1e2d" (UID: "dc865bcc-551d-4ae7-a0f3-af128e1d1e2d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:45 crc kubenswrapper[4826]: I0319 19:18:45.548226 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc865bcc-551d-4ae7-a0f3-af128e1d1e2d-kube-api-access-7wckx" (OuterVolumeSpecName: "kube-api-access-7wckx") pod "dc865bcc-551d-4ae7-a0f3-af128e1d1e2d" (UID: "dc865bcc-551d-4ae7-a0f3-af128e1d1e2d"). InnerVolumeSpecName "kube-api-access-7wckx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:18:45 crc kubenswrapper[4826]: I0319 19:18:45.565715 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7f89998948-7dsjr" event={"ID":"f794027a-e079-4fdc-94b5-aa448e034df7","Type":"ContainerStarted","Data":"f41253a03e2620bd24ce2b2d7dfd35a4b3319328730d37d12ea7bbb059c1f221"} Mar 19 19:18:45 crc kubenswrapper[4826]: I0319 19:18:45.566916 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-z2dgh" event={"ID":"3fd164df-263a-4817-a6d0-bedc59421e75","Type":"ContainerStarted","Data":"e0d31fea3d1ad082dee49de6b2b4f17a02d10c94a8dd776e6293dbd61323e4d0"} Mar 19 19:18:45 crc kubenswrapper[4826]: I0319 19:18:45.568859 4826 generic.go:334] "Generic (PLEG): container finished" podID="8064c2d9-20b8-4d57-b79b-9579455d8370" containerID="cd696c80c6fa7ab0ed4c4bfeb82d96685e3f8e48fcf5f1239121c63f72a004f9" exitCode=0 Mar 19 19:18:45 crc kubenswrapper[4826]: I0319 19:18:45.568910 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-tv45r" event={"ID":"8064c2d9-20b8-4d57-b79b-9579455d8370","Type":"ContainerDied","Data":"cd696c80c6fa7ab0ed4c4bfeb82d96685e3f8e48fcf5f1239121c63f72a004f9"} Mar 19 19:18:45 crc kubenswrapper[4826]: I0319 19:18:45.568927 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-tv45r" event={"ID":"8064c2d9-20b8-4d57-b79b-9579455d8370","Type":"ContainerStarted","Data":"d56b75d715c8353c995c5928fbed76eb2ef005dfa39ff62c88846be4539d9bef"} Mar 19 19:18:45 crc kubenswrapper[4826]: I0319 19:18:45.574443 4826 generic.go:334] "Generic (PLEG): container finished" podID="dc865bcc-551d-4ae7-a0f3-af128e1d1e2d" containerID="b817efec6ae173dd8722c23c875f2d78f243d318a14b4b568442518dde4e2a73" exitCode=0 Mar 19 19:18:45 crc kubenswrapper[4826]: I0319 19:18:45.574489 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc865bcc-551d-4ae7-a0f3-af128e1d1e2d","Type":"ContainerDied","Data":"b817efec6ae173dd8722c23c875f2d78f243d318a14b4b568442518dde4e2a73"} Mar 19 19:18:45 crc kubenswrapper[4826]: I0319 19:18:45.574506 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc865bcc-551d-4ae7-a0f3-af128e1d1e2d","Type":"ContainerDied","Data":"198110759ac95ec5060deb3b0f6f55b21c28cdd021d6f4446c1c6feb3bb23295"} Mar 19 19:18:45 crc kubenswrapper[4826]: I0319 19:18:45.574516 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:18:45 crc kubenswrapper[4826]: I0319 19:18:45.574522 4826 scope.go:117] "RemoveContainer" containerID="5de9e81e5ad92596313fd9a21748d7252aafee81793961220ed2af636e525e34" Mar 19 19:18:45 crc kubenswrapper[4826]: I0319 19:18:45.576026 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9","Type":"ContainerStarted","Data":"27a4ff33c3ca2893a28fb0f91f9778d783c00707eed34389c1797c384d27ede1"} Mar 19 19:18:45 crc kubenswrapper[4826]: I0319 19:18:45.584590 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-764c5457fd-mw862" event={"ID":"3208df19-70b8-42be-b3b3-66b38532ab47","Type":"ContainerStarted","Data":"1b0d81911a1aa40e52e5dcacfa559a1063dd6457625255436c1895d525f2ae7d"} Mar 19 19:18:45 crc kubenswrapper[4826]: I0319 19:18:45.584623 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-764c5457fd-mw862" event={"ID":"3208df19-70b8-42be-b3b3-66b38532ab47","Type":"ContainerStarted","Data":"aa6ab795d7abf4f6ba0e47a80a164e5a6c969486caa62c16ef4e7abb986c13e7"} Mar 19 19:18:45 crc kubenswrapper[4826]: I0319 19:18:45.607468 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc865bcc-551d-4ae7-a0f3-af128e1d1e2d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "dc865bcc-551d-4ae7-a0f3-af128e1d1e2d" (UID: "dc865bcc-551d-4ae7-a0f3-af128e1d1e2d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:45 crc kubenswrapper[4826]: I0319 19:18:45.648812 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wckx\" (UniqueName: \"kubernetes.io/projected/dc865bcc-551d-4ae7-a0f3-af128e1d1e2d-kube-api-access-7wckx\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:45 crc kubenswrapper[4826]: I0319 19:18:45.648840 4826 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc865bcc-551d-4ae7-a0f3-af128e1d1e2d-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:45 crc kubenswrapper[4826]: I0319 19:18:45.648849 4826 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc865bcc-551d-4ae7-a0f3-af128e1d1e2d-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:45 crc kubenswrapper[4826]: I0319 19:18:45.648857 4826 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc865bcc-551d-4ae7-a0f3-af128e1d1e2d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:45 crc kubenswrapper[4826]: I0319 19:18:45.648867 4826 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc865bcc-551d-4ae7-a0f3-af128e1d1e2d-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:45 crc kubenswrapper[4826]: I0319 19:18:45.652380 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc865bcc-551d-4ae7-a0f3-af128e1d1e2d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc865bcc-551d-4ae7-a0f3-af128e1d1e2d" (UID: "dc865bcc-551d-4ae7-a0f3-af128e1d1e2d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:45 crc kubenswrapper[4826]: I0319 19:18:45.674135 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc865bcc-551d-4ae7-a0f3-af128e1d1e2d-config-data" (OuterVolumeSpecName: "config-data") pod "dc865bcc-551d-4ae7-a0f3-af128e1d1e2d" (UID: "dc865bcc-551d-4ae7-a0f3-af128e1d1e2d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:45 crc kubenswrapper[4826]: I0319 19:18:45.751842 4826 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc865bcc-551d-4ae7-a0f3-af128e1d1e2d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:45 crc kubenswrapper[4826]: I0319 19:18:45.751871 4826 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc865bcc-551d-4ae7-a0f3-af128e1d1e2d-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:45 crc kubenswrapper[4826]: I0319 19:18:45.913921 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 19 19:18:45 crc kubenswrapper[4826]: I0319 19:18:45.956982 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:45.999884 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.003257 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:18:46 crc kubenswrapper[4826]: E0319 19:18:46.003909 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc865bcc-551d-4ae7-a0f3-af128e1d1e2d" containerName="sg-core" Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.003923 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc865bcc-551d-4ae7-a0f3-af128e1d1e2d" containerName="sg-core" Mar 19 19:18:46 crc kubenswrapper[4826]: E0319 19:18:46.003957 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc865bcc-551d-4ae7-a0f3-af128e1d1e2d" containerName="proxy-httpd" Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.003965 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc865bcc-551d-4ae7-a0f3-af128e1d1e2d" containerName="proxy-httpd" Mar 19 19:18:46 crc kubenswrapper[4826]: E0319 19:18:46.003986 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc865bcc-551d-4ae7-a0f3-af128e1d1e2d" containerName="ceilometer-notification-agent" Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.003994 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc865bcc-551d-4ae7-a0f3-af128e1d1e2d" containerName="ceilometer-notification-agent" Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.004468 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc865bcc-551d-4ae7-a0f3-af128e1d1e2d" containerName="proxy-httpd" Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.004485 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc865bcc-551d-4ae7-a0f3-af128e1d1e2d" containerName="sg-core" Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.004497 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc865bcc-551d-4ae7-a0f3-af128e1d1e2d" containerName="ceilometer-notification-agent" Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.008456 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.011888 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.012107 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.049809 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.164758 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/afbec0cf-36b2-4154-b0e2-6d886a4a0344-log-httpd\") pod \"ceilometer-0\" (UID: \"afbec0cf-36b2-4154-b0e2-6d886a4a0344\") " pod="openstack/ceilometer-0" Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.164821 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/afbec0cf-36b2-4154-b0e2-6d886a4a0344-run-httpd\") pod \"ceilometer-0\" (UID: \"afbec0cf-36b2-4154-b0e2-6d886a4a0344\") " pod="openstack/ceilometer-0" Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.164854 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/afbec0cf-36b2-4154-b0e2-6d886a4a0344-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"afbec0cf-36b2-4154-b0e2-6d886a4a0344\") " pod="openstack/ceilometer-0" Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.164878 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/afbec0cf-36b2-4154-b0e2-6d886a4a0344-scripts\") pod \"ceilometer-0\" (UID: \"afbec0cf-36b2-4154-b0e2-6d886a4a0344\") " pod="openstack/ceilometer-0" Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.164915 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5kjt\" (UniqueName: \"kubernetes.io/projected/afbec0cf-36b2-4154-b0e2-6d886a4a0344-kube-api-access-g5kjt\") pod \"ceilometer-0\" (UID: \"afbec0cf-36b2-4154-b0e2-6d886a4a0344\") " pod="openstack/ceilometer-0" Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.165097 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afbec0cf-36b2-4154-b0e2-6d886a4a0344-config-data\") pod \"ceilometer-0\" (UID: \"afbec0cf-36b2-4154-b0e2-6d886a4a0344\") " pod="openstack/ceilometer-0" Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.165130 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afbec0cf-36b2-4154-b0e2-6d886a4a0344-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"afbec0cf-36b2-4154-b0e2-6d886a4a0344\") " pod="openstack/ceilometer-0" Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.266628 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/afbec0cf-36b2-4154-b0e2-6d886a4a0344-log-httpd\") pod \"ceilometer-0\" (UID: \"afbec0cf-36b2-4154-b0e2-6d886a4a0344\") " pod="openstack/ceilometer-0" Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.266726 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/afbec0cf-36b2-4154-b0e2-6d886a4a0344-run-httpd\") pod \"ceilometer-0\" (UID: \"afbec0cf-36b2-4154-b0e2-6d886a4a0344\") " pod="openstack/ceilometer-0" Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.266766 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/afbec0cf-36b2-4154-b0e2-6d886a4a0344-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"afbec0cf-36b2-4154-b0e2-6d886a4a0344\") " pod="openstack/ceilometer-0" Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.266799 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/afbec0cf-36b2-4154-b0e2-6d886a4a0344-scripts\") pod \"ceilometer-0\" (UID: \"afbec0cf-36b2-4154-b0e2-6d886a4a0344\") " pod="openstack/ceilometer-0" Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.266842 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5kjt\" (UniqueName: \"kubernetes.io/projected/afbec0cf-36b2-4154-b0e2-6d886a4a0344-kube-api-access-g5kjt\") pod \"ceilometer-0\" (UID: \"afbec0cf-36b2-4154-b0e2-6d886a4a0344\") " pod="openstack/ceilometer-0" Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.266908 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afbec0cf-36b2-4154-b0e2-6d886a4a0344-config-data\") pod \"ceilometer-0\" (UID: \"afbec0cf-36b2-4154-b0e2-6d886a4a0344\") " pod="openstack/ceilometer-0" Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.266936 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afbec0cf-36b2-4154-b0e2-6d886a4a0344-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"afbec0cf-36b2-4154-b0e2-6d886a4a0344\") " pod="openstack/ceilometer-0" Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.267216 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/afbec0cf-36b2-4154-b0e2-6d886a4a0344-log-httpd\") pod \"ceilometer-0\" (UID: \"afbec0cf-36b2-4154-b0e2-6d886a4a0344\") " pod="openstack/ceilometer-0" Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.268722 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/afbec0cf-36b2-4154-b0e2-6d886a4a0344-run-httpd\") pod \"ceilometer-0\" (UID: \"afbec0cf-36b2-4154-b0e2-6d886a4a0344\") " pod="openstack/ceilometer-0" Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.272958 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/afbec0cf-36b2-4154-b0e2-6d886a4a0344-scripts\") pod \"ceilometer-0\" (UID: \"afbec0cf-36b2-4154-b0e2-6d886a4a0344\") " pod="openstack/ceilometer-0" Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.274757 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afbec0cf-36b2-4154-b0e2-6d886a4a0344-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"afbec0cf-36b2-4154-b0e2-6d886a4a0344\") " pod="openstack/ceilometer-0" Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.279446 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/afbec0cf-36b2-4154-b0e2-6d886a4a0344-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"afbec0cf-36b2-4154-b0e2-6d886a4a0344\") " pod="openstack/ceilometer-0" Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.280505 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afbec0cf-36b2-4154-b0e2-6d886a4a0344-config-data\") pod \"ceilometer-0\" (UID: \"afbec0cf-36b2-4154-b0e2-6d886a4a0344\") " pod="openstack/ceilometer-0" Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.288797 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5kjt\" (UniqueName: \"kubernetes.io/projected/afbec0cf-36b2-4154-b0e2-6d886a4a0344-kube-api-access-g5kjt\") pod \"ceilometer-0\" (UID: \"afbec0cf-36b2-4154-b0e2-6d886a4a0344\") " pod="openstack/ceilometer-0" Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.371749 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.377786 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-tv45r" Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.471188 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8064c2d9-20b8-4d57-b79b-9579455d8370-ovsdbserver-nb\") pod \"8064c2d9-20b8-4d57-b79b-9579455d8370\" (UID: \"8064c2d9-20b8-4d57-b79b-9579455d8370\") " Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.471375 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8064c2d9-20b8-4d57-b79b-9579455d8370-dns-swift-storage-0\") pod \"8064c2d9-20b8-4d57-b79b-9579455d8370\" (UID: \"8064c2d9-20b8-4d57-b79b-9579455d8370\") " Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.471439 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8064c2d9-20b8-4d57-b79b-9579455d8370-dns-svc\") pod \"8064c2d9-20b8-4d57-b79b-9579455d8370\" (UID: \"8064c2d9-20b8-4d57-b79b-9579455d8370\") " Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.471477 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4nhp\" (UniqueName: \"kubernetes.io/projected/8064c2d9-20b8-4d57-b79b-9579455d8370-kube-api-access-d4nhp\") pod \"8064c2d9-20b8-4d57-b79b-9579455d8370\" (UID: \"8064c2d9-20b8-4d57-b79b-9579455d8370\") " Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.471624 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8064c2d9-20b8-4d57-b79b-9579455d8370-ovsdbserver-sb\") pod \"8064c2d9-20b8-4d57-b79b-9579455d8370\" (UID: \"8064c2d9-20b8-4d57-b79b-9579455d8370\") " Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.471669 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8064c2d9-20b8-4d57-b79b-9579455d8370-config\") pod \"8064c2d9-20b8-4d57-b79b-9579455d8370\" (UID: \"8064c2d9-20b8-4d57-b79b-9579455d8370\") " Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.474800 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8064c2d9-20b8-4d57-b79b-9579455d8370-kube-api-access-d4nhp" (OuterVolumeSpecName: "kube-api-access-d4nhp") pod "8064c2d9-20b8-4d57-b79b-9579455d8370" (UID: "8064c2d9-20b8-4d57-b79b-9579455d8370"). InnerVolumeSpecName "kube-api-access-d4nhp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.501607 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8064c2d9-20b8-4d57-b79b-9579455d8370-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8064c2d9-20b8-4d57-b79b-9579455d8370" (UID: "8064c2d9-20b8-4d57-b79b-9579455d8370"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.502692 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8064c2d9-20b8-4d57-b79b-9579455d8370-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8064c2d9-20b8-4d57-b79b-9579455d8370" (UID: "8064c2d9-20b8-4d57-b79b-9579455d8370"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.505982 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8064c2d9-20b8-4d57-b79b-9579455d8370-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8064c2d9-20b8-4d57-b79b-9579455d8370" (UID: "8064c2d9-20b8-4d57-b79b-9579455d8370"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.508908 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8064c2d9-20b8-4d57-b79b-9579455d8370-config" (OuterVolumeSpecName: "config") pod "8064c2d9-20b8-4d57-b79b-9579455d8370" (UID: "8064c2d9-20b8-4d57-b79b-9579455d8370"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.517239 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8064c2d9-20b8-4d57-b79b-9579455d8370-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8064c2d9-20b8-4d57-b79b-9579455d8370" (UID: "8064c2d9-20b8-4d57-b79b-9579455d8370"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.574900 4826 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8064c2d9-20b8-4d57-b79b-9579455d8370-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.574932 4826 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8064c2d9-20b8-4d57-b79b-9579455d8370-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.574942 4826 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8064c2d9-20b8-4d57-b79b-9579455d8370-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.574951 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4nhp\" (UniqueName: \"kubernetes.io/projected/8064c2d9-20b8-4d57-b79b-9579455d8370-kube-api-access-d4nhp\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.574962 4826 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8064c2d9-20b8-4d57-b79b-9579455d8370-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.574969 4826 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8064c2d9-20b8-4d57-b79b-9579455d8370-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.604414 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-tv45r" event={"ID":"8064c2d9-20b8-4d57-b79b-9579455d8370","Type":"ContainerDied","Data":"d56b75d715c8353c995c5928fbed76eb2ef005dfa39ff62c88846be4539d9bef"} Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.604513 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-tv45r" Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.628345 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-764c5457fd-mw862" event={"ID":"3208df19-70b8-42be-b3b3-66b38532ab47","Type":"ContainerStarted","Data":"b743f3de347bd7aca25a9358c681da47e5ce86ff9e000ff79fef4ad280561d06"} Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.628924 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-764c5457fd-mw862" Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.628981 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-764c5457fd-mw862" Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.631458 4826 scope.go:117] "RemoveContainer" containerID="71098aa46f4c29e2324a7a3122515a2d8bcfde0c286c19147dff3fc814d70f59" Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.635717 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e115aacd-2c47-4070-bcdf-ead46ded7eb5","Type":"ContainerStarted","Data":"6b28b1457b009b290034ce1999b13b277c53b38160bb137b2d549422a3368c92"} Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.637154 4826 generic.go:334] "Generic (PLEG): container finished" podID="3fd164df-263a-4817-a6d0-bedc59421e75" containerID="29f5a30b6591c6bd2961572881ae6928c29655bd200d0f08ec2bfd5716a2bdf5" exitCode=0 Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.637200 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-z2dgh" event={"ID":"3fd164df-263a-4817-a6d0-bedc59421e75","Type":"ContainerDied","Data":"29f5a30b6591c6bd2961572881ae6928c29655bd200d0f08ec2bfd5716a2bdf5"} Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.654331 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-764c5457fd-mw862" podStartSLOduration=4.654311463 podStartE2EDuration="4.654311463s" podCreationTimestamp="2026-03-19 19:18:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:18:46.64216388 +0000 UTC m=+1351.396232213" watchObservedRunningTime="2026-03-19 19:18:46.654311463 +0000 UTC m=+1351.408379766" Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.708877 4826 scope.go:117] "RemoveContainer" containerID="b817efec6ae173dd8722c23c875f2d78f243d318a14b4b568442518dde4e2a73" Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.736808 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-tv45r"] Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.753043 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-tv45r"] Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.760922 4826 scope.go:117] "RemoveContainer" containerID="5de9e81e5ad92596313fd9a21748d7252aafee81793961220ed2af636e525e34" Mar 19 19:18:46 crc kubenswrapper[4826]: E0319 19:18:46.763409 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5de9e81e5ad92596313fd9a21748d7252aafee81793961220ed2af636e525e34\": container with ID starting with 5de9e81e5ad92596313fd9a21748d7252aafee81793961220ed2af636e525e34 not found: ID does not exist" containerID="5de9e81e5ad92596313fd9a21748d7252aafee81793961220ed2af636e525e34" Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.763489 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5de9e81e5ad92596313fd9a21748d7252aafee81793961220ed2af636e525e34"} err="failed to get container status \"5de9e81e5ad92596313fd9a21748d7252aafee81793961220ed2af636e525e34\": rpc error: code = NotFound desc = could not find container \"5de9e81e5ad92596313fd9a21748d7252aafee81793961220ed2af636e525e34\": container with ID starting with 5de9e81e5ad92596313fd9a21748d7252aafee81793961220ed2af636e525e34 not found: ID does not exist" Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.763545 4826 scope.go:117] "RemoveContainer" containerID="71098aa46f4c29e2324a7a3122515a2d8bcfde0c286c19147dff3fc814d70f59" Mar 19 19:18:46 crc kubenswrapper[4826]: E0319 19:18:46.763997 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71098aa46f4c29e2324a7a3122515a2d8bcfde0c286c19147dff3fc814d70f59\": container with ID starting with 71098aa46f4c29e2324a7a3122515a2d8bcfde0c286c19147dff3fc814d70f59 not found: ID does not exist" containerID="71098aa46f4c29e2324a7a3122515a2d8bcfde0c286c19147dff3fc814d70f59" Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.764034 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71098aa46f4c29e2324a7a3122515a2d8bcfde0c286c19147dff3fc814d70f59"} err="failed to get container status \"71098aa46f4c29e2324a7a3122515a2d8bcfde0c286c19147dff3fc814d70f59\": rpc error: code = NotFound desc = could not find container \"71098aa46f4c29e2324a7a3122515a2d8bcfde0c286c19147dff3fc814d70f59\": container with ID starting with 71098aa46f4c29e2324a7a3122515a2d8bcfde0c286c19147dff3fc814d70f59 not found: ID does not exist" Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.764064 4826 scope.go:117] "RemoveContainer" containerID="b817efec6ae173dd8722c23c875f2d78f243d318a14b4b568442518dde4e2a73" Mar 19 19:18:46 crc kubenswrapper[4826]: E0319 19:18:46.765052 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b817efec6ae173dd8722c23c875f2d78f243d318a14b4b568442518dde4e2a73\": container with ID starting with b817efec6ae173dd8722c23c875f2d78f243d318a14b4b568442518dde4e2a73 not found: ID does not exist" containerID="b817efec6ae173dd8722c23c875f2d78f243d318a14b4b568442518dde4e2a73" Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.765090 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b817efec6ae173dd8722c23c875f2d78f243d318a14b4b568442518dde4e2a73"} err="failed to get container status \"b817efec6ae173dd8722c23c875f2d78f243d318a14b4b568442518dde4e2a73\": rpc error: code = NotFound desc = could not find container \"b817efec6ae173dd8722c23c875f2d78f243d318a14b4b568442518dde4e2a73\": container with ID starting with b817efec6ae173dd8722c23c875f2d78f243d318a14b4b568442518dde4e2a73 not found: ID does not exist" Mar 19 19:18:46 crc kubenswrapper[4826]: I0319 19:18:46.765110 4826 scope.go:117] "RemoveContainer" containerID="cd696c80c6fa7ab0ed4c4bfeb82d96685e3f8e48fcf5f1239121c63f72a004f9" Mar 19 19:18:47 crc kubenswrapper[4826]: I0319 19:18:47.289265 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:18:47 crc kubenswrapper[4826]: W0319 19:18:47.291125 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podafbec0cf_36b2_4154_b0e2_6d886a4a0344.slice/crio-c2824a06dd03aab824cfbf1aad6bd19fc90c47d988df37ce1585f947b12b7bcb WatchSource:0}: Error finding container c2824a06dd03aab824cfbf1aad6bd19fc90c47d988df37ce1585f947b12b7bcb: Status 404 returned error can't find the container with id c2824a06dd03aab824cfbf1aad6bd19fc90c47d988df37ce1585f947b12b7bcb Mar 19 19:18:47 crc kubenswrapper[4826]: I0319 19:18:47.652345 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7f89998948-7dsjr" event={"ID":"f794027a-e079-4fdc-94b5-aa448e034df7","Type":"ContainerStarted","Data":"6b93ff0472f4b9db4c2d5a00108f339d5e60784e042249391cb5adb91bb727c7"} Mar 19 19:18:47 crc kubenswrapper[4826]: I0319 19:18:47.652948 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7f89998948-7dsjr" event={"ID":"f794027a-e079-4fdc-94b5-aa448e034df7","Type":"ContainerStarted","Data":"b75e028517a2a201a4fe458579a340bfa3daf59b166745da55c3b674668a3438"} Mar 19 19:18:47 crc kubenswrapper[4826]: I0319 19:18:47.654739 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e115aacd-2c47-4070-bcdf-ead46ded7eb5","Type":"ContainerStarted","Data":"e19d9e6d29439f3ce97323998035daa104edaa30e430992094a585e1dc50a1ce"} Mar 19 19:18:47 crc kubenswrapper[4826]: I0319 19:18:47.660744 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-z2dgh" event={"ID":"3fd164df-263a-4817-a6d0-bedc59421e75","Type":"ContainerStarted","Data":"44face072e2b1e82a23f7e97a749880d84d47c51f0ca5fd80a4f066805463c2b"} Mar 19 19:18:47 crc kubenswrapper[4826]: I0319 19:18:47.662183 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6578955fd5-z2dgh" Mar 19 19:18:47 crc kubenswrapper[4826]: I0319 19:18:47.672824 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"afbec0cf-36b2-4154-b0e2-6d886a4a0344","Type":"ContainerStarted","Data":"c2824a06dd03aab824cfbf1aad6bd19fc90c47d988df37ce1585f947b12b7bcb"} Mar 19 19:18:47 crc kubenswrapper[4826]: I0319 19:18:47.673578 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-7f89998948-7dsjr" podStartSLOduration=3.73317639 podStartE2EDuration="5.673555858s" podCreationTimestamp="2026-03-19 19:18:42 +0000 UTC" firstStartedPulling="2026-03-19 19:18:44.799526385 +0000 UTC m=+1349.553594728" lastFinishedPulling="2026-03-19 19:18:46.739905873 +0000 UTC m=+1351.493974196" observedRunningTime="2026-03-19 19:18:47.670975216 +0000 UTC m=+1352.425043549" watchObservedRunningTime="2026-03-19 19:18:47.673555858 +0000 UTC m=+1352.427624171" Mar 19 19:18:47 crc kubenswrapper[4826]: I0319 19:18:47.678881 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-67df6b8d6c-9kl5x" event={"ID":"6cf6154b-c9df-415a-82dd-907847aaf7d4","Type":"ContainerStarted","Data":"2139e9798ef286427ae278723e9b4b09917099e682f62ab188e5f437d81b1b2c"} Mar 19 19:18:47 crc kubenswrapper[4826]: I0319 19:18:47.679060 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-67df6b8d6c-9kl5x" event={"ID":"6cf6154b-c9df-415a-82dd-907847aaf7d4","Type":"ContainerStarted","Data":"ce40232cc5d6cce78ec82c6cc428b49f9dc9d852eb5be4abe531ab0cf934ea33"} Mar 19 19:18:47 crc kubenswrapper[4826]: I0319 19:18:47.688675 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9","Type":"ContainerStarted","Data":"0e90f43cd5647d64c4e827c6a1d4530fccef269de3d3ebd30eee156a91d0d212"} Mar 19 19:18:47 crc kubenswrapper[4826]: I0319 19:18:47.705605 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6578955fd5-z2dgh" podStartSLOduration=5.705590142 podStartE2EDuration="5.705590142s" podCreationTimestamp="2026-03-19 19:18:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:18:47.694358951 +0000 UTC m=+1352.448427264" watchObservedRunningTime="2026-03-19 19:18:47.705590142 +0000 UTC m=+1352.459658455" Mar 19 19:18:47 crc kubenswrapper[4826]: I0319 19:18:47.719977 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-67df6b8d6c-9kl5x" podStartSLOduration=2.23165679 podStartE2EDuration="5.71996056s" podCreationTimestamp="2026-03-19 19:18:42 +0000 UTC" firstStartedPulling="2026-03-19 19:18:43.252143736 +0000 UTC m=+1348.006212049" lastFinishedPulling="2026-03-19 19:18:46.740447506 +0000 UTC m=+1351.494515819" observedRunningTime="2026-03-19 19:18:47.713910053 +0000 UTC m=+1352.467978356" watchObservedRunningTime="2026-03-19 19:18:47.71996056 +0000 UTC m=+1352.474028873" Mar 19 19:18:48 crc kubenswrapper[4826]: I0319 19:18:48.031391 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8064c2d9-20b8-4d57-b79b-9579455d8370" path="/var/lib/kubelet/pods/8064c2d9-20b8-4d57-b79b-9579455d8370/volumes" Mar 19 19:18:48 crc kubenswrapper[4826]: I0319 19:18:48.032644 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc865bcc-551d-4ae7-a0f3-af128e1d1e2d" path="/var/lib/kubelet/pods/dc865bcc-551d-4ae7-a0f3-af128e1d1e2d/volumes" Mar 19 19:18:48 crc kubenswrapper[4826]: I0319 19:18:48.172300 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-dddfc89b6-hxwk2" Mar 19 19:18:48 crc kubenswrapper[4826]: I0319 19:18:48.510566 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-8bf647cd5-pw2jr"] Mar 19 19:18:48 crc kubenswrapper[4826]: I0319 19:18:48.511089 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-8bf647cd5-pw2jr" podUID="ce18435f-6584-401e-b983-2d82bb66f9b3" containerName="neutron-httpd" containerID="cri-o://039b77b34fa8a5bdae7d4e93f3a043efe118970621120649b5587efc23cd8535" gracePeriod=30 Mar 19 19:18:48 crc kubenswrapper[4826]: I0319 19:18:48.511237 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-8bf647cd5-pw2jr" podUID="ce18435f-6584-401e-b983-2d82bb66f9b3" containerName="neutron-api" containerID="cri-o://5171a83f6c4abf2fc8b10d0be3c0168e50375a59d480df0886c0b1cb6d330034" gracePeriod=30 Mar 19 19:18:48 crc kubenswrapper[4826]: I0319 19:18:48.530818 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-8bf647cd5-pw2jr" podUID="ce18435f-6584-401e-b983-2d82bb66f9b3" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.200:9696/\": EOF" Mar 19 19:18:48 crc kubenswrapper[4826]: I0319 19:18:48.535336 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6674b95fd5-8s42n"] Mar 19 19:18:48 crc kubenswrapper[4826]: E0319 19:18:48.535856 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8064c2d9-20b8-4d57-b79b-9579455d8370" containerName="init" Mar 19 19:18:48 crc kubenswrapper[4826]: I0319 19:18:48.535873 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="8064c2d9-20b8-4d57-b79b-9579455d8370" containerName="init" Mar 19 19:18:48 crc kubenswrapper[4826]: I0319 19:18:48.536111 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="8064c2d9-20b8-4d57-b79b-9579455d8370" containerName="init" Mar 19 19:18:48 crc kubenswrapper[4826]: I0319 19:18:48.537357 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6674b95fd5-8s42n" Mar 19 19:18:48 crc kubenswrapper[4826]: I0319 19:18:48.554211 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6674b95fd5-8s42n"] Mar 19 19:18:48 crc kubenswrapper[4826]: I0319 19:18:48.660233 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/79f755d3-73d9-4207-b77b-af2ac5f99404-httpd-config\") pod \"neutron-6674b95fd5-8s42n\" (UID: \"79f755d3-73d9-4207-b77b-af2ac5f99404\") " pod="openstack/neutron-6674b95fd5-8s42n" Mar 19 19:18:48 crc kubenswrapper[4826]: I0319 19:18:48.660292 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/79f755d3-73d9-4207-b77b-af2ac5f99404-public-tls-certs\") pod \"neutron-6674b95fd5-8s42n\" (UID: \"79f755d3-73d9-4207-b77b-af2ac5f99404\") " pod="openstack/neutron-6674b95fd5-8s42n" Mar 19 19:18:48 crc kubenswrapper[4826]: I0319 19:18:48.660321 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79f755d3-73d9-4207-b77b-af2ac5f99404-combined-ca-bundle\") pod \"neutron-6674b95fd5-8s42n\" (UID: \"79f755d3-73d9-4207-b77b-af2ac5f99404\") " pod="openstack/neutron-6674b95fd5-8s42n" Mar 19 19:18:48 crc kubenswrapper[4826]: I0319 19:18:48.660339 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/79f755d3-73d9-4207-b77b-af2ac5f99404-internal-tls-certs\") pod \"neutron-6674b95fd5-8s42n\" (UID: \"79f755d3-73d9-4207-b77b-af2ac5f99404\") " pod="openstack/neutron-6674b95fd5-8s42n" Mar 19 19:18:48 crc kubenswrapper[4826]: I0319 19:18:48.660373 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/79f755d3-73d9-4207-b77b-af2ac5f99404-config\") pod \"neutron-6674b95fd5-8s42n\" (UID: \"79f755d3-73d9-4207-b77b-af2ac5f99404\") " pod="openstack/neutron-6674b95fd5-8s42n" Mar 19 19:18:48 crc kubenswrapper[4826]: I0319 19:18:48.660405 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/79f755d3-73d9-4207-b77b-af2ac5f99404-ovndb-tls-certs\") pod \"neutron-6674b95fd5-8s42n\" (UID: \"79f755d3-73d9-4207-b77b-af2ac5f99404\") " pod="openstack/neutron-6674b95fd5-8s42n" Mar 19 19:18:48 crc kubenswrapper[4826]: I0319 19:18:48.660456 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdr4r\" (UniqueName: \"kubernetes.io/projected/79f755d3-73d9-4207-b77b-af2ac5f99404-kube-api-access-gdr4r\") pod \"neutron-6674b95fd5-8s42n\" (UID: \"79f755d3-73d9-4207-b77b-af2ac5f99404\") " pod="openstack/neutron-6674b95fd5-8s42n" Mar 19 19:18:48 crc kubenswrapper[4826]: I0319 19:18:48.710759 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e115aacd-2c47-4070-bcdf-ead46ded7eb5","Type":"ContainerStarted","Data":"72f2353bdff336712857ae56e521d63cc77595f5b5da40c4f3faf20eeaa4d307"} Mar 19 19:18:48 crc kubenswrapper[4826]: I0319 19:18:48.710968 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="e115aacd-2c47-4070-bcdf-ead46ded7eb5" containerName="cinder-api-log" containerID="cri-o://e19d9e6d29439f3ce97323998035daa104edaa30e430992094a585e1dc50a1ce" gracePeriod=30 Mar 19 19:18:48 crc kubenswrapper[4826]: I0319 19:18:48.711602 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 19 19:18:48 crc kubenswrapper[4826]: I0319 19:18:48.712035 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="e115aacd-2c47-4070-bcdf-ead46ded7eb5" containerName="cinder-api" containerID="cri-o://72f2353bdff336712857ae56e521d63cc77595f5b5da40c4f3faf20eeaa4d307" gracePeriod=30 Mar 19 19:18:48 crc kubenswrapper[4826]: I0319 19:18:48.722084 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"afbec0cf-36b2-4154-b0e2-6d886a4a0344","Type":"ContainerStarted","Data":"a9f8657235e591f363052fd3c2fcbb52099699c71560e82f1e5a0d2b9a77da3a"} Mar 19 19:18:48 crc kubenswrapper[4826]: I0319 19:18:48.731715 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9","Type":"ContainerStarted","Data":"bc8441ceb84c889a8f3734f3ce63df1fc8c2995c2d1543cce6bfa3d1eefcb523"} Mar 19 19:18:48 crc kubenswrapper[4826]: I0319 19:18:48.746806 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=6.746787198 podStartE2EDuration="6.746787198s" podCreationTimestamp="2026-03-19 19:18:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:18:48.741724485 +0000 UTC m=+1353.495792798" watchObservedRunningTime="2026-03-19 19:18:48.746787198 +0000 UTC m=+1353.500855511" Mar 19 19:18:48 crc kubenswrapper[4826]: I0319 19:18:48.763352 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/79f755d3-73d9-4207-b77b-af2ac5f99404-public-tls-certs\") pod \"neutron-6674b95fd5-8s42n\" (UID: \"79f755d3-73d9-4207-b77b-af2ac5f99404\") " pod="openstack/neutron-6674b95fd5-8s42n" Mar 19 19:18:48 crc kubenswrapper[4826]: I0319 19:18:48.763429 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79f755d3-73d9-4207-b77b-af2ac5f99404-combined-ca-bundle\") pod \"neutron-6674b95fd5-8s42n\" (UID: \"79f755d3-73d9-4207-b77b-af2ac5f99404\") " pod="openstack/neutron-6674b95fd5-8s42n" Mar 19 19:18:48 crc kubenswrapper[4826]: I0319 19:18:48.763452 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/79f755d3-73d9-4207-b77b-af2ac5f99404-internal-tls-certs\") pod \"neutron-6674b95fd5-8s42n\" (UID: \"79f755d3-73d9-4207-b77b-af2ac5f99404\") " pod="openstack/neutron-6674b95fd5-8s42n" Mar 19 19:18:48 crc kubenswrapper[4826]: I0319 19:18:48.763500 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/79f755d3-73d9-4207-b77b-af2ac5f99404-config\") pod \"neutron-6674b95fd5-8s42n\" (UID: \"79f755d3-73d9-4207-b77b-af2ac5f99404\") " pod="openstack/neutron-6674b95fd5-8s42n" Mar 19 19:18:48 crc kubenswrapper[4826]: I0319 19:18:48.763549 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/79f755d3-73d9-4207-b77b-af2ac5f99404-ovndb-tls-certs\") pod \"neutron-6674b95fd5-8s42n\" (UID: \"79f755d3-73d9-4207-b77b-af2ac5f99404\") " pod="openstack/neutron-6674b95fd5-8s42n" Mar 19 19:18:48 crc kubenswrapper[4826]: I0319 19:18:48.763620 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdr4r\" (UniqueName: \"kubernetes.io/projected/79f755d3-73d9-4207-b77b-af2ac5f99404-kube-api-access-gdr4r\") pod \"neutron-6674b95fd5-8s42n\" (UID: \"79f755d3-73d9-4207-b77b-af2ac5f99404\") " pod="openstack/neutron-6674b95fd5-8s42n" Mar 19 19:18:48 crc kubenswrapper[4826]: I0319 19:18:48.764810 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/79f755d3-73d9-4207-b77b-af2ac5f99404-httpd-config\") pod \"neutron-6674b95fd5-8s42n\" (UID: \"79f755d3-73d9-4207-b77b-af2ac5f99404\") " pod="openstack/neutron-6674b95fd5-8s42n" Mar 19 19:18:48 crc kubenswrapper[4826]: I0319 19:18:48.774625 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/79f755d3-73d9-4207-b77b-af2ac5f99404-public-tls-certs\") pod \"neutron-6674b95fd5-8s42n\" (UID: \"79f755d3-73d9-4207-b77b-af2ac5f99404\") " pod="openstack/neutron-6674b95fd5-8s42n" Mar 19 19:18:48 crc kubenswrapper[4826]: I0319 19:18:48.777793 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/79f755d3-73d9-4207-b77b-af2ac5f99404-ovndb-tls-certs\") pod \"neutron-6674b95fd5-8s42n\" (UID: \"79f755d3-73d9-4207-b77b-af2ac5f99404\") " pod="openstack/neutron-6674b95fd5-8s42n" Mar 19 19:18:48 crc kubenswrapper[4826]: I0319 19:18:48.780726 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/79f755d3-73d9-4207-b77b-af2ac5f99404-internal-tls-certs\") pod \"neutron-6674b95fd5-8s42n\" (UID: \"79f755d3-73d9-4207-b77b-af2ac5f99404\") " pod="openstack/neutron-6674b95fd5-8s42n" Mar 19 19:18:48 crc kubenswrapper[4826]: I0319 19:18:48.781572 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79f755d3-73d9-4207-b77b-af2ac5f99404-combined-ca-bundle\") pod \"neutron-6674b95fd5-8s42n\" (UID: \"79f755d3-73d9-4207-b77b-af2ac5f99404\") " pod="openstack/neutron-6674b95fd5-8s42n" Mar 19 19:18:48 crc kubenswrapper[4826]: I0319 19:18:48.784289 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/79f755d3-73d9-4207-b77b-af2ac5f99404-httpd-config\") pod \"neutron-6674b95fd5-8s42n\" (UID: \"79f755d3-73d9-4207-b77b-af2ac5f99404\") " pod="openstack/neutron-6674b95fd5-8s42n" Mar 19 19:18:48 crc kubenswrapper[4826]: I0319 19:18:48.788049 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdr4r\" (UniqueName: \"kubernetes.io/projected/79f755d3-73d9-4207-b77b-af2ac5f99404-kube-api-access-gdr4r\") pod \"neutron-6674b95fd5-8s42n\" (UID: \"79f755d3-73d9-4207-b77b-af2ac5f99404\") " pod="openstack/neutron-6674b95fd5-8s42n" Mar 19 19:18:48 crc kubenswrapper[4826]: I0319 19:18:48.789990 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/79f755d3-73d9-4207-b77b-af2ac5f99404-config\") pod \"neutron-6674b95fd5-8s42n\" (UID: \"79f755d3-73d9-4207-b77b-af2ac5f99404\") " pod="openstack/neutron-6674b95fd5-8s42n" Mar 19 19:18:48 crc kubenswrapper[4826]: I0319 19:18:48.879773 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6674b95fd5-8s42n" Mar 19 19:18:49 crc kubenswrapper[4826]: I0319 19:18:49.412141 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.514911184 podStartE2EDuration="7.412121868s" podCreationTimestamp="2026-03-19 19:18:42 +0000 UTC" firstStartedPulling="2026-03-19 19:18:44.762990482 +0000 UTC m=+1349.517058805" lastFinishedPulling="2026-03-19 19:18:46.660201176 +0000 UTC m=+1351.414269489" observedRunningTime="2026-03-19 19:18:48.778089584 +0000 UTC m=+1353.532157897" watchObservedRunningTime="2026-03-19 19:18:49.412121868 +0000 UTC m=+1354.166190181" Mar 19 19:18:49 crc kubenswrapper[4826]: I0319 19:18:49.415618 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-554d47978d-xzcgd"] Mar 19 19:18:49 crc kubenswrapper[4826]: I0319 19:18:49.433226 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-554d47978d-xzcgd"] Mar 19 19:18:49 crc kubenswrapper[4826]: I0319 19:18:49.433359 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-554d47978d-xzcgd" Mar 19 19:18:49 crc kubenswrapper[4826]: I0319 19:18:49.438448 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 19 19:18:49 crc kubenswrapper[4826]: I0319 19:18:49.446304 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 19 19:18:49 crc kubenswrapper[4826]: I0319 19:18:49.486022 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4ae16c3-6030-48fc-b4d2-057a45770fe1-config-data\") pod \"barbican-api-554d47978d-xzcgd\" (UID: \"a4ae16c3-6030-48fc-b4d2-057a45770fe1\") " pod="openstack/barbican-api-554d47978d-xzcgd" Mar 19 19:18:49 crc kubenswrapper[4826]: I0319 19:18:49.486085 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4ae16c3-6030-48fc-b4d2-057a45770fe1-internal-tls-certs\") pod \"barbican-api-554d47978d-xzcgd\" (UID: \"a4ae16c3-6030-48fc-b4d2-057a45770fe1\") " pod="openstack/barbican-api-554d47978d-xzcgd" Mar 19 19:18:49 crc kubenswrapper[4826]: I0319 19:18:49.486117 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlpwz\" (UniqueName: \"kubernetes.io/projected/a4ae16c3-6030-48fc-b4d2-057a45770fe1-kube-api-access-mlpwz\") pod \"barbican-api-554d47978d-xzcgd\" (UID: \"a4ae16c3-6030-48fc-b4d2-057a45770fe1\") " pod="openstack/barbican-api-554d47978d-xzcgd" Mar 19 19:18:49 crc kubenswrapper[4826]: I0319 19:18:49.486139 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4ae16c3-6030-48fc-b4d2-057a45770fe1-public-tls-certs\") pod \"barbican-api-554d47978d-xzcgd\" (UID: \"a4ae16c3-6030-48fc-b4d2-057a45770fe1\") " pod="openstack/barbican-api-554d47978d-xzcgd" Mar 19 19:18:49 crc kubenswrapper[4826]: I0319 19:18:49.486194 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4ae16c3-6030-48fc-b4d2-057a45770fe1-combined-ca-bundle\") pod \"barbican-api-554d47978d-xzcgd\" (UID: \"a4ae16c3-6030-48fc-b4d2-057a45770fe1\") " pod="openstack/barbican-api-554d47978d-xzcgd" Mar 19 19:18:49 crc kubenswrapper[4826]: I0319 19:18:49.486227 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4ae16c3-6030-48fc-b4d2-057a45770fe1-config-data-custom\") pod \"barbican-api-554d47978d-xzcgd\" (UID: \"a4ae16c3-6030-48fc-b4d2-057a45770fe1\") " pod="openstack/barbican-api-554d47978d-xzcgd" Mar 19 19:18:49 crc kubenswrapper[4826]: I0319 19:18:49.486261 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4ae16c3-6030-48fc-b4d2-057a45770fe1-logs\") pod \"barbican-api-554d47978d-xzcgd\" (UID: \"a4ae16c3-6030-48fc-b4d2-057a45770fe1\") " pod="openstack/barbican-api-554d47978d-xzcgd" Mar 19 19:18:49 crc kubenswrapper[4826]: I0319 19:18:49.591612 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4ae16c3-6030-48fc-b4d2-057a45770fe1-config-data\") pod \"barbican-api-554d47978d-xzcgd\" (UID: \"a4ae16c3-6030-48fc-b4d2-057a45770fe1\") " pod="openstack/barbican-api-554d47978d-xzcgd" Mar 19 19:18:49 crc kubenswrapper[4826]: I0319 19:18:49.591681 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4ae16c3-6030-48fc-b4d2-057a45770fe1-internal-tls-certs\") pod \"barbican-api-554d47978d-xzcgd\" (UID: \"a4ae16c3-6030-48fc-b4d2-057a45770fe1\") " pod="openstack/barbican-api-554d47978d-xzcgd" Mar 19 19:18:49 crc kubenswrapper[4826]: I0319 19:18:49.591713 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlpwz\" (UniqueName: \"kubernetes.io/projected/a4ae16c3-6030-48fc-b4d2-057a45770fe1-kube-api-access-mlpwz\") pod \"barbican-api-554d47978d-xzcgd\" (UID: \"a4ae16c3-6030-48fc-b4d2-057a45770fe1\") " pod="openstack/barbican-api-554d47978d-xzcgd" Mar 19 19:18:49 crc kubenswrapper[4826]: I0319 19:18:49.591740 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4ae16c3-6030-48fc-b4d2-057a45770fe1-public-tls-certs\") pod \"barbican-api-554d47978d-xzcgd\" (UID: \"a4ae16c3-6030-48fc-b4d2-057a45770fe1\") " pod="openstack/barbican-api-554d47978d-xzcgd" Mar 19 19:18:49 crc kubenswrapper[4826]: I0319 19:18:49.591799 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4ae16c3-6030-48fc-b4d2-057a45770fe1-combined-ca-bundle\") pod \"barbican-api-554d47978d-xzcgd\" (UID: \"a4ae16c3-6030-48fc-b4d2-057a45770fe1\") " pod="openstack/barbican-api-554d47978d-xzcgd" Mar 19 19:18:49 crc kubenswrapper[4826]: I0319 19:18:49.591830 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4ae16c3-6030-48fc-b4d2-057a45770fe1-config-data-custom\") pod \"barbican-api-554d47978d-xzcgd\" (UID: \"a4ae16c3-6030-48fc-b4d2-057a45770fe1\") " pod="openstack/barbican-api-554d47978d-xzcgd" Mar 19 19:18:49 crc kubenswrapper[4826]: I0319 19:18:49.591865 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4ae16c3-6030-48fc-b4d2-057a45770fe1-logs\") pod \"barbican-api-554d47978d-xzcgd\" (UID: \"a4ae16c3-6030-48fc-b4d2-057a45770fe1\") " pod="openstack/barbican-api-554d47978d-xzcgd" Mar 19 19:18:49 crc kubenswrapper[4826]: I0319 19:18:49.592615 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4ae16c3-6030-48fc-b4d2-057a45770fe1-logs\") pod \"barbican-api-554d47978d-xzcgd\" (UID: \"a4ae16c3-6030-48fc-b4d2-057a45770fe1\") " pod="openstack/barbican-api-554d47978d-xzcgd" Mar 19 19:18:49 crc kubenswrapper[4826]: I0319 19:18:49.613184 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4ae16c3-6030-48fc-b4d2-057a45770fe1-public-tls-certs\") pod \"barbican-api-554d47978d-xzcgd\" (UID: \"a4ae16c3-6030-48fc-b4d2-057a45770fe1\") " pod="openstack/barbican-api-554d47978d-xzcgd" Mar 19 19:18:49 crc kubenswrapper[4826]: I0319 19:18:49.613664 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4ae16c3-6030-48fc-b4d2-057a45770fe1-config-data-custom\") pod \"barbican-api-554d47978d-xzcgd\" (UID: \"a4ae16c3-6030-48fc-b4d2-057a45770fe1\") " pod="openstack/barbican-api-554d47978d-xzcgd" Mar 19 19:18:49 crc kubenswrapper[4826]: I0319 19:18:49.615041 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4ae16c3-6030-48fc-b4d2-057a45770fe1-combined-ca-bundle\") pod \"barbican-api-554d47978d-xzcgd\" (UID: \"a4ae16c3-6030-48fc-b4d2-057a45770fe1\") " pod="openstack/barbican-api-554d47978d-xzcgd" Mar 19 19:18:49 crc kubenswrapper[4826]: I0319 19:18:49.615547 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4ae16c3-6030-48fc-b4d2-057a45770fe1-config-data\") pod \"barbican-api-554d47978d-xzcgd\" (UID: \"a4ae16c3-6030-48fc-b4d2-057a45770fe1\") " pod="openstack/barbican-api-554d47978d-xzcgd" Mar 19 19:18:49 crc kubenswrapper[4826]: I0319 19:18:49.620888 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4ae16c3-6030-48fc-b4d2-057a45770fe1-internal-tls-certs\") pod \"barbican-api-554d47978d-xzcgd\" (UID: \"a4ae16c3-6030-48fc-b4d2-057a45770fe1\") " pod="openstack/barbican-api-554d47978d-xzcgd" Mar 19 19:18:49 crc kubenswrapper[4826]: I0319 19:18:49.642220 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlpwz\" (UniqueName: \"kubernetes.io/projected/a4ae16c3-6030-48fc-b4d2-057a45770fe1-kube-api-access-mlpwz\") pod \"barbican-api-554d47978d-xzcgd\" (UID: \"a4ae16c3-6030-48fc-b4d2-057a45770fe1\") " pod="openstack/barbican-api-554d47978d-xzcgd" Mar 19 19:18:49 crc kubenswrapper[4826]: I0319 19:18:49.749304 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6674b95fd5-8s42n"] Mar 19 19:18:49 crc kubenswrapper[4826]: I0319 19:18:49.771253 4826 generic.go:334] "Generic (PLEG): container finished" podID="ce18435f-6584-401e-b983-2d82bb66f9b3" containerID="039b77b34fa8a5bdae7d4e93f3a043efe118970621120649b5587efc23cd8535" exitCode=0 Mar 19 19:18:49 crc kubenswrapper[4826]: I0319 19:18:49.771363 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8bf647cd5-pw2jr" event={"ID":"ce18435f-6584-401e-b983-2d82bb66f9b3","Type":"ContainerDied","Data":"039b77b34fa8a5bdae7d4e93f3a043efe118970621120649b5587efc23cd8535"} Mar 19 19:18:49 crc kubenswrapper[4826]: I0319 19:18:49.776289 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-554d47978d-xzcgd" Mar 19 19:18:49 crc kubenswrapper[4826]: I0319 19:18:49.787568 4826 generic.go:334] "Generic (PLEG): container finished" podID="e115aacd-2c47-4070-bcdf-ead46ded7eb5" containerID="e19d9e6d29439f3ce97323998035daa104edaa30e430992094a585e1dc50a1ce" exitCode=143 Mar 19 19:18:49 crc kubenswrapper[4826]: I0319 19:18:49.787685 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e115aacd-2c47-4070-bcdf-ead46ded7eb5","Type":"ContainerDied","Data":"e19d9e6d29439f3ce97323998035daa104edaa30e430992094a585e1dc50a1ce"} Mar 19 19:18:49 crc kubenswrapper[4826]: I0319 19:18:49.805690 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"afbec0cf-36b2-4154-b0e2-6d886a4a0344","Type":"ContainerStarted","Data":"c430bf06f9cb0c55a5511c5e1e4019e78128c93bb6673178225f9c22fa273849"} Mar 19 19:18:49 crc kubenswrapper[4826]: I0319 19:18:49.973392 4826 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 19:18:50 crc kubenswrapper[4826]: I0319 19:18:50.323197 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-8bf647cd5-pw2jr" podUID="ce18435f-6584-401e-b983-2d82bb66f9b3" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.200:9696/\": dial tcp 10.217.0.200:9696: connect: connection refused" Mar 19 19:18:50 crc kubenswrapper[4826]: I0319 19:18:50.346538 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-554d47978d-xzcgd"] Mar 19 19:18:50 crc kubenswrapper[4826]: W0319 19:18:50.353521 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4ae16c3_6030_48fc_b4d2_057a45770fe1.slice/crio-8df6bd142c3f6bcef687baa462ae91d469f9ba05aa947b46e610dc78a45fd081 WatchSource:0}: Error finding container 8df6bd142c3f6bcef687baa462ae91d469f9ba05aa947b46e610dc78a45fd081: Status 404 returned error can't find the container with id 8df6bd142c3f6bcef687baa462ae91d469f9ba05aa947b46e610dc78a45fd081 Mar 19 19:18:50 crc kubenswrapper[4826]: I0319 19:18:50.820971 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-554d47978d-xzcgd" event={"ID":"a4ae16c3-6030-48fc-b4d2-057a45770fe1","Type":"ContainerStarted","Data":"f3adf70357781b1f9a56b38f96ea17ebcd47f60eb45af8c8b1557029d4d08f89"} Mar 19 19:18:50 crc kubenswrapper[4826]: I0319 19:18:50.821210 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-554d47978d-xzcgd" event={"ID":"a4ae16c3-6030-48fc-b4d2-057a45770fe1","Type":"ContainerStarted","Data":"8df6bd142c3f6bcef687baa462ae91d469f9ba05aa947b46e610dc78a45fd081"} Mar 19 19:18:50 crc kubenswrapper[4826]: I0319 19:18:50.825402 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"afbec0cf-36b2-4154-b0e2-6d886a4a0344","Type":"ContainerStarted","Data":"50e1e5031d7449f7952423ba72251965622473086a8461ec0897811162de9104"} Mar 19 19:18:50 crc kubenswrapper[4826]: I0319 19:18:50.828919 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6674b95fd5-8s42n" event={"ID":"79f755d3-73d9-4207-b77b-af2ac5f99404","Type":"ContainerStarted","Data":"335b2eceeae57215abf4bf08e7e5541b32e8775c7a0ca5508ddeae68e6274976"} Mar 19 19:18:50 crc kubenswrapper[4826]: I0319 19:18:50.828982 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6674b95fd5-8s42n" event={"ID":"79f755d3-73d9-4207-b77b-af2ac5f99404","Type":"ContainerStarted","Data":"96bf7d00ca432a6d7d2bce4702c7407a06cef926f853b9bae99213aa69c87694"} Mar 19 19:18:50 crc kubenswrapper[4826]: I0319 19:18:50.828994 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6674b95fd5-8s42n" event={"ID":"79f755d3-73d9-4207-b77b-af2ac5f99404","Type":"ContainerStarted","Data":"4c551249049636b4cdb70cc0512fef48806f8a3f761968d6659f703df0cc367c"} Mar 19 19:18:50 crc kubenswrapper[4826]: I0319 19:18:50.830761 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6674b95fd5-8s42n" Mar 19 19:18:50 crc kubenswrapper[4826]: I0319 19:18:50.898111 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6674b95fd5-8s42n" podStartSLOduration=2.898090873 podStartE2EDuration="2.898090873s" podCreationTimestamp="2026-03-19 19:18:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:18:50.887253191 +0000 UTC m=+1355.641321524" watchObservedRunningTime="2026-03-19 19:18:50.898090873 +0000 UTC m=+1355.652159186" Mar 19 19:18:51 crc kubenswrapper[4826]: I0319 19:18:51.841803 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-554d47978d-xzcgd" event={"ID":"a4ae16c3-6030-48fc-b4d2-057a45770fe1","Type":"ContainerStarted","Data":"a0b8422e65d1f4d5f2382e6949ab0bc8c750cce0ed9755aea078230172914b66"} Mar 19 19:18:51 crc kubenswrapper[4826]: I0319 19:18:51.842062 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-554d47978d-xzcgd" Mar 19 19:18:51 crc kubenswrapper[4826]: I0319 19:18:51.842088 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-554d47978d-xzcgd" Mar 19 19:18:51 crc kubenswrapper[4826]: I0319 19:18:51.874488 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-554d47978d-xzcgd" podStartSLOduration=2.874467912 podStartE2EDuration="2.874467912s" podCreationTimestamp="2026-03-19 19:18:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:18:51.860444533 +0000 UTC m=+1356.614512846" watchObservedRunningTime="2026-03-19 19:18:51.874467912 +0000 UTC m=+1356.628536225" Mar 19 19:18:52 crc kubenswrapper[4826]: I0319 19:18:52.856157 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"afbec0cf-36b2-4154-b0e2-6d886a4a0344","Type":"ContainerStarted","Data":"8b0eb43fb18b04265be0434a2dca34a28695c4a011ebc5d93034c54b31099c01"} Mar 19 19:18:52 crc kubenswrapper[4826]: I0319 19:18:52.857079 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 19 19:18:52 crc kubenswrapper[4826]: I0319 19:18:52.883509 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.5037078619999997 podStartE2EDuration="7.883487238s" podCreationTimestamp="2026-03-19 19:18:45 +0000 UTC" firstStartedPulling="2026-03-19 19:18:47.293438561 +0000 UTC m=+1352.047506874" lastFinishedPulling="2026-03-19 19:18:51.673217937 +0000 UTC m=+1356.427286250" observedRunningTime="2026-03-19 19:18:52.873313263 +0000 UTC m=+1357.627381576" watchObservedRunningTime="2026-03-19 19:18:52.883487238 +0000 UTC m=+1357.637555551" Mar 19 19:18:53 crc kubenswrapper[4826]: I0319 19:18:53.044073 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 19 19:18:53 crc kubenswrapper[4826]: I0319 19:18:53.296946 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 19 19:18:53 crc kubenswrapper[4826]: I0319 19:18:53.355632 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6578955fd5-z2dgh" Mar 19 19:18:53 crc kubenswrapper[4826]: I0319 19:18:53.432117 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-t2kwz"] Mar 19 19:18:53 crc kubenswrapper[4826]: I0319 19:18:53.432353 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b7b667979-t2kwz" podUID="fe47625c-be5d-44ad-be5f-b64628005833" containerName="dnsmasq-dns" containerID="cri-o://2fad22d114982867e0d3f0a2233fb55f638b563f6899839e4b215945422a348e" gracePeriod=10 Mar 19 19:18:53 crc kubenswrapper[4826]: I0319 19:18:53.873155 4826 generic.go:334] "Generic (PLEG): container finished" podID="fe47625c-be5d-44ad-be5f-b64628005833" containerID="2fad22d114982867e0d3f0a2233fb55f638b563f6899839e4b215945422a348e" exitCode=0 Mar 19 19:18:53 crc kubenswrapper[4826]: I0319 19:18:53.873413 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-t2kwz" event={"ID":"fe47625c-be5d-44ad-be5f-b64628005833","Type":"ContainerDied","Data":"2fad22d114982867e0d3f0a2233fb55f638b563f6899839e4b215945422a348e"} Mar 19 19:18:53 crc kubenswrapper[4826]: I0319 19:18:53.932750 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 19 19:18:54 crc kubenswrapper[4826]: I0319 19:18:54.076159 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-t2kwz" Mar 19 19:18:54 crc kubenswrapper[4826]: I0319 19:18:54.131490 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe47625c-be5d-44ad-be5f-b64628005833-config\") pod \"fe47625c-be5d-44ad-be5f-b64628005833\" (UID: \"fe47625c-be5d-44ad-be5f-b64628005833\") " Mar 19 19:18:54 crc kubenswrapper[4826]: I0319 19:18:54.131536 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fe47625c-be5d-44ad-be5f-b64628005833-dns-swift-storage-0\") pod \"fe47625c-be5d-44ad-be5f-b64628005833\" (UID: \"fe47625c-be5d-44ad-be5f-b64628005833\") " Mar 19 19:18:54 crc kubenswrapper[4826]: I0319 19:18:54.131623 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe47625c-be5d-44ad-be5f-b64628005833-dns-svc\") pod \"fe47625c-be5d-44ad-be5f-b64628005833\" (UID: \"fe47625c-be5d-44ad-be5f-b64628005833\") " Mar 19 19:18:54 crc kubenswrapper[4826]: I0319 19:18:54.131729 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe47625c-be5d-44ad-be5f-b64628005833-ovsdbserver-nb\") pod \"fe47625c-be5d-44ad-be5f-b64628005833\" (UID: \"fe47625c-be5d-44ad-be5f-b64628005833\") " Mar 19 19:18:54 crc kubenswrapper[4826]: I0319 19:18:54.131765 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5nfw\" (UniqueName: \"kubernetes.io/projected/fe47625c-be5d-44ad-be5f-b64628005833-kube-api-access-f5nfw\") pod \"fe47625c-be5d-44ad-be5f-b64628005833\" (UID: \"fe47625c-be5d-44ad-be5f-b64628005833\") " Mar 19 19:18:54 crc kubenswrapper[4826]: I0319 19:18:54.131784 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe47625c-be5d-44ad-be5f-b64628005833-ovsdbserver-sb\") pod \"fe47625c-be5d-44ad-be5f-b64628005833\" (UID: \"fe47625c-be5d-44ad-be5f-b64628005833\") " Mar 19 19:18:54 crc kubenswrapper[4826]: I0319 19:18:54.166746 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe47625c-be5d-44ad-be5f-b64628005833-kube-api-access-f5nfw" (OuterVolumeSpecName: "kube-api-access-f5nfw") pod "fe47625c-be5d-44ad-be5f-b64628005833" (UID: "fe47625c-be5d-44ad-be5f-b64628005833"). InnerVolumeSpecName "kube-api-access-f5nfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:18:54 crc kubenswrapper[4826]: I0319 19:18:54.207369 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe47625c-be5d-44ad-be5f-b64628005833-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fe47625c-be5d-44ad-be5f-b64628005833" (UID: "fe47625c-be5d-44ad-be5f-b64628005833"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:18:54 crc kubenswrapper[4826]: I0319 19:18:54.217641 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe47625c-be5d-44ad-be5f-b64628005833-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fe47625c-be5d-44ad-be5f-b64628005833" (UID: "fe47625c-be5d-44ad-be5f-b64628005833"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:18:54 crc kubenswrapper[4826]: I0319 19:18:54.218289 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe47625c-be5d-44ad-be5f-b64628005833-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fe47625c-be5d-44ad-be5f-b64628005833" (UID: "fe47625c-be5d-44ad-be5f-b64628005833"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:18:54 crc kubenswrapper[4826]: I0319 19:18:54.231297 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe47625c-be5d-44ad-be5f-b64628005833-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "fe47625c-be5d-44ad-be5f-b64628005833" (UID: "fe47625c-be5d-44ad-be5f-b64628005833"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:18:54 crc kubenswrapper[4826]: I0319 19:18:54.246181 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe47625c-be5d-44ad-be5f-b64628005833-config" (OuterVolumeSpecName: "config") pod "fe47625c-be5d-44ad-be5f-b64628005833" (UID: "fe47625c-be5d-44ad-be5f-b64628005833"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:18:54 crc kubenswrapper[4826]: I0319 19:18:54.249461 4826 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe47625c-be5d-44ad-be5f-b64628005833-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:54 crc kubenswrapper[4826]: I0319 19:18:54.249503 4826 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fe47625c-be5d-44ad-be5f-b64628005833-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:54 crc kubenswrapper[4826]: I0319 19:18:54.249523 4826 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe47625c-be5d-44ad-be5f-b64628005833-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:54 crc kubenswrapper[4826]: I0319 19:18:54.249535 4826 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe47625c-be5d-44ad-be5f-b64628005833-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:54 crc kubenswrapper[4826]: I0319 19:18:54.249837 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5nfw\" (UniqueName: \"kubernetes.io/projected/fe47625c-be5d-44ad-be5f-b64628005833-kube-api-access-f5nfw\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:54 crc kubenswrapper[4826]: I0319 19:18:54.249906 4826 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe47625c-be5d-44ad-be5f-b64628005833-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:54 crc kubenswrapper[4826]: I0319 19:18:54.902516 4826 generic.go:334] "Generic (PLEG): container finished" podID="ce18435f-6584-401e-b983-2d82bb66f9b3" containerID="5171a83f6c4abf2fc8b10d0be3c0168e50375a59d480df0886c0b1cb6d330034" exitCode=0 Mar 19 19:18:54 crc kubenswrapper[4826]: I0319 19:18:54.902933 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8bf647cd5-pw2jr" event={"ID":"ce18435f-6584-401e-b983-2d82bb66f9b3","Type":"ContainerDied","Data":"5171a83f6c4abf2fc8b10d0be3c0168e50375a59d480df0886c0b1cb6d330034"} Mar 19 19:18:54 crc kubenswrapper[4826]: I0319 19:18:54.905353 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9" containerName="cinder-scheduler" containerID="cri-o://0e90f43cd5647d64c4e827c6a1d4530fccef269de3d3ebd30eee156a91d0d212" gracePeriod=30 Mar 19 19:18:54 crc kubenswrapper[4826]: I0319 19:18:54.905791 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-t2kwz" Mar 19 19:18:54 crc kubenswrapper[4826]: I0319 19:18:54.919937 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-t2kwz" event={"ID":"fe47625c-be5d-44ad-be5f-b64628005833","Type":"ContainerDied","Data":"a25d2483196dcd97c82aec1949be20fa664f9128e15cf2b80e33f55b246fe072"} Mar 19 19:18:54 crc kubenswrapper[4826]: I0319 19:18:54.919995 4826 scope.go:117] "RemoveContainer" containerID="2fad22d114982867e0d3f0a2233fb55f638b563f6899839e4b215945422a348e" Mar 19 19:18:54 crc kubenswrapper[4826]: I0319 19:18:54.920397 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9" containerName="probe" containerID="cri-o://bc8441ceb84c889a8f3734f3ce63df1fc8c2995c2d1543cce6bfa3d1eefcb523" gracePeriod=30 Mar 19 19:18:54 crc kubenswrapper[4826]: I0319 19:18:54.944829 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-764c5457fd-mw862" Mar 19 19:18:54 crc kubenswrapper[4826]: I0319 19:18:54.990403 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-t2kwz"] Mar 19 19:18:54 crc kubenswrapper[4826]: I0319 19:18:54.990465 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-t2kwz"] Mar 19 19:18:54 crc kubenswrapper[4826]: I0319 19:18:54.990987 4826 scope.go:117] "RemoveContainer" containerID="17abd6d3328ad0c68ad1637abc0c078d40b863486b2545ab438dc928843ace1d" Mar 19 19:18:55 crc kubenswrapper[4826]: I0319 19:18:55.282206 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-764c5457fd-mw862" Mar 19 19:18:55 crc kubenswrapper[4826]: I0319 19:18:55.313419 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8bf647cd5-pw2jr" Mar 19 19:18:55 crc kubenswrapper[4826]: I0319 19:18:55.377942 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce18435f-6584-401e-b983-2d82bb66f9b3-ovndb-tls-certs\") pod \"ce18435f-6584-401e-b983-2d82bb66f9b3\" (UID: \"ce18435f-6584-401e-b983-2d82bb66f9b3\") " Mar 19 19:18:55 crc kubenswrapper[4826]: I0319 19:18:55.378056 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce18435f-6584-401e-b983-2d82bb66f9b3-combined-ca-bundle\") pod \"ce18435f-6584-401e-b983-2d82bb66f9b3\" (UID: \"ce18435f-6584-401e-b983-2d82bb66f9b3\") " Mar 19 19:18:55 crc kubenswrapper[4826]: I0319 19:18:55.378216 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce18435f-6584-401e-b983-2d82bb66f9b3-internal-tls-certs\") pod \"ce18435f-6584-401e-b983-2d82bb66f9b3\" (UID: \"ce18435f-6584-401e-b983-2d82bb66f9b3\") " Mar 19 19:18:55 crc kubenswrapper[4826]: I0319 19:18:55.378272 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ce18435f-6584-401e-b983-2d82bb66f9b3-httpd-config\") pod \"ce18435f-6584-401e-b983-2d82bb66f9b3\" (UID: \"ce18435f-6584-401e-b983-2d82bb66f9b3\") " Mar 19 19:18:55 crc kubenswrapper[4826]: I0319 19:18:55.378298 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ce18435f-6584-401e-b983-2d82bb66f9b3-config\") pod \"ce18435f-6584-401e-b983-2d82bb66f9b3\" (UID: \"ce18435f-6584-401e-b983-2d82bb66f9b3\") " Mar 19 19:18:55 crc kubenswrapper[4826]: I0319 19:18:55.378324 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce18435f-6584-401e-b983-2d82bb66f9b3-public-tls-certs\") pod \"ce18435f-6584-401e-b983-2d82bb66f9b3\" (UID: \"ce18435f-6584-401e-b983-2d82bb66f9b3\") " Mar 19 19:18:55 crc kubenswrapper[4826]: I0319 19:18:55.378417 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flmv8\" (UniqueName: \"kubernetes.io/projected/ce18435f-6584-401e-b983-2d82bb66f9b3-kube-api-access-flmv8\") pod \"ce18435f-6584-401e-b983-2d82bb66f9b3\" (UID: \"ce18435f-6584-401e-b983-2d82bb66f9b3\") " Mar 19 19:18:55 crc kubenswrapper[4826]: I0319 19:18:55.387133 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce18435f-6584-401e-b983-2d82bb66f9b3-kube-api-access-flmv8" (OuterVolumeSpecName: "kube-api-access-flmv8") pod "ce18435f-6584-401e-b983-2d82bb66f9b3" (UID: "ce18435f-6584-401e-b983-2d82bb66f9b3"). InnerVolumeSpecName "kube-api-access-flmv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:18:55 crc kubenswrapper[4826]: I0319 19:18:55.388005 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce18435f-6584-401e-b983-2d82bb66f9b3-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "ce18435f-6584-401e-b983-2d82bb66f9b3" (UID: "ce18435f-6584-401e-b983-2d82bb66f9b3"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:55 crc kubenswrapper[4826]: I0319 19:18:55.400202 4826 patch_prober.go:28] interesting pod/machine-config-daemon-zz87p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:18:55 crc kubenswrapper[4826]: I0319 19:18:55.400253 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:18:55 crc kubenswrapper[4826]: I0319 19:18:55.400298 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" Mar 19 19:18:55 crc kubenswrapper[4826]: I0319 19:18:55.401067 4826 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1238977d0e09446586a9032546be2d2ff642cd7a1d8371018f40396f2b3eff68"} pod="openshift-machine-config-operator/machine-config-daemon-zz87p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 19:18:55 crc kubenswrapper[4826]: I0319 19:18:55.401109 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" containerID="cri-o://1238977d0e09446586a9032546be2d2ff642cd7a1d8371018f40396f2b3eff68" gracePeriod=600 Mar 19 19:18:55 crc kubenswrapper[4826]: I0319 19:18:55.466064 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce18435f-6584-401e-b983-2d82bb66f9b3-config" (OuterVolumeSpecName: "config") pod "ce18435f-6584-401e-b983-2d82bb66f9b3" (UID: "ce18435f-6584-401e-b983-2d82bb66f9b3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:55 crc kubenswrapper[4826]: I0319 19:18:55.476750 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce18435f-6584-401e-b983-2d82bb66f9b3-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ce18435f-6584-401e-b983-2d82bb66f9b3" (UID: "ce18435f-6584-401e-b983-2d82bb66f9b3"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:55 crc kubenswrapper[4826]: I0319 19:18:55.482295 4826 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ce18435f-6584-401e-b983-2d82bb66f9b3-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:55 crc kubenswrapper[4826]: I0319 19:18:55.482326 4826 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/ce18435f-6584-401e-b983-2d82bb66f9b3-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:55 crc kubenswrapper[4826]: I0319 19:18:55.482337 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flmv8\" (UniqueName: \"kubernetes.io/projected/ce18435f-6584-401e-b983-2d82bb66f9b3-kube-api-access-flmv8\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:55 crc kubenswrapper[4826]: I0319 19:18:55.482346 4826 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce18435f-6584-401e-b983-2d82bb66f9b3-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:55 crc kubenswrapper[4826]: I0319 19:18:55.499792 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce18435f-6584-401e-b983-2d82bb66f9b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce18435f-6584-401e-b983-2d82bb66f9b3" (UID: "ce18435f-6584-401e-b983-2d82bb66f9b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:55 crc kubenswrapper[4826]: I0319 19:18:55.519149 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce18435f-6584-401e-b983-2d82bb66f9b3-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ce18435f-6584-401e-b983-2d82bb66f9b3" (UID: "ce18435f-6584-401e-b983-2d82bb66f9b3"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:55 crc kubenswrapper[4826]: I0319 19:18:55.536337 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce18435f-6584-401e-b983-2d82bb66f9b3-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "ce18435f-6584-401e-b983-2d82bb66f9b3" (UID: "ce18435f-6584-401e-b983-2d82bb66f9b3"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:55 crc kubenswrapper[4826]: I0319 19:18:55.587911 4826 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce18435f-6584-401e-b983-2d82bb66f9b3-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:55 crc kubenswrapper[4826]: I0319 19:18:55.587941 4826 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce18435f-6584-401e-b983-2d82bb66f9b3-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:55 crc kubenswrapper[4826]: I0319 19:18:55.587951 4826 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce18435f-6584-401e-b983-2d82bb66f9b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:55 crc kubenswrapper[4826]: I0319 19:18:55.933175 4826 generic.go:334] "Generic (PLEG): container finished" podID="86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9" containerID="bc8441ceb84c889a8f3734f3ce63df1fc8c2995c2d1543cce6bfa3d1eefcb523" exitCode=0 Mar 19 19:18:55 crc kubenswrapper[4826]: I0319 19:18:55.933453 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9","Type":"ContainerDied","Data":"bc8441ceb84c889a8f3734f3ce63df1fc8c2995c2d1543cce6bfa3d1eefcb523"} Mar 19 19:18:55 crc kubenswrapper[4826]: I0319 19:18:55.935677 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8bf647cd5-pw2jr" event={"ID":"ce18435f-6584-401e-b983-2d82bb66f9b3","Type":"ContainerDied","Data":"65b26f8c3ab04755968bea8b177a153398a852fdb2a6b750a219a0567594dcea"} Mar 19 19:18:55 crc kubenswrapper[4826]: I0319 19:18:55.935708 4826 scope.go:117] "RemoveContainer" containerID="039b77b34fa8a5bdae7d4e93f3a043efe118970621120649b5587efc23cd8535" Mar 19 19:18:55 crc kubenswrapper[4826]: I0319 19:18:55.935880 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8bf647cd5-pw2jr" Mar 19 19:18:55 crc kubenswrapper[4826]: I0319 19:18:55.947503 4826 generic.go:334] "Generic (PLEG): container finished" podID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerID="1238977d0e09446586a9032546be2d2ff642cd7a1d8371018f40396f2b3eff68" exitCode=0 Mar 19 19:18:55 crc kubenswrapper[4826]: I0319 19:18:55.947704 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" event={"ID":"b456fa3f-c7a7-45ca-b560-e7a9b21be05a","Type":"ContainerDied","Data":"1238977d0e09446586a9032546be2d2ff642cd7a1d8371018f40396f2b3eff68"} Mar 19 19:18:55 crc kubenswrapper[4826]: I0319 19:18:55.947790 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" event={"ID":"b456fa3f-c7a7-45ca-b560-e7a9b21be05a","Type":"ContainerStarted","Data":"856447f1cdc796c080402d3bfb76d7471741ca95039714006756d0cb980e424c"} Mar 19 19:18:55 crc kubenswrapper[4826]: I0319 19:18:55.963118 4826 scope.go:117] "RemoveContainer" containerID="5171a83f6c4abf2fc8b10d0be3c0168e50375a59d480df0886c0b1cb6d330034" Mar 19 19:18:55 crc kubenswrapper[4826]: I0319 19:18:55.988968 4826 scope.go:117] "RemoveContainer" containerID="8f9b98750fb35334b26ac1561a7757e06810afb82592af11d7a0e1fbf0a43d22" Mar 19 19:18:55 crc kubenswrapper[4826]: I0319 19:18:55.998948 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe47625c-be5d-44ad-be5f-b64628005833" path="/var/lib/kubelet/pods/fe47625c-be5d-44ad-be5f-b64628005833/volumes" Mar 19 19:18:56 crc kubenswrapper[4826]: I0319 19:18:56.017356 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-8bf647cd5-pw2jr"] Mar 19 19:18:56 crc kubenswrapper[4826]: I0319 19:18:56.022859 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-8bf647cd5-pw2jr"] Mar 19 19:18:57 crc kubenswrapper[4826]: I0319 19:18:57.322033 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 19 19:18:57 crc kubenswrapper[4826]: I0319 19:18:57.608878 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 19 19:18:57 crc kubenswrapper[4826]: I0319 19:18:57.733477 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9-config-data-custom\") pod \"86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9\" (UID: \"86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9\") " Mar 19 19:18:57 crc kubenswrapper[4826]: I0319 19:18:57.733522 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rr49t\" (UniqueName: \"kubernetes.io/projected/86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9-kube-api-access-rr49t\") pod \"86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9\" (UID: \"86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9\") " Mar 19 19:18:57 crc kubenswrapper[4826]: I0319 19:18:57.733702 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9-scripts\") pod \"86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9\" (UID: \"86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9\") " Mar 19 19:18:57 crc kubenswrapper[4826]: I0319 19:18:57.733759 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9-etc-machine-id\") pod \"86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9\" (UID: \"86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9\") " Mar 19 19:18:57 crc kubenswrapper[4826]: I0319 19:18:57.733773 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9-combined-ca-bundle\") pod \"86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9\" (UID: \"86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9\") " Mar 19 19:18:57 crc kubenswrapper[4826]: I0319 19:18:57.733816 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9-config-data\") pod \"86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9\" (UID: \"86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9\") " Mar 19 19:18:57 crc kubenswrapper[4826]: I0319 19:18:57.734347 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9" (UID: "86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 19:18:57 crc kubenswrapper[4826]: I0319 19:18:57.739396 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9-scripts" (OuterVolumeSpecName: "scripts") pod "86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9" (UID: "86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:57 crc kubenswrapper[4826]: I0319 19:18:57.744888 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9-kube-api-access-rr49t" (OuterVolumeSpecName: "kube-api-access-rr49t") pod "86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9" (UID: "86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9"). InnerVolumeSpecName "kube-api-access-rr49t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:18:57 crc kubenswrapper[4826]: I0319 19:18:57.764824 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9" (UID: "86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:57 crc kubenswrapper[4826]: I0319 19:18:57.837092 4826 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:57 crc kubenswrapper[4826]: I0319 19:18:57.837124 4826 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:57 crc kubenswrapper[4826]: I0319 19:18:57.837134 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rr49t\" (UniqueName: \"kubernetes.io/projected/86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9-kube-api-access-rr49t\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:57 crc kubenswrapper[4826]: I0319 19:18:57.837146 4826 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:57 crc kubenswrapper[4826]: I0319 19:18:57.885458 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9" (UID: "86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:57 crc kubenswrapper[4826]: I0319 19:18:57.913745 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9-config-data" (OuterVolumeSpecName: "config-data") pod "86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9" (UID: "86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:18:57 crc kubenswrapper[4826]: I0319 19:18:57.939409 4826 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:57 crc kubenswrapper[4826]: I0319 19:18:57.939437 4826 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:18:57 crc kubenswrapper[4826]: I0319 19:18:57.973579 4826 generic.go:334] "Generic (PLEG): container finished" podID="86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9" containerID="0e90f43cd5647d64c4e827c6a1d4530fccef269de3d3ebd30eee156a91d0d212" exitCode=0 Mar 19 19:18:57 crc kubenswrapper[4826]: I0319 19:18:57.973622 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9","Type":"ContainerDied","Data":"0e90f43cd5647d64c4e827c6a1d4530fccef269de3d3ebd30eee156a91d0d212"} Mar 19 19:18:57 crc kubenswrapper[4826]: I0319 19:18:57.973648 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9","Type":"ContainerDied","Data":"27a4ff33c3ca2893a28fb0f91f9778d783c00707eed34389c1797c384d27ede1"} Mar 19 19:18:57 crc kubenswrapper[4826]: I0319 19:18:57.973676 4826 scope.go:117] "RemoveContainer" containerID="bc8441ceb84c889a8f3734f3ce63df1fc8c2995c2d1543cce6bfa3d1eefcb523" Mar 19 19:18:57 crc kubenswrapper[4826]: I0319 19:18:57.973789 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 19 19:18:57 crc kubenswrapper[4826]: I0319 19:18:57.998903 4826 scope.go:117] "RemoveContainer" containerID="0e90f43cd5647d64c4e827c6a1d4530fccef269de3d3ebd30eee156a91d0d212" Mar 19 19:18:58 crc kubenswrapper[4826]: I0319 19:18:58.001888 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce18435f-6584-401e-b983-2d82bb66f9b3" path="/var/lib/kubelet/pods/ce18435f-6584-401e-b983-2d82bb66f9b3/volumes" Mar 19 19:18:58 crc kubenswrapper[4826]: I0319 19:18:58.024697 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 19 19:18:58 crc kubenswrapper[4826]: I0319 19:18:58.041211 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 19 19:18:58 crc kubenswrapper[4826]: I0319 19:18:58.049081 4826 scope.go:117] "RemoveContainer" containerID="bc8441ceb84c889a8f3734f3ce63df1fc8c2995c2d1543cce6bfa3d1eefcb523" Mar 19 19:18:58 crc kubenswrapper[4826]: E0319 19:18:58.052785 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc8441ceb84c889a8f3734f3ce63df1fc8c2995c2d1543cce6bfa3d1eefcb523\": container with ID starting with bc8441ceb84c889a8f3734f3ce63df1fc8c2995c2d1543cce6bfa3d1eefcb523 not found: ID does not exist" containerID="bc8441ceb84c889a8f3734f3ce63df1fc8c2995c2d1543cce6bfa3d1eefcb523" Mar 19 19:18:58 crc kubenswrapper[4826]: I0319 19:18:58.052955 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc8441ceb84c889a8f3734f3ce63df1fc8c2995c2d1543cce6bfa3d1eefcb523"} err="failed to get container status \"bc8441ceb84c889a8f3734f3ce63df1fc8c2995c2d1543cce6bfa3d1eefcb523\": rpc error: code = NotFound desc = could not find container \"bc8441ceb84c889a8f3734f3ce63df1fc8c2995c2d1543cce6bfa3d1eefcb523\": container with ID starting with bc8441ceb84c889a8f3734f3ce63df1fc8c2995c2d1543cce6bfa3d1eefcb523 not found: ID does not exist" Mar 19 19:18:58 crc kubenswrapper[4826]: I0319 19:18:58.053050 4826 scope.go:117] "RemoveContainer" containerID="0e90f43cd5647d64c4e827c6a1d4530fccef269de3d3ebd30eee156a91d0d212" Mar 19 19:18:58 crc kubenswrapper[4826]: E0319 19:18:58.056788 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e90f43cd5647d64c4e827c6a1d4530fccef269de3d3ebd30eee156a91d0d212\": container with ID starting with 0e90f43cd5647d64c4e827c6a1d4530fccef269de3d3ebd30eee156a91d0d212 not found: ID does not exist" containerID="0e90f43cd5647d64c4e827c6a1d4530fccef269de3d3ebd30eee156a91d0d212" Mar 19 19:18:58 crc kubenswrapper[4826]: I0319 19:18:58.056934 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e90f43cd5647d64c4e827c6a1d4530fccef269de3d3ebd30eee156a91d0d212"} err="failed to get container status \"0e90f43cd5647d64c4e827c6a1d4530fccef269de3d3ebd30eee156a91d0d212\": rpc error: code = NotFound desc = could not find container \"0e90f43cd5647d64c4e827c6a1d4530fccef269de3d3ebd30eee156a91d0d212\": container with ID starting with 0e90f43cd5647d64c4e827c6a1d4530fccef269de3d3ebd30eee156a91d0d212 not found: ID does not exist" Mar 19 19:18:58 crc kubenswrapper[4826]: I0319 19:18:58.061931 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 19 19:18:58 crc kubenswrapper[4826]: E0319 19:18:58.062441 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9" containerName="cinder-scheduler" Mar 19 19:18:58 crc kubenswrapper[4826]: I0319 19:18:58.062460 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9" containerName="cinder-scheduler" Mar 19 19:18:58 crc kubenswrapper[4826]: E0319 19:18:58.062474 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe47625c-be5d-44ad-be5f-b64628005833" containerName="dnsmasq-dns" Mar 19 19:18:58 crc kubenswrapper[4826]: I0319 19:18:58.062481 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe47625c-be5d-44ad-be5f-b64628005833" containerName="dnsmasq-dns" Mar 19 19:18:58 crc kubenswrapper[4826]: E0319 19:18:58.062491 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9" containerName="probe" Mar 19 19:18:58 crc kubenswrapper[4826]: I0319 19:18:58.062497 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9" containerName="probe" Mar 19 19:18:58 crc kubenswrapper[4826]: E0319 19:18:58.062509 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce18435f-6584-401e-b983-2d82bb66f9b3" containerName="neutron-api" Mar 19 19:18:58 crc kubenswrapper[4826]: I0319 19:18:58.062515 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce18435f-6584-401e-b983-2d82bb66f9b3" containerName="neutron-api" Mar 19 19:18:58 crc kubenswrapper[4826]: E0319 19:18:58.062527 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce18435f-6584-401e-b983-2d82bb66f9b3" containerName="neutron-httpd" Mar 19 19:18:58 crc kubenswrapper[4826]: I0319 19:18:58.062535 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce18435f-6584-401e-b983-2d82bb66f9b3" containerName="neutron-httpd" Mar 19 19:18:58 crc kubenswrapper[4826]: E0319 19:18:58.062565 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe47625c-be5d-44ad-be5f-b64628005833" containerName="init" Mar 19 19:18:58 crc kubenswrapper[4826]: I0319 19:18:58.062571 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe47625c-be5d-44ad-be5f-b64628005833" containerName="init" Mar 19 19:18:58 crc kubenswrapper[4826]: I0319 19:18:58.062914 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce18435f-6584-401e-b983-2d82bb66f9b3" containerName="neutron-api" Mar 19 19:18:58 crc kubenswrapper[4826]: I0319 19:18:58.062934 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe47625c-be5d-44ad-be5f-b64628005833" containerName="dnsmasq-dns" Mar 19 19:18:58 crc kubenswrapper[4826]: I0319 19:18:58.062945 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9" containerName="cinder-scheduler" Mar 19 19:18:58 crc kubenswrapper[4826]: I0319 19:18:58.062961 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce18435f-6584-401e-b983-2d82bb66f9b3" containerName="neutron-httpd" Mar 19 19:18:58 crc kubenswrapper[4826]: I0319 19:18:58.062977 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9" containerName="probe" Mar 19 19:18:58 crc kubenswrapper[4826]: I0319 19:18:58.064252 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 19 19:18:58 crc kubenswrapper[4826]: I0319 19:18:58.067408 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 19 19:18:58 crc kubenswrapper[4826]: I0319 19:18:58.091151 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 19 19:18:58 crc kubenswrapper[4826]: I0319 19:18:58.142529 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9c0ba365-345a-4a2b-b919-b5e9de88b680-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9c0ba365-345a-4a2b-b919-b5e9de88b680\") " pod="openstack/cinder-scheduler-0" Mar 19 19:18:58 crc kubenswrapper[4826]: I0319 19:18:58.142578 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c0ba365-345a-4a2b-b919-b5e9de88b680-scripts\") pod \"cinder-scheduler-0\" (UID: \"9c0ba365-345a-4a2b-b919-b5e9de88b680\") " pod="openstack/cinder-scheduler-0" Mar 19 19:18:58 crc kubenswrapper[4826]: I0319 19:18:58.142759 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c964p\" (UniqueName: \"kubernetes.io/projected/9c0ba365-345a-4a2b-b919-b5e9de88b680-kube-api-access-c964p\") pod \"cinder-scheduler-0\" (UID: \"9c0ba365-345a-4a2b-b919-b5e9de88b680\") " pod="openstack/cinder-scheduler-0" Mar 19 19:18:58 crc kubenswrapper[4826]: I0319 19:18:58.142967 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9c0ba365-345a-4a2b-b919-b5e9de88b680-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9c0ba365-345a-4a2b-b919-b5e9de88b680\") " pod="openstack/cinder-scheduler-0" Mar 19 19:18:58 crc kubenswrapper[4826]: I0319 19:18:58.143023 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c0ba365-345a-4a2b-b919-b5e9de88b680-config-data\") pod \"cinder-scheduler-0\" (UID: \"9c0ba365-345a-4a2b-b919-b5e9de88b680\") " pod="openstack/cinder-scheduler-0" Mar 19 19:18:58 crc kubenswrapper[4826]: I0319 19:18:58.143315 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c0ba365-345a-4a2b-b919-b5e9de88b680-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9c0ba365-345a-4a2b-b919-b5e9de88b680\") " pod="openstack/cinder-scheduler-0" Mar 19 19:18:58 crc kubenswrapper[4826]: I0319 19:18:58.245346 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c0ba365-345a-4a2b-b919-b5e9de88b680-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9c0ba365-345a-4a2b-b919-b5e9de88b680\") " pod="openstack/cinder-scheduler-0" Mar 19 19:18:58 crc kubenswrapper[4826]: I0319 19:18:58.245400 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9c0ba365-345a-4a2b-b919-b5e9de88b680-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9c0ba365-345a-4a2b-b919-b5e9de88b680\") " pod="openstack/cinder-scheduler-0" Mar 19 19:18:58 crc kubenswrapper[4826]: I0319 19:18:58.245432 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c0ba365-345a-4a2b-b919-b5e9de88b680-scripts\") pod \"cinder-scheduler-0\" (UID: \"9c0ba365-345a-4a2b-b919-b5e9de88b680\") " pod="openstack/cinder-scheduler-0" Mar 19 19:18:58 crc kubenswrapper[4826]: I0319 19:18:58.245474 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c964p\" (UniqueName: \"kubernetes.io/projected/9c0ba365-345a-4a2b-b919-b5e9de88b680-kube-api-access-c964p\") pod \"cinder-scheduler-0\" (UID: \"9c0ba365-345a-4a2b-b919-b5e9de88b680\") " pod="openstack/cinder-scheduler-0" Mar 19 19:18:58 crc kubenswrapper[4826]: I0319 19:18:58.245524 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9c0ba365-345a-4a2b-b919-b5e9de88b680-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9c0ba365-345a-4a2b-b919-b5e9de88b680\") " pod="openstack/cinder-scheduler-0" Mar 19 19:18:58 crc kubenswrapper[4826]: I0319 19:18:58.245534 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9c0ba365-345a-4a2b-b919-b5e9de88b680-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9c0ba365-345a-4a2b-b919-b5e9de88b680\") " pod="openstack/cinder-scheduler-0" Mar 19 19:18:58 crc kubenswrapper[4826]: I0319 19:18:58.245644 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c0ba365-345a-4a2b-b919-b5e9de88b680-config-data\") pod \"cinder-scheduler-0\" (UID: \"9c0ba365-345a-4a2b-b919-b5e9de88b680\") " pod="openstack/cinder-scheduler-0" Mar 19 19:18:58 crc kubenswrapper[4826]: I0319 19:18:58.249875 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c0ba365-345a-4a2b-b919-b5e9de88b680-config-data\") pod \"cinder-scheduler-0\" (UID: \"9c0ba365-345a-4a2b-b919-b5e9de88b680\") " pod="openstack/cinder-scheduler-0" Mar 19 19:18:58 crc kubenswrapper[4826]: I0319 19:18:58.250001 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9c0ba365-345a-4a2b-b919-b5e9de88b680-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9c0ba365-345a-4a2b-b919-b5e9de88b680\") " pod="openstack/cinder-scheduler-0" Mar 19 19:18:58 crc kubenswrapper[4826]: I0319 19:18:58.251009 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c0ba365-345a-4a2b-b919-b5e9de88b680-scripts\") pod \"cinder-scheduler-0\" (UID: \"9c0ba365-345a-4a2b-b919-b5e9de88b680\") " pod="openstack/cinder-scheduler-0" Mar 19 19:18:58 crc kubenswrapper[4826]: I0319 19:18:58.251777 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c0ba365-345a-4a2b-b919-b5e9de88b680-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9c0ba365-345a-4a2b-b919-b5e9de88b680\") " pod="openstack/cinder-scheduler-0" Mar 19 19:18:58 crc kubenswrapper[4826]: I0319 19:18:58.261755 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c964p\" (UniqueName: \"kubernetes.io/projected/9c0ba365-345a-4a2b-b919-b5e9de88b680-kube-api-access-c964p\") pod \"cinder-scheduler-0\" (UID: \"9c0ba365-345a-4a2b-b919-b5e9de88b680\") " pod="openstack/cinder-scheduler-0" Mar 19 19:18:58 crc kubenswrapper[4826]: I0319 19:18:58.389984 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 19 19:18:58 crc kubenswrapper[4826]: I0319 19:18:58.895866 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 19 19:18:58 crc kubenswrapper[4826]: I0319 19:18:58.994513 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9c0ba365-345a-4a2b-b919-b5e9de88b680","Type":"ContainerStarted","Data":"97a392e865da2833a9b5b39d65bd52bb1c2b34ce2cc25470e39b4dd1add00f63"} Mar 19 19:19:00 crc kubenswrapper[4826]: I0319 19:19:00.004761 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9" path="/var/lib/kubelet/pods/86f15ac2-bdb2-40e1-86e4-5b7d0bbcf8b9/volumes" Mar 19 19:19:00 crc kubenswrapper[4826]: I0319 19:19:00.017367 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9c0ba365-345a-4a2b-b919-b5e9de88b680","Type":"ContainerStarted","Data":"f7c7d7accd377b922c768cd2c63561dd625ead7eed451e5666328299f2ed6f49"} Mar 19 19:19:01 crc kubenswrapper[4826]: I0319 19:19:01.035404 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9c0ba365-345a-4a2b-b919-b5e9de88b680","Type":"ContainerStarted","Data":"09185c0b3fe158b8f57e6443fc836760332c8b28ba9fde2b50b00ecef1b758bc"} Mar 19 19:19:01 crc kubenswrapper[4826]: I0319 19:19:01.063777 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.0637560600000002 podStartE2EDuration="3.06375606s" podCreationTimestamp="2026-03-19 19:18:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:19:01.055107771 +0000 UTC m=+1365.809176104" watchObservedRunningTime="2026-03-19 19:19:01.06375606 +0000 UTC m=+1365.817824393" Mar 19 19:19:01 crc kubenswrapper[4826]: I0319 19:19:01.296138 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-554d47978d-xzcgd" Mar 19 19:19:01 crc kubenswrapper[4826]: I0319 19:19:01.370246 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-554d47978d-xzcgd" Mar 19 19:19:01 crc kubenswrapper[4826]: I0319 19:19:01.471777 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-764c5457fd-mw862"] Mar 19 19:19:01 crc kubenswrapper[4826]: I0319 19:19:01.472007 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-764c5457fd-mw862" podUID="3208df19-70b8-42be-b3b3-66b38532ab47" containerName="barbican-api-log" containerID="cri-o://1b0d81911a1aa40e52e5dcacfa559a1063dd6457625255436c1895d525f2ae7d" gracePeriod=30 Mar 19 19:19:01 crc kubenswrapper[4826]: I0319 19:19:01.472439 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-764c5457fd-mw862" podUID="3208df19-70b8-42be-b3b3-66b38532ab47" containerName="barbican-api" containerID="cri-o://b743f3de347bd7aca25a9358c681da47e5ce86ff9e000ff79fef4ad280561d06" gracePeriod=30 Mar 19 19:19:02 crc kubenswrapper[4826]: I0319 19:19:02.048925 4826 generic.go:334] "Generic (PLEG): container finished" podID="3208df19-70b8-42be-b3b3-66b38532ab47" containerID="1b0d81911a1aa40e52e5dcacfa559a1063dd6457625255436c1895d525f2ae7d" exitCode=143 Mar 19 19:19:02 crc kubenswrapper[4826]: I0319 19:19:02.049011 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-764c5457fd-mw862" event={"ID":"3208df19-70b8-42be-b3b3-66b38532ab47","Type":"ContainerDied","Data":"1b0d81911a1aa40e52e5dcacfa559a1063dd6457625255436c1895d525f2ae7d"} Mar 19 19:19:03 crc kubenswrapper[4826]: I0319 19:19:03.194867 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-849c97b598-8kp5r" Mar 19 19:19:03 crc kubenswrapper[4826]: I0319 19:19:03.195397 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-849c97b598-8kp5r" Mar 19 19:19:03 crc kubenswrapper[4826]: I0319 19:19:03.391041 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 19 19:19:04 crc kubenswrapper[4826]: I0319 19:19:04.261130 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-578cc9f57d-6xb4n" Mar 19 19:19:04 crc kubenswrapper[4826]: I0319 19:19:04.715777 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 19 19:19:04 crc kubenswrapper[4826]: I0319 19:19:04.717707 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 19 19:19:04 crc kubenswrapper[4826]: I0319 19:19:04.720405 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 19 19:19:04 crc kubenswrapper[4826]: I0319 19:19:04.720866 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-g2cwn" Mar 19 19:19:04 crc kubenswrapper[4826]: I0319 19:19:04.728751 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 19 19:19:04 crc kubenswrapper[4826]: I0319 19:19:04.739192 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 19 19:19:04 crc kubenswrapper[4826]: I0319 19:19:04.745594 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1f4e03c-3679-4122-a562-0de802c7f1c8-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a1f4e03c-3679-4122-a562-0de802c7f1c8\") " pod="openstack/openstackclient" Mar 19 19:19:04 crc kubenswrapper[4826]: I0319 19:19:04.745675 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqtgm\" (UniqueName: \"kubernetes.io/projected/a1f4e03c-3679-4122-a562-0de802c7f1c8-kube-api-access-pqtgm\") pod \"openstackclient\" (UID: \"a1f4e03c-3679-4122-a562-0de802c7f1c8\") " pod="openstack/openstackclient" Mar 19 19:19:04 crc kubenswrapper[4826]: I0319 19:19:04.745776 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a1f4e03c-3679-4122-a562-0de802c7f1c8-openstack-config\") pod \"openstackclient\" (UID: \"a1f4e03c-3679-4122-a562-0de802c7f1c8\") " pod="openstack/openstackclient" Mar 19 19:19:04 crc kubenswrapper[4826]: I0319 19:19:04.745810 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a1f4e03c-3679-4122-a562-0de802c7f1c8-openstack-config-secret\") pod \"openstackclient\" (UID: \"a1f4e03c-3679-4122-a562-0de802c7f1c8\") " pod="openstack/openstackclient" Mar 19 19:19:04 crc kubenswrapper[4826]: I0319 19:19:04.848155 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a1f4e03c-3679-4122-a562-0de802c7f1c8-openstack-config-secret\") pod \"openstackclient\" (UID: \"a1f4e03c-3679-4122-a562-0de802c7f1c8\") " pod="openstack/openstackclient" Mar 19 19:19:04 crc kubenswrapper[4826]: I0319 19:19:04.848337 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1f4e03c-3679-4122-a562-0de802c7f1c8-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a1f4e03c-3679-4122-a562-0de802c7f1c8\") " pod="openstack/openstackclient" Mar 19 19:19:04 crc kubenswrapper[4826]: I0319 19:19:04.848397 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqtgm\" (UniqueName: \"kubernetes.io/projected/a1f4e03c-3679-4122-a562-0de802c7f1c8-kube-api-access-pqtgm\") pod \"openstackclient\" (UID: \"a1f4e03c-3679-4122-a562-0de802c7f1c8\") " pod="openstack/openstackclient" Mar 19 19:19:04 crc kubenswrapper[4826]: I0319 19:19:04.848508 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a1f4e03c-3679-4122-a562-0de802c7f1c8-openstack-config\") pod \"openstackclient\" (UID: \"a1f4e03c-3679-4122-a562-0de802c7f1c8\") " pod="openstack/openstackclient" Mar 19 19:19:04 crc kubenswrapper[4826]: I0319 19:19:04.849328 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a1f4e03c-3679-4122-a562-0de802c7f1c8-openstack-config\") pod \"openstackclient\" (UID: \"a1f4e03c-3679-4122-a562-0de802c7f1c8\") " pod="openstack/openstackclient" Mar 19 19:19:04 crc kubenswrapper[4826]: I0319 19:19:04.856367 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a1f4e03c-3679-4122-a562-0de802c7f1c8-openstack-config-secret\") pod \"openstackclient\" (UID: \"a1f4e03c-3679-4122-a562-0de802c7f1c8\") " pod="openstack/openstackclient" Mar 19 19:19:04 crc kubenswrapper[4826]: I0319 19:19:04.875168 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1f4e03c-3679-4122-a562-0de802c7f1c8-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a1f4e03c-3679-4122-a562-0de802c7f1c8\") " pod="openstack/openstackclient" Mar 19 19:19:04 crc kubenswrapper[4826]: I0319 19:19:04.881322 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqtgm\" (UniqueName: \"kubernetes.io/projected/a1f4e03c-3679-4122-a562-0de802c7f1c8-kube-api-access-pqtgm\") pod \"openstackclient\" (UID: \"a1f4e03c-3679-4122-a562-0de802c7f1c8\") " pod="openstack/openstackclient" Mar 19 19:19:05 crc kubenswrapper[4826]: I0319 19:19:05.046938 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 19 19:19:05 crc kubenswrapper[4826]: I0319 19:19:05.219116 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-764c5457fd-mw862" podUID="3208df19-70b8-42be-b3b3-66b38532ab47" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.207:9311/healthcheck\": read tcp 10.217.0.2:52206->10.217.0.207:9311: read: connection reset by peer" Mar 19 19:19:05 crc kubenswrapper[4826]: I0319 19:19:05.219173 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-764c5457fd-mw862" podUID="3208df19-70b8-42be-b3b3-66b38532ab47" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.207:9311/healthcheck\": read tcp 10.217.0.2:52200->10.217.0.207:9311: read: connection reset by peer" Mar 19 19:19:06 crc kubenswrapper[4826]: I0319 19:19:06.127150 4826 generic.go:334] "Generic (PLEG): container finished" podID="3208df19-70b8-42be-b3b3-66b38532ab47" containerID="b743f3de347bd7aca25a9358c681da47e5ce86ff9e000ff79fef4ad280561d06" exitCode=0 Mar 19 19:19:06 crc kubenswrapper[4826]: I0319 19:19:06.127720 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-764c5457fd-mw862" event={"ID":"3208df19-70b8-42be-b3b3-66b38532ab47","Type":"ContainerDied","Data":"b743f3de347bd7aca25a9358c681da47e5ce86ff9e000ff79fef4ad280561d06"} Mar 19 19:19:06 crc kubenswrapper[4826]: I0319 19:19:06.373866 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-58fdd88484-zg97f" Mar 19 19:19:06 crc kubenswrapper[4826]: I0319 19:19:06.479759 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-764c5457fd-mw862" Mar 19 19:19:06 crc kubenswrapper[4826]: I0319 19:19:06.485482 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3208df19-70b8-42be-b3b3-66b38532ab47-logs\") pod \"3208df19-70b8-42be-b3b3-66b38532ab47\" (UID: \"3208df19-70b8-42be-b3b3-66b38532ab47\") " Mar 19 19:19:06 crc kubenswrapper[4826]: I0319 19:19:06.485741 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3208df19-70b8-42be-b3b3-66b38532ab47-combined-ca-bundle\") pod \"3208df19-70b8-42be-b3b3-66b38532ab47\" (UID: \"3208df19-70b8-42be-b3b3-66b38532ab47\") " Mar 19 19:19:06 crc kubenswrapper[4826]: I0319 19:19:06.485800 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k589r\" (UniqueName: \"kubernetes.io/projected/3208df19-70b8-42be-b3b3-66b38532ab47-kube-api-access-k589r\") pod \"3208df19-70b8-42be-b3b3-66b38532ab47\" (UID: \"3208df19-70b8-42be-b3b3-66b38532ab47\") " Mar 19 19:19:06 crc kubenswrapper[4826]: I0319 19:19:06.485862 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3208df19-70b8-42be-b3b3-66b38532ab47-config-data\") pod \"3208df19-70b8-42be-b3b3-66b38532ab47\" (UID: \"3208df19-70b8-42be-b3b3-66b38532ab47\") " Mar 19 19:19:06 crc kubenswrapper[4826]: I0319 19:19:06.485924 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3208df19-70b8-42be-b3b3-66b38532ab47-config-data-custom\") pod \"3208df19-70b8-42be-b3b3-66b38532ab47\" (UID: \"3208df19-70b8-42be-b3b3-66b38532ab47\") " Mar 19 19:19:06 crc kubenswrapper[4826]: I0319 19:19:06.486352 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3208df19-70b8-42be-b3b3-66b38532ab47-logs" (OuterVolumeSpecName: "logs") pod "3208df19-70b8-42be-b3b3-66b38532ab47" (UID: "3208df19-70b8-42be-b3b3-66b38532ab47"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:19:06 crc kubenswrapper[4826]: I0319 19:19:06.493313 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3208df19-70b8-42be-b3b3-66b38532ab47-kube-api-access-k589r" (OuterVolumeSpecName: "kube-api-access-k589r") pod "3208df19-70b8-42be-b3b3-66b38532ab47" (UID: "3208df19-70b8-42be-b3b3-66b38532ab47"). InnerVolumeSpecName "kube-api-access-k589r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:19:06 crc kubenswrapper[4826]: I0319 19:19:06.507762 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3208df19-70b8-42be-b3b3-66b38532ab47-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3208df19-70b8-42be-b3b3-66b38532ab47" (UID: "3208df19-70b8-42be-b3b3-66b38532ab47"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:06 crc kubenswrapper[4826]: I0319 19:19:06.534577 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-58fdd88484-zg97f" Mar 19 19:19:06 crc kubenswrapper[4826]: I0319 19:19:06.562809 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3208df19-70b8-42be-b3b3-66b38532ab47-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3208df19-70b8-42be-b3b3-66b38532ab47" (UID: "3208df19-70b8-42be-b3b3-66b38532ab47"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:06 crc kubenswrapper[4826]: I0319 19:19:06.588502 4826 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3208df19-70b8-42be-b3b3-66b38532ab47-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:06 crc kubenswrapper[4826]: I0319 19:19:06.588539 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k589r\" (UniqueName: \"kubernetes.io/projected/3208df19-70b8-42be-b3b3-66b38532ab47-kube-api-access-k589r\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:06 crc kubenswrapper[4826]: I0319 19:19:06.588550 4826 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3208df19-70b8-42be-b3b3-66b38532ab47-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:06 crc kubenswrapper[4826]: I0319 19:19:06.588559 4826 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3208df19-70b8-42be-b3b3-66b38532ab47-logs\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:06 crc kubenswrapper[4826]: I0319 19:19:06.628544 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-849c97b598-8kp5r"] Mar 19 19:19:06 crc kubenswrapper[4826]: I0319 19:19:06.628825 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-849c97b598-8kp5r" podUID="35e51cc9-70e8-40ab-a613-b7f0ffe9b04e" containerName="placement-log" containerID="cri-o://4f149913d18416843831582f7a3640dcecba1ef3dac591f776f00032fd7813e5" gracePeriod=30 Mar 19 19:19:06 crc kubenswrapper[4826]: I0319 19:19:06.629266 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-849c97b598-8kp5r" podUID="35e51cc9-70e8-40ab-a613-b7f0ffe9b04e" containerName="placement-api" containerID="cri-o://55970305422c2e6cbf86f40c0b9e98cdee5dc4de5452a9f833a2e65e413a024d" gracePeriod=30 Mar 19 19:19:06 crc kubenswrapper[4826]: I0319 19:19:06.629789 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3208df19-70b8-42be-b3b3-66b38532ab47-config-data" (OuterVolumeSpecName: "config-data") pod "3208df19-70b8-42be-b3b3-66b38532ab47" (UID: "3208df19-70b8-42be-b3b3-66b38532ab47"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:06 crc kubenswrapper[4826]: I0319 19:19:06.676489 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 19 19:19:06 crc kubenswrapper[4826]: I0319 19:19:06.690465 4826 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3208df19-70b8-42be-b3b3-66b38532ab47-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:07 crc kubenswrapper[4826]: I0319 19:19:07.144184 4826 generic.go:334] "Generic (PLEG): container finished" podID="35e51cc9-70e8-40ab-a613-b7f0ffe9b04e" containerID="4f149913d18416843831582f7a3640dcecba1ef3dac591f776f00032fd7813e5" exitCode=143 Mar 19 19:19:07 crc kubenswrapper[4826]: I0319 19:19:07.144323 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-849c97b598-8kp5r" event={"ID":"35e51cc9-70e8-40ab-a613-b7f0ffe9b04e","Type":"ContainerDied","Data":"4f149913d18416843831582f7a3640dcecba1ef3dac591f776f00032fd7813e5"} Mar 19 19:19:07 crc kubenswrapper[4826]: I0319 19:19:07.146835 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-764c5457fd-mw862" event={"ID":"3208df19-70b8-42be-b3b3-66b38532ab47","Type":"ContainerDied","Data":"aa6ab795d7abf4f6ba0e47a80a164e5a6c969486caa62c16ef4e7abb986c13e7"} Mar 19 19:19:07 crc kubenswrapper[4826]: I0319 19:19:07.146906 4826 scope.go:117] "RemoveContainer" containerID="b743f3de347bd7aca25a9358c681da47e5ce86ff9e000ff79fef4ad280561d06" Mar 19 19:19:07 crc kubenswrapper[4826]: I0319 19:19:07.147367 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-764c5457fd-mw862" Mar 19 19:19:07 crc kubenswrapper[4826]: I0319 19:19:07.148140 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a1f4e03c-3679-4122-a562-0de802c7f1c8","Type":"ContainerStarted","Data":"1558d92766a98d52ad0a4f9760ddcb538bba990011dc68482b47280af3023a36"} Mar 19 19:19:07 crc kubenswrapper[4826]: I0319 19:19:07.169825 4826 scope.go:117] "RemoveContainer" containerID="1b0d81911a1aa40e52e5dcacfa559a1063dd6457625255436c1895d525f2ae7d" Mar 19 19:19:07 crc kubenswrapper[4826]: I0319 19:19:07.201222 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-764c5457fd-mw862"] Mar 19 19:19:07 crc kubenswrapper[4826]: I0319 19:19:07.211623 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-764c5457fd-mw862"] Mar 19 19:19:07 crc kubenswrapper[4826]: I0319 19:19:07.988440 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3208df19-70b8-42be-b3b3-66b38532ab47" path="/var/lib/kubelet/pods/3208df19-70b8-42be-b3b3-66b38532ab47/volumes" Mar 19 19:19:08 crc kubenswrapper[4826]: I0319 19:19:08.721042 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 19 19:19:09 crc kubenswrapper[4826]: I0319 19:19:09.591622 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-5448ff86dc-ccghd"] Mar 19 19:19:09 crc kubenswrapper[4826]: E0319 19:19:09.592370 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3208df19-70b8-42be-b3b3-66b38532ab47" containerName="barbican-api" Mar 19 19:19:09 crc kubenswrapper[4826]: I0319 19:19:09.592390 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="3208df19-70b8-42be-b3b3-66b38532ab47" containerName="barbican-api" Mar 19 19:19:09 crc kubenswrapper[4826]: E0319 19:19:09.592406 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3208df19-70b8-42be-b3b3-66b38532ab47" containerName="barbican-api-log" Mar 19 19:19:09 crc kubenswrapper[4826]: I0319 19:19:09.592415 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="3208df19-70b8-42be-b3b3-66b38532ab47" containerName="barbican-api-log" Mar 19 19:19:09 crc kubenswrapper[4826]: I0319 19:19:09.592648 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="3208df19-70b8-42be-b3b3-66b38532ab47" containerName="barbican-api" Mar 19 19:19:09 crc kubenswrapper[4826]: I0319 19:19:09.592695 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="3208df19-70b8-42be-b3b3-66b38532ab47" containerName="barbican-api-log" Mar 19 19:19:09 crc kubenswrapper[4826]: I0319 19:19:09.593926 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5448ff86dc-ccghd" Mar 19 19:19:09 crc kubenswrapper[4826]: I0319 19:19:09.595857 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 19 19:19:09 crc kubenswrapper[4826]: I0319 19:19:09.595953 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 19 19:19:09 crc kubenswrapper[4826]: I0319 19:19:09.597900 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 19 19:19:09 crc kubenswrapper[4826]: I0319 19:19:09.617960 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5448ff86dc-ccghd"] Mar 19 19:19:09 crc kubenswrapper[4826]: I0319 19:19:09.666418 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b2152b5-e974-469b-84e7-c0ca4d4a9826-public-tls-certs\") pod \"swift-proxy-5448ff86dc-ccghd\" (UID: \"2b2152b5-e974-469b-84e7-c0ca4d4a9826\") " pod="openstack/swift-proxy-5448ff86dc-ccghd" Mar 19 19:19:09 crc kubenswrapper[4826]: I0319 19:19:09.666797 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b2152b5-e974-469b-84e7-c0ca4d4a9826-internal-tls-certs\") pod \"swift-proxy-5448ff86dc-ccghd\" (UID: \"2b2152b5-e974-469b-84e7-c0ca4d4a9826\") " pod="openstack/swift-proxy-5448ff86dc-ccghd" Mar 19 19:19:09 crc kubenswrapper[4826]: I0319 19:19:09.667067 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b2152b5-e974-469b-84e7-c0ca4d4a9826-run-httpd\") pod \"swift-proxy-5448ff86dc-ccghd\" (UID: \"2b2152b5-e974-469b-84e7-c0ca4d4a9826\") " pod="openstack/swift-proxy-5448ff86dc-ccghd" Mar 19 19:19:09 crc kubenswrapper[4826]: I0319 19:19:09.667183 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2b2152b5-e974-469b-84e7-c0ca4d4a9826-etc-swift\") pod \"swift-proxy-5448ff86dc-ccghd\" (UID: \"2b2152b5-e974-469b-84e7-c0ca4d4a9826\") " pod="openstack/swift-proxy-5448ff86dc-ccghd" Mar 19 19:19:09 crc kubenswrapper[4826]: I0319 19:19:09.667269 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pp7f2\" (UniqueName: \"kubernetes.io/projected/2b2152b5-e974-469b-84e7-c0ca4d4a9826-kube-api-access-pp7f2\") pod \"swift-proxy-5448ff86dc-ccghd\" (UID: \"2b2152b5-e974-469b-84e7-c0ca4d4a9826\") " pod="openstack/swift-proxy-5448ff86dc-ccghd" Mar 19 19:19:09 crc kubenswrapper[4826]: I0319 19:19:09.667444 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b2152b5-e974-469b-84e7-c0ca4d4a9826-config-data\") pod \"swift-proxy-5448ff86dc-ccghd\" (UID: \"2b2152b5-e974-469b-84e7-c0ca4d4a9826\") " pod="openstack/swift-proxy-5448ff86dc-ccghd" Mar 19 19:19:09 crc kubenswrapper[4826]: I0319 19:19:09.667622 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b2152b5-e974-469b-84e7-c0ca4d4a9826-combined-ca-bundle\") pod \"swift-proxy-5448ff86dc-ccghd\" (UID: \"2b2152b5-e974-469b-84e7-c0ca4d4a9826\") " pod="openstack/swift-proxy-5448ff86dc-ccghd" Mar 19 19:19:09 crc kubenswrapper[4826]: I0319 19:19:09.667803 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b2152b5-e974-469b-84e7-c0ca4d4a9826-log-httpd\") pod \"swift-proxy-5448ff86dc-ccghd\" (UID: \"2b2152b5-e974-469b-84e7-c0ca4d4a9826\") " pod="openstack/swift-proxy-5448ff86dc-ccghd" Mar 19 19:19:09 crc kubenswrapper[4826]: I0319 19:19:09.769429 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b2152b5-e974-469b-84e7-c0ca4d4a9826-public-tls-certs\") pod \"swift-proxy-5448ff86dc-ccghd\" (UID: \"2b2152b5-e974-469b-84e7-c0ca4d4a9826\") " pod="openstack/swift-proxy-5448ff86dc-ccghd" Mar 19 19:19:09 crc kubenswrapper[4826]: I0319 19:19:09.769510 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b2152b5-e974-469b-84e7-c0ca4d4a9826-internal-tls-certs\") pod \"swift-proxy-5448ff86dc-ccghd\" (UID: \"2b2152b5-e974-469b-84e7-c0ca4d4a9826\") " pod="openstack/swift-proxy-5448ff86dc-ccghd" Mar 19 19:19:09 crc kubenswrapper[4826]: I0319 19:19:09.769556 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b2152b5-e974-469b-84e7-c0ca4d4a9826-run-httpd\") pod \"swift-proxy-5448ff86dc-ccghd\" (UID: \"2b2152b5-e974-469b-84e7-c0ca4d4a9826\") " pod="openstack/swift-proxy-5448ff86dc-ccghd" Mar 19 19:19:09 crc kubenswrapper[4826]: I0319 19:19:09.769582 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2b2152b5-e974-469b-84e7-c0ca4d4a9826-etc-swift\") pod \"swift-proxy-5448ff86dc-ccghd\" (UID: \"2b2152b5-e974-469b-84e7-c0ca4d4a9826\") " pod="openstack/swift-proxy-5448ff86dc-ccghd" Mar 19 19:19:09 crc kubenswrapper[4826]: I0319 19:19:09.769609 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pp7f2\" (UniqueName: \"kubernetes.io/projected/2b2152b5-e974-469b-84e7-c0ca4d4a9826-kube-api-access-pp7f2\") pod \"swift-proxy-5448ff86dc-ccghd\" (UID: \"2b2152b5-e974-469b-84e7-c0ca4d4a9826\") " pod="openstack/swift-proxy-5448ff86dc-ccghd" Mar 19 19:19:09 crc kubenswrapper[4826]: I0319 19:19:09.769644 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b2152b5-e974-469b-84e7-c0ca4d4a9826-config-data\") pod \"swift-proxy-5448ff86dc-ccghd\" (UID: \"2b2152b5-e974-469b-84e7-c0ca4d4a9826\") " pod="openstack/swift-proxy-5448ff86dc-ccghd" Mar 19 19:19:09 crc kubenswrapper[4826]: I0319 19:19:09.769750 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b2152b5-e974-469b-84e7-c0ca4d4a9826-combined-ca-bundle\") pod \"swift-proxy-5448ff86dc-ccghd\" (UID: \"2b2152b5-e974-469b-84e7-c0ca4d4a9826\") " pod="openstack/swift-proxy-5448ff86dc-ccghd" Mar 19 19:19:09 crc kubenswrapper[4826]: I0319 19:19:09.769805 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b2152b5-e974-469b-84e7-c0ca4d4a9826-log-httpd\") pod \"swift-proxy-5448ff86dc-ccghd\" (UID: \"2b2152b5-e974-469b-84e7-c0ca4d4a9826\") " pod="openstack/swift-proxy-5448ff86dc-ccghd" Mar 19 19:19:09 crc kubenswrapper[4826]: I0319 19:19:09.770248 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b2152b5-e974-469b-84e7-c0ca4d4a9826-log-httpd\") pod \"swift-proxy-5448ff86dc-ccghd\" (UID: \"2b2152b5-e974-469b-84e7-c0ca4d4a9826\") " pod="openstack/swift-proxy-5448ff86dc-ccghd" Mar 19 19:19:09 crc kubenswrapper[4826]: I0319 19:19:09.770479 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b2152b5-e974-469b-84e7-c0ca4d4a9826-run-httpd\") pod \"swift-proxy-5448ff86dc-ccghd\" (UID: \"2b2152b5-e974-469b-84e7-c0ca4d4a9826\") " pod="openstack/swift-proxy-5448ff86dc-ccghd" Mar 19 19:19:09 crc kubenswrapper[4826]: I0319 19:19:09.775873 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b2152b5-e974-469b-84e7-c0ca4d4a9826-config-data\") pod \"swift-proxy-5448ff86dc-ccghd\" (UID: \"2b2152b5-e974-469b-84e7-c0ca4d4a9826\") " pod="openstack/swift-proxy-5448ff86dc-ccghd" Mar 19 19:19:09 crc kubenswrapper[4826]: I0319 19:19:09.779944 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b2152b5-e974-469b-84e7-c0ca4d4a9826-public-tls-certs\") pod \"swift-proxy-5448ff86dc-ccghd\" (UID: \"2b2152b5-e974-469b-84e7-c0ca4d4a9826\") " pod="openstack/swift-proxy-5448ff86dc-ccghd" Mar 19 19:19:09 crc kubenswrapper[4826]: I0319 19:19:09.781863 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b2152b5-e974-469b-84e7-c0ca4d4a9826-combined-ca-bundle\") pod \"swift-proxy-5448ff86dc-ccghd\" (UID: \"2b2152b5-e974-469b-84e7-c0ca4d4a9826\") " pod="openstack/swift-proxy-5448ff86dc-ccghd" Mar 19 19:19:09 crc kubenswrapper[4826]: I0319 19:19:09.792186 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pp7f2\" (UniqueName: \"kubernetes.io/projected/2b2152b5-e974-469b-84e7-c0ca4d4a9826-kube-api-access-pp7f2\") pod \"swift-proxy-5448ff86dc-ccghd\" (UID: \"2b2152b5-e974-469b-84e7-c0ca4d4a9826\") " pod="openstack/swift-proxy-5448ff86dc-ccghd" Mar 19 19:19:09 crc kubenswrapper[4826]: I0319 19:19:09.792229 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b2152b5-e974-469b-84e7-c0ca4d4a9826-internal-tls-certs\") pod \"swift-proxy-5448ff86dc-ccghd\" (UID: \"2b2152b5-e974-469b-84e7-c0ca4d4a9826\") " pod="openstack/swift-proxy-5448ff86dc-ccghd" Mar 19 19:19:09 crc kubenswrapper[4826]: I0319 19:19:09.792619 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2b2152b5-e974-469b-84e7-c0ca4d4a9826-etc-swift\") pod \"swift-proxy-5448ff86dc-ccghd\" (UID: \"2b2152b5-e974-469b-84e7-c0ca4d4a9826\") " pod="openstack/swift-proxy-5448ff86dc-ccghd" Mar 19 19:19:09 crc kubenswrapper[4826]: I0319 19:19:09.913059 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5448ff86dc-ccghd" Mar 19 19:19:10 crc kubenswrapper[4826]: I0319 19:19:10.188799 4826 generic.go:334] "Generic (PLEG): container finished" podID="35e51cc9-70e8-40ab-a613-b7f0ffe9b04e" containerID="55970305422c2e6cbf86f40c0b9e98cdee5dc4de5452a9f833a2e65e413a024d" exitCode=0 Mar 19 19:19:10 crc kubenswrapper[4826]: I0319 19:19:10.188899 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-849c97b598-8kp5r" event={"ID":"35e51cc9-70e8-40ab-a613-b7f0ffe9b04e","Type":"ContainerDied","Data":"55970305422c2e6cbf86f40c0b9e98cdee5dc4de5452a9f833a2e65e413a024d"} Mar 19 19:19:10 crc kubenswrapper[4826]: I0319 19:19:10.361902 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-849c97b598-8kp5r" Mar 19 19:19:10 crc kubenswrapper[4826]: I0319 19:19:10.387888 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35e51cc9-70e8-40ab-a613-b7f0ffe9b04e-logs\") pod \"35e51cc9-70e8-40ab-a613-b7f0ffe9b04e\" (UID: \"35e51cc9-70e8-40ab-a613-b7f0ffe9b04e\") " Mar 19 19:19:10 crc kubenswrapper[4826]: I0319 19:19:10.387952 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35e51cc9-70e8-40ab-a613-b7f0ffe9b04e-scripts\") pod \"35e51cc9-70e8-40ab-a613-b7f0ffe9b04e\" (UID: \"35e51cc9-70e8-40ab-a613-b7f0ffe9b04e\") " Mar 19 19:19:10 crc kubenswrapper[4826]: I0319 19:19:10.388044 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/35e51cc9-70e8-40ab-a613-b7f0ffe9b04e-public-tls-certs\") pod \"35e51cc9-70e8-40ab-a613-b7f0ffe9b04e\" (UID: \"35e51cc9-70e8-40ab-a613-b7f0ffe9b04e\") " Mar 19 19:19:10 crc kubenswrapper[4826]: I0319 19:19:10.388137 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35e51cc9-70e8-40ab-a613-b7f0ffe9b04e-config-data\") pod \"35e51cc9-70e8-40ab-a613-b7f0ffe9b04e\" (UID: \"35e51cc9-70e8-40ab-a613-b7f0ffe9b04e\") " Mar 19 19:19:10 crc kubenswrapper[4826]: I0319 19:19:10.388220 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/35e51cc9-70e8-40ab-a613-b7f0ffe9b04e-internal-tls-certs\") pod \"35e51cc9-70e8-40ab-a613-b7f0ffe9b04e\" (UID: \"35e51cc9-70e8-40ab-a613-b7f0ffe9b04e\") " Mar 19 19:19:10 crc kubenswrapper[4826]: I0319 19:19:10.388240 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sj97v\" (UniqueName: \"kubernetes.io/projected/35e51cc9-70e8-40ab-a613-b7f0ffe9b04e-kube-api-access-sj97v\") pod \"35e51cc9-70e8-40ab-a613-b7f0ffe9b04e\" (UID: \"35e51cc9-70e8-40ab-a613-b7f0ffe9b04e\") " Mar 19 19:19:10 crc kubenswrapper[4826]: I0319 19:19:10.388284 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35e51cc9-70e8-40ab-a613-b7f0ffe9b04e-combined-ca-bundle\") pod \"35e51cc9-70e8-40ab-a613-b7f0ffe9b04e\" (UID: \"35e51cc9-70e8-40ab-a613-b7f0ffe9b04e\") " Mar 19 19:19:10 crc kubenswrapper[4826]: I0319 19:19:10.397506 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35e51cc9-70e8-40ab-a613-b7f0ffe9b04e-logs" (OuterVolumeSpecName: "logs") pod "35e51cc9-70e8-40ab-a613-b7f0ffe9b04e" (UID: "35e51cc9-70e8-40ab-a613-b7f0ffe9b04e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:19:10 crc kubenswrapper[4826]: I0319 19:19:10.402977 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35e51cc9-70e8-40ab-a613-b7f0ffe9b04e-kube-api-access-sj97v" (OuterVolumeSpecName: "kube-api-access-sj97v") pod "35e51cc9-70e8-40ab-a613-b7f0ffe9b04e" (UID: "35e51cc9-70e8-40ab-a613-b7f0ffe9b04e"). InnerVolumeSpecName "kube-api-access-sj97v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:19:10 crc kubenswrapper[4826]: I0319 19:19:10.403105 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35e51cc9-70e8-40ab-a613-b7f0ffe9b04e-scripts" (OuterVolumeSpecName: "scripts") pod "35e51cc9-70e8-40ab-a613-b7f0ffe9b04e" (UID: "35e51cc9-70e8-40ab-a613-b7f0ffe9b04e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:10 crc kubenswrapper[4826]: I0319 19:19:10.461756 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35e51cc9-70e8-40ab-a613-b7f0ffe9b04e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "35e51cc9-70e8-40ab-a613-b7f0ffe9b04e" (UID: "35e51cc9-70e8-40ab-a613-b7f0ffe9b04e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:10 crc kubenswrapper[4826]: I0319 19:19:10.471414 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35e51cc9-70e8-40ab-a613-b7f0ffe9b04e-config-data" (OuterVolumeSpecName: "config-data") pod "35e51cc9-70e8-40ab-a613-b7f0ffe9b04e" (UID: "35e51cc9-70e8-40ab-a613-b7f0ffe9b04e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:10 crc kubenswrapper[4826]: I0319 19:19:10.490442 4826 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35e51cc9-70e8-40ab-a613-b7f0ffe9b04e-logs\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:10 crc kubenswrapper[4826]: I0319 19:19:10.490471 4826 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35e51cc9-70e8-40ab-a613-b7f0ffe9b04e-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:10 crc kubenswrapper[4826]: I0319 19:19:10.490480 4826 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35e51cc9-70e8-40ab-a613-b7f0ffe9b04e-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:10 crc kubenswrapper[4826]: I0319 19:19:10.490491 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sj97v\" (UniqueName: \"kubernetes.io/projected/35e51cc9-70e8-40ab-a613-b7f0ffe9b04e-kube-api-access-sj97v\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:10 crc kubenswrapper[4826]: I0319 19:19:10.490503 4826 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35e51cc9-70e8-40ab-a613-b7f0ffe9b04e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:10 crc kubenswrapper[4826]: I0319 19:19:10.527716 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35e51cc9-70e8-40ab-a613-b7f0ffe9b04e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "35e51cc9-70e8-40ab-a613-b7f0ffe9b04e" (UID: "35e51cc9-70e8-40ab-a613-b7f0ffe9b04e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:10 crc kubenswrapper[4826]: I0319 19:19:10.548286 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35e51cc9-70e8-40ab-a613-b7f0ffe9b04e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "35e51cc9-70e8-40ab-a613-b7f0ffe9b04e" (UID: "35e51cc9-70e8-40ab-a613-b7f0ffe9b04e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:10 crc kubenswrapper[4826]: I0319 19:19:10.578088 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5448ff86dc-ccghd"] Mar 19 19:19:10 crc kubenswrapper[4826]: I0319 19:19:10.591983 4826 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/35e51cc9-70e8-40ab-a613-b7f0ffe9b04e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:10 crc kubenswrapper[4826]: I0319 19:19:10.592012 4826 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/35e51cc9-70e8-40ab-a613-b7f0ffe9b04e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:10 crc kubenswrapper[4826]: I0319 19:19:10.657919 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:19:10 crc kubenswrapper[4826]: I0319 19:19:10.658179 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="afbec0cf-36b2-4154-b0e2-6d886a4a0344" containerName="ceilometer-central-agent" containerID="cri-o://a9f8657235e591f363052fd3c2fcbb52099699c71560e82f1e5a0d2b9a77da3a" gracePeriod=30 Mar 19 19:19:10 crc kubenswrapper[4826]: I0319 19:19:10.658371 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="afbec0cf-36b2-4154-b0e2-6d886a4a0344" containerName="sg-core" containerID="cri-o://50e1e5031d7449f7952423ba72251965622473086a8461ec0897811162de9104" gracePeriod=30 Mar 19 19:19:10 crc kubenswrapper[4826]: I0319 19:19:10.658386 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="afbec0cf-36b2-4154-b0e2-6d886a4a0344" containerName="proxy-httpd" containerID="cri-o://8b0eb43fb18b04265be0434a2dca34a28695c4a011ebc5d93034c54b31099c01" gracePeriod=30 Mar 19 19:19:10 crc kubenswrapper[4826]: I0319 19:19:10.658519 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="afbec0cf-36b2-4154-b0e2-6d886a4a0344" containerName="ceilometer-notification-agent" containerID="cri-o://c430bf06f9cb0c55a5511c5e1e4019e78128c93bb6673178225f9c22fa273849" gracePeriod=30 Mar 19 19:19:10 crc kubenswrapper[4826]: I0319 19:19:10.759517 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="afbec0cf-36b2-4154-b0e2-6d886a4a0344" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.211:3000/\": read tcp 10.217.0.2:51088->10.217.0.211:3000: read: connection reset by peer" Mar 19 19:19:11 crc kubenswrapper[4826]: I0319 19:19:11.211115 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5448ff86dc-ccghd" event={"ID":"2b2152b5-e974-469b-84e7-c0ca4d4a9826","Type":"ContainerStarted","Data":"f1b43981f82a4e052f98dbd3482cff45268567ab9460d55d135c92c3815798ae"} Mar 19 19:19:11 crc kubenswrapper[4826]: I0319 19:19:11.212514 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5448ff86dc-ccghd" event={"ID":"2b2152b5-e974-469b-84e7-c0ca4d4a9826","Type":"ContainerStarted","Data":"0d7e5ff5ff3b2b2b5eceb21340d1f08660bbefdf8c838589f778c0be319f9420"} Mar 19 19:19:11 crc kubenswrapper[4826]: I0319 19:19:11.212531 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5448ff86dc-ccghd" event={"ID":"2b2152b5-e974-469b-84e7-c0ca4d4a9826","Type":"ContainerStarted","Data":"848f7f98f71f3a64e1898608ed399e01bae14bac899fd82a32f77ce7e7d8d49e"} Mar 19 19:19:11 crc kubenswrapper[4826]: I0319 19:19:11.213758 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5448ff86dc-ccghd" Mar 19 19:19:11 crc kubenswrapper[4826]: I0319 19:19:11.213810 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5448ff86dc-ccghd" Mar 19 19:19:11 crc kubenswrapper[4826]: I0319 19:19:11.220537 4826 generic.go:334] "Generic (PLEG): container finished" podID="afbec0cf-36b2-4154-b0e2-6d886a4a0344" containerID="8b0eb43fb18b04265be0434a2dca34a28695c4a011ebc5d93034c54b31099c01" exitCode=0 Mar 19 19:19:11 crc kubenswrapper[4826]: I0319 19:19:11.220568 4826 generic.go:334] "Generic (PLEG): container finished" podID="afbec0cf-36b2-4154-b0e2-6d886a4a0344" containerID="50e1e5031d7449f7952423ba72251965622473086a8461ec0897811162de9104" exitCode=2 Mar 19 19:19:11 crc kubenswrapper[4826]: I0319 19:19:11.220575 4826 generic.go:334] "Generic (PLEG): container finished" podID="afbec0cf-36b2-4154-b0e2-6d886a4a0344" containerID="c430bf06f9cb0c55a5511c5e1e4019e78128c93bb6673178225f9c22fa273849" exitCode=0 Mar 19 19:19:11 crc kubenswrapper[4826]: I0319 19:19:11.220581 4826 generic.go:334] "Generic (PLEG): container finished" podID="afbec0cf-36b2-4154-b0e2-6d886a4a0344" containerID="a9f8657235e591f363052fd3c2fcbb52099699c71560e82f1e5a0d2b9a77da3a" exitCode=0 Mar 19 19:19:11 crc kubenswrapper[4826]: I0319 19:19:11.220619 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"afbec0cf-36b2-4154-b0e2-6d886a4a0344","Type":"ContainerDied","Data":"8b0eb43fb18b04265be0434a2dca34a28695c4a011ebc5d93034c54b31099c01"} Mar 19 19:19:11 crc kubenswrapper[4826]: I0319 19:19:11.220646 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"afbec0cf-36b2-4154-b0e2-6d886a4a0344","Type":"ContainerDied","Data":"50e1e5031d7449f7952423ba72251965622473086a8461ec0897811162de9104"} Mar 19 19:19:11 crc kubenswrapper[4826]: I0319 19:19:11.220672 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"afbec0cf-36b2-4154-b0e2-6d886a4a0344","Type":"ContainerDied","Data":"c430bf06f9cb0c55a5511c5e1e4019e78128c93bb6673178225f9c22fa273849"} Mar 19 19:19:11 crc kubenswrapper[4826]: I0319 19:19:11.220683 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"afbec0cf-36b2-4154-b0e2-6d886a4a0344","Type":"ContainerDied","Data":"a9f8657235e591f363052fd3c2fcbb52099699c71560e82f1e5a0d2b9a77da3a"} Mar 19 19:19:11 crc kubenswrapper[4826]: I0319 19:19:11.223060 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-849c97b598-8kp5r" event={"ID":"35e51cc9-70e8-40ab-a613-b7f0ffe9b04e","Type":"ContainerDied","Data":"f89ad55452439cf7b6024e24d15ded80d2d695458438f9abf5bf2d12b1fe1824"} Mar 19 19:19:11 crc kubenswrapper[4826]: I0319 19:19:11.223093 4826 scope.go:117] "RemoveContainer" containerID="55970305422c2e6cbf86f40c0b9e98cdee5dc4de5452a9f833a2e65e413a024d" Mar 19 19:19:11 crc kubenswrapper[4826]: I0319 19:19:11.223240 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-849c97b598-8kp5r" Mar 19 19:19:11 crc kubenswrapper[4826]: I0319 19:19:11.243076 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-5448ff86dc-ccghd" podStartSLOduration=2.243056577 podStartE2EDuration="2.243056577s" podCreationTimestamp="2026-03-19 19:19:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:19:11.234305325 +0000 UTC m=+1375.988373648" watchObservedRunningTime="2026-03-19 19:19:11.243056577 +0000 UTC m=+1375.997124890" Mar 19 19:19:11 crc kubenswrapper[4826]: I0319 19:19:11.267411 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-849c97b598-8kp5r"] Mar 19 19:19:11 crc kubenswrapper[4826]: I0319 19:19:11.269836 4826 scope.go:117] "RemoveContainer" containerID="4f149913d18416843831582f7a3640dcecba1ef3dac591f776f00032fd7813e5" Mar 19 19:19:11 crc kubenswrapper[4826]: I0319 19:19:11.281341 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-849c97b598-8kp5r"] Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:11.535213 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:11.619952 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/afbec0cf-36b2-4154-b0e2-6d886a4a0344-run-httpd\") pod \"afbec0cf-36b2-4154-b0e2-6d886a4a0344\" (UID: \"afbec0cf-36b2-4154-b0e2-6d886a4a0344\") " Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:11.619994 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afbec0cf-36b2-4154-b0e2-6d886a4a0344-combined-ca-bundle\") pod \"afbec0cf-36b2-4154-b0e2-6d886a4a0344\" (UID: \"afbec0cf-36b2-4154-b0e2-6d886a4a0344\") " Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:11.620033 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5kjt\" (UniqueName: \"kubernetes.io/projected/afbec0cf-36b2-4154-b0e2-6d886a4a0344-kube-api-access-g5kjt\") pod \"afbec0cf-36b2-4154-b0e2-6d886a4a0344\" (UID: \"afbec0cf-36b2-4154-b0e2-6d886a4a0344\") " Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:11.620105 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/afbec0cf-36b2-4154-b0e2-6d886a4a0344-log-httpd\") pod \"afbec0cf-36b2-4154-b0e2-6d886a4a0344\" (UID: \"afbec0cf-36b2-4154-b0e2-6d886a4a0344\") " Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:11.620144 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/afbec0cf-36b2-4154-b0e2-6d886a4a0344-sg-core-conf-yaml\") pod \"afbec0cf-36b2-4154-b0e2-6d886a4a0344\" (UID: \"afbec0cf-36b2-4154-b0e2-6d886a4a0344\") " Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:11.620183 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afbec0cf-36b2-4154-b0e2-6d886a4a0344-config-data\") pod \"afbec0cf-36b2-4154-b0e2-6d886a4a0344\" (UID: \"afbec0cf-36b2-4154-b0e2-6d886a4a0344\") " Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:11.620201 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/afbec0cf-36b2-4154-b0e2-6d886a4a0344-scripts\") pod \"afbec0cf-36b2-4154-b0e2-6d886a4a0344\" (UID: \"afbec0cf-36b2-4154-b0e2-6d886a4a0344\") " Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:11.623906 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afbec0cf-36b2-4154-b0e2-6d886a4a0344-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "afbec0cf-36b2-4154-b0e2-6d886a4a0344" (UID: "afbec0cf-36b2-4154-b0e2-6d886a4a0344"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:11.625259 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afbec0cf-36b2-4154-b0e2-6d886a4a0344-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "afbec0cf-36b2-4154-b0e2-6d886a4a0344" (UID: "afbec0cf-36b2-4154-b0e2-6d886a4a0344"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:11.663982 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afbec0cf-36b2-4154-b0e2-6d886a4a0344-scripts" (OuterVolumeSpecName: "scripts") pod "afbec0cf-36b2-4154-b0e2-6d886a4a0344" (UID: "afbec0cf-36b2-4154-b0e2-6d886a4a0344"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:11.671819 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afbec0cf-36b2-4154-b0e2-6d886a4a0344-kube-api-access-g5kjt" (OuterVolumeSpecName: "kube-api-access-g5kjt") pod "afbec0cf-36b2-4154-b0e2-6d886a4a0344" (UID: "afbec0cf-36b2-4154-b0e2-6d886a4a0344"). InnerVolumeSpecName "kube-api-access-g5kjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:11.723327 4826 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/afbec0cf-36b2-4154-b0e2-6d886a4a0344-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:11.723347 4826 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/afbec0cf-36b2-4154-b0e2-6d886a4a0344-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:11.723356 4826 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/afbec0cf-36b2-4154-b0e2-6d886a4a0344-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:11.723364 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5kjt\" (UniqueName: \"kubernetes.io/projected/afbec0cf-36b2-4154-b0e2-6d886a4a0344-kube-api-access-g5kjt\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:11.738802 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afbec0cf-36b2-4154-b0e2-6d886a4a0344-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "afbec0cf-36b2-4154-b0e2-6d886a4a0344" (UID: "afbec0cf-36b2-4154-b0e2-6d886a4a0344"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:11.814572 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afbec0cf-36b2-4154-b0e2-6d886a4a0344-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "afbec0cf-36b2-4154-b0e2-6d886a4a0344" (UID: "afbec0cf-36b2-4154-b0e2-6d886a4a0344"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:11.832702 4826 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afbec0cf-36b2-4154-b0e2-6d886a4a0344-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:11.832740 4826 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/afbec0cf-36b2-4154-b0e2-6d886a4a0344-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:11.890743 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afbec0cf-36b2-4154-b0e2-6d886a4a0344-config-data" (OuterVolumeSpecName: "config-data") pod "afbec0cf-36b2-4154-b0e2-6d886a4a0344" (UID: "afbec0cf-36b2-4154-b0e2-6d886a4a0344"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:11.934876 4826 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afbec0cf-36b2-4154-b0e2-6d886a4a0344-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:11.991526 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35e51cc9-70e8-40ab-a613-b7f0ffe9b04e" path="/var/lib/kubelet/pods/35e51cc9-70e8-40ab-a613-b7f0ffe9b04e/volumes" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.238049 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"afbec0cf-36b2-4154-b0e2-6d886a4a0344","Type":"ContainerDied","Data":"c2824a06dd03aab824cfbf1aad6bd19fc90c47d988df37ce1585f947b12b7bcb"} Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.238096 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.238336 4826 scope.go:117] "RemoveContainer" containerID="8b0eb43fb18b04265be0434a2dca34a28695c4a011ebc5d93034c54b31099c01" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.274002 4826 scope.go:117] "RemoveContainer" containerID="50e1e5031d7449f7952423ba72251965622473086a8461ec0897811162de9104" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.279568 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.298299 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.311253 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:19:12 crc kubenswrapper[4826]: E0319 19:19:12.311729 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35e51cc9-70e8-40ab-a613-b7f0ffe9b04e" containerName="placement-log" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.311747 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="35e51cc9-70e8-40ab-a613-b7f0ffe9b04e" containerName="placement-log" Mar 19 19:19:12 crc kubenswrapper[4826]: E0319 19:19:12.311764 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afbec0cf-36b2-4154-b0e2-6d886a4a0344" containerName="ceilometer-notification-agent" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.311771 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="afbec0cf-36b2-4154-b0e2-6d886a4a0344" containerName="ceilometer-notification-agent" Mar 19 19:19:12 crc kubenswrapper[4826]: E0319 19:19:12.311805 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35e51cc9-70e8-40ab-a613-b7f0ffe9b04e" containerName="placement-api" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.311811 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="35e51cc9-70e8-40ab-a613-b7f0ffe9b04e" containerName="placement-api" Mar 19 19:19:12 crc kubenswrapper[4826]: E0319 19:19:12.311821 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afbec0cf-36b2-4154-b0e2-6d886a4a0344" containerName="sg-core" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.311826 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="afbec0cf-36b2-4154-b0e2-6d886a4a0344" containerName="sg-core" Mar 19 19:19:12 crc kubenswrapper[4826]: E0319 19:19:12.311859 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afbec0cf-36b2-4154-b0e2-6d886a4a0344" containerName="ceilometer-central-agent" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.311866 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="afbec0cf-36b2-4154-b0e2-6d886a4a0344" containerName="ceilometer-central-agent" Mar 19 19:19:12 crc kubenswrapper[4826]: E0319 19:19:12.311884 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afbec0cf-36b2-4154-b0e2-6d886a4a0344" containerName="proxy-httpd" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.311890 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="afbec0cf-36b2-4154-b0e2-6d886a4a0344" containerName="proxy-httpd" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.312132 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="afbec0cf-36b2-4154-b0e2-6d886a4a0344" containerName="sg-core" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.312150 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="afbec0cf-36b2-4154-b0e2-6d886a4a0344" containerName="ceilometer-central-agent" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.312165 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="35e51cc9-70e8-40ab-a613-b7f0ffe9b04e" containerName="placement-api" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.312183 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="afbec0cf-36b2-4154-b0e2-6d886a4a0344" containerName="ceilometer-notification-agent" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.312191 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="35e51cc9-70e8-40ab-a613-b7f0ffe9b04e" containerName="placement-log" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.312201 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="afbec0cf-36b2-4154-b0e2-6d886a4a0344" containerName="proxy-httpd" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.314215 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.317078 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.317303 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.318681 4826 scope.go:117] "RemoveContainer" containerID="c430bf06f9cb0c55a5511c5e1e4019e78128c93bb6673178225f9c22fa273849" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.337285 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.342032 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5859facc-e308-4701-9de1-ef9f4866a8d3-scripts\") pod \"ceilometer-0\" (UID: \"5859facc-e308-4701-9de1-ef9f4866a8d3\") " pod="openstack/ceilometer-0" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.342106 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kv6nk\" (UniqueName: \"kubernetes.io/projected/5859facc-e308-4701-9de1-ef9f4866a8d3-kube-api-access-kv6nk\") pod \"ceilometer-0\" (UID: \"5859facc-e308-4701-9de1-ef9f4866a8d3\") " pod="openstack/ceilometer-0" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.342129 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5859facc-e308-4701-9de1-ef9f4866a8d3-run-httpd\") pod \"ceilometer-0\" (UID: \"5859facc-e308-4701-9de1-ef9f4866a8d3\") " pod="openstack/ceilometer-0" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.342214 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5859facc-e308-4701-9de1-ef9f4866a8d3-config-data\") pod \"ceilometer-0\" (UID: \"5859facc-e308-4701-9de1-ef9f4866a8d3\") " pod="openstack/ceilometer-0" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.342316 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5859facc-e308-4701-9de1-ef9f4866a8d3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5859facc-e308-4701-9de1-ef9f4866a8d3\") " pod="openstack/ceilometer-0" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.342343 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5859facc-e308-4701-9de1-ef9f4866a8d3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5859facc-e308-4701-9de1-ef9f4866a8d3\") " pod="openstack/ceilometer-0" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.342391 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5859facc-e308-4701-9de1-ef9f4866a8d3-log-httpd\") pod \"ceilometer-0\" (UID: \"5859facc-e308-4701-9de1-ef9f4866a8d3\") " pod="openstack/ceilometer-0" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.398811 4826 scope.go:117] "RemoveContainer" containerID="a9f8657235e591f363052fd3c2fcbb52099699c71560e82f1e5a0d2b9a77da3a" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.443946 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kv6nk\" (UniqueName: \"kubernetes.io/projected/5859facc-e308-4701-9de1-ef9f4866a8d3-kube-api-access-kv6nk\") pod \"ceilometer-0\" (UID: \"5859facc-e308-4701-9de1-ef9f4866a8d3\") " pod="openstack/ceilometer-0" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.443987 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5859facc-e308-4701-9de1-ef9f4866a8d3-run-httpd\") pod \"ceilometer-0\" (UID: \"5859facc-e308-4701-9de1-ef9f4866a8d3\") " pod="openstack/ceilometer-0" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.444056 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5859facc-e308-4701-9de1-ef9f4866a8d3-config-data\") pod \"ceilometer-0\" (UID: \"5859facc-e308-4701-9de1-ef9f4866a8d3\") " pod="openstack/ceilometer-0" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.444140 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5859facc-e308-4701-9de1-ef9f4866a8d3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5859facc-e308-4701-9de1-ef9f4866a8d3\") " pod="openstack/ceilometer-0" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.444175 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5859facc-e308-4701-9de1-ef9f4866a8d3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5859facc-e308-4701-9de1-ef9f4866a8d3\") " pod="openstack/ceilometer-0" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.444203 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5859facc-e308-4701-9de1-ef9f4866a8d3-log-httpd\") pod \"ceilometer-0\" (UID: \"5859facc-e308-4701-9de1-ef9f4866a8d3\") " pod="openstack/ceilometer-0" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.444283 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5859facc-e308-4701-9de1-ef9f4866a8d3-scripts\") pod \"ceilometer-0\" (UID: \"5859facc-e308-4701-9de1-ef9f4866a8d3\") " pod="openstack/ceilometer-0" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.450387 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5859facc-e308-4701-9de1-ef9f4866a8d3-scripts\") pod \"ceilometer-0\" (UID: \"5859facc-e308-4701-9de1-ef9f4866a8d3\") " pod="openstack/ceilometer-0" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.450646 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5859facc-e308-4701-9de1-ef9f4866a8d3-log-httpd\") pod \"ceilometer-0\" (UID: \"5859facc-e308-4701-9de1-ef9f4866a8d3\") " pod="openstack/ceilometer-0" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.450709 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5859facc-e308-4701-9de1-ef9f4866a8d3-run-httpd\") pod \"ceilometer-0\" (UID: \"5859facc-e308-4701-9de1-ef9f4866a8d3\") " pod="openstack/ceilometer-0" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.451803 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5859facc-e308-4701-9de1-ef9f4866a8d3-config-data\") pod \"ceilometer-0\" (UID: \"5859facc-e308-4701-9de1-ef9f4866a8d3\") " pod="openstack/ceilometer-0" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.452019 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5859facc-e308-4701-9de1-ef9f4866a8d3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5859facc-e308-4701-9de1-ef9f4866a8d3\") " pod="openstack/ceilometer-0" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.458136 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5859facc-e308-4701-9de1-ef9f4866a8d3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5859facc-e308-4701-9de1-ef9f4866a8d3\") " pod="openstack/ceilometer-0" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.474288 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kv6nk\" (UniqueName: \"kubernetes.io/projected/5859facc-e308-4701-9de1-ef9f4866a8d3-kube-api-access-kv6nk\") pod \"ceilometer-0\" (UID: \"5859facc-e308-4701-9de1-ef9f4866a8d3\") " pod="openstack/ceilometer-0" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.485067 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-n9d4c"] Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.486540 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-n9d4c" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.507713 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-n9d4c"] Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.546455 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a60e04c4-caed-4173-ac2a-4dca3c4ae080-operator-scripts\") pod \"nova-api-db-create-n9d4c\" (UID: \"a60e04c4-caed-4173-ac2a-4dca3c4ae080\") " pod="openstack/nova-api-db-create-n9d4c" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.546528 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cc8d\" (UniqueName: \"kubernetes.io/projected/a60e04c4-caed-4173-ac2a-4dca3c4ae080-kube-api-access-4cc8d\") pod \"nova-api-db-create-n9d4c\" (UID: \"a60e04c4-caed-4173-ac2a-4dca3c4ae080\") " pod="openstack/nova-api-db-create-n9d4c" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.575124 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-96hcs"] Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.576427 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-96hcs" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.603809 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-96hcs"] Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.613251 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-c0a7-account-create-update-7mljh"] Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.615600 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c0a7-account-create-update-7mljh" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.620668 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.643592 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-c0a7-account-create-update-7mljh"] Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.648164 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v8pn\" (UniqueName: \"kubernetes.io/projected/9361ba16-5f40-4c7b-a698-dd74fb1d4af7-kube-api-access-9v8pn\") pod \"nova-api-c0a7-account-create-update-7mljh\" (UID: \"9361ba16-5f40-4c7b-a698-dd74fb1d4af7\") " pod="openstack/nova-api-c0a7-account-create-update-7mljh" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.648304 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a60e04c4-caed-4173-ac2a-4dca3c4ae080-operator-scripts\") pod \"nova-api-db-create-n9d4c\" (UID: \"a60e04c4-caed-4173-ac2a-4dca3c4ae080\") " pod="openstack/nova-api-db-create-n9d4c" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.648334 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cc8d\" (UniqueName: \"kubernetes.io/projected/a60e04c4-caed-4173-ac2a-4dca3c4ae080-kube-api-access-4cc8d\") pod \"nova-api-db-create-n9d4c\" (UID: \"a60e04c4-caed-4173-ac2a-4dca3c4ae080\") " pod="openstack/nova-api-db-create-n9d4c" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.648380 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w2k4\" (UniqueName: \"kubernetes.io/projected/56d23be9-1486-41d9-9f8f-86ca8965b96c-kube-api-access-2w2k4\") pod \"nova-cell0-db-create-96hcs\" (UID: \"56d23be9-1486-41d9-9f8f-86ca8965b96c\") " pod="openstack/nova-cell0-db-create-96hcs" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.648418 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56d23be9-1486-41d9-9f8f-86ca8965b96c-operator-scripts\") pod \"nova-cell0-db-create-96hcs\" (UID: \"56d23be9-1486-41d9-9f8f-86ca8965b96c\") " pod="openstack/nova-cell0-db-create-96hcs" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.648440 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9361ba16-5f40-4c7b-a698-dd74fb1d4af7-operator-scripts\") pod \"nova-api-c0a7-account-create-update-7mljh\" (UID: \"9361ba16-5f40-4c7b-a698-dd74fb1d4af7\") " pod="openstack/nova-api-c0a7-account-create-update-7mljh" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.649096 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a60e04c4-caed-4173-ac2a-4dca3c4ae080-operator-scripts\") pod \"nova-api-db-create-n9d4c\" (UID: \"a60e04c4-caed-4173-ac2a-4dca3c4ae080\") " pod="openstack/nova-api-db-create-n9d4c" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.666757 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cc8d\" (UniqueName: \"kubernetes.io/projected/a60e04c4-caed-4173-ac2a-4dca3c4ae080-kube-api-access-4cc8d\") pod \"nova-api-db-create-n9d4c\" (UID: \"a60e04c4-caed-4173-ac2a-4dca3c4ae080\") " pod="openstack/nova-api-db-create-n9d4c" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.692723 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.695843 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-s9rqd"] Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.697357 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-s9rqd" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.712386 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-s9rqd"] Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.750391 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81a7caa7-3837-46ff-9008-bc8784373127-operator-scripts\") pod \"nova-cell1-db-create-s9rqd\" (UID: \"81a7caa7-3837-46ff-9008-bc8784373127\") " pod="openstack/nova-cell1-db-create-s9rqd" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.750487 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2w2k4\" (UniqueName: \"kubernetes.io/projected/56d23be9-1486-41d9-9f8f-86ca8965b96c-kube-api-access-2w2k4\") pod \"nova-cell0-db-create-96hcs\" (UID: \"56d23be9-1486-41d9-9f8f-86ca8965b96c\") " pod="openstack/nova-cell0-db-create-96hcs" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.750513 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56d23be9-1486-41d9-9f8f-86ca8965b96c-operator-scripts\") pod \"nova-cell0-db-create-96hcs\" (UID: \"56d23be9-1486-41d9-9f8f-86ca8965b96c\") " pod="openstack/nova-cell0-db-create-96hcs" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.750537 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9361ba16-5f40-4c7b-a698-dd74fb1d4af7-operator-scripts\") pod \"nova-api-c0a7-account-create-update-7mljh\" (UID: \"9361ba16-5f40-4c7b-a698-dd74fb1d4af7\") " pod="openstack/nova-api-c0a7-account-create-update-7mljh" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.750587 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v8pn\" (UniqueName: \"kubernetes.io/projected/9361ba16-5f40-4c7b-a698-dd74fb1d4af7-kube-api-access-9v8pn\") pod \"nova-api-c0a7-account-create-update-7mljh\" (UID: \"9361ba16-5f40-4c7b-a698-dd74fb1d4af7\") " pod="openstack/nova-api-c0a7-account-create-update-7mljh" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.750681 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr5lt\" (UniqueName: \"kubernetes.io/projected/81a7caa7-3837-46ff-9008-bc8784373127-kube-api-access-rr5lt\") pod \"nova-cell1-db-create-s9rqd\" (UID: \"81a7caa7-3837-46ff-9008-bc8784373127\") " pod="openstack/nova-cell1-db-create-s9rqd" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.751585 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56d23be9-1486-41d9-9f8f-86ca8965b96c-operator-scripts\") pod \"nova-cell0-db-create-96hcs\" (UID: \"56d23be9-1486-41d9-9f8f-86ca8965b96c\") " pod="openstack/nova-cell0-db-create-96hcs" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.752041 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9361ba16-5f40-4c7b-a698-dd74fb1d4af7-operator-scripts\") pod \"nova-api-c0a7-account-create-update-7mljh\" (UID: \"9361ba16-5f40-4c7b-a698-dd74fb1d4af7\") " pod="openstack/nova-api-c0a7-account-create-update-7mljh" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.770539 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v8pn\" (UniqueName: \"kubernetes.io/projected/9361ba16-5f40-4c7b-a698-dd74fb1d4af7-kube-api-access-9v8pn\") pod \"nova-api-c0a7-account-create-update-7mljh\" (UID: \"9361ba16-5f40-4c7b-a698-dd74fb1d4af7\") " pod="openstack/nova-api-c0a7-account-create-update-7mljh" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.772287 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w2k4\" (UniqueName: \"kubernetes.io/projected/56d23be9-1486-41d9-9f8f-86ca8965b96c-kube-api-access-2w2k4\") pod \"nova-cell0-db-create-96hcs\" (UID: \"56d23be9-1486-41d9-9f8f-86ca8965b96c\") " pod="openstack/nova-cell0-db-create-96hcs" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.782543 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-9a8c-account-create-update-s22ds"] Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.783882 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9a8c-account-create-update-s22ds" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.788876 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.797405 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-9a8c-account-create-update-s22ds"] Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.852465 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt2wk\" (UniqueName: \"kubernetes.io/projected/603d736d-cb9b-41e3-b971-82c457932511-kube-api-access-xt2wk\") pod \"nova-cell0-9a8c-account-create-update-s22ds\" (UID: \"603d736d-cb9b-41e3-b971-82c457932511\") " pod="openstack/nova-cell0-9a8c-account-create-update-s22ds" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.852524 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/603d736d-cb9b-41e3-b971-82c457932511-operator-scripts\") pod \"nova-cell0-9a8c-account-create-update-s22ds\" (UID: \"603d736d-cb9b-41e3-b971-82c457932511\") " pod="openstack/nova-cell0-9a8c-account-create-update-s22ds" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.852565 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr5lt\" (UniqueName: \"kubernetes.io/projected/81a7caa7-3837-46ff-9008-bc8784373127-kube-api-access-rr5lt\") pod \"nova-cell1-db-create-s9rqd\" (UID: \"81a7caa7-3837-46ff-9008-bc8784373127\") " pod="openstack/nova-cell1-db-create-s9rqd" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.852635 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81a7caa7-3837-46ff-9008-bc8784373127-operator-scripts\") pod \"nova-cell1-db-create-s9rqd\" (UID: \"81a7caa7-3837-46ff-9008-bc8784373127\") " pod="openstack/nova-cell1-db-create-s9rqd" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.853292 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81a7caa7-3837-46ff-9008-bc8784373127-operator-scripts\") pod \"nova-cell1-db-create-s9rqd\" (UID: \"81a7caa7-3837-46ff-9008-bc8784373127\") " pod="openstack/nova-cell1-db-create-s9rqd" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.874170 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr5lt\" (UniqueName: \"kubernetes.io/projected/81a7caa7-3837-46ff-9008-bc8784373127-kube-api-access-rr5lt\") pod \"nova-cell1-db-create-s9rqd\" (UID: \"81a7caa7-3837-46ff-9008-bc8784373127\") " pod="openstack/nova-cell1-db-create-s9rqd" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.897538 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-n9d4c" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.904617 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-96hcs" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.942331 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c0a7-account-create-update-7mljh" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.955428 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xt2wk\" (UniqueName: \"kubernetes.io/projected/603d736d-cb9b-41e3-b971-82c457932511-kube-api-access-xt2wk\") pod \"nova-cell0-9a8c-account-create-update-s22ds\" (UID: \"603d736d-cb9b-41e3-b971-82c457932511\") " pod="openstack/nova-cell0-9a8c-account-create-update-s22ds" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.955693 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/603d736d-cb9b-41e3-b971-82c457932511-operator-scripts\") pod \"nova-cell0-9a8c-account-create-update-s22ds\" (UID: \"603d736d-cb9b-41e3-b971-82c457932511\") " pod="openstack/nova-cell0-9a8c-account-create-update-s22ds" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.956387 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/603d736d-cb9b-41e3-b971-82c457932511-operator-scripts\") pod \"nova-cell0-9a8c-account-create-update-s22ds\" (UID: \"603d736d-cb9b-41e3-b971-82c457932511\") " pod="openstack/nova-cell0-9a8c-account-create-update-s22ds" Mar 19 19:19:12 crc kubenswrapper[4826]: I0319 19:19:12.983382 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt2wk\" (UniqueName: \"kubernetes.io/projected/603d736d-cb9b-41e3-b971-82c457932511-kube-api-access-xt2wk\") pod \"nova-cell0-9a8c-account-create-update-s22ds\" (UID: \"603d736d-cb9b-41e3-b971-82c457932511\") " pod="openstack/nova-cell0-9a8c-account-create-update-s22ds" Mar 19 19:19:13 crc kubenswrapper[4826]: I0319 19:19:13.003756 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-ac0a-account-create-update-nrsjr"] Mar 19 19:19:13 crc kubenswrapper[4826]: I0319 19:19:13.006833 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-ac0a-account-create-update-nrsjr" Mar 19 19:19:13 crc kubenswrapper[4826]: I0319 19:19:13.009185 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 19 19:19:13 crc kubenswrapper[4826]: I0319 19:19:13.012949 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-ac0a-account-create-update-nrsjr"] Mar 19 19:19:13 crc kubenswrapper[4826]: I0319 19:19:13.013648 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-s9rqd" Mar 19 19:19:13 crc kubenswrapper[4826]: I0319 19:19:13.088145 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38780138-b2f9-49a0-9ed1-90ee6fbb4c11-operator-scripts\") pod \"nova-cell1-ac0a-account-create-update-nrsjr\" (UID: \"38780138-b2f9-49a0-9ed1-90ee6fbb4c11\") " pod="openstack/nova-cell1-ac0a-account-create-update-nrsjr" Mar 19 19:19:13 crc kubenswrapper[4826]: I0319 19:19:13.088627 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qkxq\" (UniqueName: \"kubernetes.io/projected/38780138-b2f9-49a0-9ed1-90ee6fbb4c11-kube-api-access-2qkxq\") pod \"nova-cell1-ac0a-account-create-update-nrsjr\" (UID: \"38780138-b2f9-49a0-9ed1-90ee6fbb4c11\") " pod="openstack/nova-cell1-ac0a-account-create-update-nrsjr" Mar 19 19:19:13 crc kubenswrapper[4826]: I0319 19:19:13.140364 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9a8c-account-create-update-s22ds" Mar 19 19:19:13 crc kubenswrapper[4826]: I0319 19:19:13.191954 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qkxq\" (UniqueName: \"kubernetes.io/projected/38780138-b2f9-49a0-9ed1-90ee6fbb4c11-kube-api-access-2qkxq\") pod \"nova-cell1-ac0a-account-create-update-nrsjr\" (UID: \"38780138-b2f9-49a0-9ed1-90ee6fbb4c11\") " pod="openstack/nova-cell1-ac0a-account-create-update-nrsjr" Mar 19 19:19:13 crc kubenswrapper[4826]: I0319 19:19:13.192085 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38780138-b2f9-49a0-9ed1-90ee6fbb4c11-operator-scripts\") pod \"nova-cell1-ac0a-account-create-update-nrsjr\" (UID: \"38780138-b2f9-49a0-9ed1-90ee6fbb4c11\") " pod="openstack/nova-cell1-ac0a-account-create-update-nrsjr" Mar 19 19:19:13 crc kubenswrapper[4826]: I0319 19:19:13.192850 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38780138-b2f9-49a0-9ed1-90ee6fbb4c11-operator-scripts\") pod \"nova-cell1-ac0a-account-create-update-nrsjr\" (UID: \"38780138-b2f9-49a0-9ed1-90ee6fbb4c11\") " pod="openstack/nova-cell1-ac0a-account-create-update-nrsjr" Mar 19 19:19:13 crc kubenswrapper[4826]: I0319 19:19:13.211911 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qkxq\" (UniqueName: \"kubernetes.io/projected/38780138-b2f9-49a0-9ed1-90ee6fbb4c11-kube-api-access-2qkxq\") pod \"nova-cell1-ac0a-account-create-update-nrsjr\" (UID: \"38780138-b2f9-49a0-9ed1-90ee6fbb4c11\") " pod="openstack/nova-cell1-ac0a-account-create-update-nrsjr" Mar 19 19:19:13 crc kubenswrapper[4826]: I0319 19:19:13.364223 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-ac0a-account-create-update-nrsjr" Mar 19 19:19:13 crc kubenswrapper[4826]: I0319 19:19:13.993482 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afbec0cf-36b2-4154-b0e2-6d886a4a0344" path="/var/lib/kubelet/pods/afbec0cf-36b2-4154-b0e2-6d886a4a0344/volumes" Mar 19 19:19:18 crc kubenswrapper[4826]: I0319 19:19:18.908995 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6674b95fd5-8s42n" Mar 19 19:19:19 crc kubenswrapper[4826]: I0319 19:19:19.010069 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-dddfc89b6-hxwk2"] Mar 19 19:19:19 crc kubenswrapper[4826]: I0319 19:19:19.010792 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-dddfc89b6-hxwk2" podUID="5ef1c972-c655-460d-a249-a1f7d4990b2f" containerName="neutron-api" containerID="cri-o://83e7c3bfc46a79113e6c663c7ed9ecee31bd393e79e33163c32138d7bb8d0b91" gracePeriod=30 Mar 19 19:19:19 crc kubenswrapper[4826]: I0319 19:19:19.010917 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-dddfc89b6-hxwk2" podUID="5ef1c972-c655-460d-a249-a1f7d4990b2f" containerName="neutron-httpd" containerID="cri-o://cd104fc87b47839a2c9c5fcb924ef27eb67868772e968598c2ea2c5177099e9b" gracePeriod=30 Mar 19 19:19:19 crc kubenswrapper[4826]: I0319 19:19:19.384599 4826 generic.go:334] "Generic (PLEG): container finished" podID="e115aacd-2c47-4070-bcdf-ead46ded7eb5" containerID="72f2353bdff336712857ae56e521d63cc77595f5b5da40c4f3faf20eeaa4d307" exitCode=137 Mar 19 19:19:19 crc kubenswrapper[4826]: I0319 19:19:19.384950 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e115aacd-2c47-4070-bcdf-ead46ded7eb5","Type":"ContainerDied","Data":"72f2353bdff336712857ae56e521d63cc77595f5b5da40c4f3faf20eeaa4d307"} Mar 19 19:19:19 crc kubenswrapper[4826]: I0319 19:19:19.430076 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 19 19:19:19 crc kubenswrapper[4826]: I0319 19:19:19.555420 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e115aacd-2c47-4070-bcdf-ead46ded7eb5-combined-ca-bundle\") pod \"e115aacd-2c47-4070-bcdf-ead46ded7eb5\" (UID: \"e115aacd-2c47-4070-bcdf-ead46ded7eb5\") " Mar 19 19:19:19 crc kubenswrapper[4826]: I0319 19:19:19.555871 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e115aacd-2c47-4070-bcdf-ead46ded7eb5-config-data-custom\") pod \"e115aacd-2c47-4070-bcdf-ead46ded7eb5\" (UID: \"e115aacd-2c47-4070-bcdf-ead46ded7eb5\") " Mar 19 19:19:19 crc kubenswrapper[4826]: I0319 19:19:19.555911 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e115aacd-2c47-4070-bcdf-ead46ded7eb5-etc-machine-id\") pod \"e115aacd-2c47-4070-bcdf-ead46ded7eb5\" (UID: \"e115aacd-2c47-4070-bcdf-ead46ded7eb5\") " Mar 19 19:19:19 crc kubenswrapper[4826]: I0319 19:19:19.555962 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e115aacd-2c47-4070-bcdf-ead46ded7eb5-scripts\") pod \"e115aacd-2c47-4070-bcdf-ead46ded7eb5\" (UID: \"e115aacd-2c47-4070-bcdf-ead46ded7eb5\") " Mar 19 19:19:19 crc kubenswrapper[4826]: I0319 19:19:19.556109 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e115aacd-2c47-4070-bcdf-ead46ded7eb5-logs\") pod \"e115aacd-2c47-4070-bcdf-ead46ded7eb5\" (UID: \"e115aacd-2c47-4070-bcdf-ead46ded7eb5\") " Mar 19 19:19:19 crc kubenswrapper[4826]: I0319 19:19:19.556162 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzlmg\" (UniqueName: \"kubernetes.io/projected/e115aacd-2c47-4070-bcdf-ead46ded7eb5-kube-api-access-zzlmg\") pod \"e115aacd-2c47-4070-bcdf-ead46ded7eb5\" (UID: \"e115aacd-2c47-4070-bcdf-ead46ded7eb5\") " Mar 19 19:19:19 crc kubenswrapper[4826]: I0319 19:19:19.556642 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e115aacd-2c47-4070-bcdf-ead46ded7eb5-logs" (OuterVolumeSpecName: "logs") pod "e115aacd-2c47-4070-bcdf-ead46ded7eb5" (UID: "e115aacd-2c47-4070-bcdf-ead46ded7eb5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:19:19 crc kubenswrapper[4826]: I0319 19:19:19.556631 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e115aacd-2c47-4070-bcdf-ead46ded7eb5-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e115aacd-2c47-4070-bcdf-ead46ded7eb5" (UID: "e115aacd-2c47-4070-bcdf-ead46ded7eb5"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 19:19:19 crc kubenswrapper[4826]: I0319 19:19:19.556887 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e115aacd-2c47-4070-bcdf-ead46ded7eb5-config-data\") pod \"e115aacd-2c47-4070-bcdf-ead46ded7eb5\" (UID: \"e115aacd-2c47-4070-bcdf-ead46ded7eb5\") " Mar 19 19:19:19 crc kubenswrapper[4826]: I0319 19:19:19.558019 4826 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e115aacd-2c47-4070-bcdf-ead46ded7eb5-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:19 crc kubenswrapper[4826]: I0319 19:19:19.558034 4826 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e115aacd-2c47-4070-bcdf-ead46ded7eb5-logs\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:19 crc kubenswrapper[4826]: I0319 19:19:19.565423 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e115aacd-2c47-4070-bcdf-ead46ded7eb5-scripts" (OuterVolumeSpecName: "scripts") pod "e115aacd-2c47-4070-bcdf-ead46ded7eb5" (UID: "e115aacd-2c47-4070-bcdf-ead46ded7eb5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:19 crc kubenswrapper[4826]: I0319 19:19:19.565546 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e115aacd-2c47-4070-bcdf-ead46ded7eb5-kube-api-access-zzlmg" (OuterVolumeSpecName: "kube-api-access-zzlmg") pod "e115aacd-2c47-4070-bcdf-ead46ded7eb5" (UID: "e115aacd-2c47-4070-bcdf-ead46ded7eb5"). InnerVolumeSpecName "kube-api-access-zzlmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:19:19 crc kubenswrapper[4826]: I0319 19:19:19.566338 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e115aacd-2c47-4070-bcdf-ead46ded7eb5-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e115aacd-2c47-4070-bcdf-ead46ded7eb5" (UID: "e115aacd-2c47-4070-bcdf-ead46ded7eb5"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:19 crc kubenswrapper[4826]: I0319 19:19:19.609898 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e115aacd-2c47-4070-bcdf-ead46ded7eb5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e115aacd-2c47-4070-bcdf-ead46ded7eb5" (UID: "e115aacd-2c47-4070-bcdf-ead46ded7eb5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:19 crc kubenswrapper[4826]: I0319 19:19:19.652008 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e115aacd-2c47-4070-bcdf-ead46ded7eb5-config-data" (OuterVolumeSpecName: "config-data") pod "e115aacd-2c47-4070-bcdf-ead46ded7eb5" (UID: "e115aacd-2c47-4070-bcdf-ead46ded7eb5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:19 crc kubenswrapper[4826]: I0319 19:19:19.676428 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzlmg\" (UniqueName: \"kubernetes.io/projected/e115aacd-2c47-4070-bcdf-ead46ded7eb5-kube-api-access-zzlmg\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:19 crc kubenswrapper[4826]: I0319 19:19:19.676462 4826 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e115aacd-2c47-4070-bcdf-ead46ded7eb5-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:19 crc kubenswrapper[4826]: I0319 19:19:19.676472 4826 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e115aacd-2c47-4070-bcdf-ead46ded7eb5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:19 crc kubenswrapper[4826]: I0319 19:19:19.676481 4826 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e115aacd-2c47-4070-bcdf-ead46ded7eb5-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:19 crc kubenswrapper[4826]: I0319 19:19:19.676492 4826 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e115aacd-2c47-4070-bcdf-ead46ded7eb5-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:19 crc kubenswrapper[4826]: I0319 19:19:19.887755 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-s9rqd"] Mar 19 19:19:19 crc kubenswrapper[4826]: I0319 19:19:19.942968 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5448ff86dc-ccghd" Mar 19 19:19:19 crc kubenswrapper[4826]: I0319 19:19:19.999866 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5448ff86dc-ccghd" Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.181723 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-c0a7-account-create-update-7mljh"] Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.195623 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.195699 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-n9d4c"] Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.218580 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-96hcs"] Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.234714 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.241197 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.244434 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-9a8c-account-create-update-s22ds"] Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.253932 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-ac0a-account-create-update-nrsjr"] Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.329369 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.420049 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a1f4e03c-3679-4122-a562-0de802c7f1c8","Type":"ContainerStarted","Data":"a6d852ec1fe151045a309fea65a56dcd4b27e73fff601279e688ac287cee2e15"} Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.423975 4826 generic.go:334] "Generic (PLEG): container finished" podID="5ef1c972-c655-460d-a249-a1f7d4990b2f" containerID="cd104fc87b47839a2c9c5fcb924ef27eb67868772e968598c2ea2c5177099e9b" exitCode=0 Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.424020 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dddfc89b6-hxwk2" event={"ID":"5ef1c972-c655-460d-a249-a1f7d4990b2f","Type":"ContainerDied","Data":"cd104fc87b47839a2c9c5fcb924ef27eb67868772e968598c2ea2c5177099e9b"} Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.430473 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-9a8c-account-create-update-s22ds" event={"ID":"603d736d-cb9b-41e3-b971-82c457932511","Type":"ContainerStarted","Data":"f4de9dd67d03b692384cbded6daf6812b7c94bf8faf0e34c8952f6885c3b6387"} Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.433714 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-96hcs" event={"ID":"56d23be9-1486-41d9-9f8f-86ca8965b96c","Type":"ContainerStarted","Data":"06132281a52afeae3e83d340219e5759dea1b23bbdee35ce8b45bd171ab343c5"} Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.443999 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.940014124 podStartE2EDuration="16.443983447s" podCreationTimestamp="2026-03-19 19:19:04 +0000 UTC" firstStartedPulling="2026-03-19 19:19:06.66218202 +0000 UTC m=+1371.416250333" lastFinishedPulling="2026-03-19 19:19:19.166151343 +0000 UTC m=+1383.920219656" observedRunningTime="2026-03-19 19:19:20.437562012 +0000 UTC m=+1385.191630325" watchObservedRunningTime="2026-03-19 19:19:20.443983447 +0000 UTC m=+1385.198051760" Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.449217 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-ac0a-account-create-update-nrsjr" event={"ID":"38780138-b2f9-49a0-9ed1-90ee6fbb4c11","Type":"ContainerStarted","Data":"95a659df8e6c70a8eaf1809b6039b6f09a906d974632ca683c38f08a7f129e7e"} Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.458305 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c0a7-account-create-update-7mljh" event={"ID":"9361ba16-5f40-4c7b-a698-dd74fb1d4af7","Type":"ContainerStarted","Data":"2eab71a633e0fb447584bc572850911366f7b2383a31e935c94c3ded1c66cab6"} Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.462210 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-n9d4c" event={"ID":"a60e04c4-caed-4173-ac2a-4dca3c4ae080","Type":"ContainerStarted","Data":"26e7d37dbe6bbb92df07a22477c1875d45ad67ceada3adbe76cdf77b37fe47ad"} Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.469295 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-s9rqd" event={"ID":"81a7caa7-3837-46ff-9008-bc8784373127","Type":"ContainerStarted","Data":"77ec9d965c883abe338c2908131eb66c9fdb6227af45362583d8fc2afc23015c"} Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.469334 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-s9rqd" event={"ID":"81a7caa7-3837-46ff-9008-bc8784373127","Type":"ContainerStarted","Data":"331f247c7ee6396d1fa82ce7028a3badca5a8cca76b46758f0b62dd9dce9a181"} Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.478122 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e115aacd-2c47-4070-bcdf-ead46ded7eb5","Type":"ContainerDied","Data":"6b28b1457b009b290034ce1999b13b277c53b38160bb137b2d549422a3368c92"} Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.478175 4826 scope.go:117] "RemoveContainer" containerID="72f2353bdff336712857ae56e521d63cc77595f5b5da40c4f3faf20eeaa4d307" Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.478320 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.503937 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-s9rqd" podStartSLOduration=8.503922185 podStartE2EDuration="8.503922185s" podCreationTimestamp="2026-03-19 19:19:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:19:20.499808186 +0000 UTC m=+1385.253876499" watchObservedRunningTime="2026-03-19 19:19:20.503922185 +0000 UTC m=+1385.257990498" Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.508852 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5859facc-e308-4701-9de1-ef9f4866a8d3","Type":"ContainerStarted","Data":"b15a64e3bc4b774c8ec8909d1d3f3c4a6f794ebc83684f4b29ccbfc181463168"} Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.547692 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.560481 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.573758 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 19 19:19:20 crc kubenswrapper[4826]: E0319 19:19:20.574377 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e115aacd-2c47-4070-bcdf-ead46ded7eb5" containerName="cinder-api-log" Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.574392 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="e115aacd-2c47-4070-bcdf-ead46ded7eb5" containerName="cinder-api-log" Mar 19 19:19:20 crc kubenswrapper[4826]: E0319 19:19:20.574414 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e115aacd-2c47-4070-bcdf-ead46ded7eb5" containerName="cinder-api" Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.574420 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="e115aacd-2c47-4070-bcdf-ead46ded7eb5" containerName="cinder-api" Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.574677 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="e115aacd-2c47-4070-bcdf-ead46ded7eb5" containerName="cinder-api-log" Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.574701 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="e115aacd-2c47-4070-bcdf-ead46ded7eb5" containerName="cinder-api" Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.575892 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.582356 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.583386 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.583419 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.583457 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.620380 4826 scope.go:117] "RemoveContainer" containerID="e19d9e6d29439f3ce97323998035daa104edaa30e430992094a585e1dc50a1ce" Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.716085 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e18e7f7e-f1f1-4349-a076-79e1f781315d-scripts\") pod \"cinder-api-0\" (UID: \"e18e7f7e-f1f1-4349-a076-79e1f781315d\") " pod="openstack/cinder-api-0" Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.716566 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e18e7f7e-f1f1-4349-a076-79e1f781315d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e18e7f7e-f1f1-4349-a076-79e1f781315d\") " pod="openstack/cinder-api-0" Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.716735 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e18e7f7e-f1f1-4349-a076-79e1f781315d-config-data\") pod \"cinder-api-0\" (UID: \"e18e7f7e-f1f1-4349-a076-79e1f781315d\") " pod="openstack/cinder-api-0" Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.716847 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e18e7f7e-f1f1-4349-a076-79e1f781315d-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"e18e7f7e-f1f1-4349-a076-79e1f781315d\") " pod="openstack/cinder-api-0" Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.716912 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e18e7f7e-f1f1-4349-a076-79e1f781315d-public-tls-certs\") pod \"cinder-api-0\" (UID: \"e18e7f7e-f1f1-4349-a076-79e1f781315d\") " pod="openstack/cinder-api-0" Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.716976 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e18e7f7e-f1f1-4349-a076-79e1f781315d-config-data-custom\") pod \"cinder-api-0\" (UID: \"e18e7f7e-f1f1-4349-a076-79e1f781315d\") " pod="openstack/cinder-api-0" Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.717093 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e18e7f7e-f1f1-4349-a076-79e1f781315d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e18e7f7e-f1f1-4349-a076-79e1f781315d\") " pod="openstack/cinder-api-0" Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.717216 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hj9g4\" (UniqueName: \"kubernetes.io/projected/e18e7f7e-f1f1-4349-a076-79e1f781315d-kube-api-access-hj9g4\") pod \"cinder-api-0\" (UID: \"e18e7f7e-f1f1-4349-a076-79e1f781315d\") " pod="openstack/cinder-api-0" Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.717323 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e18e7f7e-f1f1-4349-a076-79e1f781315d-logs\") pod \"cinder-api-0\" (UID: \"e18e7f7e-f1f1-4349-a076-79e1f781315d\") " pod="openstack/cinder-api-0" Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.819360 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e18e7f7e-f1f1-4349-a076-79e1f781315d-scripts\") pod \"cinder-api-0\" (UID: \"e18e7f7e-f1f1-4349-a076-79e1f781315d\") " pod="openstack/cinder-api-0" Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.819440 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e18e7f7e-f1f1-4349-a076-79e1f781315d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e18e7f7e-f1f1-4349-a076-79e1f781315d\") " pod="openstack/cinder-api-0" Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.819495 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e18e7f7e-f1f1-4349-a076-79e1f781315d-config-data\") pod \"cinder-api-0\" (UID: \"e18e7f7e-f1f1-4349-a076-79e1f781315d\") " pod="openstack/cinder-api-0" Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.819519 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e18e7f7e-f1f1-4349-a076-79e1f781315d-public-tls-certs\") pod \"cinder-api-0\" (UID: \"e18e7f7e-f1f1-4349-a076-79e1f781315d\") " pod="openstack/cinder-api-0" Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.819538 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e18e7f7e-f1f1-4349-a076-79e1f781315d-config-data-custom\") pod \"cinder-api-0\" (UID: \"e18e7f7e-f1f1-4349-a076-79e1f781315d\") " pod="openstack/cinder-api-0" Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.819551 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e18e7f7e-f1f1-4349-a076-79e1f781315d-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"e18e7f7e-f1f1-4349-a076-79e1f781315d\") " pod="openstack/cinder-api-0" Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.819587 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e18e7f7e-f1f1-4349-a076-79e1f781315d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e18e7f7e-f1f1-4349-a076-79e1f781315d\") " pod="openstack/cinder-api-0" Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.819630 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hj9g4\" (UniqueName: \"kubernetes.io/projected/e18e7f7e-f1f1-4349-a076-79e1f781315d-kube-api-access-hj9g4\") pod \"cinder-api-0\" (UID: \"e18e7f7e-f1f1-4349-a076-79e1f781315d\") " pod="openstack/cinder-api-0" Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.819708 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e18e7f7e-f1f1-4349-a076-79e1f781315d-logs\") pod \"cinder-api-0\" (UID: \"e18e7f7e-f1f1-4349-a076-79e1f781315d\") " pod="openstack/cinder-api-0" Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.820340 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e18e7f7e-f1f1-4349-a076-79e1f781315d-logs\") pod \"cinder-api-0\" (UID: \"e18e7f7e-f1f1-4349-a076-79e1f781315d\") " pod="openstack/cinder-api-0" Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.820826 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e18e7f7e-f1f1-4349-a076-79e1f781315d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e18e7f7e-f1f1-4349-a076-79e1f781315d\") " pod="openstack/cinder-api-0" Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.825178 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e18e7f7e-f1f1-4349-a076-79e1f781315d-config-data-custom\") pod \"cinder-api-0\" (UID: \"e18e7f7e-f1f1-4349-a076-79e1f781315d\") " pod="openstack/cinder-api-0" Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.825201 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e18e7f7e-f1f1-4349-a076-79e1f781315d-public-tls-certs\") pod \"cinder-api-0\" (UID: \"e18e7f7e-f1f1-4349-a076-79e1f781315d\") " pod="openstack/cinder-api-0" Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.825347 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e18e7f7e-f1f1-4349-a076-79e1f781315d-scripts\") pod \"cinder-api-0\" (UID: \"e18e7f7e-f1f1-4349-a076-79e1f781315d\") " pod="openstack/cinder-api-0" Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.825896 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e18e7f7e-f1f1-4349-a076-79e1f781315d-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"e18e7f7e-f1f1-4349-a076-79e1f781315d\") " pod="openstack/cinder-api-0" Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.826154 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e18e7f7e-f1f1-4349-a076-79e1f781315d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e18e7f7e-f1f1-4349-a076-79e1f781315d\") " pod="openstack/cinder-api-0" Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.826931 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e18e7f7e-f1f1-4349-a076-79e1f781315d-config-data\") pod \"cinder-api-0\" (UID: \"e18e7f7e-f1f1-4349-a076-79e1f781315d\") " pod="openstack/cinder-api-0" Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.840453 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hj9g4\" (UniqueName: \"kubernetes.io/projected/e18e7f7e-f1f1-4349-a076-79e1f781315d-kube-api-access-hj9g4\") pod \"cinder-api-0\" (UID: \"e18e7f7e-f1f1-4349-a076-79e1f781315d\") " pod="openstack/cinder-api-0" Mar 19 19:19:20 crc kubenswrapper[4826]: I0319 19:19:20.915466 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 19 19:19:21 crc kubenswrapper[4826]: I0319 19:19:21.469825 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 19 19:19:21 crc kubenswrapper[4826]: I0319 19:19:21.525686 4826 generic.go:334] "Generic (PLEG): container finished" podID="38780138-b2f9-49a0-9ed1-90ee6fbb4c11" containerID="ef10ff692a80616e83e4a7fc61781a325ff0c035046a7ef4b60a86295cee14d3" exitCode=0 Mar 19 19:19:21 crc kubenswrapper[4826]: I0319 19:19:21.525746 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-ac0a-account-create-update-nrsjr" event={"ID":"38780138-b2f9-49a0-9ed1-90ee6fbb4c11","Type":"ContainerDied","Data":"ef10ff692a80616e83e4a7fc61781a325ff0c035046a7ef4b60a86295cee14d3"} Mar 19 19:19:21 crc kubenswrapper[4826]: I0319 19:19:21.531324 4826 generic.go:334] "Generic (PLEG): container finished" podID="a60e04c4-caed-4173-ac2a-4dca3c4ae080" containerID="5be86ed4ca6a2ad91533eb1be40cbdd99b329ac6d4f507653c629adccbddfd57" exitCode=0 Mar 19 19:19:21 crc kubenswrapper[4826]: I0319 19:19:21.531403 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-n9d4c" event={"ID":"a60e04c4-caed-4173-ac2a-4dca3c4ae080","Type":"ContainerDied","Data":"5be86ed4ca6a2ad91533eb1be40cbdd99b329ac6d4f507653c629adccbddfd57"} Mar 19 19:19:21 crc kubenswrapper[4826]: I0319 19:19:21.534767 4826 generic.go:334] "Generic (PLEG): container finished" podID="81a7caa7-3837-46ff-9008-bc8784373127" containerID="77ec9d965c883abe338c2908131eb66c9fdb6227af45362583d8fc2afc23015c" exitCode=0 Mar 19 19:19:21 crc kubenswrapper[4826]: I0319 19:19:21.534931 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-s9rqd" event={"ID":"81a7caa7-3837-46ff-9008-bc8784373127","Type":"ContainerDied","Data":"77ec9d965c883abe338c2908131eb66c9fdb6227af45362583d8fc2afc23015c"} Mar 19 19:19:21 crc kubenswrapper[4826]: I0319 19:19:21.561563 4826 generic.go:334] "Generic (PLEG): container finished" podID="56d23be9-1486-41d9-9f8f-86ca8965b96c" containerID="21817210888fa22816cfc57f69fbf7ffdd658827d7d41be0bd9f61500d3d372d" exitCode=0 Mar 19 19:19:21 crc kubenswrapper[4826]: I0319 19:19:21.561637 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-96hcs" event={"ID":"56d23be9-1486-41d9-9f8f-86ca8965b96c","Type":"ContainerDied","Data":"21817210888fa22816cfc57f69fbf7ffdd658827d7d41be0bd9f61500d3d372d"} Mar 19 19:19:21 crc kubenswrapper[4826]: I0319 19:19:21.572131 4826 generic.go:334] "Generic (PLEG): container finished" podID="603d736d-cb9b-41e3-b971-82c457932511" containerID="fff89c799bb4a7d8f1c1471e29d1fcea876eef3b3721a68532df2c08ee66dbf4" exitCode=0 Mar 19 19:19:21 crc kubenswrapper[4826]: I0319 19:19:21.572215 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-9a8c-account-create-update-s22ds" event={"ID":"603d736d-cb9b-41e3-b971-82c457932511","Type":"ContainerDied","Data":"fff89c799bb4a7d8f1c1471e29d1fcea876eef3b3721a68532df2c08ee66dbf4"} Mar 19 19:19:21 crc kubenswrapper[4826]: I0319 19:19:21.580359 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e18e7f7e-f1f1-4349-a076-79e1f781315d","Type":"ContainerStarted","Data":"5423f00f33154934cdfe3984b7f513db9f6045d794434ef95139e8a197aa92b3"} Mar 19 19:19:21 crc kubenswrapper[4826]: I0319 19:19:21.584684 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5859facc-e308-4701-9de1-ef9f4866a8d3","Type":"ContainerStarted","Data":"8accb492b569909be45c55e670bdb92bd9bbbf345c576a4ea28dc2a36fd9db7f"} Mar 19 19:19:21 crc kubenswrapper[4826]: I0319 19:19:21.594034 4826 generic.go:334] "Generic (PLEG): container finished" podID="9361ba16-5f40-4c7b-a698-dd74fb1d4af7" containerID="fdf467b26b8ede7bc3176bd1dc6d2b9e2853361e447eed4af4ca4256a3452752" exitCode=0 Mar 19 19:19:21 crc kubenswrapper[4826]: I0319 19:19:21.594141 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c0a7-account-create-update-7mljh" event={"ID":"9361ba16-5f40-4c7b-a698-dd74fb1d4af7","Type":"ContainerDied","Data":"fdf467b26b8ede7bc3176bd1dc6d2b9e2853361e447eed4af4ca4256a3452752"} Mar 19 19:19:21 crc kubenswrapper[4826]: I0319 19:19:21.677313 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:19:21 crc kubenswrapper[4826]: I0319 19:19:21.997260 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e115aacd-2c47-4070-bcdf-ead46ded7eb5" path="/var/lib/kubelet/pods/e115aacd-2c47-4070-bcdf-ead46ded7eb5/volumes" Mar 19 19:19:22 crc kubenswrapper[4826]: I0319 19:19:22.620374 4826 generic.go:334] "Generic (PLEG): container finished" podID="5ef1c972-c655-460d-a249-a1f7d4990b2f" containerID="83e7c3bfc46a79113e6c663c7ed9ecee31bd393e79e33163c32138d7bb8d0b91" exitCode=0 Mar 19 19:19:22 crc kubenswrapper[4826]: I0319 19:19:22.620902 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dddfc89b6-hxwk2" event={"ID":"5ef1c972-c655-460d-a249-a1f7d4990b2f","Type":"ContainerDied","Data":"83e7c3bfc46a79113e6c663c7ed9ecee31bd393e79e33163c32138d7bb8d0b91"} Mar 19 19:19:22 crc kubenswrapper[4826]: I0319 19:19:22.642681 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e18e7f7e-f1f1-4349-a076-79e1f781315d","Type":"ContainerStarted","Data":"7b62520fcf46b3ee9238ce4c43cc7b274c9e9d4079a1b7eccea61461e0649fc8"} Mar 19 19:19:22 crc kubenswrapper[4826]: I0319 19:19:22.662728 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5859facc-e308-4701-9de1-ef9f4866a8d3","Type":"ContainerStarted","Data":"908540e86d64eae77c502214c1c8eb51757c13a3192a1f8fb098abf8fb0dcabd"} Mar 19 19:19:22 crc kubenswrapper[4826]: I0319 19:19:22.662776 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5859facc-e308-4701-9de1-ef9f4866a8d3","Type":"ContainerStarted","Data":"789ca983399ae59295135fa5841d92d23fc9e5743f6fec0e299fe4c25b8db75f"} Mar 19 19:19:22 crc kubenswrapper[4826]: I0319 19:19:22.735784 4826 scope.go:117] "RemoveContainer" containerID="665e7e9dcfc20ff31b70f3ab3bfdd29cb1836b7d0c40dc480e7cb291800adfd0" Mar 19 19:19:22 crc kubenswrapper[4826]: I0319 19:19:22.963203 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dddfc89b6-hxwk2" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.094753 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ef1c972-c655-460d-a249-a1f7d4990b2f-ovndb-tls-certs\") pod \"5ef1c972-c655-460d-a249-a1f7d4990b2f\" (UID: \"5ef1c972-c655-460d-a249-a1f7d4990b2f\") " Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.095007 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xsbl\" (UniqueName: \"kubernetes.io/projected/5ef1c972-c655-460d-a249-a1f7d4990b2f-kube-api-access-7xsbl\") pod \"5ef1c972-c655-460d-a249-a1f7d4990b2f\" (UID: \"5ef1c972-c655-460d-a249-a1f7d4990b2f\") " Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.095032 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5ef1c972-c655-460d-a249-a1f7d4990b2f-httpd-config\") pod \"5ef1c972-c655-460d-a249-a1f7d4990b2f\" (UID: \"5ef1c972-c655-460d-a249-a1f7d4990b2f\") " Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.095098 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5ef1c972-c655-460d-a249-a1f7d4990b2f-config\") pod \"5ef1c972-c655-460d-a249-a1f7d4990b2f\" (UID: \"5ef1c972-c655-460d-a249-a1f7d4990b2f\") " Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.095266 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ef1c972-c655-460d-a249-a1f7d4990b2f-combined-ca-bundle\") pod \"5ef1c972-c655-460d-a249-a1f7d4990b2f\" (UID: \"5ef1c972-c655-460d-a249-a1f7d4990b2f\") " Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.110395 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ef1c972-c655-460d-a249-a1f7d4990b2f-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "5ef1c972-c655-460d-a249-a1f7d4990b2f" (UID: "5ef1c972-c655-460d-a249-a1f7d4990b2f"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.168077 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ef1c972-c655-460d-a249-a1f7d4990b2f-kube-api-access-7xsbl" (OuterVolumeSpecName: "kube-api-access-7xsbl") pod "5ef1c972-c655-460d-a249-a1f7d4990b2f" (UID: "5ef1c972-c655-460d-a249-a1f7d4990b2f"). InnerVolumeSpecName "kube-api-access-7xsbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.204160 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xsbl\" (UniqueName: \"kubernetes.io/projected/5ef1c972-c655-460d-a249-a1f7d4990b2f-kube-api-access-7xsbl\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.204195 4826 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5ef1c972-c655-460d-a249-a1f7d4990b2f-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.298793 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ef1c972-c655-460d-a249-a1f7d4990b2f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5ef1c972-c655-460d-a249-a1f7d4990b2f" (UID: "5ef1c972-c655-460d-a249-a1f7d4990b2f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.307371 4826 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ef1c972-c655-460d-a249-a1f7d4990b2f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.365903 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ef1c972-c655-460d-a249-a1f7d4990b2f-config" (OuterVolumeSpecName: "config") pod "5ef1c972-c655-460d-a249-a1f7d4990b2f" (UID: "5ef1c972-c655-460d-a249-a1f7d4990b2f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.396054 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ef1c972-c655-460d-a249-a1f7d4990b2f-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "5ef1c972-c655-460d-a249-a1f7d4990b2f" (UID: "5ef1c972-c655-460d-a249-a1f7d4990b2f"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.409401 4826 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ef1c972-c655-460d-a249-a1f7d4990b2f-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.409441 4826 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/5ef1c972-c655-460d-a249-a1f7d4990b2f-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.485043 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c0a7-account-create-update-7mljh" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.512161 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9a8c-account-create-update-s22ds" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.543234 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-ac0a-account-create-update-nrsjr" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.559718 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-n9d4c" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.577973 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-s9rqd" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.584432 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-96hcs" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.634945 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38780138-b2f9-49a0-9ed1-90ee6fbb4c11-operator-scripts\") pod \"38780138-b2f9-49a0-9ed1-90ee6fbb4c11\" (UID: \"38780138-b2f9-49a0-9ed1-90ee6fbb4c11\") " Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.635702 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38780138-b2f9-49a0-9ed1-90ee6fbb4c11-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "38780138-b2f9-49a0-9ed1-90ee6fbb4c11" (UID: "38780138-b2f9-49a0-9ed1-90ee6fbb4c11"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.635875 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cc8d\" (UniqueName: \"kubernetes.io/projected/a60e04c4-caed-4173-ac2a-4dca3c4ae080-kube-api-access-4cc8d\") pod \"a60e04c4-caed-4173-ac2a-4dca3c4ae080\" (UID: \"a60e04c4-caed-4173-ac2a-4dca3c4ae080\") " Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.635982 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a60e04c4-caed-4173-ac2a-4dca3c4ae080-operator-scripts\") pod \"a60e04c4-caed-4173-ac2a-4dca3c4ae080\" (UID: \"a60e04c4-caed-4173-ac2a-4dca3c4ae080\") " Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.636011 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9v8pn\" (UniqueName: \"kubernetes.io/projected/9361ba16-5f40-4c7b-a698-dd74fb1d4af7-kube-api-access-9v8pn\") pod \"9361ba16-5f40-4c7b-a698-dd74fb1d4af7\" (UID: \"9361ba16-5f40-4c7b-a698-dd74fb1d4af7\") " Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.636042 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9361ba16-5f40-4c7b-a698-dd74fb1d4af7-operator-scripts\") pod \"9361ba16-5f40-4c7b-a698-dd74fb1d4af7\" (UID: \"9361ba16-5f40-4c7b-a698-dd74fb1d4af7\") " Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.636059 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qkxq\" (UniqueName: \"kubernetes.io/projected/38780138-b2f9-49a0-9ed1-90ee6fbb4c11-kube-api-access-2qkxq\") pod \"38780138-b2f9-49a0-9ed1-90ee6fbb4c11\" (UID: \"38780138-b2f9-49a0-9ed1-90ee6fbb4c11\") " Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.636094 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xt2wk\" (UniqueName: \"kubernetes.io/projected/603d736d-cb9b-41e3-b971-82c457932511-kube-api-access-xt2wk\") pod \"603d736d-cb9b-41e3-b971-82c457932511\" (UID: \"603d736d-cb9b-41e3-b971-82c457932511\") " Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.636131 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/603d736d-cb9b-41e3-b971-82c457932511-operator-scripts\") pod \"603d736d-cb9b-41e3-b971-82c457932511\" (UID: \"603d736d-cb9b-41e3-b971-82c457932511\") " Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.636206 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rr5lt\" (UniqueName: \"kubernetes.io/projected/81a7caa7-3837-46ff-9008-bc8784373127-kube-api-access-rr5lt\") pod \"81a7caa7-3837-46ff-9008-bc8784373127\" (UID: \"81a7caa7-3837-46ff-9008-bc8784373127\") " Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.636270 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81a7caa7-3837-46ff-9008-bc8784373127-operator-scripts\") pod \"81a7caa7-3837-46ff-9008-bc8784373127\" (UID: \"81a7caa7-3837-46ff-9008-bc8784373127\") " Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.637092 4826 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38780138-b2f9-49a0-9ed1-90ee6fbb4c11-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.637384 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81a7caa7-3837-46ff-9008-bc8784373127-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "81a7caa7-3837-46ff-9008-bc8784373127" (UID: "81a7caa7-3837-46ff-9008-bc8784373127"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.638049 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/603d736d-cb9b-41e3-b971-82c457932511-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "603d736d-cb9b-41e3-b971-82c457932511" (UID: "603d736d-cb9b-41e3-b971-82c457932511"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.642202 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a60e04c4-caed-4173-ac2a-4dca3c4ae080-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a60e04c4-caed-4173-ac2a-4dca3c4ae080" (UID: "a60e04c4-caed-4173-ac2a-4dca3c4ae080"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.642218 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9361ba16-5f40-4c7b-a698-dd74fb1d4af7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9361ba16-5f40-4c7b-a698-dd74fb1d4af7" (UID: "9361ba16-5f40-4c7b-a698-dd74fb1d4af7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.648832 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9361ba16-5f40-4c7b-a698-dd74fb1d4af7-kube-api-access-9v8pn" (OuterVolumeSpecName: "kube-api-access-9v8pn") pod "9361ba16-5f40-4c7b-a698-dd74fb1d4af7" (UID: "9361ba16-5f40-4c7b-a698-dd74fb1d4af7"). InnerVolumeSpecName "kube-api-access-9v8pn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.648885 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81a7caa7-3837-46ff-9008-bc8784373127-kube-api-access-rr5lt" (OuterVolumeSpecName: "kube-api-access-rr5lt") pod "81a7caa7-3837-46ff-9008-bc8784373127" (UID: "81a7caa7-3837-46ff-9008-bc8784373127"). InnerVolumeSpecName "kube-api-access-rr5lt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.648979 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a60e04c4-caed-4173-ac2a-4dca3c4ae080-kube-api-access-4cc8d" (OuterVolumeSpecName: "kube-api-access-4cc8d") pod "a60e04c4-caed-4173-ac2a-4dca3c4ae080" (UID: "a60e04c4-caed-4173-ac2a-4dca3c4ae080"). InnerVolumeSpecName "kube-api-access-4cc8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.648878 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/603d736d-cb9b-41e3-b971-82c457932511-kube-api-access-xt2wk" (OuterVolumeSpecName: "kube-api-access-xt2wk") pod "603d736d-cb9b-41e3-b971-82c457932511" (UID: "603d736d-cb9b-41e3-b971-82c457932511"). InnerVolumeSpecName "kube-api-access-xt2wk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.651142 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38780138-b2f9-49a0-9ed1-90ee6fbb4c11-kube-api-access-2qkxq" (OuterVolumeSpecName: "kube-api-access-2qkxq") pod "38780138-b2f9-49a0-9ed1-90ee6fbb4c11" (UID: "38780138-b2f9-49a0-9ed1-90ee6fbb4c11"). InnerVolumeSpecName "kube-api-access-2qkxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.700599 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e18e7f7e-f1f1-4349-a076-79e1f781315d","Type":"ContainerStarted","Data":"425309022b33183f35ee911ca8fa1173162a208d6994b2bf131a2004480c6623"} Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.702378 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.708187 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-96hcs" event={"ID":"56d23be9-1486-41d9-9f8f-86ca8965b96c","Type":"ContainerDied","Data":"06132281a52afeae3e83d340219e5759dea1b23bbdee35ce8b45bd171ab343c5"} Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.708234 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06132281a52afeae3e83d340219e5759dea1b23bbdee35ce8b45bd171ab343c5" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.708334 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-96hcs" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.724952 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-ac0a-account-create-update-nrsjr" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.725410 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-ac0a-account-create-update-nrsjr" event={"ID":"38780138-b2f9-49a0-9ed1-90ee6fbb4c11","Type":"ContainerDied","Data":"95a659df8e6c70a8eaf1809b6039b6f09a906d974632ca683c38f08a7f129e7e"} Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.725898 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95a659df8e6c70a8eaf1809b6039b6f09a906d974632ca683c38f08a7f129e7e" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.730852 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.7308344780000002 podStartE2EDuration="3.730834478s" podCreationTimestamp="2026-03-19 19:19:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:19:23.720824296 +0000 UTC m=+1388.474892619" watchObservedRunningTime="2026-03-19 19:19:23.730834478 +0000 UTC m=+1388.484902791" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.734062 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c0a7-account-create-update-7mljh" event={"ID":"9361ba16-5f40-4c7b-a698-dd74fb1d4af7","Type":"ContainerDied","Data":"2eab71a633e0fb447584bc572850911366f7b2383a31e935c94c3ded1c66cab6"} Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.734096 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2eab71a633e0fb447584bc572850911366f7b2383a31e935c94c3ded1c66cab6" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.734173 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c0a7-account-create-update-7mljh" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.739970 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56d23be9-1486-41d9-9f8f-86ca8965b96c-operator-scripts\") pod \"56d23be9-1486-41d9-9f8f-86ca8965b96c\" (UID: \"56d23be9-1486-41d9-9f8f-86ca8965b96c\") " Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.740516 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w2k4\" (UniqueName: \"kubernetes.io/projected/56d23be9-1486-41d9-9f8f-86ca8965b96c-kube-api-access-2w2k4\") pod \"56d23be9-1486-41d9-9f8f-86ca8965b96c\" (UID: \"56d23be9-1486-41d9-9f8f-86ca8965b96c\") " Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.741507 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cc8d\" (UniqueName: \"kubernetes.io/projected/a60e04c4-caed-4173-ac2a-4dca3c4ae080-kube-api-access-4cc8d\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.741813 4826 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a60e04c4-caed-4173-ac2a-4dca3c4ae080-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.742052 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9v8pn\" (UniqueName: \"kubernetes.io/projected/9361ba16-5f40-4c7b-a698-dd74fb1d4af7-kube-api-access-9v8pn\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.742134 4826 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9361ba16-5f40-4c7b-a698-dd74fb1d4af7-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.742190 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qkxq\" (UniqueName: \"kubernetes.io/projected/38780138-b2f9-49a0-9ed1-90ee6fbb4c11-kube-api-access-2qkxq\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.742240 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xt2wk\" (UniqueName: \"kubernetes.io/projected/603d736d-cb9b-41e3-b971-82c457932511-kube-api-access-xt2wk\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.742289 4826 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/603d736d-cb9b-41e3-b971-82c457932511-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.742641 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rr5lt\" (UniqueName: \"kubernetes.io/projected/81a7caa7-3837-46ff-9008-bc8784373127-kube-api-access-rr5lt\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.742820 4826 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81a7caa7-3837-46ff-9008-bc8784373127-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.742607 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-n9d4c" event={"ID":"a60e04c4-caed-4173-ac2a-4dca3c4ae080","Type":"ContainerDied","Data":"26e7d37dbe6bbb92df07a22477c1875d45ad67ceada3adbe76cdf77b37fe47ad"} Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.742926 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26e7d37dbe6bbb92df07a22477c1875d45ad67ceada3adbe76cdf77b37fe47ad" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.740800 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56d23be9-1486-41d9-9f8f-86ca8965b96c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "56d23be9-1486-41d9-9f8f-86ca8965b96c" (UID: "56d23be9-1486-41d9-9f8f-86ca8965b96c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.742592 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-n9d4c" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.747227 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56d23be9-1486-41d9-9f8f-86ca8965b96c-kube-api-access-2w2k4" (OuterVolumeSpecName: "kube-api-access-2w2k4") pod "56d23be9-1486-41d9-9f8f-86ca8965b96c" (UID: "56d23be9-1486-41d9-9f8f-86ca8965b96c"). InnerVolumeSpecName "kube-api-access-2w2k4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.754416 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dddfc89b6-hxwk2" event={"ID":"5ef1c972-c655-460d-a249-a1f7d4990b2f","Type":"ContainerDied","Data":"6231258059cb48cf598379d2a2b3f2474af44be853875b7a576d0be465f40737"} Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.754459 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dddfc89b6-hxwk2" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.754467 4826 scope.go:117] "RemoveContainer" containerID="cd104fc87b47839a2c9c5fcb924ef27eb67868772e968598c2ea2c5177099e9b" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.764722 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9a8c-account-create-update-s22ds" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.764725 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-9a8c-account-create-update-s22ds" event={"ID":"603d736d-cb9b-41e3-b971-82c457932511","Type":"ContainerDied","Data":"f4de9dd67d03b692384cbded6daf6812b7c94bf8faf0e34c8952f6885c3b6387"} Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.764831 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4de9dd67d03b692384cbded6daf6812b7c94bf8faf0e34c8952f6885c3b6387" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.767140 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-s9rqd" event={"ID":"81a7caa7-3837-46ff-9008-bc8784373127","Type":"ContainerDied","Data":"331f247c7ee6396d1fa82ce7028a3badca5a8cca76b46758f0b62dd9dce9a181"} Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.767189 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="331f247c7ee6396d1fa82ce7028a3badca5a8cca76b46758f0b62dd9dce9a181" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.767277 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-s9rqd" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.800429 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cml5k"] Mar 19 19:19:23 crc kubenswrapper[4826]: E0319 19:19:23.802033 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a60e04c4-caed-4173-ac2a-4dca3c4ae080" containerName="mariadb-database-create" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.802055 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="a60e04c4-caed-4173-ac2a-4dca3c4ae080" containerName="mariadb-database-create" Mar 19 19:19:23 crc kubenswrapper[4826]: E0319 19:19:23.802078 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9361ba16-5f40-4c7b-a698-dd74fb1d4af7" containerName="mariadb-account-create-update" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.802086 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="9361ba16-5f40-4c7b-a698-dd74fb1d4af7" containerName="mariadb-account-create-update" Mar 19 19:19:23 crc kubenswrapper[4826]: E0319 19:19:23.802094 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56d23be9-1486-41d9-9f8f-86ca8965b96c" containerName="mariadb-database-create" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.802100 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="56d23be9-1486-41d9-9f8f-86ca8965b96c" containerName="mariadb-database-create" Mar 19 19:19:23 crc kubenswrapper[4826]: E0319 19:19:23.802121 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38780138-b2f9-49a0-9ed1-90ee6fbb4c11" containerName="mariadb-account-create-update" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.802127 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="38780138-b2f9-49a0-9ed1-90ee6fbb4c11" containerName="mariadb-account-create-update" Mar 19 19:19:23 crc kubenswrapper[4826]: E0319 19:19:23.802136 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ef1c972-c655-460d-a249-a1f7d4990b2f" containerName="neutron-httpd" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.802142 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ef1c972-c655-460d-a249-a1f7d4990b2f" containerName="neutron-httpd" Mar 19 19:19:23 crc kubenswrapper[4826]: E0319 19:19:23.802151 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="603d736d-cb9b-41e3-b971-82c457932511" containerName="mariadb-account-create-update" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.802158 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="603d736d-cb9b-41e3-b971-82c457932511" containerName="mariadb-account-create-update" Mar 19 19:19:23 crc kubenswrapper[4826]: E0319 19:19:23.802188 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ef1c972-c655-460d-a249-a1f7d4990b2f" containerName="neutron-api" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.802193 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ef1c972-c655-460d-a249-a1f7d4990b2f" containerName="neutron-api" Mar 19 19:19:23 crc kubenswrapper[4826]: E0319 19:19:23.802207 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81a7caa7-3837-46ff-9008-bc8784373127" containerName="mariadb-database-create" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.802213 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="81a7caa7-3837-46ff-9008-bc8784373127" containerName="mariadb-database-create" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.802399 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="603d736d-cb9b-41e3-b971-82c457932511" containerName="mariadb-account-create-update" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.802414 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ef1c972-c655-460d-a249-a1f7d4990b2f" containerName="neutron-api" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.802423 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="a60e04c4-caed-4173-ac2a-4dca3c4ae080" containerName="mariadb-database-create" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.802438 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ef1c972-c655-460d-a249-a1f7d4990b2f" containerName="neutron-httpd" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.802445 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="38780138-b2f9-49a0-9ed1-90ee6fbb4c11" containerName="mariadb-account-create-update" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.802460 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="9361ba16-5f40-4c7b-a698-dd74fb1d4af7" containerName="mariadb-account-create-update" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.802467 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="56d23be9-1486-41d9-9f8f-86ca8965b96c" containerName="mariadb-database-create" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.802480 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="81a7caa7-3837-46ff-9008-bc8784373127" containerName="mariadb-database-create" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.804057 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cml5k" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.819955 4826 scope.go:117] "RemoveContainer" containerID="83e7c3bfc46a79113e6c663c7ed9ecee31bd393e79e33163c32138d7bb8d0b91" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.841737 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cml5k"] Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.844833 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af2662b9-3873-4947-9793-e7e1c6611dcb-utilities\") pod \"redhat-operators-cml5k\" (UID: \"af2662b9-3873-4947-9793-e7e1c6611dcb\") " pod="openshift-marketplace/redhat-operators-cml5k" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.844976 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af2662b9-3873-4947-9793-e7e1c6611dcb-catalog-content\") pod \"redhat-operators-cml5k\" (UID: \"af2662b9-3873-4947-9793-e7e1c6611dcb\") " pod="openshift-marketplace/redhat-operators-cml5k" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.845197 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7v9w\" (UniqueName: \"kubernetes.io/projected/af2662b9-3873-4947-9793-e7e1c6611dcb-kube-api-access-h7v9w\") pod \"redhat-operators-cml5k\" (UID: \"af2662b9-3873-4947-9793-e7e1c6611dcb\") " pod="openshift-marketplace/redhat-operators-cml5k" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.845341 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w2k4\" (UniqueName: \"kubernetes.io/projected/56d23be9-1486-41d9-9f8f-86ca8965b96c-kube-api-access-2w2k4\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.845399 4826 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56d23be9-1486-41d9-9f8f-86ca8965b96c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.867891 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-dddfc89b6-hxwk2"] Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.881241 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-dddfc89b6-hxwk2"] Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.947874 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af2662b9-3873-4947-9793-e7e1c6611dcb-catalog-content\") pod \"redhat-operators-cml5k\" (UID: \"af2662b9-3873-4947-9793-e7e1c6611dcb\") " pod="openshift-marketplace/redhat-operators-cml5k" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.948031 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7v9w\" (UniqueName: \"kubernetes.io/projected/af2662b9-3873-4947-9793-e7e1c6611dcb-kube-api-access-h7v9w\") pod \"redhat-operators-cml5k\" (UID: \"af2662b9-3873-4947-9793-e7e1c6611dcb\") " pod="openshift-marketplace/redhat-operators-cml5k" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.948096 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af2662b9-3873-4947-9793-e7e1c6611dcb-utilities\") pod \"redhat-operators-cml5k\" (UID: \"af2662b9-3873-4947-9793-e7e1c6611dcb\") " pod="openshift-marketplace/redhat-operators-cml5k" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.948497 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af2662b9-3873-4947-9793-e7e1c6611dcb-utilities\") pod \"redhat-operators-cml5k\" (UID: \"af2662b9-3873-4947-9793-e7e1c6611dcb\") " pod="openshift-marketplace/redhat-operators-cml5k" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.948823 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af2662b9-3873-4947-9793-e7e1c6611dcb-catalog-content\") pod \"redhat-operators-cml5k\" (UID: \"af2662b9-3873-4947-9793-e7e1c6611dcb\") " pod="openshift-marketplace/redhat-operators-cml5k" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.968435 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7v9w\" (UniqueName: \"kubernetes.io/projected/af2662b9-3873-4947-9793-e7e1c6611dcb-kube-api-access-h7v9w\") pod \"redhat-operators-cml5k\" (UID: \"af2662b9-3873-4947-9793-e7e1c6611dcb\") " pod="openshift-marketplace/redhat-operators-cml5k" Mar 19 19:19:23 crc kubenswrapper[4826]: I0319 19:19:23.996793 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ef1c972-c655-460d-a249-a1f7d4990b2f" path="/var/lib/kubelet/pods/5ef1c972-c655-460d-a249-a1f7d4990b2f/volumes" Mar 19 19:19:24 crc kubenswrapper[4826]: I0319 19:19:24.131896 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cml5k" Mar 19 19:19:24 crc kubenswrapper[4826]: I0319 19:19:24.672382 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cml5k"] Mar 19 19:19:24 crc kubenswrapper[4826]: I0319 19:19:24.784971 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cml5k" event={"ID":"af2662b9-3873-4947-9793-e7e1c6611dcb","Type":"ContainerStarted","Data":"91350502dcb210944af251ae5a18f9020c8e7524f77d556eca5954e0f74c19fa"} Mar 19 19:19:25 crc kubenswrapper[4826]: E0319 19:19:25.044730 4826 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf2662b9_3873_4947_9793_e7e1c6611dcb.slice/crio-90d79c80978df5290ed3480f9117dd794bb87506bd39c0812f84a12d5ae758cc.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf2662b9_3873_4947_9793_e7e1c6611dcb.slice/crio-conmon-90d79c80978df5290ed3480f9117dd794bb87506bd39c0812f84a12d5ae758cc.scope\": RecentStats: unable to find data in memory cache]" Mar 19 19:19:25 crc kubenswrapper[4826]: I0319 19:19:25.797220 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5859facc-e308-4701-9de1-ef9f4866a8d3","Type":"ContainerStarted","Data":"45543cc736227e197297846bee69b87f4a305800250fa4a9e6fb37646cfc9aab"} Mar 19 19:19:25 crc kubenswrapper[4826]: I0319 19:19:25.797479 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 19 19:19:25 crc kubenswrapper[4826]: I0319 19:19:25.797532 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5859facc-e308-4701-9de1-ef9f4866a8d3" containerName="sg-core" containerID="cri-o://908540e86d64eae77c502214c1c8eb51757c13a3192a1f8fb098abf8fb0dcabd" gracePeriod=30 Mar 19 19:19:25 crc kubenswrapper[4826]: I0319 19:19:25.797531 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5859facc-e308-4701-9de1-ef9f4866a8d3" containerName="proxy-httpd" containerID="cri-o://45543cc736227e197297846bee69b87f4a305800250fa4a9e6fb37646cfc9aab" gracePeriod=30 Mar 19 19:19:25 crc kubenswrapper[4826]: I0319 19:19:25.797577 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5859facc-e308-4701-9de1-ef9f4866a8d3" containerName="ceilometer-notification-agent" containerID="cri-o://789ca983399ae59295135fa5841d92d23fc9e5743f6fec0e299fe4c25b8db75f" gracePeriod=30 Mar 19 19:19:25 crc kubenswrapper[4826]: I0319 19:19:25.797527 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5859facc-e308-4701-9de1-ef9f4866a8d3" containerName="ceilometer-central-agent" containerID="cri-o://8accb492b569909be45c55e670bdb92bd9bbbf345c576a4ea28dc2a36fd9db7f" gracePeriod=30 Mar 19 19:19:25 crc kubenswrapper[4826]: I0319 19:19:25.800812 4826 generic.go:334] "Generic (PLEG): container finished" podID="af2662b9-3873-4947-9793-e7e1c6611dcb" containerID="90d79c80978df5290ed3480f9117dd794bb87506bd39c0812f84a12d5ae758cc" exitCode=0 Mar 19 19:19:25 crc kubenswrapper[4826]: I0319 19:19:25.800870 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cml5k" event={"ID":"af2662b9-3873-4947-9793-e7e1c6611dcb","Type":"ContainerDied","Data":"90d79c80978df5290ed3480f9117dd794bb87506bd39c0812f84a12d5ae758cc"} Mar 19 19:19:25 crc kubenswrapper[4826]: I0319 19:19:25.826314 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=9.458278781 podStartE2EDuration="13.826295773s" podCreationTimestamp="2026-03-19 19:19:12 +0000 UTC" firstStartedPulling="2026-03-19 19:19:20.209624752 +0000 UTC m=+1384.963693065" lastFinishedPulling="2026-03-19 19:19:24.577641734 +0000 UTC m=+1389.331710057" observedRunningTime="2026-03-19 19:19:25.821761883 +0000 UTC m=+1390.575830206" watchObservedRunningTime="2026-03-19 19:19:25.826295773 +0000 UTC m=+1390.580364096" Mar 19 19:19:25 crc kubenswrapper[4826]: I0319 19:19:25.865232 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-79595bb545-xcckb"] Mar 19 19:19:25 crc kubenswrapper[4826]: I0319 19:19:25.866913 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-79595bb545-xcckb" Mar 19 19:19:25 crc kubenswrapper[4826]: I0319 19:19:25.876870 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Mar 19 19:19:25 crc kubenswrapper[4826]: I0319 19:19:25.877115 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Mar 19 19:19:25 crc kubenswrapper[4826]: I0319 19:19:25.877335 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-vhwpp" Mar 19 19:19:25 crc kubenswrapper[4826]: I0319 19:19:25.901850 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-79595bb545-xcckb"] Mar 19 19:19:25 crc kubenswrapper[4826]: I0319 19:19:25.993312 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a3611e1-4719-476c-8d8b-ceccedcb14bc-combined-ca-bundle\") pod \"heat-engine-79595bb545-xcckb\" (UID: \"6a3611e1-4719-476c-8d8b-ceccedcb14bc\") " pod="openstack/heat-engine-79595bb545-xcckb" Mar 19 19:19:25 crc kubenswrapper[4826]: I0319 19:19:25.993362 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6a3611e1-4719-476c-8d8b-ceccedcb14bc-config-data-custom\") pod \"heat-engine-79595bb545-xcckb\" (UID: \"6a3611e1-4719-476c-8d8b-ceccedcb14bc\") " pod="openstack/heat-engine-79595bb545-xcckb" Mar 19 19:19:25 crc kubenswrapper[4826]: I0319 19:19:25.993436 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmjqb\" (UniqueName: \"kubernetes.io/projected/6a3611e1-4719-476c-8d8b-ceccedcb14bc-kube-api-access-mmjqb\") pod \"heat-engine-79595bb545-xcckb\" (UID: \"6a3611e1-4719-476c-8d8b-ceccedcb14bc\") " pod="openstack/heat-engine-79595bb545-xcckb" Mar 19 19:19:25 crc kubenswrapper[4826]: I0319 19:19:25.993469 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a3611e1-4719-476c-8d8b-ceccedcb14bc-config-data\") pod \"heat-engine-79595bb545-xcckb\" (UID: \"6a3611e1-4719-476c-8d8b-ceccedcb14bc\") " pod="openstack/heat-engine-79595bb545-xcckb" Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.010727 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-688b9f5b49-sspxd"] Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.012400 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688b9f5b49-sspxd" Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.045339 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688b9f5b49-sspxd"] Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.100417 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a3611e1-4719-476c-8d8b-ceccedcb14bc-combined-ca-bundle\") pod \"heat-engine-79595bb545-xcckb\" (UID: \"6a3611e1-4719-476c-8d8b-ceccedcb14bc\") " pod="openstack/heat-engine-79595bb545-xcckb" Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.100462 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6a3611e1-4719-476c-8d8b-ceccedcb14bc-config-data-custom\") pod \"heat-engine-79595bb545-xcckb\" (UID: \"6a3611e1-4719-476c-8d8b-ceccedcb14bc\") " pod="openstack/heat-engine-79595bb545-xcckb" Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.100504 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e4769a42-cedc-492d-a6bf-e1ccf3cd77bc-ovsdbserver-nb\") pod \"dnsmasq-dns-688b9f5b49-sspxd\" (UID: \"e4769a42-cedc-492d-a6bf-e1ccf3cd77bc\") " pod="openstack/dnsmasq-dns-688b9f5b49-sspxd" Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.100526 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e4769a42-cedc-492d-a6bf-e1ccf3cd77bc-dns-svc\") pod \"dnsmasq-dns-688b9f5b49-sspxd\" (UID: \"e4769a42-cedc-492d-a6bf-e1ccf3cd77bc\") " pod="openstack/dnsmasq-dns-688b9f5b49-sspxd" Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.100547 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4769a42-cedc-492d-a6bf-e1ccf3cd77bc-config\") pod \"dnsmasq-dns-688b9f5b49-sspxd\" (UID: \"e4769a42-cedc-492d-a6bf-e1ccf3cd77bc\") " pod="openstack/dnsmasq-dns-688b9f5b49-sspxd" Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.100581 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmjqb\" (UniqueName: \"kubernetes.io/projected/6a3611e1-4719-476c-8d8b-ceccedcb14bc-kube-api-access-mmjqb\") pod \"heat-engine-79595bb545-xcckb\" (UID: \"6a3611e1-4719-476c-8d8b-ceccedcb14bc\") " pod="openstack/heat-engine-79595bb545-xcckb" Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.100597 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e4769a42-cedc-492d-a6bf-e1ccf3cd77bc-dns-swift-storage-0\") pod \"dnsmasq-dns-688b9f5b49-sspxd\" (UID: \"e4769a42-cedc-492d-a6bf-e1ccf3cd77bc\") " pod="openstack/dnsmasq-dns-688b9f5b49-sspxd" Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.100625 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a3611e1-4719-476c-8d8b-ceccedcb14bc-config-data\") pod \"heat-engine-79595bb545-xcckb\" (UID: \"6a3611e1-4719-476c-8d8b-ceccedcb14bc\") " pod="openstack/heat-engine-79595bb545-xcckb" Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.100673 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e4769a42-cedc-492d-a6bf-e1ccf3cd77bc-ovsdbserver-sb\") pod \"dnsmasq-dns-688b9f5b49-sspxd\" (UID: \"e4769a42-cedc-492d-a6bf-e1ccf3cd77bc\") " pod="openstack/dnsmasq-dns-688b9f5b49-sspxd" Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.100692 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlflq\" (UniqueName: \"kubernetes.io/projected/e4769a42-cedc-492d-a6bf-e1ccf3cd77bc-kube-api-access-vlflq\") pod \"dnsmasq-dns-688b9f5b49-sspxd\" (UID: \"e4769a42-cedc-492d-a6bf-e1ccf3cd77bc\") " pod="openstack/dnsmasq-dns-688b9f5b49-sspxd" Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.111386 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6a3611e1-4719-476c-8d8b-ceccedcb14bc-config-data-custom\") pod \"heat-engine-79595bb545-xcckb\" (UID: \"6a3611e1-4719-476c-8d8b-ceccedcb14bc\") " pod="openstack/heat-engine-79595bb545-xcckb" Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.117670 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a3611e1-4719-476c-8d8b-ceccedcb14bc-combined-ca-bundle\") pod \"heat-engine-79595bb545-xcckb\" (UID: \"6a3611e1-4719-476c-8d8b-ceccedcb14bc\") " pod="openstack/heat-engine-79595bb545-xcckb" Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.130290 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmjqb\" (UniqueName: \"kubernetes.io/projected/6a3611e1-4719-476c-8d8b-ceccedcb14bc-kube-api-access-mmjqb\") pod \"heat-engine-79595bb545-xcckb\" (UID: \"6a3611e1-4719-476c-8d8b-ceccedcb14bc\") " pod="openstack/heat-engine-79595bb545-xcckb" Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.132035 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a3611e1-4719-476c-8d8b-ceccedcb14bc-config-data\") pod \"heat-engine-79595bb545-xcckb\" (UID: \"6a3611e1-4719-476c-8d8b-ceccedcb14bc\") " pod="openstack/heat-engine-79595bb545-xcckb" Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.132081 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-68bcd6d974-ppz6b"] Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.141645 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-68bcd6d974-ppz6b" Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.153033 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.177784 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-7c88fd79fb-wz79s"] Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.179238 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7c88fd79fb-wz79s" Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.180736 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.197756 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-68bcd6d974-ppz6b"] Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.203266 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e4769a42-cedc-492d-a6bf-e1ccf3cd77bc-ovsdbserver-nb\") pod \"dnsmasq-dns-688b9f5b49-sspxd\" (UID: \"e4769a42-cedc-492d-a6bf-e1ccf3cd77bc\") " pod="openstack/dnsmasq-dns-688b9f5b49-sspxd" Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.203302 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e4769a42-cedc-492d-a6bf-e1ccf3cd77bc-dns-svc\") pod \"dnsmasq-dns-688b9f5b49-sspxd\" (UID: \"e4769a42-cedc-492d-a6bf-e1ccf3cd77bc\") " pod="openstack/dnsmasq-dns-688b9f5b49-sspxd" Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.203325 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4769a42-cedc-492d-a6bf-e1ccf3cd77bc-config\") pod \"dnsmasq-dns-688b9f5b49-sspxd\" (UID: \"e4769a42-cedc-492d-a6bf-e1ccf3cd77bc\") " pod="openstack/dnsmasq-dns-688b9f5b49-sspxd" Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.203365 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/82d5c8e9-6fae-4abc-996d-f1d3ce618ac8-config-data-custom\") pod \"heat-cfnapi-68bcd6d974-ppz6b\" (UID: \"82d5c8e9-6fae-4abc-996d-f1d3ce618ac8\") " pod="openstack/heat-cfnapi-68bcd6d974-ppz6b" Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.203384 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e4769a42-cedc-492d-a6bf-e1ccf3cd77bc-dns-swift-storage-0\") pod \"dnsmasq-dns-688b9f5b49-sspxd\" (UID: \"e4769a42-cedc-492d-a6bf-e1ccf3cd77bc\") " pod="openstack/dnsmasq-dns-688b9f5b49-sspxd" Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.203400 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcvbm\" (UniqueName: \"kubernetes.io/projected/82d5c8e9-6fae-4abc-996d-f1d3ce618ac8-kube-api-access-gcvbm\") pod \"heat-cfnapi-68bcd6d974-ppz6b\" (UID: \"82d5c8e9-6fae-4abc-996d-f1d3ce618ac8\") " pod="openstack/heat-cfnapi-68bcd6d974-ppz6b" Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.203417 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82d5c8e9-6fae-4abc-996d-f1d3ce618ac8-config-data\") pod \"heat-cfnapi-68bcd6d974-ppz6b\" (UID: \"82d5c8e9-6fae-4abc-996d-f1d3ce618ac8\") " pod="openstack/heat-cfnapi-68bcd6d974-ppz6b" Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.203461 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e4769a42-cedc-492d-a6bf-e1ccf3cd77bc-ovsdbserver-sb\") pod \"dnsmasq-dns-688b9f5b49-sspxd\" (UID: \"e4769a42-cedc-492d-a6bf-e1ccf3cd77bc\") " pod="openstack/dnsmasq-dns-688b9f5b49-sspxd" Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.203478 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlflq\" (UniqueName: \"kubernetes.io/projected/e4769a42-cedc-492d-a6bf-e1ccf3cd77bc-kube-api-access-vlflq\") pod \"dnsmasq-dns-688b9f5b49-sspxd\" (UID: \"e4769a42-cedc-492d-a6bf-e1ccf3cd77bc\") " pod="openstack/dnsmasq-dns-688b9f5b49-sspxd" Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.203509 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82d5c8e9-6fae-4abc-996d-f1d3ce618ac8-combined-ca-bundle\") pod \"heat-cfnapi-68bcd6d974-ppz6b\" (UID: \"82d5c8e9-6fae-4abc-996d-f1d3ce618ac8\") " pod="openstack/heat-cfnapi-68bcd6d974-ppz6b" Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.204281 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e4769a42-cedc-492d-a6bf-e1ccf3cd77bc-ovsdbserver-nb\") pod \"dnsmasq-dns-688b9f5b49-sspxd\" (UID: \"e4769a42-cedc-492d-a6bf-e1ccf3cd77bc\") " pod="openstack/dnsmasq-dns-688b9f5b49-sspxd" Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.204803 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e4769a42-cedc-492d-a6bf-e1ccf3cd77bc-dns-svc\") pod \"dnsmasq-dns-688b9f5b49-sspxd\" (UID: \"e4769a42-cedc-492d-a6bf-e1ccf3cd77bc\") " pod="openstack/dnsmasq-dns-688b9f5b49-sspxd" Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.205391 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e4769a42-cedc-492d-a6bf-e1ccf3cd77bc-ovsdbserver-sb\") pod \"dnsmasq-dns-688b9f5b49-sspxd\" (UID: \"e4769a42-cedc-492d-a6bf-e1ccf3cd77bc\") " pod="openstack/dnsmasq-dns-688b9f5b49-sspxd" Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.205403 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4769a42-cedc-492d-a6bf-e1ccf3cd77bc-config\") pod \"dnsmasq-dns-688b9f5b49-sspxd\" (UID: \"e4769a42-cedc-492d-a6bf-e1ccf3cd77bc\") " pod="openstack/dnsmasq-dns-688b9f5b49-sspxd" Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.205568 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-79595bb545-xcckb" Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.206751 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e4769a42-cedc-492d-a6bf-e1ccf3cd77bc-dns-swift-storage-0\") pod \"dnsmasq-dns-688b9f5b49-sspxd\" (UID: \"e4769a42-cedc-492d-a6bf-e1ccf3cd77bc\") " pod="openstack/dnsmasq-dns-688b9f5b49-sspxd" Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.219481 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-7c88fd79fb-wz79s"] Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.230353 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlflq\" (UniqueName: \"kubernetes.io/projected/e4769a42-cedc-492d-a6bf-e1ccf3cd77bc-kube-api-access-vlflq\") pod \"dnsmasq-dns-688b9f5b49-sspxd\" (UID: \"e4769a42-cedc-492d-a6bf-e1ccf3cd77bc\") " pod="openstack/dnsmasq-dns-688b9f5b49-sspxd" Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.305232 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/82d5c8e9-6fae-4abc-996d-f1d3ce618ac8-config-data-custom\") pod \"heat-cfnapi-68bcd6d974-ppz6b\" (UID: \"82d5c8e9-6fae-4abc-996d-f1d3ce618ac8\") " pod="openstack/heat-cfnapi-68bcd6d974-ppz6b" Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.305279 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcvbm\" (UniqueName: \"kubernetes.io/projected/82d5c8e9-6fae-4abc-996d-f1d3ce618ac8-kube-api-access-gcvbm\") pod \"heat-cfnapi-68bcd6d974-ppz6b\" (UID: \"82d5c8e9-6fae-4abc-996d-f1d3ce618ac8\") " pod="openstack/heat-cfnapi-68bcd6d974-ppz6b" Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.305297 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82d5c8e9-6fae-4abc-996d-f1d3ce618ac8-config-data\") pod \"heat-cfnapi-68bcd6d974-ppz6b\" (UID: \"82d5c8e9-6fae-4abc-996d-f1d3ce618ac8\") " pod="openstack/heat-cfnapi-68bcd6d974-ppz6b" Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.305335 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3-config-data-custom\") pod \"heat-api-7c88fd79fb-wz79s\" (UID: \"6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3\") " pod="openstack/heat-api-7c88fd79fb-wz79s" Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.305358 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3-combined-ca-bundle\") pod \"heat-api-7c88fd79fb-wz79s\" (UID: \"6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3\") " pod="openstack/heat-api-7c88fd79fb-wz79s" Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.305411 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82d5c8e9-6fae-4abc-996d-f1d3ce618ac8-combined-ca-bundle\") pod \"heat-cfnapi-68bcd6d974-ppz6b\" (UID: \"82d5c8e9-6fae-4abc-996d-f1d3ce618ac8\") " pod="openstack/heat-cfnapi-68bcd6d974-ppz6b" Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.305492 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3-config-data\") pod \"heat-api-7c88fd79fb-wz79s\" (UID: \"6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3\") " pod="openstack/heat-api-7c88fd79fb-wz79s" Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.305516 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7lzf\" (UniqueName: \"kubernetes.io/projected/6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3-kube-api-access-s7lzf\") pod \"heat-api-7c88fd79fb-wz79s\" (UID: \"6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3\") " pod="openstack/heat-api-7c88fd79fb-wz79s" Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.310908 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82d5c8e9-6fae-4abc-996d-f1d3ce618ac8-config-data\") pod \"heat-cfnapi-68bcd6d974-ppz6b\" (UID: \"82d5c8e9-6fae-4abc-996d-f1d3ce618ac8\") " pod="openstack/heat-cfnapi-68bcd6d974-ppz6b" Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.315396 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82d5c8e9-6fae-4abc-996d-f1d3ce618ac8-combined-ca-bundle\") pod \"heat-cfnapi-68bcd6d974-ppz6b\" (UID: \"82d5c8e9-6fae-4abc-996d-f1d3ce618ac8\") " pod="openstack/heat-cfnapi-68bcd6d974-ppz6b" Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.325828 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/82d5c8e9-6fae-4abc-996d-f1d3ce618ac8-config-data-custom\") pod \"heat-cfnapi-68bcd6d974-ppz6b\" (UID: \"82d5c8e9-6fae-4abc-996d-f1d3ce618ac8\") " pod="openstack/heat-cfnapi-68bcd6d974-ppz6b" Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.328863 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcvbm\" (UniqueName: \"kubernetes.io/projected/82d5c8e9-6fae-4abc-996d-f1d3ce618ac8-kube-api-access-gcvbm\") pod \"heat-cfnapi-68bcd6d974-ppz6b\" (UID: \"82d5c8e9-6fae-4abc-996d-f1d3ce618ac8\") " pod="openstack/heat-cfnapi-68bcd6d974-ppz6b" Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.334536 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688b9f5b49-sspxd" Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.407645 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3-config-data\") pod \"heat-api-7c88fd79fb-wz79s\" (UID: \"6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3\") " pod="openstack/heat-api-7c88fd79fb-wz79s" Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.407893 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7lzf\" (UniqueName: \"kubernetes.io/projected/6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3-kube-api-access-s7lzf\") pod \"heat-api-7c88fd79fb-wz79s\" (UID: \"6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3\") " pod="openstack/heat-api-7c88fd79fb-wz79s" Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.408132 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3-config-data-custom\") pod \"heat-api-7c88fd79fb-wz79s\" (UID: \"6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3\") " pod="openstack/heat-api-7c88fd79fb-wz79s" Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.408156 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3-combined-ca-bundle\") pod \"heat-api-7c88fd79fb-wz79s\" (UID: \"6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3\") " pod="openstack/heat-api-7c88fd79fb-wz79s" Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.417674 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3-config-data-custom\") pod \"heat-api-7c88fd79fb-wz79s\" (UID: \"6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3\") " pod="openstack/heat-api-7c88fd79fb-wz79s" Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.417866 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3-combined-ca-bundle\") pod \"heat-api-7c88fd79fb-wz79s\" (UID: \"6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3\") " pod="openstack/heat-api-7c88fd79fb-wz79s" Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.417898 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3-config-data\") pod \"heat-api-7c88fd79fb-wz79s\" (UID: \"6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3\") " pod="openstack/heat-api-7c88fd79fb-wz79s" Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.425121 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7lzf\" (UniqueName: \"kubernetes.io/projected/6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3-kube-api-access-s7lzf\") pod \"heat-api-7c88fd79fb-wz79s\" (UID: \"6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3\") " pod="openstack/heat-api-7c88fd79fb-wz79s" Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.430684 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-68bcd6d974-ppz6b" Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.517025 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7c88fd79fb-wz79s" Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.812441 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-79595bb545-xcckb"] Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.848159 4826 generic.go:334] "Generic (PLEG): container finished" podID="5859facc-e308-4701-9de1-ef9f4866a8d3" containerID="45543cc736227e197297846bee69b87f4a305800250fa4a9e6fb37646cfc9aab" exitCode=0 Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.848188 4826 generic.go:334] "Generic (PLEG): container finished" podID="5859facc-e308-4701-9de1-ef9f4866a8d3" containerID="908540e86d64eae77c502214c1c8eb51757c13a3192a1f8fb098abf8fb0dcabd" exitCode=2 Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.848195 4826 generic.go:334] "Generic (PLEG): container finished" podID="5859facc-e308-4701-9de1-ef9f4866a8d3" containerID="789ca983399ae59295135fa5841d92d23fc9e5743f6fec0e299fe4c25b8db75f" exitCode=0 Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.848202 4826 generic.go:334] "Generic (PLEG): container finished" podID="5859facc-e308-4701-9de1-ef9f4866a8d3" containerID="8accb492b569909be45c55e670bdb92bd9bbbf345c576a4ea28dc2a36fd9db7f" exitCode=0 Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.848243 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5859facc-e308-4701-9de1-ef9f4866a8d3","Type":"ContainerDied","Data":"45543cc736227e197297846bee69b87f4a305800250fa4a9e6fb37646cfc9aab"} Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.848295 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5859facc-e308-4701-9de1-ef9f4866a8d3","Type":"ContainerDied","Data":"908540e86d64eae77c502214c1c8eb51757c13a3192a1f8fb098abf8fb0dcabd"} Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.848307 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5859facc-e308-4701-9de1-ef9f4866a8d3","Type":"ContainerDied","Data":"789ca983399ae59295135fa5841d92d23fc9e5743f6fec0e299fe4c25b8db75f"} Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.848316 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5859facc-e308-4701-9de1-ef9f4866a8d3","Type":"ContainerDied","Data":"8accb492b569909be45c55e670bdb92bd9bbbf345c576a4ea28dc2a36fd9db7f"} Mar 19 19:19:26 crc kubenswrapper[4826]: W0319 19:19:26.862837 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a3611e1_4719_476c_8d8b_ceccedcb14bc.slice/crio-1ce291fa5ab6a3a2bd586ef9a9f2d801ada93141230d7c7668c16e9eaa84c838 WatchSource:0}: Error finding container 1ce291fa5ab6a3a2bd586ef9a9f2d801ada93141230d7c7668c16e9eaa84c838: Status 404 returned error can't find the container with id 1ce291fa5ab6a3a2bd586ef9a9f2d801ada93141230d7c7668c16e9eaa84c838 Mar 19 19:19:26 crc kubenswrapper[4826]: I0319 19:19:26.940302 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688b9f5b49-sspxd"] Mar 19 19:19:27 crc kubenswrapper[4826]: I0319 19:19:27.205941 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-68bcd6d974-ppz6b"] Mar 19 19:19:27 crc kubenswrapper[4826]: I0319 19:19:27.359440 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-7c88fd79fb-wz79s"] Mar 19 19:19:27 crc kubenswrapper[4826]: W0319 19:19:27.361922 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d932d9e_4ab6_4dcc_bff6_e20ebb18e3d3.slice/crio-aeb78d450d3c8f0381188672144821a342610df376df52ccbc806e6140937586 WatchSource:0}: Error finding container aeb78d450d3c8f0381188672144821a342610df376df52ccbc806e6140937586: Status 404 returned error can't find the container with id aeb78d450d3c8f0381188672144821a342610df376df52ccbc806e6140937586 Mar 19 19:19:27 crc kubenswrapper[4826]: I0319 19:19:27.423950 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:19:27 crc kubenswrapper[4826]: I0319 19:19:27.553687 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kv6nk\" (UniqueName: \"kubernetes.io/projected/5859facc-e308-4701-9de1-ef9f4866a8d3-kube-api-access-kv6nk\") pod \"5859facc-e308-4701-9de1-ef9f4866a8d3\" (UID: \"5859facc-e308-4701-9de1-ef9f4866a8d3\") " Mar 19 19:19:27 crc kubenswrapper[4826]: I0319 19:19:27.553947 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5859facc-e308-4701-9de1-ef9f4866a8d3-sg-core-conf-yaml\") pod \"5859facc-e308-4701-9de1-ef9f4866a8d3\" (UID: \"5859facc-e308-4701-9de1-ef9f4866a8d3\") " Mar 19 19:19:27 crc kubenswrapper[4826]: I0319 19:19:27.554015 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5859facc-e308-4701-9de1-ef9f4866a8d3-log-httpd\") pod \"5859facc-e308-4701-9de1-ef9f4866a8d3\" (UID: \"5859facc-e308-4701-9de1-ef9f4866a8d3\") " Mar 19 19:19:27 crc kubenswrapper[4826]: I0319 19:19:27.554073 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5859facc-e308-4701-9de1-ef9f4866a8d3-scripts\") pod \"5859facc-e308-4701-9de1-ef9f4866a8d3\" (UID: \"5859facc-e308-4701-9de1-ef9f4866a8d3\") " Mar 19 19:19:27 crc kubenswrapper[4826]: I0319 19:19:27.554188 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5859facc-e308-4701-9de1-ef9f4866a8d3-config-data\") pod \"5859facc-e308-4701-9de1-ef9f4866a8d3\" (UID: \"5859facc-e308-4701-9de1-ef9f4866a8d3\") " Mar 19 19:19:27 crc kubenswrapper[4826]: I0319 19:19:27.554317 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5859facc-e308-4701-9de1-ef9f4866a8d3-combined-ca-bundle\") pod \"5859facc-e308-4701-9de1-ef9f4866a8d3\" (UID: \"5859facc-e308-4701-9de1-ef9f4866a8d3\") " Mar 19 19:19:27 crc kubenswrapper[4826]: I0319 19:19:27.554343 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5859facc-e308-4701-9de1-ef9f4866a8d3-run-httpd\") pod \"5859facc-e308-4701-9de1-ef9f4866a8d3\" (UID: \"5859facc-e308-4701-9de1-ef9f4866a8d3\") " Mar 19 19:19:27 crc kubenswrapper[4826]: I0319 19:19:27.555423 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5859facc-e308-4701-9de1-ef9f4866a8d3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5859facc-e308-4701-9de1-ef9f4866a8d3" (UID: "5859facc-e308-4701-9de1-ef9f4866a8d3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:19:27 crc kubenswrapper[4826]: I0319 19:19:27.555457 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5859facc-e308-4701-9de1-ef9f4866a8d3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5859facc-e308-4701-9de1-ef9f4866a8d3" (UID: "5859facc-e308-4701-9de1-ef9f4866a8d3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:19:27 crc kubenswrapper[4826]: I0319 19:19:27.560325 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5859facc-e308-4701-9de1-ef9f4866a8d3-kube-api-access-kv6nk" (OuterVolumeSpecName: "kube-api-access-kv6nk") pod "5859facc-e308-4701-9de1-ef9f4866a8d3" (UID: "5859facc-e308-4701-9de1-ef9f4866a8d3"). InnerVolumeSpecName "kube-api-access-kv6nk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:19:27 crc kubenswrapper[4826]: I0319 19:19:27.569757 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5859facc-e308-4701-9de1-ef9f4866a8d3-scripts" (OuterVolumeSpecName: "scripts") pod "5859facc-e308-4701-9de1-ef9f4866a8d3" (UID: "5859facc-e308-4701-9de1-ef9f4866a8d3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:27 crc kubenswrapper[4826]: I0319 19:19:27.616299 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5859facc-e308-4701-9de1-ef9f4866a8d3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5859facc-e308-4701-9de1-ef9f4866a8d3" (UID: "5859facc-e308-4701-9de1-ef9f4866a8d3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:27 crc kubenswrapper[4826]: I0319 19:19:27.656915 4826 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5859facc-e308-4701-9de1-ef9f4866a8d3-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:27 crc kubenswrapper[4826]: I0319 19:19:27.656941 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kv6nk\" (UniqueName: \"kubernetes.io/projected/5859facc-e308-4701-9de1-ef9f4866a8d3-kube-api-access-kv6nk\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:27 crc kubenswrapper[4826]: I0319 19:19:27.656954 4826 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5859facc-e308-4701-9de1-ef9f4866a8d3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:27 crc kubenswrapper[4826]: I0319 19:19:27.656962 4826 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5859facc-e308-4701-9de1-ef9f4866a8d3-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:27 crc kubenswrapper[4826]: I0319 19:19:27.656970 4826 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5859facc-e308-4701-9de1-ef9f4866a8d3-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:27 crc kubenswrapper[4826]: I0319 19:19:27.691238 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5859facc-e308-4701-9de1-ef9f4866a8d3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5859facc-e308-4701-9de1-ef9f4866a8d3" (UID: "5859facc-e308-4701-9de1-ef9f4866a8d3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:27 crc kubenswrapper[4826]: I0319 19:19:27.752288 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5859facc-e308-4701-9de1-ef9f4866a8d3-config-data" (OuterVolumeSpecName: "config-data") pod "5859facc-e308-4701-9de1-ef9f4866a8d3" (UID: "5859facc-e308-4701-9de1-ef9f4866a8d3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:27 crc kubenswrapper[4826]: I0319 19:19:27.758740 4826 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5859facc-e308-4701-9de1-ef9f4866a8d3-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:27 crc kubenswrapper[4826]: I0319 19:19:27.758765 4826 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5859facc-e308-4701-9de1-ef9f4866a8d3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:27 crc kubenswrapper[4826]: I0319 19:19:27.871535 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cml5k" event={"ID":"af2662b9-3873-4947-9793-e7e1c6611dcb","Type":"ContainerStarted","Data":"3be29d70298c1093bc3ea79f530bfa5d2598792bf372dd42f474edfb330146ef"} Mar 19 19:19:27 crc kubenswrapper[4826]: I0319 19:19:27.875544 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-79595bb545-xcckb" event={"ID":"6a3611e1-4719-476c-8d8b-ceccedcb14bc","Type":"ContainerStarted","Data":"230b6d2884e1b6005c051b0b95d6403e68a27adb8c35eeee9e84242ad167358b"} Mar 19 19:19:27 crc kubenswrapper[4826]: I0319 19:19:27.875630 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-79595bb545-xcckb" event={"ID":"6a3611e1-4719-476c-8d8b-ceccedcb14bc","Type":"ContainerStarted","Data":"1ce291fa5ab6a3a2bd586ef9a9f2d801ada93141230d7c7668c16e9eaa84c838"} Mar 19 19:19:27 crc kubenswrapper[4826]: I0319 19:19:27.876854 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-79595bb545-xcckb" Mar 19 19:19:27 crc kubenswrapper[4826]: I0319 19:19:27.877598 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-68bcd6d974-ppz6b" event={"ID":"82d5c8e9-6fae-4abc-996d-f1d3ce618ac8","Type":"ContainerStarted","Data":"7723d0844e43483d7f9193a53c58c24afc08cc3ee9e8d533b9e378e8200b6135"} Mar 19 19:19:27 crc kubenswrapper[4826]: I0319 19:19:27.879336 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7c88fd79fb-wz79s" event={"ID":"6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3","Type":"ContainerStarted","Data":"aeb78d450d3c8f0381188672144821a342610df376df52ccbc806e6140937586"} Mar 19 19:19:27 crc kubenswrapper[4826]: I0319 19:19:27.881046 4826 generic.go:334] "Generic (PLEG): container finished" podID="e4769a42-cedc-492d-a6bf-e1ccf3cd77bc" containerID="78fcfab2b14088f7eccbcaa73d544a229ad6d4384e2efc2c80135a6bef514d16" exitCode=0 Mar 19 19:19:27 crc kubenswrapper[4826]: I0319 19:19:27.881122 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688b9f5b49-sspxd" event={"ID":"e4769a42-cedc-492d-a6bf-e1ccf3cd77bc","Type":"ContainerDied","Data":"78fcfab2b14088f7eccbcaa73d544a229ad6d4384e2efc2c80135a6bef514d16"} Mar 19 19:19:27 crc kubenswrapper[4826]: I0319 19:19:27.881156 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688b9f5b49-sspxd" event={"ID":"e4769a42-cedc-492d-a6bf-e1ccf3cd77bc","Type":"ContainerStarted","Data":"3160f3b77bf7ac243fdd1c5ab2cc5953a965a3d7bc54a238ae25234fc78ed0cf"} Mar 19 19:19:27 crc kubenswrapper[4826]: I0319 19:19:27.888512 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5859facc-e308-4701-9de1-ef9f4866a8d3","Type":"ContainerDied","Data":"b15a64e3bc4b774c8ec8909d1d3f3c4a6f794ebc83684f4b29ccbfc181463168"} Mar 19 19:19:27 crc kubenswrapper[4826]: I0319 19:19:27.888559 4826 scope.go:117] "RemoveContainer" containerID="45543cc736227e197297846bee69b87f4a305800250fa4a9e6fb37646cfc9aab" Mar 19 19:19:27 crc kubenswrapper[4826]: I0319 19:19:27.888731 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:19:27 crc kubenswrapper[4826]: I0319 19:19:27.987590 4826 scope.go:117] "RemoveContainer" containerID="908540e86d64eae77c502214c1c8eb51757c13a3192a1f8fb098abf8fb0dcabd" Mar 19 19:19:28 crc kubenswrapper[4826]: I0319 19:19:28.000481 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-79595bb545-xcckb" podStartSLOduration=3.00046305 podStartE2EDuration="3.00046305s" podCreationTimestamp="2026-03-19 19:19:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:19:27.940099391 +0000 UTC m=+1392.694167694" watchObservedRunningTime="2026-03-19 19:19:28.00046305 +0000 UTC m=+1392.754531363" Mar 19 19:19:28 crc kubenswrapper[4826]: I0319 19:19:28.055866 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:19:28 crc kubenswrapper[4826]: I0319 19:19:28.074814 4826 scope.go:117] "RemoveContainer" containerID="789ca983399ae59295135fa5841d92d23fc9e5743f6fec0e299fe4c25b8db75f" Mar 19 19:19:28 crc kubenswrapper[4826]: I0319 19:19:28.103101 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:19:28 crc kubenswrapper[4826]: I0319 19:19:28.121717 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:19:28 crc kubenswrapper[4826]: E0319 19:19:28.122416 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5859facc-e308-4701-9de1-ef9f4866a8d3" containerName="proxy-httpd" Mar 19 19:19:28 crc kubenswrapper[4826]: I0319 19:19:28.122434 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="5859facc-e308-4701-9de1-ef9f4866a8d3" containerName="proxy-httpd" Mar 19 19:19:28 crc kubenswrapper[4826]: E0319 19:19:28.122455 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5859facc-e308-4701-9de1-ef9f4866a8d3" containerName="ceilometer-central-agent" Mar 19 19:19:28 crc kubenswrapper[4826]: I0319 19:19:28.122461 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="5859facc-e308-4701-9de1-ef9f4866a8d3" containerName="ceilometer-central-agent" Mar 19 19:19:28 crc kubenswrapper[4826]: E0319 19:19:28.122468 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5859facc-e308-4701-9de1-ef9f4866a8d3" containerName="sg-core" Mar 19 19:19:28 crc kubenswrapper[4826]: I0319 19:19:28.122473 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="5859facc-e308-4701-9de1-ef9f4866a8d3" containerName="sg-core" Mar 19 19:19:28 crc kubenswrapper[4826]: E0319 19:19:28.122503 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5859facc-e308-4701-9de1-ef9f4866a8d3" containerName="ceilometer-notification-agent" Mar 19 19:19:28 crc kubenswrapper[4826]: I0319 19:19:28.122509 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="5859facc-e308-4701-9de1-ef9f4866a8d3" containerName="ceilometer-notification-agent" Mar 19 19:19:28 crc kubenswrapper[4826]: I0319 19:19:28.122778 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="5859facc-e308-4701-9de1-ef9f4866a8d3" containerName="ceilometer-notification-agent" Mar 19 19:19:28 crc kubenswrapper[4826]: I0319 19:19:28.122802 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="5859facc-e308-4701-9de1-ef9f4866a8d3" containerName="sg-core" Mar 19 19:19:28 crc kubenswrapper[4826]: I0319 19:19:28.122822 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="5859facc-e308-4701-9de1-ef9f4866a8d3" containerName="proxy-httpd" Mar 19 19:19:28 crc kubenswrapper[4826]: I0319 19:19:28.122834 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="5859facc-e308-4701-9de1-ef9f4866a8d3" containerName="ceilometer-central-agent" Mar 19 19:19:28 crc kubenswrapper[4826]: I0319 19:19:28.124912 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:19:28 crc kubenswrapper[4826]: I0319 19:19:28.129353 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 19 19:19:28 crc kubenswrapper[4826]: I0319 19:19:28.130046 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:19:28 crc kubenswrapper[4826]: I0319 19:19:28.136836 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 19 19:19:28 crc kubenswrapper[4826]: I0319 19:19:28.174121 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e59b525d-52f1-4d86-a5da-37c081313417-scripts\") pod \"ceilometer-0\" (UID: \"e59b525d-52f1-4d86-a5da-37c081313417\") " pod="openstack/ceilometer-0" Mar 19 19:19:28 crc kubenswrapper[4826]: I0319 19:19:28.174234 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e59b525d-52f1-4d86-a5da-37c081313417-log-httpd\") pod \"ceilometer-0\" (UID: \"e59b525d-52f1-4d86-a5da-37c081313417\") " pod="openstack/ceilometer-0" Mar 19 19:19:28 crc kubenswrapper[4826]: I0319 19:19:28.174266 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e59b525d-52f1-4d86-a5da-37c081313417-run-httpd\") pod \"ceilometer-0\" (UID: \"e59b525d-52f1-4d86-a5da-37c081313417\") " pod="openstack/ceilometer-0" Mar 19 19:19:28 crc kubenswrapper[4826]: I0319 19:19:28.174434 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e59b525d-52f1-4d86-a5da-37c081313417-config-data\") pod \"ceilometer-0\" (UID: \"e59b525d-52f1-4d86-a5da-37c081313417\") " pod="openstack/ceilometer-0" Mar 19 19:19:28 crc kubenswrapper[4826]: I0319 19:19:28.174544 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e59b525d-52f1-4d86-a5da-37c081313417-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e59b525d-52f1-4d86-a5da-37c081313417\") " pod="openstack/ceilometer-0" Mar 19 19:19:28 crc kubenswrapper[4826]: I0319 19:19:28.174796 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96pmk\" (UniqueName: \"kubernetes.io/projected/e59b525d-52f1-4d86-a5da-37c081313417-kube-api-access-96pmk\") pod \"ceilometer-0\" (UID: \"e59b525d-52f1-4d86-a5da-37c081313417\") " pod="openstack/ceilometer-0" Mar 19 19:19:28 crc kubenswrapper[4826]: I0319 19:19:28.174850 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e59b525d-52f1-4d86-a5da-37c081313417-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e59b525d-52f1-4d86-a5da-37c081313417\") " pod="openstack/ceilometer-0" Mar 19 19:19:28 crc kubenswrapper[4826]: I0319 19:19:28.227625 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-nn4pv"] Mar 19 19:19:28 crc kubenswrapper[4826]: I0319 19:19:28.229933 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-nn4pv" Mar 19 19:19:28 crc kubenswrapper[4826]: I0319 19:19:28.233888 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-8p84g" Mar 19 19:19:28 crc kubenswrapper[4826]: I0319 19:19:28.233997 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 19 19:19:28 crc kubenswrapper[4826]: I0319 19:19:28.234079 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 19 19:19:28 crc kubenswrapper[4826]: I0319 19:19:28.244684 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-nn4pv"] Mar 19 19:19:28 crc kubenswrapper[4826]: I0319 19:19:28.268104 4826 scope.go:117] "RemoveContainer" containerID="8accb492b569909be45c55e670bdb92bd9bbbf345c576a4ea28dc2a36fd9db7f" Mar 19 19:19:28 crc kubenswrapper[4826]: I0319 19:19:28.282638 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e59b525d-52f1-4d86-a5da-37c081313417-config-data\") pod \"ceilometer-0\" (UID: \"e59b525d-52f1-4d86-a5da-37c081313417\") " pod="openstack/ceilometer-0" Mar 19 19:19:28 crc kubenswrapper[4826]: I0319 19:19:28.282703 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e59b525d-52f1-4d86-a5da-37c081313417-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e59b525d-52f1-4d86-a5da-37c081313417\") " pod="openstack/ceilometer-0" Mar 19 19:19:28 crc kubenswrapper[4826]: I0319 19:19:28.282765 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f69e3ed0-cf8e-438d-a6f0-dac56664901e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-nn4pv\" (UID: \"f69e3ed0-cf8e-438d-a6f0-dac56664901e\") " pod="openstack/nova-cell0-conductor-db-sync-nn4pv" Mar 19 19:19:28 crc kubenswrapper[4826]: I0319 19:19:28.282822 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96pmk\" (UniqueName: \"kubernetes.io/projected/e59b525d-52f1-4d86-a5da-37c081313417-kube-api-access-96pmk\") pod \"ceilometer-0\" (UID: \"e59b525d-52f1-4d86-a5da-37c081313417\") " pod="openstack/ceilometer-0" Mar 19 19:19:28 crc kubenswrapper[4826]: I0319 19:19:28.282849 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e59b525d-52f1-4d86-a5da-37c081313417-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e59b525d-52f1-4d86-a5da-37c081313417\") " pod="openstack/ceilometer-0" Mar 19 19:19:28 crc kubenswrapper[4826]: I0319 19:19:28.282879 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f69e3ed0-cf8e-438d-a6f0-dac56664901e-config-data\") pod \"nova-cell0-conductor-db-sync-nn4pv\" (UID: \"f69e3ed0-cf8e-438d-a6f0-dac56664901e\") " pod="openstack/nova-cell0-conductor-db-sync-nn4pv" Mar 19 19:19:28 crc kubenswrapper[4826]: I0319 19:19:28.282927 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f69e3ed0-cf8e-438d-a6f0-dac56664901e-scripts\") pod \"nova-cell0-conductor-db-sync-nn4pv\" (UID: \"f69e3ed0-cf8e-438d-a6f0-dac56664901e\") " pod="openstack/nova-cell0-conductor-db-sync-nn4pv" Mar 19 19:19:28 crc kubenswrapper[4826]: I0319 19:19:28.282986 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e59b525d-52f1-4d86-a5da-37c081313417-scripts\") pod \"ceilometer-0\" (UID: \"e59b525d-52f1-4d86-a5da-37c081313417\") " pod="openstack/ceilometer-0" Mar 19 19:19:28 crc kubenswrapper[4826]: I0319 19:19:28.283034 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2mww\" (UniqueName: \"kubernetes.io/projected/f69e3ed0-cf8e-438d-a6f0-dac56664901e-kube-api-access-q2mww\") pod \"nova-cell0-conductor-db-sync-nn4pv\" (UID: \"f69e3ed0-cf8e-438d-a6f0-dac56664901e\") " pod="openstack/nova-cell0-conductor-db-sync-nn4pv" Mar 19 19:19:28 crc kubenswrapper[4826]: I0319 19:19:28.290872 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e59b525d-52f1-4d86-a5da-37c081313417-log-httpd\") pod \"ceilometer-0\" (UID: \"e59b525d-52f1-4d86-a5da-37c081313417\") " pod="openstack/ceilometer-0" Mar 19 19:19:28 crc kubenswrapper[4826]: I0319 19:19:28.291000 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e59b525d-52f1-4d86-a5da-37c081313417-run-httpd\") pod \"ceilometer-0\" (UID: \"e59b525d-52f1-4d86-a5da-37c081313417\") " pod="openstack/ceilometer-0" Mar 19 19:19:28 crc kubenswrapper[4826]: I0319 19:19:28.291401 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e59b525d-52f1-4d86-a5da-37c081313417-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e59b525d-52f1-4d86-a5da-37c081313417\") " pod="openstack/ceilometer-0" Mar 19 19:19:28 crc kubenswrapper[4826]: I0319 19:19:28.292821 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e59b525d-52f1-4d86-a5da-37c081313417-log-httpd\") pod \"ceilometer-0\" (UID: \"e59b525d-52f1-4d86-a5da-37c081313417\") " pod="openstack/ceilometer-0" Mar 19 19:19:28 crc kubenswrapper[4826]: I0319 19:19:28.293003 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e59b525d-52f1-4d86-a5da-37c081313417-scripts\") pod \"ceilometer-0\" (UID: \"e59b525d-52f1-4d86-a5da-37c081313417\") " pod="openstack/ceilometer-0" Mar 19 19:19:28 crc kubenswrapper[4826]: I0319 19:19:28.296028 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e59b525d-52f1-4d86-a5da-37c081313417-config-data\") pod \"ceilometer-0\" (UID: \"e59b525d-52f1-4d86-a5da-37c081313417\") " pod="openstack/ceilometer-0" Mar 19 19:19:28 crc kubenswrapper[4826]: I0319 19:19:28.298021 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e59b525d-52f1-4d86-a5da-37c081313417-run-httpd\") pod \"ceilometer-0\" (UID: \"e59b525d-52f1-4d86-a5da-37c081313417\") " pod="openstack/ceilometer-0" Mar 19 19:19:28 crc kubenswrapper[4826]: I0319 19:19:28.300262 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e59b525d-52f1-4d86-a5da-37c081313417-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e59b525d-52f1-4d86-a5da-37c081313417\") " pod="openstack/ceilometer-0" Mar 19 19:19:28 crc kubenswrapper[4826]: I0319 19:19:28.306485 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96pmk\" (UniqueName: \"kubernetes.io/projected/e59b525d-52f1-4d86-a5da-37c081313417-kube-api-access-96pmk\") pod \"ceilometer-0\" (UID: \"e59b525d-52f1-4d86-a5da-37c081313417\") " pod="openstack/ceilometer-0" Mar 19 19:19:28 crc kubenswrapper[4826]: I0319 19:19:28.394113 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f69e3ed0-cf8e-438d-a6f0-dac56664901e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-nn4pv\" (UID: \"f69e3ed0-cf8e-438d-a6f0-dac56664901e\") " pod="openstack/nova-cell0-conductor-db-sync-nn4pv" Mar 19 19:19:28 crc kubenswrapper[4826]: I0319 19:19:28.394248 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f69e3ed0-cf8e-438d-a6f0-dac56664901e-config-data\") pod \"nova-cell0-conductor-db-sync-nn4pv\" (UID: \"f69e3ed0-cf8e-438d-a6f0-dac56664901e\") " pod="openstack/nova-cell0-conductor-db-sync-nn4pv" Mar 19 19:19:28 crc kubenswrapper[4826]: I0319 19:19:28.394277 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f69e3ed0-cf8e-438d-a6f0-dac56664901e-scripts\") pod \"nova-cell0-conductor-db-sync-nn4pv\" (UID: \"f69e3ed0-cf8e-438d-a6f0-dac56664901e\") " pod="openstack/nova-cell0-conductor-db-sync-nn4pv" Mar 19 19:19:28 crc kubenswrapper[4826]: I0319 19:19:28.394367 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2mww\" (UniqueName: \"kubernetes.io/projected/f69e3ed0-cf8e-438d-a6f0-dac56664901e-kube-api-access-q2mww\") pod \"nova-cell0-conductor-db-sync-nn4pv\" (UID: \"f69e3ed0-cf8e-438d-a6f0-dac56664901e\") " pod="openstack/nova-cell0-conductor-db-sync-nn4pv" Mar 19 19:19:28 crc kubenswrapper[4826]: I0319 19:19:28.399799 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f69e3ed0-cf8e-438d-a6f0-dac56664901e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-nn4pv\" (UID: \"f69e3ed0-cf8e-438d-a6f0-dac56664901e\") " pod="openstack/nova-cell0-conductor-db-sync-nn4pv" Mar 19 19:19:28 crc kubenswrapper[4826]: I0319 19:19:28.404578 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f69e3ed0-cf8e-438d-a6f0-dac56664901e-scripts\") pod \"nova-cell0-conductor-db-sync-nn4pv\" (UID: \"f69e3ed0-cf8e-438d-a6f0-dac56664901e\") " pod="openstack/nova-cell0-conductor-db-sync-nn4pv" Mar 19 19:19:28 crc kubenswrapper[4826]: I0319 19:19:28.412876 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f69e3ed0-cf8e-438d-a6f0-dac56664901e-config-data\") pod \"nova-cell0-conductor-db-sync-nn4pv\" (UID: \"f69e3ed0-cf8e-438d-a6f0-dac56664901e\") " pod="openstack/nova-cell0-conductor-db-sync-nn4pv" Mar 19 19:19:28 crc kubenswrapper[4826]: I0319 19:19:28.417344 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2mww\" (UniqueName: \"kubernetes.io/projected/f69e3ed0-cf8e-438d-a6f0-dac56664901e-kube-api-access-q2mww\") pod \"nova-cell0-conductor-db-sync-nn4pv\" (UID: \"f69e3ed0-cf8e-438d-a6f0-dac56664901e\") " pod="openstack/nova-cell0-conductor-db-sync-nn4pv" Mar 19 19:19:28 crc kubenswrapper[4826]: I0319 19:19:28.470798 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-nn4pv" Mar 19 19:19:28 crc kubenswrapper[4826]: I0319 19:19:28.580121 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:19:28 crc kubenswrapper[4826]: I0319 19:19:28.936004 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688b9f5b49-sspxd" event={"ID":"e4769a42-cedc-492d-a6bf-e1ccf3cd77bc","Type":"ContainerStarted","Data":"1286bc0ae1e60c4fd1bca205079bc99eb86cd33504aed4a0de1f5eeec643da4a"} Mar 19 19:19:28 crc kubenswrapper[4826]: I0319 19:19:28.937782 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-688b9f5b49-sspxd" Mar 19 19:19:28 crc kubenswrapper[4826]: I0319 19:19:28.966793 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-688b9f5b49-sspxd" podStartSLOduration=3.966772016 podStartE2EDuration="3.966772016s" podCreationTimestamp="2026-03-19 19:19:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:19:28.958419124 +0000 UTC m=+1393.712487457" watchObservedRunningTime="2026-03-19 19:19:28.966772016 +0000 UTC m=+1393.720840329" Mar 19 19:19:29 crc kubenswrapper[4826]: W0319 19:19:29.046708 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf69e3ed0_cf8e_438d_a6f0_dac56664901e.slice/crio-c39672eee0893827c86076897c9a17dc4b22dd5e239b0314ce758f37306318ca WatchSource:0}: Error finding container c39672eee0893827c86076897c9a17dc4b22dd5e239b0314ce758f37306318ca: Status 404 returned error can't find the container with id c39672eee0893827c86076897c9a17dc4b22dd5e239b0314ce758f37306318ca Mar 19 19:19:29 crc kubenswrapper[4826]: I0319 19:19:29.047621 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-nn4pv"] Mar 19 19:19:29 crc kubenswrapper[4826]: I0319 19:19:29.206366 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:19:29 crc kubenswrapper[4826]: I0319 19:19:29.497306 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 19:19:29 crc kubenswrapper[4826]: I0319 19:19:29.497568 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="2ee13755-098f-4b30-8f68-4376adb9d4aa" containerName="glance-log" containerID="cri-o://f1b048f794c806a00b659dc40b9462845f9f298f42c762391b8b6db5f6b14904" gracePeriod=30 Mar 19 19:19:29 crc kubenswrapper[4826]: I0319 19:19:29.497941 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="2ee13755-098f-4b30-8f68-4376adb9d4aa" containerName="glance-httpd" containerID="cri-o://7f01b1f044c2e1724a50234bed289541d76936595212ea22858965ec18c6384c" gracePeriod=30 Mar 19 19:19:29 crc kubenswrapper[4826]: I0319 19:19:29.959699 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-nn4pv" event={"ID":"f69e3ed0-cf8e-438d-a6f0-dac56664901e","Type":"ContainerStarted","Data":"c39672eee0893827c86076897c9a17dc4b22dd5e239b0314ce758f37306318ca"} Mar 19 19:19:29 crc kubenswrapper[4826]: I0319 19:19:29.962717 4826 generic.go:334] "Generic (PLEG): container finished" podID="2ee13755-098f-4b30-8f68-4376adb9d4aa" containerID="f1b048f794c806a00b659dc40b9462845f9f298f42c762391b8b6db5f6b14904" exitCode=143 Mar 19 19:19:29 crc kubenswrapper[4826]: I0319 19:19:29.962834 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2ee13755-098f-4b30-8f68-4376adb9d4aa","Type":"ContainerDied","Data":"f1b048f794c806a00b659dc40b9462845f9f298f42c762391b8b6db5f6b14904"} Mar 19 19:19:29 crc kubenswrapper[4826]: I0319 19:19:29.991226 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5859facc-e308-4701-9de1-ef9f4866a8d3" path="/var/lib/kubelet/pods/5859facc-e308-4701-9de1-ef9f4866a8d3/volumes" Mar 19 19:19:30 crc kubenswrapper[4826]: I0319 19:19:30.630078 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:19:30 crc kubenswrapper[4826]: W0319 19:19:30.657274 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode59b525d_52f1_4d86_a5da_37c081313417.slice/crio-250d1310bdb99ec08f400300424c500dd0c796dfb4f57da8efa297f9539dd7f6 WatchSource:0}: Error finding container 250d1310bdb99ec08f400300424c500dd0c796dfb4f57da8efa297f9539dd7f6: Status 404 returned error can't find the container with id 250d1310bdb99ec08f400300424c500dd0c796dfb4f57da8efa297f9539dd7f6 Mar 19 19:19:30 crc kubenswrapper[4826]: I0319 19:19:30.994739 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e59b525d-52f1-4d86-a5da-37c081313417","Type":"ContainerStarted","Data":"250d1310bdb99ec08f400300424c500dd0c796dfb4f57da8efa297f9539dd7f6"} Mar 19 19:19:32 crc kubenswrapper[4826]: I0319 19:19:32.014012 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-68bcd6d974-ppz6b" event={"ID":"82d5c8e9-6fae-4abc-996d-f1d3ce618ac8","Type":"ContainerStarted","Data":"78fa0b8373dfced29b5cee7aa94853b1a82d50aa81e14bf83daca33d54d170c8"} Mar 19 19:19:32 crc kubenswrapper[4826]: I0319 19:19:32.014772 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-68bcd6d974-ppz6b" Mar 19 19:19:32 crc kubenswrapper[4826]: I0319 19:19:32.018028 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7c88fd79fb-wz79s" event={"ID":"6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3","Type":"ContainerStarted","Data":"95c3c6e502efc12c74623e5694c7a61ea772e53865b7e0738e3661da68279953"} Mar 19 19:19:32 crc kubenswrapper[4826]: I0319 19:19:32.018194 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-7c88fd79fb-wz79s" Mar 19 19:19:32 crc kubenswrapper[4826]: I0319 19:19:32.044137 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-68bcd6d974-ppz6b" podStartSLOduration=2.514572548 podStartE2EDuration="6.044112703s" podCreationTimestamp="2026-03-19 19:19:26 +0000 UTC" firstStartedPulling="2026-03-19 19:19:27.230496652 +0000 UTC m=+1391.984564965" lastFinishedPulling="2026-03-19 19:19:30.760036807 +0000 UTC m=+1395.514105120" observedRunningTime="2026-03-19 19:19:32.030923934 +0000 UTC m=+1396.784992237" watchObservedRunningTime="2026-03-19 19:19:32.044112703 +0000 UTC m=+1396.798181026" Mar 19 19:19:32 crc kubenswrapper[4826]: I0319 19:19:32.052008 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-7c88fd79fb-wz79s" podStartSLOduration=2.651543347 podStartE2EDuration="6.051993073s" podCreationTimestamp="2026-03-19 19:19:26 +0000 UTC" firstStartedPulling="2026-03-19 19:19:27.369262095 +0000 UTC m=+1392.123330408" lastFinishedPulling="2026-03-19 19:19:30.769711831 +0000 UTC m=+1395.523780134" observedRunningTime="2026-03-19 19:19:32.045215869 +0000 UTC m=+1396.799284172" watchObservedRunningTime="2026-03-19 19:19:32.051993073 +0000 UTC m=+1396.806061386" Mar 19 19:19:32 crc kubenswrapper[4826]: I0319 19:19:32.712604 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-5778865fb9-z27ps"] Mar 19 19:19:32 crc kubenswrapper[4826]: I0319 19:19:32.749270 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5778865fb9-z27ps" Mar 19 19:19:32 crc kubenswrapper[4826]: I0319 19:19:32.752327 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76db7194-58de-4efa-8ffc-a18f17d2a3c4-config-data\") pod \"heat-engine-5778865fb9-z27ps\" (UID: \"76db7194-58de-4efa-8ffc-a18f17d2a3c4\") " pod="openstack/heat-engine-5778865fb9-z27ps" Mar 19 19:19:32 crc kubenswrapper[4826]: I0319 19:19:32.752415 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/76db7194-58de-4efa-8ffc-a18f17d2a3c4-config-data-custom\") pod \"heat-engine-5778865fb9-z27ps\" (UID: \"76db7194-58de-4efa-8ffc-a18f17d2a3c4\") " pod="openstack/heat-engine-5778865fb9-z27ps" Mar 19 19:19:32 crc kubenswrapper[4826]: I0319 19:19:32.752605 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mx9fn\" (UniqueName: \"kubernetes.io/projected/76db7194-58de-4efa-8ffc-a18f17d2a3c4-kube-api-access-mx9fn\") pod \"heat-engine-5778865fb9-z27ps\" (UID: \"76db7194-58de-4efa-8ffc-a18f17d2a3c4\") " pod="openstack/heat-engine-5778865fb9-z27ps" Mar 19 19:19:32 crc kubenswrapper[4826]: I0319 19:19:32.752721 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76db7194-58de-4efa-8ffc-a18f17d2a3c4-combined-ca-bundle\") pod \"heat-engine-5778865fb9-z27ps\" (UID: \"76db7194-58de-4efa-8ffc-a18f17d2a3c4\") " pod="openstack/heat-engine-5778865fb9-z27ps" Mar 19 19:19:32 crc kubenswrapper[4826]: I0319 19:19:32.831817 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-d8697dd89-f2jgx"] Mar 19 19:19:32 crc kubenswrapper[4826]: I0319 19:19:32.833292 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-d8697dd89-f2jgx" Mar 19 19:19:32 crc kubenswrapper[4826]: I0319 19:19:32.861591 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8pj5\" (UniqueName: \"kubernetes.io/projected/fba7304a-508e-40ce-899c-608fd790ee26-kube-api-access-z8pj5\") pod \"heat-api-d8697dd89-f2jgx\" (UID: \"fba7304a-508e-40ce-899c-608fd790ee26\") " pod="openstack/heat-api-d8697dd89-f2jgx" Mar 19 19:19:32 crc kubenswrapper[4826]: I0319 19:19:32.861646 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76db7194-58de-4efa-8ffc-a18f17d2a3c4-combined-ca-bundle\") pod \"heat-engine-5778865fb9-z27ps\" (UID: \"76db7194-58de-4efa-8ffc-a18f17d2a3c4\") " pod="openstack/heat-engine-5778865fb9-z27ps" Mar 19 19:19:32 crc kubenswrapper[4826]: I0319 19:19:32.861688 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76db7194-58de-4efa-8ffc-a18f17d2a3c4-config-data\") pod \"heat-engine-5778865fb9-z27ps\" (UID: \"76db7194-58de-4efa-8ffc-a18f17d2a3c4\") " pod="openstack/heat-engine-5778865fb9-z27ps" Mar 19 19:19:32 crc kubenswrapper[4826]: I0319 19:19:32.861731 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/76db7194-58de-4efa-8ffc-a18f17d2a3c4-config-data-custom\") pod \"heat-engine-5778865fb9-z27ps\" (UID: \"76db7194-58de-4efa-8ffc-a18f17d2a3c4\") " pod="openstack/heat-engine-5778865fb9-z27ps" Mar 19 19:19:32 crc kubenswrapper[4826]: I0319 19:19:32.861801 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fba7304a-508e-40ce-899c-608fd790ee26-config-data-custom\") pod \"heat-api-d8697dd89-f2jgx\" (UID: \"fba7304a-508e-40ce-899c-608fd790ee26\") " pod="openstack/heat-api-d8697dd89-f2jgx" Mar 19 19:19:32 crc kubenswrapper[4826]: I0319 19:19:32.861839 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fba7304a-508e-40ce-899c-608fd790ee26-config-data\") pod \"heat-api-d8697dd89-f2jgx\" (UID: \"fba7304a-508e-40ce-899c-608fd790ee26\") " pod="openstack/heat-api-d8697dd89-f2jgx" Mar 19 19:19:32 crc kubenswrapper[4826]: I0319 19:19:32.861872 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fba7304a-508e-40ce-899c-608fd790ee26-combined-ca-bundle\") pod \"heat-api-d8697dd89-f2jgx\" (UID: \"fba7304a-508e-40ce-899c-608fd790ee26\") " pod="openstack/heat-api-d8697dd89-f2jgx" Mar 19 19:19:32 crc kubenswrapper[4826]: I0319 19:19:32.861906 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mx9fn\" (UniqueName: \"kubernetes.io/projected/76db7194-58de-4efa-8ffc-a18f17d2a3c4-kube-api-access-mx9fn\") pod \"heat-engine-5778865fb9-z27ps\" (UID: \"76db7194-58de-4efa-8ffc-a18f17d2a3c4\") " pod="openstack/heat-engine-5778865fb9-z27ps" Mar 19 19:19:32 crc kubenswrapper[4826]: I0319 19:19:32.871080 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-5778865fb9-z27ps"] Mar 19 19:19:32 crc kubenswrapper[4826]: I0319 19:19:32.889208 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/76db7194-58de-4efa-8ffc-a18f17d2a3c4-config-data-custom\") pod \"heat-engine-5778865fb9-z27ps\" (UID: \"76db7194-58de-4efa-8ffc-a18f17d2a3c4\") " pod="openstack/heat-engine-5778865fb9-z27ps" Mar 19 19:19:32 crc kubenswrapper[4826]: I0319 19:19:32.897528 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mx9fn\" (UniqueName: \"kubernetes.io/projected/76db7194-58de-4efa-8ffc-a18f17d2a3c4-kube-api-access-mx9fn\") pod \"heat-engine-5778865fb9-z27ps\" (UID: \"76db7194-58de-4efa-8ffc-a18f17d2a3c4\") " pod="openstack/heat-engine-5778865fb9-z27ps" Mar 19 19:19:32 crc kubenswrapper[4826]: I0319 19:19:32.912483 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76db7194-58de-4efa-8ffc-a18f17d2a3c4-config-data\") pod \"heat-engine-5778865fb9-z27ps\" (UID: \"76db7194-58de-4efa-8ffc-a18f17d2a3c4\") " pod="openstack/heat-engine-5778865fb9-z27ps" Mar 19 19:19:32 crc kubenswrapper[4826]: I0319 19:19:32.929806 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76db7194-58de-4efa-8ffc-a18f17d2a3c4-combined-ca-bundle\") pod \"heat-engine-5778865fb9-z27ps\" (UID: \"76db7194-58de-4efa-8ffc-a18f17d2a3c4\") " pod="openstack/heat-engine-5778865fb9-z27ps" Mar 19 19:19:32 crc kubenswrapper[4826]: I0319 19:19:32.936463 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-6bc67f648c-7ktnt"] Mar 19 19:19:32 crc kubenswrapper[4826]: I0319 19:19:32.937916 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6bc67f648c-7ktnt" Mar 19 19:19:32 crc kubenswrapper[4826]: I0319 19:19:32.957720 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-d8697dd89-f2jgx"] Mar 19 19:19:32 crc kubenswrapper[4826]: I0319 19:19:32.967898 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fba7304a-508e-40ce-899c-608fd790ee26-config-data-custom\") pod \"heat-api-d8697dd89-f2jgx\" (UID: \"fba7304a-508e-40ce-899c-608fd790ee26\") " pod="openstack/heat-api-d8697dd89-f2jgx" Mar 19 19:19:32 crc kubenswrapper[4826]: I0319 19:19:32.967948 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f72991a-39f2-4e20-bbba-68e7761f0644-combined-ca-bundle\") pod \"heat-cfnapi-6bc67f648c-7ktnt\" (UID: \"6f72991a-39f2-4e20-bbba-68e7761f0644\") " pod="openstack/heat-cfnapi-6bc67f648c-7ktnt" Mar 19 19:19:32 crc kubenswrapper[4826]: I0319 19:19:32.968000 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fba7304a-508e-40ce-899c-608fd790ee26-config-data\") pod \"heat-api-d8697dd89-f2jgx\" (UID: \"fba7304a-508e-40ce-899c-608fd790ee26\") " pod="openstack/heat-api-d8697dd89-f2jgx" Mar 19 19:19:32 crc kubenswrapper[4826]: I0319 19:19:32.968051 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fba7304a-508e-40ce-899c-608fd790ee26-combined-ca-bundle\") pod \"heat-api-d8697dd89-f2jgx\" (UID: \"fba7304a-508e-40ce-899c-608fd790ee26\") " pod="openstack/heat-api-d8697dd89-f2jgx" Mar 19 19:19:32 crc kubenswrapper[4826]: I0319 19:19:32.968109 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6f72991a-39f2-4e20-bbba-68e7761f0644-config-data-custom\") pod \"heat-cfnapi-6bc67f648c-7ktnt\" (UID: \"6f72991a-39f2-4e20-bbba-68e7761f0644\") " pod="openstack/heat-cfnapi-6bc67f648c-7ktnt" Mar 19 19:19:32 crc kubenswrapper[4826]: I0319 19:19:32.968180 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8pj5\" (UniqueName: \"kubernetes.io/projected/fba7304a-508e-40ce-899c-608fd790ee26-kube-api-access-z8pj5\") pod \"heat-api-d8697dd89-f2jgx\" (UID: \"fba7304a-508e-40ce-899c-608fd790ee26\") " pod="openstack/heat-api-d8697dd89-f2jgx" Mar 19 19:19:32 crc kubenswrapper[4826]: I0319 19:19:32.968246 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9jpr\" (UniqueName: \"kubernetes.io/projected/6f72991a-39f2-4e20-bbba-68e7761f0644-kube-api-access-j9jpr\") pod \"heat-cfnapi-6bc67f648c-7ktnt\" (UID: \"6f72991a-39f2-4e20-bbba-68e7761f0644\") " pod="openstack/heat-cfnapi-6bc67f648c-7ktnt" Mar 19 19:19:32 crc kubenswrapper[4826]: I0319 19:19:32.968284 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f72991a-39f2-4e20-bbba-68e7761f0644-config-data\") pod \"heat-cfnapi-6bc67f648c-7ktnt\" (UID: \"6f72991a-39f2-4e20-bbba-68e7761f0644\") " pod="openstack/heat-cfnapi-6bc67f648c-7ktnt" Mar 19 19:19:32 crc kubenswrapper[4826]: I0319 19:19:32.977510 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fba7304a-508e-40ce-899c-608fd790ee26-combined-ca-bundle\") pod \"heat-api-d8697dd89-f2jgx\" (UID: \"fba7304a-508e-40ce-899c-608fd790ee26\") " pod="openstack/heat-api-d8697dd89-f2jgx" Mar 19 19:19:32 crc kubenswrapper[4826]: I0319 19:19:32.988742 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fba7304a-508e-40ce-899c-608fd790ee26-config-data\") pod \"heat-api-d8697dd89-f2jgx\" (UID: \"fba7304a-508e-40ce-899c-608fd790ee26\") " pod="openstack/heat-api-d8697dd89-f2jgx" Mar 19 19:19:32 crc kubenswrapper[4826]: I0319 19:19:32.993708 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fba7304a-508e-40ce-899c-608fd790ee26-config-data-custom\") pod \"heat-api-d8697dd89-f2jgx\" (UID: \"fba7304a-508e-40ce-899c-608fd790ee26\") " pod="openstack/heat-api-d8697dd89-f2jgx" Mar 19 19:19:32 crc kubenswrapper[4826]: I0319 19:19:32.999334 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8pj5\" (UniqueName: \"kubernetes.io/projected/fba7304a-508e-40ce-899c-608fd790ee26-kube-api-access-z8pj5\") pod \"heat-api-d8697dd89-f2jgx\" (UID: \"fba7304a-508e-40ce-899c-608fd790ee26\") " pod="openstack/heat-api-d8697dd89-f2jgx" Mar 19 19:19:33 crc kubenswrapper[4826]: I0319 19:19:33.006255 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6bc67f648c-7ktnt"] Mar 19 19:19:33 crc kubenswrapper[4826]: I0319 19:19:33.037219 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e59b525d-52f1-4d86-a5da-37c081313417","Type":"ContainerStarted","Data":"550a8f683c972bd4e72cdadb62c99812697396be7235c67cb0bf0cd2f12f8d88"} Mar 19 19:19:33 crc kubenswrapper[4826]: I0319 19:19:33.073482 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6f72991a-39f2-4e20-bbba-68e7761f0644-config-data-custom\") pod \"heat-cfnapi-6bc67f648c-7ktnt\" (UID: \"6f72991a-39f2-4e20-bbba-68e7761f0644\") " pod="openstack/heat-cfnapi-6bc67f648c-7ktnt" Mar 19 19:19:33 crc kubenswrapper[4826]: I0319 19:19:33.073787 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9jpr\" (UniqueName: \"kubernetes.io/projected/6f72991a-39f2-4e20-bbba-68e7761f0644-kube-api-access-j9jpr\") pod \"heat-cfnapi-6bc67f648c-7ktnt\" (UID: \"6f72991a-39f2-4e20-bbba-68e7761f0644\") " pod="openstack/heat-cfnapi-6bc67f648c-7ktnt" Mar 19 19:19:33 crc kubenswrapper[4826]: I0319 19:19:33.073878 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f72991a-39f2-4e20-bbba-68e7761f0644-config-data\") pod \"heat-cfnapi-6bc67f648c-7ktnt\" (UID: \"6f72991a-39f2-4e20-bbba-68e7761f0644\") " pod="openstack/heat-cfnapi-6bc67f648c-7ktnt" Mar 19 19:19:33 crc kubenswrapper[4826]: I0319 19:19:33.074011 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f72991a-39f2-4e20-bbba-68e7761f0644-combined-ca-bundle\") pod \"heat-cfnapi-6bc67f648c-7ktnt\" (UID: \"6f72991a-39f2-4e20-bbba-68e7761f0644\") " pod="openstack/heat-cfnapi-6bc67f648c-7ktnt" Mar 19 19:19:33 crc kubenswrapper[4826]: I0319 19:19:33.076093 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5778865fb9-z27ps" Mar 19 19:19:33 crc kubenswrapper[4826]: I0319 19:19:33.089161 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6f72991a-39f2-4e20-bbba-68e7761f0644-config-data-custom\") pod \"heat-cfnapi-6bc67f648c-7ktnt\" (UID: \"6f72991a-39f2-4e20-bbba-68e7761f0644\") " pod="openstack/heat-cfnapi-6bc67f648c-7ktnt" Mar 19 19:19:33 crc kubenswrapper[4826]: I0319 19:19:33.089428 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f72991a-39f2-4e20-bbba-68e7761f0644-combined-ca-bundle\") pod \"heat-cfnapi-6bc67f648c-7ktnt\" (UID: \"6f72991a-39f2-4e20-bbba-68e7761f0644\") " pod="openstack/heat-cfnapi-6bc67f648c-7ktnt" Mar 19 19:19:33 crc kubenswrapper[4826]: I0319 19:19:33.094961 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f72991a-39f2-4e20-bbba-68e7761f0644-config-data\") pod \"heat-cfnapi-6bc67f648c-7ktnt\" (UID: \"6f72991a-39f2-4e20-bbba-68e7761f0644\") " pod="openstack/heat-cfnapi-6bc67f648c-7ktnt" Mar 19 19:19:33 crc kubenswrapper[4826]: I0319 19:19:33.098045 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9jpr\" (UniqueName: \"kubernetes.io/projected/6f72991a-39f2-4e20-bbba-68e7761f0644-kube-api-access-j9jpr\") pod \"heat-cfnapi-6bc67f648c-7ktnt\" (UID: \"6f72991a-39f2-4e20-bbba-68e7761f0644\") " pod="openstack/heat-cfnapi-6bc67f648c-7ktnt" Mar 19 19:19:33 crc kubenswrapper[4826]: I0319 19:19:33.149704 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-d8697dd89-f2jgx" Mar 19 19:19:33 crc kubenswrapper[4826]: I0319 19:19:33.170974 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6bc67f648c-7ktnt" Mar 19 19:19:33 crc kubenswrapper[4826]: I0319 19:19:33.781884 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-5778865fb9-z27ps"] Mar 19 19:19:33 crc kubenswrapper[4826]: I0319 19:19:33.952693 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6bc67f648c-7ktnt"] Mar 19 19:19:33 crc kubenswrapper[4826]: I0319 19:19:33.972200 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-d8697dd89-f2jgx"] Mar 19 19:19:33 crc kubenswrapper[4826]: W0319 19:19:33.991803 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfba7304a_508e_40ce_899c_608fd790ee26.slice/crio-cbcbf27e3ef482164887bde4ba3f7c94f36488fac325c72a365d0e2a6952bd26 WatchSource:0}: Error finding container cbcbf27e3ef482164887bde4ba3f7c94f36488fac325c72a365d0e2a6952bd26: Status 404 returned error can't find the container with id cbcbf27e3ef482164887bde4ba3f7c94f36488fac325c72a365d0e2a6952bd26 Mar 19 19:19:34 crc kubenswrapper[4826]: I0319 19:19:34.063222 4826 generic.go:334] "Generic (PLEG): container finished" podID="2ee13755-098f-4b30-8f68-4376adb9d4aa" containerID="7f01b1f044c2e1724a50234bed289541d76936595212ea22858965ec18c6384c" exitCode=0 Mar 19 19:19:34 crc kubenswrapper[4826]: I0319 19:19:34.063313 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2ee13755-098f-4b30-8f68-4376adb9d4aa","Type":"ContainerDied","Data":"7f01b1f044c2e1724a50234bed289541d76936595212ea22858965ec18c6384c"} Mar 19 19:19:34 crc kubenswrapper[4826]: I0319 19:19:34.074126 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e59b525d-52f1-4d86-a5da-37c081313417","Type":"ContainerStarted","Data":"ea9f38be91b0a71b13926d7222d2bf66e5a8cd150a918024642d064f04a772e1"} Mar 19 19:19:34 crc kubenswrapper[4826]: I0319 19:19:34.080505 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5778865fb9-z27ps" event={"ID":"76db7194-58de-4efa-8ffc-a18f17d2a3c4","Type":"ContainerStarted","Data":"69740d7ca7b86c059bf74495223cd2f5a4d9d1fd8f209241cce4976d07d9c841"} Mar 19 19:19:34 crc kubenswrapper[4826]: I0319 19:19:34.081933 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6bc67f648c-7ktnt" event={"ID":"6f72991a-39f2-4e20-bbba-68e7761f0644","Type":"ContainerStarted","Data":"71a288ef9674cb737997f93b27bdd7d3d476e228a4ba8a52ad37b0e60ca54555"} Mar 19 19:19:34 crc kubenswrapper[4826]: I0319 19:19:34.088101 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-d8697dd89-f2jgx" event={"ID":"fba7304a-508e-40ce-899c-608fd790ee26","Type":"ContainerStarted","Data":"cbcbf27e3ef482164887bde4ba3f7c94f36488fac325c72a365d0e2a6952bd26"} Mar 19 19:19:34 crc kubenswrapper[4826]: I0319 19:19:34.306982 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 19:19:34 crc kubenswrapper[4826]: I0319 19:19:34.420074 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ee13755-098f-4b30-8f68-4376adb9d4aa-combined-ca-bundle\") pod \"2ee13755-098f-4b30-8f68-4376adb9d4aa\" (UID: \"2ee13755-098f-4b30-8f68-4376adb9d4aa\") " Mar 19 19:19:34 crc kubenswrapper[4826]: I0319 19:19:34.420146 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ee13755-098f-4b30-8f68-4376adb9d4aa-config-data\") pod \"2ee13755-098f-4b30-8f68-4376adb9d4aa\" (UID: \"2ee13755-098f-4b30-8f68-4376adb9d4aa\") " Mar 19 19:19:34 crc kubenswrapper[4826]: I0319 19:19:34.420242 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2ee13755-098f-4b30-8f68-4376adb9d4aa-httpd-run\") pod \"2ee13755-098f-4b30-8f68-4376adb9d4aa\" (UID: \"2ee13755-098f-4b30-8f68-4376adb9d4aa\") " Mar 19 19:19:34 crc kubenswrapper[4826]: I0319 19:19:34.420298 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cj2jl\" (UniqueName: \"kubernetes.io/projected/2ee13755-098f-4b30-8f68-4376adb9d4aa-kube-api-access-cj2jl\") pod \"2ee13755-098f-4b30-8f68-4376adb9d4aa\" (UID: \"2ee13755-098f-4b30-8f68-4376adb9d4aa\") " Mar 19 19:19:34 crc kubenswrapper[4826]: I0319 19:19:34.420320 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ee13755-098f-4b30-8f68-4376adb9d4aa-scripts\") pod \"2ee13755-098f-4b30-8f68-4376adb9d4aa\" (UID: \"2ee13755-098f-4b30-8f68-4376adb9d4aa\") " Mar 19 19:19:34 crc kubenswrapper[4826]: I0319 19:19:34.420458 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ee13755-098f-4b30-8f68-4376adb9d4aa-internal-tls-certs\") pod \"2ee13755-098f-4b30-8f68-4376adb9d4aa\" (UID: \"2ee13755-098f-4b30-8f68-4376adb9d4aa\") " Mar 19 19:19:34 crc kubenswrapper[4826]: I0319 19:19:34.420496 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ee13755-098f-4b30-8f68-4376adb9d4aa-logs\") pod \"2ee13755-098f-4b30-8f68-4376adb9d4aa\" (UID: \"2ee13755-098f-4b30-8f68-4376adb9d4aa\") " Mar 19 19:19:34 crc kubenswrapper[4826]: I0319 19:19:34.421955 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ee13755-098f-4b30-8f68-4376adb9d4aa-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2ee13755-098f-4b30-8f68-4376adb9d4aa" (UID: "2ee13755-098f-4b30-8f68-4376adb9d4aa"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:19:34 crc kubenswrapper[4826]: I0319 19:19:34.422712 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ee13755-098f-4b30-8f68-4376adb9d4aa-logs" (OuterVolumeSpecName: "logs") pod "2ee13755-098f-4b30-8f68-4376adb9d4aa" (UID: "2ee13755-098f-4b30-8f68-4376adb9d4aa"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:19:34 crc kubenswrapper[4826]: I0319 19:19:34.423552 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ce7ce0de-edb0-4ca1-b23b-a1184d7ab53e\") pod \"2ee13755-098f-4b30-8f68-4376adb9d4aa\" (UID: \"2ee13755-098f-4b30-8f68-4376adb9d4aa\") " Mar 19 19:19:34 crc kubenswrapper[4826]: I0319 19:19:34.443382 4826 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2ee13755-098f-4b30-8f68-4376adb9d4aa-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:34 crc kubenswrapper[4826]: I0319 19:19:34.443413 4826 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ee13755-098f-4b30-8f68-4376adb9d4aa-logs\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:34 crc kubenswrapper[4826]: I0319 19:19:34.453498 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ee13755-098f-4b30-8f68-4376adb9d4aa-kube-api-access-cj2jl" (OuterVolumeSpecName: "kube-api-access-cj2jl") pod "2ee13755-098f-4b30-8f68-4376adb9d4aa" (UID: "2ee13755-098f-4b30-8f68-4376adb9d4aa"). InnerVolumeSpecName "kube-api-access-cj2jl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:19:34 crc kubenswrapper[4826]: I0319 19:19:34.463509 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ee13755-098f-4b30-8f68-4376adb9d4aa-scripts" (OuterVolumeSpecName: "scripts") pod "2ee13755-098f-4b30-8f68-4376adb9d4aa" (UID: "2ee13755-098f-4b30-8f68-4376adb9d4aa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:34 crc kubenswrapper[4826]: I0319 19:19:34.547104 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cj2jl\" (UniqueName: \"kubernetes.io/projected/2ee13755-098f-4b30-8f68-4376adb9d4aa-kube-api-access-cj2jl\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:34 crc kubenswrapper[4826]: I0319 19:19:34.547528 4826 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ee13755-098f-4b30-8f68-4376adb9d4aa-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:34 crc kubenswrapper[4826]: I0319 19:19:34.584927 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ce7ce0de-edb0-4ca1-b23b-a1184d7ab53e" (OuterVolumeSpecName: "glance") pod "2ee13755-098f-4b30-8f68-4376adb9d4aa" (UID: "2ee13755-098f-4b30-8f68-4376adb9d4aa"). InnerVolumeSpecName "pvc-ce7ce0de-edb0-4ca1-b23b-a1184d7ab53e". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 19 19:19:34 crc kubenswrapper[4826]: I0319 19:19:34.649739 4826 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-ce7ce0de-edb0-4ca1-b23b-a1184d7ab53e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ce7ce0de-edb0-4ca1-b23b-a1184d7ab53e\") on node \"crc\" " Mar 19 19:19:34 crc kubenswrapper[4826]: I0319 19:19:34.667924 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ee13755-098f-4b30-8f68-4376adb9d4aa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2ee13755-098f-4b30-8f68-4376adb9d4aa" (UID: "2ee13755-098f-4b30-8f68-4376adb9d4aa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:34 crc kubenswrapper[4826]: I0319 19:19:34.725901 4826 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 19 19:19:34 crc kubenswrapper[4826]: I0319 19:19:34.726066 4826 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-ce7ce0de-edb0-4ca1-b23b-a1184d7ab53e" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ce7ce0de-edb0-4ca1-b23b-a1184d7ab53e") on node "crc" Mar 19 19:19:34 crc kubenswrapper[4826]: I0319 19:19:34.741184 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ee13755-098f-4b30-8f68-4376adb9d4aa-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2ee13755-098f-4b30-8f68-4376adb9d4aa" (UID: "2ee13755-098f-4b30-8f68-4376adb9d4aa"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:34 crc kubenswrapper[4826]: I0319 19:19:34.752126 4826 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ee13755-098f-4b30-8f68-4376adb9d4aa-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:34 crc kubenswrapper[4826]: I0319 19:19:34.752155 4826 reconciler_common.go:293] "Volume detached for volume \"pvc-ce7ce0de-edb0-4ca1-b23b-a1184d7ab53e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ce7ce0de-edb0-4ca1-b23b-a1184d7ab53e\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:34 crc kubenswrapper[4826]: I0319 19:19:34.752168 4826 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ee13755-098f-4b30-8f68-4376adb9d4aa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:34 crc kubenswrapper[4826]: I0319 19:19:34.763390 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ee13755-098f-4b30-8f68-4376adb9d4aa-config-data" (OuterVolumeSpecName: "config-data") pod "2ee13755-098f-4b30-8f68-4376adb9d4aa" (UID: "2ee13755-098f-4b30-8f68-4376adb9d4aa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:34 crc kubenswrapper[4826]: I0319 19:19:34.870068 4826 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ee13755-098f-4b30-8f68-4376adb9d4aa-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:34 crc kubenswrapper[4826]: I0319 19:19:34.920819 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-api-0" podUID="e18e7f7e-f1f1-4349-a076-79e1f781315d" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.224:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.019788 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-7c88fd79fb-wz79s"] Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.020243 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-7c88fd79fb-wz79s" podUID="6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3" containerName="heat-api" containerID="cri-o://95c3c6e502efc12c74623e5694c7a61ea772e53865b7e0738e3661da68279953" gracePeriod=60 Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.041419 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-68bcd6d974-ppz6b"] Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.043299 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-68bcd6d974-ppz6b" podUID="82d5c8e9-6fae-4abc-996d-f1d3ce618ac8" containerName="heat-cfnapi" containerID="cri-o://78fa0b8373dfced29b5cee7aa94853b1a82d50aa81e14bf83daca33d54d170c8" gracePeriod=60 Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.053931 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-784d5749b4-gl4q7"] Mar 19 19:19:35 crc kubenswrapper[4826]: E0319 19:19:35.054403 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ee13755-098f-4b30-8f68-4376adb9d4aa" containerName="glance-httpd" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.054419 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ee13755-098f-4b30-8f68-4376adb9d4aa" containerName="glance-httpd" Mar 19 19:19:35 crc kubenswrapper[4826]: E0319 19:19:35.054431 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ee13755-098f-4b30-8f68-4376adb9d4aa" containerName="glance-log" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.054437 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ee13755-098f-4b30-8f68-4376adb9d4aa" containerName="glance-log" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.054682 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ee13755-098f-4b30-8f68-4376adb9d4aa" containerName="glance-httpd" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.054695 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ee13755-098f-4b30-8f68-4376adb9d4aa" containerName="glance-log" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.099639 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-784d5749b4-gl4q7" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.104544 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-public-svc" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.109335 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-internal-svc" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.111780 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-559768d959-n8n6w"] Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.121026 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-559768d959-n8n6w" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.125807 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-public-svc" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.131581 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-internal-svc" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.152704 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-784d5749b4-gl4q7"] Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.189220 4826 generic.go:334] "Generic (PLEG): container finished" podID="6f72991a-39f2-4e20-bbba-68e7761f0644" containerID="863f1b6cd22febb26830d51a6e705d2f15751aff4255356a0e9d1d8240a177dd" exitCode=1 Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.190390 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6bc67f648c-7ktnt" event={"ID":"6f72991a-39f2-4e20-bbba-68e7761f0644","Type":"ContainerDied","Data":"863f1b6cd22febb26830d51a6e705d2f15751aff4255356a0e9d1d8240a177dd"} Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.191103 4826 scope.go:117] "RemoveContainer" containerID="863f1b6cd22febb26830d51a6e705d2f15751aff4255356a0e9d1d8240a177dd" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.197133 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-559768d959-n8n6w"] Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.213561 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/63a6bcd4-833d-4f25-a4ab-e890afd8feb1-internal-tls-certs\") pod \"heat-cfnapi-559768d959-n8n6w\" (UID: \"63a6bcd4-833d-4f25-a4ab-e890afd8feb1\") " pod="openstack/heat-cfnapi-559768d959-n8n6w" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.213732 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/07d156e5-8ee8-46e0-880f-4cf3fd7d2aac-config-data-custom\") pod \"heat-api-784d5749b4-gl4q7\" (UID: \"07d156e5-8ee8-46e0-880f-4cf3fd7d2aac\") " pod="openstack/heat-api-784d5749b4-gl4q7" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.213762 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63a6bcd4-833d-4f25-a4ab-e890afd8feb1-config-data\") pod \"heat-cfnapi-559768d959-n8n6w\" (UID: \"63a6bcd4-833d-4f25-a4ab-e890afd8feb1\") " pod="openstack/heat-cfnapi-559768d959-n8n6w" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.213794 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z27bh\" (UniqueName: \"kubernetes.io/projected/07d156e5-8ee8-46e0-880f-4cf3fd7d2aac-kube-api-access-z27bh\") pod \"heat-api-784d5749b4-gl4q7\" (UID: \"07d156e5-8ee8-46e0-880f-4cf3fd7d2aac\") " pod="openstack/heat-api-784d5749b4-gl4q7" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.214835 4826 generic.go:334] "Generic (PLEG): container finished" podID="fba7304a-508e-40ce-899c-608fd790ee26" containerID="c7a3845637e2063d13b7e2eeb9a9def2641884332306e2b18e677e34ffc6f433" exitCode=1 Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.214919 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-d8697dd89-f2jgx" event={"ID":"fba7304a-508e-40ce-899c-608fd790ee26","Type":"ContainerDied","Data":"c7a3845637e2063d13b7e2eeb9a9def2641884332306e2b18e677e34ffc6f433"} Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.213828 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07d156e5-8ee8-46e0-880f-4cf3fd7d2aac-combined-ca-bundle\") pod \"heat-api-784d5749b4-gl4q7\" (UID: \"07d156e5-8ee8-46e0-880f-4cf3fd7d2aac\") " pod="openstack/heat-api-784d5749b4-gl4q7" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.215560 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/63a6bcd4-833d-4f25-a4ab-e890afd8feb1-public-tls-certs\") pod \"heat-cfnapi-559768d959-n8n6w\" (UID: \"63a6bcd4-833d-4f25-a4ab-e890afd8feb1\") " pod="openstack/heat-cfnapi-559768d959-n8n6w" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.215606 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63a6bcd4-833d-4f25-a4ab-e890afd8feb1-combined-ca-bundle\") pod \"heat-cfnapi-559768d959-n8n6w\" (UID: \"63a6bcd4-833d-4f25-a4ab-e890afd8feb1\") " pod="openstack/heat-cfnapi-559768d959-n8n6w" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.215636 4826 scope.go:117] "RemoveContainer" containerID="c7a3845637e2063d13b7e2eeb9a9def2641884332306e2b18e677e34ffc6f433" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.215699 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/07d156e5-8ee8-46e0-880f-4cf3fd7d2aac-public-tls-certs\") pod \"heat-api-784d5749b4-gl4q7\" (UID: \"07d156e5-8ee8-46e0-880f-4cf3fd7d2aac\") " pod="openstack/heat-api-784d5749b4-gl4q7" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.215725 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/63a6bcd4-833d-4f25-a4ab-e890afd8feb1-config-data-custom\") pod \"heat-cfnapi-559768d959-n8n6w\" (UID: \"63a6bcd4-833d-4f25-a4ab-e890afd8feb1\") " pod="openstack/heat-cfnapi-559768d959-n8n6w" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.215759 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07d156e5-8ee8-46e0-880f-4cf3fd7d2aac-config-data\") pod \"heat-api-784d5749b4-gl4q7\" (UID: \"07d156e5-8ee8-46e0-880f-4cf3fd7d2aac\") " pod="openstack/heat-api-784d5749b4-gl4q7" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.215886 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lljph\" (UniqueName: \"kubernetes.io/projected/63a6bcd4-833d-4f25-a4ab-e890afd8feb1-kube-api-access-lljph\") pod \"heat-cfnapi-559768d959-n8n6w\" (UID: \"63a6bcd4-833d-4f25-a4ab-e890afd8feb1\") " pod="openstack/heat-cfnapi-559768d959-n8n6w" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.215953 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/07d156e5-8ee8-46e0-880f-4cf3fd7d2aac-internal-tls-certs\") pod \"heat-api-784d5749b4-gl4q7\" (UID: \"07d156e5-8ee8-46e0-880f-4cf3fd7d2aac\") " pod="openstack/heat-api-784d5749b4-gl4q7" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.239743 4826 generic.go:334] "Generic (PLEG): container finished" podID="af2662b9-3873-4947-9793-e7e1c6611dcb" containerID="3be29d70298c1093bc3ea79f530bfa5d2598792bf372dd42f474edfb330146ef" exitCode=0 Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.239799 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cml5k" event={"ID":"af2662b9-3873-4947-9793-e7e1c6611dcb","Type":"ContainerDied","Data":"3be29d70298c1093bc3ea79f530bfa5d2598792bf372dd42f474edfb330146ef"} Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.289380 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2ee13755-098f-4b30-8f68-4376adb9d4aa","Type":"ContainerDied","Data":"c70ffe91fc6694f141565003afb38ef310c5fda8d8043c89fa5b94ee5e7a2c0b"} Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.289433 4826 scope.go:117] "RemoveContainer" containerID="7f01b1f044c2e1724a50234bed289541d76936595212ea22858965ec18c6384c" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.289587 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.317798 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lljph\" (UniqueName: \"kubernetes.io/projected/63a6bcd4-833d-4f25-a4ab-e890afd8feb1-kube-api-access-lljph\") pod \"heat-cfnapi-559768d959-n8n6w\" (UID: \"63a6bcd4-833d-4f25-a4ab-e890afd8feb1\") " pod="openstack/heat-cfnapi-559768d959-n8n6w" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.317852 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/07d156e5-8ee8-46e0-880f-4cf3fd7d2aac-internal-tls-certs\") pod \"heat-api-784d5749b4-gl4q7\" (UID: \"07d156e5-8ee8-46e0-880f-4cf3fd7d2aac\") " pod="openstack/heat-api-784d5749b4-gl4q7" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.317920 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/63a6bcd4-833d-4f25-a4ab-e890afd8feb1-internal-tls-certs\") pod \"heat-cfnapi-559768d959-n8n6w\" (UID: \"63a6bcd4-833d-4f25-a4ab-e890afd8feb1\") " pod="openstack/heat-cfnapi-559768d959-n8n6w" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.318021 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/07d156e5-8ee8-46e0-880f-4cf3fd7d2aac-config-data-custom\") pod \"heat-api-784d5749b4-gl4q7\" (UID: \"07d156e5-8ee8-46e0-880f-4cf3fd7d2aac\") " pod="openstack/heat-api-784d5749b4-gl4q7" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.318040 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63a6bcd4-833d-4f25-a4ab-e890afd8feb1-config-data\") pod \"heat-cfnapi-559768d959-n8n6w\" (UID: \"63a6bcd4-833d-4f25-a4ab-e890afd8feb1\") " pod="openstack/heat-cfnapi-559768d959-n8n6w" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.318057 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z27bh\" (UniqueName: \"kubernetes.io/projected/07d156e5-8ee8-46e0-880f-4cf3fd7d2aac-kube-api-access-z27bh\") pod \"heat-api-784d5749b4-gl4q7\" (UID: \"07d156e5-8ee8-46e0-880f-4cf3fd7d2aac\") " pod="openstack/heat-api-784d5749b4-gl4q7" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.318088 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07d156e5-8ee8-46e0-880f-4cf3fd7d2aac-combined-ca-bundle\") pod \"heat-api-784d5749b4-gl4q7\" (UID: \"07d156e5-8ee8-46e0-880f-4cf3fd7d2aac\") " pod="openstack/heat-api-784d5749b4-gl4q7" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.318103 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/63a6bcd4-833d-4f25-a4ab-e890afd8feb1-public-tls-certs\") pod \"heat-cfnapi-559768d959-n8n6w\" (UID: \"63a6bcd4-833d-4f25-a4ab-e890afd8feb1\") " pod="openstack/heat-cfnapi-559768d959-n8n6w" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.318134 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63a6bcd4-833d-4f25-a4ab-e890afd8feb1-combined-ca-bundle\") pod \"heat-cfnapi-559768d959-n8n6w\" (UID: \"63a6bcd4-833d-4f25-a4ab-e890afd8feb1\") " pod="openstack/heat-cfnapi-559768d959-n8n6w" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.318191 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/07d156e5-8ee8-46e0-880f-4cf3fd7d2aac-public-tls-certs\") pod \"heat-api-784d5749b4-gl4q7\" (UID: \"07d156e5-8ee8-46e0-880f-4cf3fd7d2aac\") " pod="openstack/heat-api-784d5749b4-gl4q7" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.318214 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/63a6bcd4-833d-4f25-a4ab-e890afd8feb1-config-data-custom\") pod \"heat-cfnapi-559768d959-n8n6w\" (UID: \"63a6bcd4-833d-4f25-a4ab-e890afd8feb1\") " pod="openstack/heat-cfnapi-559768d959-n8n6w" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.318240 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07d156e5-8ee8-46e0-880f-4cf3fd7d2aac-config-data\") pod \"heat-api-784d5749b4-gl4q7\" (UID: \"07d156e5-8ee8-46e0-880f-4cf3fd7d2aac\") " pod="openstack/heat-api-784d5749b4-gl4q7" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.330352 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e59b525d-52f1-4d86-a5da-37c081313417","Type":"ContainerStarted","Data":"e4c9622a827f99ee1ce26e67a3f0278a1160184f8d6ad03c38537f6f5cff54c1"} Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.342266 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/07d156e5-8ee8-46e0-880f-4cf3fd7d2aac-public-tls-certs\") pod \"heat-api-784d5749b4-gl4q7\" (UID: \"07d156e5-8ee8-46e0-880f-4cf3fd7d2aac\") " pod="openstack/heat-api-784d5749b4-gl4q7" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.364578 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/63a6bcd4-833d-4f25-a4ab-e890afd8feb1-internal-tls-certs\") pod \"heat-cfnapi-559768d959-n8n6w\" (UID: \"63a6bcd4-833d-4f25-a4ab-e890afd8feb1\") " pod="openstack/heat-cfnapi-559768d959-n8n6w" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.366521 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07d156e5-8ee8-46e0-880f-4cf3fd7d2aac-combined-ca-bundle\") pod \"heat-api-784d5749b4-gl4q7\" (UID: \"07d156e5-8ee8-46e0-880f-4cf3fd7d2aac\") " pod="openstack/heat-api-784d5749b4-gl4q7" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.367074 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z27bh\" (UniqueName: \"kubernetes.io/projected/07d156e5-8ee8-46e0-880f-4cf3fd7d2aac-kube-api-access-z27bh\") pod \"heat-api-784d5749b4-gl4q7\" (UID: \"07d156e5-8ee8-46e0-880f-4cf3fd7d2aac\") " pod="openstack/heat-api-784d5749b4-gl4q7" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.372447 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/07d156e5-8ee8-46e0-880f-4cf3fd7d2aac-config-data-custom\") pod \"heat-api-784d5749b4-gl4q7\" (UID: \"07d156e5-8ee8-46e0-880f-4cf3fd7d2aac\") " pod="openstack/heat-api-784d5749b4-gl4q7" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.381887 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07d156e5-8ee8-46e0-880f-4cf3fd7d2aac-config-data\") pod \"heat-api-784d5749b4-gl4q7\" (UID: \"07d156e5-8ee8-46e0-880f-4cf3fd7d2aac\") " pod="openstack/heat-api-784d5749b4-gl4q7" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.385255 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/63a6bcd4-833d-4f25-a4ab-e890afd8feb1-public-tls-certs\") pod \"heat-cfnapi-559768d959-n8n6w\" (UID: \"63a6bcd4-833d-4f25-a4ab-e890afd8feb1\") " pod="openstack/heat-cfnapi-559768d959-n8n6w" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.385732 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/07d156e5-8ee8-46e0-880f-4cf3fd7d2aac-internal-tls-certs\") pod \"heat-api-784d5749b4-gl4q7\" (UID: \"07d156e5-8ee8-46e0-880f-4cf3fd7d2aac\") " pod="openstack/heat-api-784d5749b4-gl4q7" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.386291 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lljph\" (UniqueName: \"kubernetes.io/projected/63a6bcd4-833d-4f25-a4ab-e890afd8feb1-kube-api-access-lljph\") pod \"heat-cfnapi-559768d959-n8n6w\" (UID: \"63a6bcd4-833d-4f25-a4ab-e890afd8feb1\") " pod="openstack/heat-cfnapi-559768d959-n8n6w" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.393551 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63a6bcd4-833d-4f25-a4ab-e890afd8feb1-combined-ca-bundle\") pod \"heat-cfnapi-559768d959-n8n6w\" (UID: \"63a6bcd4-833d-4f25-a4ab-e890afd8feb1\") " pod="openstack/heat-cfnapi-559768d959-n8n6w" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.396862 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/63a6bcd4-833d-4f25-a4ab-e890afd8feb1-config-data-custom\") pod \"heat-cfnapi-559768d959-n8n6w\" (UID: \"63a6bcd4-833d-4f25-a4ab-e890afd8feb1\") " pod="openstack/heat-cfnapi-559768d959-n8n6w" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.403390 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5778865fb9-z27ps" event={"ID":"76db7194-58de-4efa-8ffc-a18f17d2a3c4","Type":"ContainerStarted","Data":"68dfaf9f9c83d4523aa658f5fab31f7d1183a95ef00b2a8f508bf2a06aba4fee"} Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.406161 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-5778865fb9-z27ps" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.414509 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63a6bcd4-833d-4f25-a4ab-e890afd8feb1-config-data\") pod \"heat-cfnapi-559768d959-n8n6w\" (UID: \"63a6bcd4-833d-4f25-a4ab-e890afd8feb1\") " pod="openstack/heat-cfnapi-559768d959-n8n6w" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.446976 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-5778865fb9-z27ps" podStartSLOduration=3.446956047 podStartE2EDuration="3.446956047s" podCreationTimestamp="2026-03-19 19:19:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:19:35.438747909 +0000 UTC m=+1400.192816232" watchObservedRunningTime="2026-03-19 19:19:35.446956047 +0000 UTC m=+1400.201024360" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.450411 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-784d5749b4-gl4q7" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.488124 4826 scope.go:117] "RemoveContainer" containerID="f1b048f794c806a00b659dc40b9462845f9f298f42c762391b8b6db5f6b14904" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.489966 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-559768d959-n8n6w" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.684111 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.703344 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.733839 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.735808 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.741614 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.741856 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.750267 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.850451 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ada0e1e-2ace-4520-9fce-e0d9771afc01-logs\") pod \"glance-default-internal-api-0\" (UID: \"3ada0e1e-2ace-4520-9fce-e0d9771afc01\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.850743 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ce7ce0de-edb0-4ca1-b23b-a1184d7ab53e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ce7ce0de-edb0-4ca1-b23b-a1184d7ab53e\") pod \"glance-default-internal-api-0\" (UID: \"3ada0e1e-2ace-4520-9fce-e0d9771afc01\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.850786 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ada0e1e-2ace-4520-9fce-e0d9771afc01-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3ada0e1e-2ace-4520-9fce-e0d9771afc01\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.850813 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nq6r\" (UniqueName: \"kubernetes.io/projected/3ada0e1e-2ace-4520-9fce-e0d9771afc01-kube-api-access-4nq6r\") pod \"glance-default-internal-api-0\" (UID: \"3ada0e1e-2ace-4520-9fce-e0d9771afc01\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.850851 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ada0e1e-2ace-4520-9fce-e0d9771afc01-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3ada0e1e-2ace-4520-9fce-e0d9771afc01\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.850936 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3ada0e1e-2ace-4520-9fce-e0d9771afc01-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3ada0e1e-2ace-4520-9fce-e0d9771afc01\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.850996 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ada0e1e-2ace-4520-9fce-e0d9771afc01-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3ada0e1e-2ace-4520-9fce-e0d9771afc01\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.851017 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ada0e1e-2ace-4520-9fce-e0d9771afc01-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3ada0e1e-2ace-4520-9fce-e0d9771afc01\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.954207 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ada0e1e-2ace-4520-9fce-e0d9771afc01-logs\") pod \"glance-default-internal-api-0\" (UID: \"3ada0e1e-2ace-4520-9fce-e0d9771afc01\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.954263 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ce7ce0de-edb0-4ca1-b23b-a1184d7ab53e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ce7ce0de-edb0-4ca1-b23b-a1184d7ab53e\") pod \"glance-default-internal-api-0\" (UID: \"3ada0e1e-2ace-4520-9fce-e0d9771afc01\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.954310 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ada0e1e-2ace-4520-9fce-e0d9771afc01-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3ada0e1e-2ace-4520-9fce-e0d9771afc01\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.954338 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nq6r\" (UniqueName: \"kubernetes.io/projected/3ada0e1e-2ace-4520-9fce-e0d9771afc01-kube-api-access-4nq6r\") pod \"glance-default-internal-api-0\" (UID: \"3ada0e1e-2ace-4520-9fce-e0d9771afc01\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.954372 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ada0e1e-2ace-4520-9fce-e0d9771afc01-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3ada0e1e-2ace-4520-9fce-e0d9771afc01\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.954480 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3ada0e1e-2ace-4520-9fce-e0d9771afc01-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3ada0e1e-2ace-4520-9fce-e0d9771afc01\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.954549 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ada0e1e-2ace-4520-9fce-e0d9771afc01-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3ada0e1e-2ace-4520-9fce-e0d9771afc01\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.954573 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ada0e1e-2ace-4520-9fce-e0d9771afc01-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3ada0e1e-2ace-4520-9fce-e0d9771afc01\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.955377 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ada0e1e-2ace-4520-9fce-e0d9771afc01-logs\") pod \"glance-default-internal-api-0\" (UID: \"3ada0e1e-2ace-4520-9fce-e0d9771afc01\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.958939 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3ada0e1e-2ace-4520-9fce-e0d9771afc01-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3ada0e1e-2ace-4520-9fce-e0d9771afc01\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.961398 4826 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.961503 4826 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ce7ce0de-edb0-4ca1-b23b-a1184d7ab53e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ce7ce0de-edb0-4ca1-b23b-a1184d7ab53e\") pod \"glance-default-internal-api-0\" (UID: \"3ada0e1e-2ace-4520-9fce-e0d9771afc01\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/8a5c6a8766a786cbd3eca35b0f5a1e3802ae1e4cbe235d9479277134f5caec0c/globalmount\"" pod="openstack/glance-default-internal-api-0" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.966348 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="e18e7f7e-f1f1-4349-a076-79e1f781315d" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.224:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.970057 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ada0e1e-2ace-4520-9fce-e0d9771afc01-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3ada0e1e-2ace-4520-9fce-e0d9771afc01\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.970341 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ada0e1e-2ace-4520-9fce-e0d9771afc01-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3ada0e1e-2ace-4520-9fce-e0d9771afc01\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.978578 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ada0e1e-2ace-4520-9fce-e0d9771afc01-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3ada0e1e-2ace-4520-9fce-e0d9771afc01\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.982210 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ada0e1e-2ace-4520-9fce-e0d9771afc01-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3ada0e1e-2ace-4520-9fce-e0d9771afc01\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:19:35 crc kubenswrapper[4826]: I0319 19:19:35.997966 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nq6r\" (UniqueName: \"kubernetes.io/projected/3ada0e1e-2ace-4520-9fce-e0d9771afc01-kube-api-access-4nq6r\") pod \"glance-default-internal-api-0\" (UID: \"3ada0e1e-2ace-4520-9fce-e0d9771afc01\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:19:36 crc kubenswrapper[4826]: I0319 19:19:36.047489 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ee13755-098f-4b30-8f68-4376adb9d4aa" path="/var/lib/kubelet/pods/2ee13755-098f-4b30-8f68-4376adb9d4aa/volumes" Mar 19 19:19:36 crc kubenswrapper[4826]: I0319 19:19:36.065131 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ce7ce0de-edb0-4ca1-b23b-a1184d7ab53e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ce7ce0de-edb0-4ca1-b23b-a1184d7ab53e\") pod \"glance-default-internal-api-0\" (UID: \"3ada0e1e-2ace-4520-9fce-e0d9771afc01\") " pod="openstack/glance-default-internal-api-0" Mar 19 19:19:36 crc kubenswrapper[4826]: I0319 19:19:36.182179 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-68bcd6d974-ppz6b" Mar 19 19:19:36 crc kubenswrapper[4826]: I0319 19:19:36.266741 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcvbm\" (UniqueName: \"kubernetes.io/projected/82d5c8e9-6fae-4abc-996d-f1d3ce618ac8-kube-api-access-gcvbm\") pod \"82d5c8e9-6fae-4abc-996d-f1d3ce618ac8\" (UID: \"82d5c8e9-6fae-4abc-996d-f1d3ce618ac8\") " Mar 19 19:19:36 crc kubenswrapper[4826]: I0319 19:19:36.267305 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/82d5c8e9-6fae-4abc-996d-f1d3ce618ac8-config-data-custom\") pod \"82d5c8e9-6fae-4abc-996d-f1d3ce618ac8\" (UID: \"82d5c8e9-6fae-4abc-996d-f1d3ce618ac8\") " Mar 19 19:19:36 crc kubenswrapper[4826]: I0319 19:19:36.267379 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82d5c8e9-6fae-4abc-996d-f1d3ce618ac8-combined-ca-bundle\") pod \"82d5c8e9-6fae-4abc-996d-f1d3ce618ac8\" (UID: \"82d5c8e9-6fae-4abc-996d-f1d3ce618ac8\") " Mar 19 19:19:36 crc kubenswrapper[4826]: I0319 19:19:36.267420 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82d5c8e9-6fae-4abc-996d-f1d3ce618ac8-config-data\") pod \"82d5c8e9-6fae-4abc-996d-f1d3ce618ac8\" (UID: \"82d5c8e9-6fae-4abc-996d-f1d3ce618ac8\") " Mar 19 19:19:36 crc kubenswrapper[4826]: I0319 19:19:36.272289 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-784d5749b4-gl4q7"] Mar 19 19:19:36 crc kubenswrapper[4826]: I0319 19:19:36.272729 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82d5c8e9-6fae-4abc-996d-f1d3ce618ac8-kube-api-access-gcvbm" (OuterVolumeSpecName: "kube-api-access-gcvbm") pod "82d5c8e9-6fae-4abc-996d-f1d3ce618ac8" (UID: "82d5c8e9-6fae-4abc-996d-f1d3ce618ac8"). InnerVolumeSpecName "kube-api-access-gcvbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:19:36 crc kubenswrapper[4826]: I0319 19:19:36.273393 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcvbm\" (UniqueName: \"kubernetes.io/projected/82d5c8e9-6fae-4abc-996d-f1d3ce618ac8-kube-api-access-gcvbm\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:36 crc kubenswrapper[4826]: I0319 19:19:36.278874 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82d5c8e9-6fae-4abc-996d-f1d3ce618ac8-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "82d5c8e9-6fae-4abc-996d-f1d3ce618ac8" (UID: "82d5c8e9-6fae-4abc-996d-f1d3ce618ac8"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:36 crc kubenswrapper[4826]: I0319 19:19:36.358563 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-688b9f5b49-sspxd" Mar 19 19:19:36 crc kubenswrapper[4826]: I0319 19:19:36.359712 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 19:19:36 crc kubenswrapper[4826]: I0319 19:19:36.376816 4826 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/82d5c8e9-6fae-4abc-996d-f1d3ce618ac8-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:36 crc kubenswrapper[4826]: I0319 19:19:36.393772 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82d5c8e9-6fae-4abc-996d-f1d3ce618ac8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "82d5c8e9-6fae-4abc-996d-f1d3ce618ac8" (UID: "82d5c8e9-6fae-4abc-996d-f1d3ce618ac8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:36 crc kubenswrapper[4826]: I0319 19:19:36.404836 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82d5c8e9-6fae-4abc-996d-f1d3ce618ac8-config-data" (OuterVolumeSpecName: "config-data") pod "82d5c8e9-6fae-4abc-996d-f1d3ce618ac8" (UID: "82d5c8e9-6fae-4abc-996d-f1d3ce618ac8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:36 crc kubenswrapper[4826]: I0319 19:19:36.463334 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-z2dgh"] Mar 19 19:19:36 crc kubenswrapper[4826]: I0319 19:19:36.475367 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6578955fd5-z2dgh" podUID="3fd164df-263a-4817-a6d0-bedc59421e75" containerName="dnsmasq-dns" containerID="cri-o://44face072e2b1e82a23f7e97a749880d84d47c51f0ca5fd80a4f066805463c2b" gracePeriod=10 Mar 19 19:19:36 crc kubenswrapper[4826]: I0319 19:19:36.479951 4826 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82d5c8e9-6fae-4abc-996d-f1d3ce618ac8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:36 crc kubenswrapper[4826]: I0319 19:19:36.484230 4826 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82d5c8e9-6fae-4abc-996d-f1d3ce618ac8-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:36 crc kubenswrapper[4826]: I0319 19:19:36.481369 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-68bcd6d974-ppz6b" event={"ID":"82d5c8e9-6fae-4abc-996d-f1d3ce618ac8","Type":"ContainerDied","Data":"78fa0b8373dfced29b5cee7aa94853b1a82d50aa81e14bf83daca33d54d170c8"} Mar 19 19:19:36 crc kubenswrapper[4826]: I0319 19:19:36.484419 4826 scope.go:117] "RemoveContainer" containerID="78fa0b8373dfced29b5cee7aa94853b1a82d50aa81e14bf83daca33d54d170c8" Mar 19 19:19:36 crc kubenswrapper[4826]: I0319 19:19:36.481345 4826 generic.go:334] "Generic (PLEG): container finished" podID="82d5c8e9-6fae-4abc-996d-f1d3ce618ac8" containerID="78fa0b8373dfced29b5cee7aa94853b1a82d50aa81e14bf83daca33d54d170c8" exitCode=0 Mar 19 19:19:36 crc kubenswrapper[4826]: I0319 19:19:36.481431 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-68bcd6d974-ppz6b" Mar 19 19:19:36 crc kubenswrapper[4826]: I0319 19:19:36.484792 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-68bcd6d974-ppz6b" event={"ID":"82d5c8e9-6fae-4abc-996d-f1d3ce618ac8","Type":"ContainerDied","Data":"7723d0844e43483d7f9193a53c58c24afc08cc3ee9e8d533b9e378e8200b6135"} Mar 19 19:19:36 crc kubenswrapper[4826]: I0319 19:19:36.511141 4826 generic.go:334] "Generic (PLEG): container finished" podID="6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3" containerID="95c3c6e502efc12c74623e5694c7a61ea772e53865b7e0738e3661da68279953" exitCode=0 Mar 19 19:19:36 crc kubenswrapper[4826]: I0319 19:19:36.511212 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7c88fd79fb-wz79s" event={"ID":"6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3","Type":"ContainerDied","Data":"95c3c6e502efc12c74623e5694c7a61ea772e53865b7e0738e3661da68279953"} Mar 19 19:19:36 crc kubenswrapper[4826]: I0319 19:19:36.517551 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-559768d959-n8n6w"] Mar 19 19:19:36 crc kubenswrapper[4826]: I0319 19:19:36.527298 4826 generic.go:334] "Generic (PLEG): container finished" podID="6f72991a-39f2-4e20-bbba-68e7761f0644" containerID="18a42fbd22ce350dde3d9c34836042403dbbd26779f4f53aa1938a2fce6dfb67" exitCode=1 Mar 19 19:19:36 crc kubenswrapper[4826]: I0319 19:19:36.527537 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6bc67f648c-7ktnt" event={"ID":"6f72991a-39f2-4e20-bbba-68e7761f0644","Type":"ContainerDied","Data":"18a42fbd22ce350dde3d9c34836042403dbbd26779f4f53aa1938a2fce6dfb67"} Mar 19 19:19:36 crc kubenswrapper[4826]: I0319 19:19:36.528236 4826 scope.go:117] "RemoveContainer" containerID="18a42fbd22ce350dde3d9c34836042403dbbd26779f4f53aa1938a2fce6dfb67" Mar 19 19:19:36 crc kubenswrapper[4826]: E0319 19:19:36.528526 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-6bc67f648c-7ktnt_openstack(6f72991a-39f2-4e20-bbba-68e7761f0644)\"" pod="openstack/heat-cfnapi-6bc67f648c-7ktnt" podUID="6f72991a-39f2-4e20-bbba-68e7761f0644" Mar 19 19:19:36 crc kubenswrapper[4826]: I0319 19:19:36.542797 4826 generic.go:334] "Generic (PLEG): container finished" podID="fba7304a-508e-40ce-899c-608fd790ee26" containerID="7aec749a0c7586daee3e38e3fcffd060f07c049dea607343a073230cf02a1985" exitCode=1 Mar 19 19:19:36 crc kubenswrapper[4826]: I0319 19:19:36.542846 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-d8697dd89-f2jgx" event={"ID":"fba7304a-508e-40ce-899c-608fd790ee26","Type":"ContainerDied","Data":"7aec749a0c7586daee3e38e3fcffd060f07c049dea607343a073230cf02a1985"} Mar 19 19:19:36 crc kubenswrapper[4826]: I0319 19:19:36.543242 4826 scope.go:117] "RemoveContainer" containerID="7aec749a0c7586daee3e38e3fcffd060f07c049dea607343a073230cf02a1985" Mar 19 19:19:36 crc kubenswrapper[4826]: E0319 19:19:36.543442 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-d8697dd89-f2jgx_openstack(fba7304a-508e-40ce-899c-608fd790ee26)\"" pod="openstack/heat-api-d8697dd89-f2jgx" podUID="fba7304a-508e-40ce-899c-608fd790ee26" Mar 19 19:19:36 crc kubenswrapper[4826]: I0319 19:19:36.561461 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cml5k" event={"ID":"af2662b9-3873-4947-9793-e7e1c6611dcb","Type":"ContainerStarted","Data":"c527ac1ea75fd336044e51d8b3c0ca4ff65d1d8505818e1642568cf5ef2fafca"} Mar 19 19:19:36 crc kubenswrapper[4826]: I0319 19:19:36.572634 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-784d5749b4-gl4q7" event={"ID":"07d156e5-8ee8-46e0-880f-4cf3fd7d2aac","Type":"ContainerStarted","Data":"65068b9bcf79aa38cf3d0a9cfb5d3aa5593a175a7b589cd68d69ef048390dcec"} Mar 19 19:19:36 crc kubenswrapper[4826]: I0319 19:19:36.602466 4826 scope.go:117] "RemoveContainer" containerID="78fa0b8373dfced29b5cee7aa94853b1a82d50aa81e14bf83daca33d54d170c8" Mar 19 19:19:36 crc kubenswrapper[4826]: E0319 19:19:36.613365 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78fa0b8373dfced29b5cee7aa94853b1a82d50aa81e14bf83daca33d54d170c8\": container with ID starting with 78fa0b8373dfced29b5cee7aa94853b1a82d50aa81e14bf83daca33d54d170c8 not found: ID does not exist" containerID="78fa0b8373dfced29b5cee7aa94853b1a82d50aa81e14bf83daca33d54d170c8" Mar 19 19:19:36 crc kubenswrapper[4826]: I0319 19:19:36.613403 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78fa0b8373dfced29b5cee7aa94853b1a82d50aa81e14bf83daca33d54d170c8"} err="failed to get container status \"78fa0b8373dfced29b5cee7aa94853b1a82d50aa81e14bf83daca33d54d170c8\": rpc error: code = NotFound desc = could not find container \"78fa0b8373dfced29b5cee7aa94853b1a82d50aa81e14bf83daca33d54d170c8\": container with ID starting with 78fa0b8373dfced29b5cee7aa94853b1a82d50aa81e14bf83daca33d54d170c8 not found: ID does not exist" Mar 19 19:19:36 crc kubenswrapper[4826]: I0319 19:19:36.613428 4826 scope.go:117] "RemoveContainer" containerID="863f1b6cd22febb26830d51a6e705d2f15751aff4255356a0e9d1d8240a177dd" Mar 19 19:19:36 crc kubenswrapper[4826]: I0319 19:19:36.691448 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-68bcd6d974-ppz6b"] Mar 19 19:19:36 crc kubenswrapper[4826]: I0319 19:19:36.704822 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-68bcd6d974-ppz6b"] Mar 19 19:19:36 crc kubenswrapper[4826]: I0319 19:19:36.715677 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cml5k" podStartSLOduration=3.718185059 podStartE2EDuration="13.715644971s" podCreationTimestamp="2026-03-19 19:19:23 +0000 UTC" firstStartedPulling="2026-03-19 19:19:25.806296969 +0000 UTC m=+1390.560365292" lastFinishedPulling="2026-03-19 19:19:35.803756891 +0000 UTC m=+1400.557825204" observedRunningTime="2026-03-19 19:19:36.651664314 +0000 UTC m=+1401.405732627" watchObservedRunningTime="2026-03-19 19:19:36.715644971 +0000 UTC m=+1401.469713284" Mar 19 19:19:36 crc kubenswrapper[4826]: I0319 19:19:36.731794 4826 scope.go:117] "RemoveContainer" containerID="c7a3845637e2063d13b7e2eeb9a9def2641884332306e2b18e677e34ffc6f433" Mar 19 19:19:36 crc kubenswrapper[4826]: I0319 19:19:36.744608 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7c88fd79fb-wz79s" Mar 19 19:19:36 crc kubenswrapper[4826]: I0319 19:19:36.794014 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7lzf\" (UniqueName: \"kubernetes.io/projected/6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3-kube-api-access-s7lzf\") pod \"6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3\" (UID: \"6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3\") " Mar 19 19:19:36 crc kubenswrapper[4826]: I0319 19:19:36.794071 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3-config-data\") pod \"6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3\" (UID: \"6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3\") " Mar 19 19:19:36 crc kubenswrapper[4826]: I0319 19:19:36.794119 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3-combined-ca-bundle\") pod \"6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3\" (UID: \"6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3\") " Mar 19 19:19:36 crc kubenswrapper[4826]: I0319 19:19:36.794302 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3-config-data-custom\") pod \"6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3\" (UID: \"6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3\") " Mar 19 19:19:36 crc kubenswrapper[4826]: I0319 19:19:36.808858 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3-kube-api-access-s7lzf" (OuterVolumeSpecName: "kube-api-access-s7lzf") pod "6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3" (UID: "6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3"). InnerVolumeSpecName "kube-api-access-s7lzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:19:36 crc kubenswrapper[4826]: I0319 19:19:36.809591 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3" (UID: "6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:36 crc kubenswrapper[4826]: I0319 19:19:36.866817 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3" (UID: "6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:36 crc kubenswrapper[4826]: I0319 19:19:36.897165 4826 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:36 crc kubenswrapper[4826]: I0319 19:19:36.897196 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7lzf\" (UniqueName: \"kubernetes.io/projected/6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3-kube-api-access-s7lzf\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:36 crc kubenswrapper[4826]: I0319 19:19:36.897208 4826 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:36 crc kubenswrapper[4826]: I0319 19:19:36.925602 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3-config-data" (OuterVolumeSpecName: "config-data") pod "6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3" (UID: "6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:37 crc kubenswrapper[4826]: I0319 19:19:37.002915 4826 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:37 crc kubenswrapper[4826]: W0319 19:19:37.435356 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ada0e1e_2ace_4520_9fce_e0d9771afc01.slice/crio-a3392f4ac0605d4daefdfeac47fc6cfb045781454c97070d031dbace663d06a6 WatchSource:0}: Error finding container a3392f4ac0605d4daefdfeac47fc6cfb045781454c97070d031dbace663d06a6: Status 404 returned error can't find the container with id a3392f4ac0605d4daefdfeac47fc6cfb045781454c97070d031dbace663d06a6 Mar 19 19:19:37 crc kubenswrapper[4826]: I0319 19:19:37.468565 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 19:19:37 crc kubenswrapper[4826]: I0319 19:19:37.590990 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-z2dgh" Mar 19 19:19:37 crc kubenswrapper[4826]: I0319 19:19:37.629452 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnqbp\" (UniqueName: \"kubernetes.io/projected/3fd164df-263a-4817-a6d0-bedc59421e75-kube-api-access-jnqbp\") pod \"3fd164df-263a-4817-a6d0-bedc59421e75\" (UID: \"3fd164df-263a-4817-a6d0-bedc59421e75\") " Mar 19 19:19:37 crc kubenswrapper[4826]: I0319 19:19:37.629599 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3fd164df-263a-4817-a6d0-bedc59421e75-dns-svc\") pod \"3fd164df-263a-4817-a6d0-bedc59421e75\" (UID: \"3fd164df-263a-4817-a6d0-bedc59421e75\") " Mar 19 19:19:37 crc kubenswrapper[4826]: I0319 19:19:37.629629 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3fd164df-263a-4817-a6d0-bedc59421e75-ovsdbserver-sb\") pod \"3fd164df-263a-4817-a6d0-bedc59421e75\" (UID: \"3fd164df-263a-4817-a6d0-bedc59421e75\") " Mar 19 19:19:37 crc kubenswrapper[4826]: I0319 19:19:37.629755 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3fd164df-263a-4817-a6d0-bedc59421e75-dns-swift-storage-0\") pod \"3fd164df-263a-4817-a6d0-bedc59421e75\" (UID: \"3fd164df-263a-4817-a6d0-bedc59421e75\") " Mar 19 19:19:37 crc kubenswrapper[4826]: I0319 19:19:37.629803 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3fd164df-263a-4817-a6d0-bedc59421e75-ovsdbserver-nb\") pod \"3fd164df-263a-4817-a6d0-bedc59421e75\" (UID: \"3fd164df-263a-4817-a6d0-bedc59421e75\") " Mar 19 19:19:37 crc kubenswrapper[4826]: I0319 19:19:37.629866 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fd164df-263a-4817-a6d0-bedc59421e75-config\") pod \"3fd164df-263a-4817-a6d0-bedc59421e75\" (UID: \"3fd164df-263a-4817-a6d0-bedc59421e75\") " Mar 19 19:19:37 crc kubenswrapper[4826]: I0319 19:19:37.676831 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fd164df-263a-4817-a6d0-bedc59421e75-kube-api-access-jnqbp" (OuterVolumeSpecName: "kube-api-access-jnqbp") pod "3fd164df-263a-4817-a6d0-bedc59421e75" (UID: "3fd164df-263a-4817-a6d0-bedc59421e75"). InnerVolumeSpecName "kube-api-access-jnqbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:19:37 crc kubenswrapper[4826]: I0319 19:19:37.713220 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3ada0e1e-2ace-4520-9fce-e0d9771afc01","Type":"ContainerStarted","Data":"a3392f4ac0605d4daefdfeac47fc6cfb045781454c97070d031dbace663d06a6"} Mar 19 19:19:37 crc kubenswrapper[4826]: I0319 19:19:37.730113 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-784d5749b4-gl4q7" event={"ID":"07d156e5-8ee8-46e0-880f-4cf3fd7d2aac","Type":"ContainerStarted","Data":"5446ca03f39f3c674da1a362b6032b333090d971f781df7dcac73f902e93848a"} Mar 19 19:19:37 crc kubenswrapper[4826]: I0319 19:19:37.732334 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnqbp\" (UniqueName: \"kubernetes.io/projected/3fd164df-263a-4817-a6d0-bedc59421e75-kube-api-access-jnqbp\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:37 crc kubenswrapper[4826]: I0319 19:19:37.732379 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-784d5749b4-gl4q7" Mar 19 19:19:37 crc kubenswrapper[4826]: I0319 19:19:37.765643 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e59b525d-52f1-4d86-a5da-37c081313417","Type":"ContainerStarted","Data":"f25fc23efa997d4174e70f1fe9be5552330d68c23639add1ce865beece04372f"} Mar 19 19:19:37 crc kubenswrapper[4826]: I0319 19:19:37.765896 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e59b525d-52f1-4d86-a5da-37c081313417" containerName="ceilometer-central-agent" containerID="cri-o://550a8f683c972bd4e72cdadb62c99812697396be7235c67cb0bf0cd2f12f8d88" gracePeriod=30 Mar 19 19:19:37 crc kubenswrapper[4826]: I0319 19:19:37.766063 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 19 19:19:37 crc kubenswrapper[4826]: I0319 19:19:37.766117 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e59b525d-52f1-4d86-a5da-37c081313417" containerName="proxy-httpd" containerID="cri-o://f25fc23efa997d4174e70f1fe9be5552330d68c23639add1ce865beece04372f" gracePeriod=30 Mar 19 19:19:37 crc kubenswrapper[4826]: I0319 19:19:37.766157 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e59b525d-52f1-4d86-a5da-37c081313417" containerName="sg-core" containerID="cri-o://e4c9622a827f99ee1ce26e67a3f0278a1160184f8d6ad03c38537f6f5cff54c1" gracePeriod=30 Mar 19 19:19:37 crc kubenswrapper[4826]: I0319 19:19:37.766192 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e59b525d-52f1-4d86-a5da-37c081313417" containerName="ceilometer-notification-agent" containerID="cri-o://ea9f38be91b0a71b13926d7222d2bf66e5a8cd150a918024642d064f04a772e1" gracePeriod=30 Mar 19 19:19:37 crc kubenswrapper[4826]: I0319 19:19:37.768790 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-784d5749b4-gl4q7" podStartSLOduration=2.768771064 podStartE2EDuration="2.768771064s" podCreationTimestamp="2026-03-19 19:19:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:19:37.76738595 +0000 UTC m=+1402.521454273" watchObservedRunningTime="2026-03-19 19:19:37.768771064 +0000 UTC m=+1402.522839377" Mar 19 19:19:37 crc kubenswrapper[4826]: I0319 19:19:37.787016 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fd164df-263a-4817-a6d0-bedc59421e75-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3fd164df-263a-4817-a6d0-bedc59421e75" (UID: "3fd164df-263a-4817-a6d0-bedc59421e75"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:19:37 crc kubenswrapper[4826]: I0319 19:19:37.794114 4826 generic.go:334] "Generic (PLEG): container finished" podID="3fd164df-263a-4817-a6d0-bedc59421e75" containerID="44face072e2b1e82a23f7e97a749880d84d47c51f0ca5fd80a4f066805463c2b" exitCode=0 Mar 19 19:19:37 crc kubenswrapper[4826]: I0319 19:19:37.794201 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-z2dgh" Mar 19 19:19:37 crc kubenswrapper[4826]: I0319 19:19:37.794203 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-z2dgh" event={"ID":"3fd164df-263a-4817-a6d0-bedc59421e75","Type":"ContainerDied","Data":"44face072e2b1e82a23f7e97a749880d84d47c51f0ca5fd80a4f066805463c2b"} Mar 19 19:19:37 crc kubenswrapper[4826]: I0319 19:19:37.794479 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-z2dgh" event={"ID":"3fd164df-263a-4817-a6d0-bedc59421e75","Type":"ContainerDied","Data":"e0d31fea3d1ad082dee49de6b2b4f17a02d10c94a8dd776e6293dbd61323e4d0"} Mar 19 19:19:37 crc kubenswrapper[4826]: I0319 19:19:37.794521 4826 scope.go:117] "RemoveContainer" containerID="44face072e2b1e82a23f7e97a749880d84d47c51f0ca5fd80a4f066805463c2b" Mar 19 19:19:37 crc kubenswrapper[4826]: I0319 19:19:37.809042 4826 scope.go:117] "RemoveContainer" containerID="7aec749a0c7586daee3e38e3fcffd060f07c049dea607343a073230cf02a1985" Mar 19 19:19:37 crc kubenswrapper[4826]: E0319 19:19:37.809275 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-d8697dd89-f2jgx_openstack(fba7304a-508e-40ce-899c-608fd790ee26)\"" pod="openstack/heat-api-d8697dd89-f2jgx" podUID="fba7304a-508e-40ce-899c-608fd790ee26" Mar 19 19:19:37 crc kubenswrapper[4826]: I0319 19:19:37.819941 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.96576668 podStartE2EDuration="10.819924991s" podCreationTimestamp="2026-03-19 19:19:27 +0000 UTC" firstStartedPulling="2026-03-19 19:19:30.747629948 +0000 UTC m=+1395.501698261" lastFinishedPulling="2026-03-19 19:19:36.601788259 +0000 UTC m=+1401.355856572" observedRunningTime="2026-03-19 19:19:37.809534039 +0000 UTC m=+1402.563602352" watchObservedRunningTime="2026-03-19 19:19:37.819924991 +0000 UTC m=+1402.573993304" Mar 19 19:19:37 crc kubenswrapper[4826]: I0319 19:19:37.839283 4826 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3fd164df-263a-4817-a6d0-bedc59421e75-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:37 crc kubenswrapper[4826]: I0319 19:19:37.846935 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-559768d959-n8n6w" event={"ID":"63a6bcd4-833d-4f25-a4ab-e890afd8feb1","Type":"ContainerStarted","Data":"d85d3e712985dd5f00d7a95045fa98d205a5c9d5719eccb9bdba353b07351c6f"} Mar 19 19:19:37 crc kubenswrapper[4826]: I0319 19:19:37.846980 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-559768d959-n8n6w" event={"ID":"63a6bcd4-833d-4f25-a4ab-e890afd8feb1","Type":"ContainerStarted","Data":"4a1d859f8a14665eb264785e6ff8bfa0d3c8bf459cbb4f1e28fb5a0737d6ee25"} Mar 19 19:19:37 crc kubenswrapper[4826]: I0319 19:19:37.848032 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-559768d959-n8n6w" Mar 19 19:19:37 crc kubenswrapper[4826]: I0319 19:19:37.862639 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fd164df-263a-4817-a6d0-bedc59421e75-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3fd164df-263a-4817-a6d0-bedc59421e75" (UID: "3fd164df-263a-4817-a6d0-bedc59421e75"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:19:37 crc kubenswrapper[4826]: I0319 19:19:37.870743 4826 scope.go:117] "RemoveContainer" containerID="29f5a30b6591c6bd2961572881ae6928c29655bd200d0f08ec2bfd5716a2bdf5" Mar 19 19:19:37 crc kubenswrapper[4826]: I0319 19:19:37.901454 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fd164df-263a-4817-a6d0-bedc59421e75-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3fd164df-263a-4817-a6d0-bedc59421e75" (UID: "3fd164df-263a-4817-a6d0-bedc59421e75"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:19:37 crc kubenswrapper[4826]: I0319 19:19:37.928318 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fd164df-263a-4817-a6d0-bedc59421e75-config" (OuterVolumeSpecName: "config") pod "3fd164df-263a-4817-a6d0-bedc59421e75" (UID: "3fd164df-263a-4817-a6d0-bedc59421e75"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:19:37 crc kubenswrapper[4826]: I0319 19:19:37.969484 4826 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fd164df-263a-4817-a6d0-bedc59421e75-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:37 crc kubenswrapper[4826]: I0319 19:19:37.969514 4826 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3fd164df-263a-4817-a6d0-bedc59421e75-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:37 crc kubenswrapper[4826]: I0319 19:19:37.969526 4826 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3fd164df-263a-4817-a6d0-bedc59421e75-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:37 crc kubenswrapper[4826]: I0319 19:19:37.978215 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fd164df-263a-4817-a6d0-bedc59421e75-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3fd164df-263a-4817-a6d0-bedc59421e75" (UID: "3fd164df-263a-4817-a6d0-bedc59421e75"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:19:37 crc kubenswrapper[4826]: I0319 19:19:37.990704 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7c88fd79fb-wz79s" Mar 19 19:19:38 crc kubenswrapper[4826]: I0319 19:19:38.004985 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-559768d959-n8n6w" podStartSLOduration=3.004964383 podStartE2EDuration="3.004964383s" podCreationTimestamp="2026-03-19 19:19:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:19:37.874677874 +0000 UTC m=+1402.628746187" watchObservedRunningTime="2026-03-19 19:19:38.004964383 +0000 UTC m=+1402.759032716" Mar 19 19:19:38 crc kubenswrapper[4826]: I0319 19:19:38.038209 4826 scope.go:117] "RemoveContainer" containerID="18a42fbd22ce350dde3d9c34836042403dbbd26779f4f53aa1938a2fce6dfb67" Mar 19 19:19:38 crc kubenswrapper[4826]: I0319 19:19:38.083222 4826 scope.go:117] "RemoveContainer" containerID="44face072e2b1e82a23f7e97a749880d84d47c51f0ca5fd80a4f066805463c2b" Mar 19 19:19:38 crc kubenswrapper[4826]: E0319 19:19:38.084841 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-6bc67f648c-7ktnt_openstack(6f72991a-39f2-4e20-bbba-68e7761f0644)\"" pod="openstack/heat-cfnapi-6bc67f648c-7ktnt" podUID="6f72991a-39f2-4e20-bbba-68e7761f0644" Mar 19 19:19:38 crc kubenswrapper[4826]: I0319 19:19:38.085893 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82d5c8e9-6fae-4abc-996d-f1d3ce618ac8" path="/var/lib/kubelet/pods/82d5c8e9-6fae-4abc-996d-f1d3ce618ac8/volumes" Mar 19 19:19:38 crc kubenswrapper[4826]: I0319 19:19:38.086984 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7c88fd79fb-wz79s" event={"ID":"6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3","Type":"ContainerDied","Data":"aeb78d450d3c8f0381188672144821a342610df376df52ccbc806e6140937586"} Mar 19 19:19:38 crc kubenswrapper[4826]: I0319 19:19:38.089466 4826 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3fd164df-263a-4817-a6d0-bedc59421e75-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:38 crc kubenswrapper[4826]: E0319 19:19:38.093326 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44face072e2b1e82a23f7e97a749880d84d47c51f0ca5fd80a4f066805463c2b\": container with ID starting with 44face072e2b1e82a23f7e97a749880d84d47c51f0ca5fd80a4f066805463c2b not found: ID does not exist" containerID="44face072e2b1e82a23f7e97a749880d84d47c51f0ca5fd80a4f066805463c2b" Mar 19 19:19:38 crc kubenswrapper[4826]: I0319 19:19:38.093369 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44face072e2b1e82a23f7e97a749880d84d47c51f0ca5fd80a4f066805463c2b"} err="failed to get container status \"44face072e2b1e82a23f7e97a749880d84d47c51f0ca5fd80a4f066805463c2b\": rpc error: code = NotFound desc = could not find container \"44face072e2b1e82a23f7e97a749880d84d47c51f0ca5fd80a4f066805463c2b\": container with ID starting with 44face072e2b1e82a23f7e97a749880d84d47c51f0ca5fd80a4f066805463c2b not found: ID does not exist" Mar 19 19:19:38 crc kubenswrapper[4826]: I0319 19:19:38.093394 4826 scope.go:117] "RemoveContainer" containerID="29f5a30b6591c6bd2961572881ae6928c29655bd200d0f08ec2bfd5716a2bdf5" Mar 19 19:19:38 crc kubenswrapper[4826]: E0319 19:19:38.107275 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29f5a30b6591c6bd2961572881ae6928c29655bd200d0f08ec2bfd5716a2bdf5\": container with ID starting with 29f5a30b6591c6bd2961572881ae6928c29655bd200d0f08ec2bfd5716a2bdf5 not found: ID does not exist" containerID="29f5a30b6591c6bd2961572881ae6928c29655bd200d0f08ec2bfd5716a2bdf5" Mar 19 19:19:38 crc kubenswrapper[4826]: I0319 19:19:38.107340 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29f5a30b6591c6bd2961572881ae6928c29655bd200d0f08ec2bfd5716a2bdf5"} err="failed to get container status \"29f5a30b6591c6bd2961572881ae6928c29655bd200d0f08ec2bfd5716a2bdf5\": rpc error: code = NotFound desc = could not find container \"29f5a30b6591c6bd2961572881ae6928c29655bd200d0f08ec2bfd5716a2bdf5\": container with ID starting with 29f5a30b6591c6bd2961572881ae6928c29655bd200d0f08ec2bfd5716a2bdf5 not found: ID does not exist" Mar 19 19:19:38 crc kubenswrapper[4826]: I0319 19:19:38.107370 4826 scope.go:117] "RemoveContainer" containerID="95c3c6e502efc12c74623e5694c7a61ea772e53865b7e0738e3661da68279953" Mar 19 19:19:38 crc kubenswrapper[4826]: I0319 19:19:38.150479 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-d8697dd89-f2jgx" Mar 19 19:19:38 crc kubenswrapper[4826]: I0319 19:19:38.150582 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-d8697dd89-f2jgx" Mar 19 19:19:38 crc kubenswrapper[4826]: I0319 19:19:38.172267 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-6bc67f648c-7ktnt" Mar 19 19:19:38 crc kubenswrapper[4826]: I0319 19:19:38.172753 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-6bc67f648c-7ktnt" Mar 19 19:19:38 crc kubenswrapper[4826]: I0319 19:19:38.193498 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-7c88fd79fb-wz79s"] Mar 19 19:19:38 crc kubenswrapper[4826]: I0319 19:19:38.206134 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-7c88fd79fb-wz79s"] Mar 19 19:19:38 crc kubenswrapper[4826]: I0319 19:19:38.215575 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-z2dgh"] Mar 19 19:19:38 crc kubenswrapper[4826]: I0319 19:19:38.225139 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-z2dgh"] Mar 19 19:19:39 crc kubenswrapper[4826]: I0319 19:19:39.056239 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3ada0e1e-2ace-4520-9fce-e0d9771afc01","Type":"ContainerStarted","Data":"a5f57f4cf0a0ecf56ee8b1e8c945ca35eb96d6871a5776ee42f32d608d7cea61"} Mar 19 19:19:39 crc kubenswrapper[4826]: I0319 19:19:39.059884 4826 generic.go:334] "Generic (PLEG): container finished" podID="e59b525d-52f1-4d86-a5da-37c081313417" containerID="f25fc23efa997d4174e70f1fe9be5552330d68c23639add1ce865beece04372f" exitCode=0 Mar 19 19:19:39 crc kubenswrapper[4826]: I0319 19:19:39.060035 4826 generic.go:334] "Generic (PLEG): container finished" podID="e59b525d-52f1-4d86-a5da-37c081313417" containerID="e4c9622a827f99ee1ce26e67a3f0278a1160184f8d6ad03c38537f6f5cff54c1" exitCode=2 Mar 19 19:19:39 crc kubenswrapper[4826]: I0319 19:19:39.060101 4826 generic.go:334] "Generic (PLEG): container finished" podID="e59b525d-52f1-4d86-a5da-37c081313417" containerID="ea9f38be91b0a71b13926d7222d2bf66e5a8cd150a918024642d064f04a772e1" exitCode=0 Mar 19 19:19:39 crc kubenswrapper[4826]: I0319 19:19:39.059964 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e59b525d-52f1-4d86-a5da-37c081313417","Type":"ContainerDied","Data":"f25fc23efa997d4174e70f1fe9be5552330d68c23639add1ce865beece04372f"} Mar 19 19:19:39 crc kubenswrapper[4826]: I0319 19:19:39.060311 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e59b525d-52f1-4d86-a5da-37c081313417","Type":"ContainerDied","Data":"e4c9622a827f99ee1ce26e67a3f0278a1160184f8d6ad03c38537f6f5cff54c1"} Mar 19 19:19:39 crc kubenswrapper[4826]: I0319 19:19:39.060326 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e59b525d-52f1-4d86-a5da-37c081313417","Type":"ContainerDied","Data":"ea9f38be91b0a71b13926d7222d2bf66e5a8cd150a918024642d064f04a772e1"} Mar 19 19:19:39 crc kubenswrapper[4826]: I0319 19:19:39.061483 4826 scope.go:117] "RemoveContainer" containerID="18a42fbd22ce350dde3d9c34836042403dbbd26779f4f53aa1938a2fce6dfb67" Mar 19 19:19:39 crc kubenswrapper[4826]: E0319 19:19:39.061748 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-6bc67f648c-7ktnt_openstack(6f72991a-39f2-4e20-bbba-68e7761f0644)\"" pod="openstack/heat-cfnapi-6bc67f648c-7ktnt" podUID="6f72991a-39f2-4e20-bbba-68e7761f0644" Mar 19 19:19:39 crc kubenswrapper[4826]: I0319 19:19:39.061967 4826 scope.go:117] "RemoveContainer" containerID="7aec749a0c7586daee3e38e3fcffd060f07c049dea607343a073230cf02a1985" Mar 19 19:19:39 crc kubenswrapper[4826]: E0319 19:19:39.062245 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-d8697dd89-f2jgx_openstack(fba7304a-508e-40ce-899c-608fd790ee26)\"" pod="openstack/heat-api-d8697dd89-f2jgx" podUID="fba7304a-508e-40ce-899c-608fd790ee26" Mar 19 19:19:39 crc kubenswrapper[4826]: I0319 19:19:39.206907 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 19 19:19:39 crc kubenswrapper[4826]: I0319 19:19:39.411734 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 19:19:39 crc kubenswrapper[4826]: I0319 19:19:39.412913 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="8b2afd63-9250-4f56-be2a-271b75704b95" containerName="glance-log" containerID="cri-o://3b7f96ca48f5f84c67af82ba8d22e31bfc942a1b6150444356ba090ccd0a15de" gracePeriod=30 Mar 19 19:19:39 crc kubenswrapper[4826]: I0319 19:19:39.413546 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="8b2afd63-9250-4f56-be2a-271b75704b95" containerName="glance-httpd" containerID="cri-o://a0ea25b10134f259221bf3f106ba88d381ec21f9146a923476d1ce60455d1651" gracePeriod=30 Mar 19 19:19:39 crc kubenswrapper[4826]: I0319 19:19:39.992441 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fd164df-263a-4817-a6d0-bedc59421e75" path="/var/lib/kubelet/pods/3fd164df-263a-4817-a6d0-bedc59421e75/volumes" Mar 19 19:19:39 crc kubenswrapper[4826]: I0319 19:19:39.993732 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3" path="/var/lib/kubelet/pods/6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3/volumes" Mar 19 19:19:40 crc kubenswrapper[4826]: I0319 19:19:40.076159 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3ada0e1e-2ace-4520-9fce-e0d9771afc01","Type":"ContainerStarted","Data":"f8e88e072bc10a38a8518573248b772cd130cde2824296ff418a99b691edbd65"} Mar 19 19:19:40 crc kubenswrapper[4826]: I0319 19:19:40.078440 4826 generic.go:334] "Generic (PLEG): container finished" podID="8b2afd63-9250-4f56-be2a-271b75704b95" containerID="3b7f96ca48f5f84c67af82ba8d22e31bfc942a1b6150444356ba090ccd0a15de" exitCode=143 Mar 19 19:19:40 crc kubenswrapper[4826]: I0319 19:19:40.078540 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8b2afd63-9250-4f56-be2a-271b75704b95","Type":"ContainerDied","Data":"3b7f96ca48f5f84c67af82ba8d22e31bfc942a1b6150444356ba090ccd0a15de"} Mar 19 19:19:40 crc kubenswrapper[4826]: I0319 19:19:40.080971 4826 scope.go:117] "RemoveContainer" containerID="7aec749a0c7586daee3e38e3fcffd060f07c049dea607343a073230cf02a1985" Mar 19 19:19:40 crc kubenswrapper[4826]: E0319 19:19:40.081285 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-d8697dd89-f2jgx_openstack(fba7304a-508e-40ce-899c-608fd790ee26)\"" pod="openstack/heat-api-d8697dd89-f2jgx" podUID="fba7304a-508e-40ce-899c-608fd790ee26" Mar 19 19:19:40 crc kubenswrapper[4826]: I0319 19:19:40.081578 4826 scope.go:117] "RemoveContainer" containerID="18a42fbd22ce350dde3d9c34836042403dbbd26779f4f53aa1938a2fce6dfb67" Mar 19 19:19:40 crc kubenswrapper[4826]: E0319 19:19:40.082062 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-6bc67f648c-7ktnt_openstack(6f72991a-39f2-4e20-bbba-68e7761f0644)\"" pod="openstack/heat-cfnapi-6bc67f648c-7ktnt" podUID="6f72991a-39f2-4e20-bbba-68e7761f0644" Mar 19 19:19:40 crc kubenswrapper[4826]: I0319 19:19:40.104136 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.104114948 podStartE2EDuration="5.104114948s" podCreationTimestamp="2026-03-19 19:19:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:19:40.100219853 +0000 UTC m=+1404.854288166" watchObservedRunningTime="2026-03-19 19:19:40.104114948 +0000 UTC m=+1404.858183261" Mar 19 19:19:41 crc kubenswrapper[4826]: I0319 19:19:41.100604 4826 generic.go:334] "Generic (PLEG): container finished" podID="e59b525d-52f1-4d86-a5da-37c081313417" containerID="550a8f683c972bd4e72cdadb62c99812697396be7235c67cb0bf0cd2f12f8d88" exitCode=0 Mar 19 19:19:41 crc kubenswrapper[4826]: I0319 19:19:41.100743 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e59b525d-52f1-4d86-a5da-37c081313417","Type":"ContainerDied","Data":"550a8f683c972bd4e72cdadb62c99812697396be7235c67cb0bf0cd2f12f8d88"} Mar 19 19:19:43 crc kubenswrapper[4826]: I0319 19:19:43.138408 4826 generic.go:334] "Generic (PLEG): container finished" podID="8b2afd63-9250-4f56-be2a-271b75704b95" containerID="a0ea25b10134f259221bf3f106ba88d381ec21f9146a923476d1ce60455d1651" exitCode=0 Mar 19 19:19:43 crc kubenswrapper[4826]: I0319 19:19:43.138977 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8b2afd63-9250-4f56-be2a-271b75704b95","Type":"ContainerDied","Data":"a0ea25b10134f259221bf3f106ba88d381ec21f9146a923476d1ce60455d1651"} Mar 19 19:19:44 crc kubenswrapper[4826]: I0319 19:19:44.132273 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cml5k" Mar 19 19:19:44 crc kubenswrapper[4826]: I0319 19:19:44.132404 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cml5k" Mar 19 19:19:45 crc kubenswrapper[4826]: I0319 19:19:45.193574 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cml5k" podUID="af2662b9-3873-4947-9793-e7e1c6611dcb" containerName="registry-server" probeResult="failure" output=< Mar 19 19:19:45 crc kubenswrapper[4826]: timeout: failed to connect service ":50051" within 1s Mar 19 19:19:45 crc kubenswrapper[4826]: > Mar 19 19:19:45 crc kubenswrapper[4826]: I0319 19:19:45.534446 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:19:45 crc kubenswrapper[4826]: I0319 19:19:45.586260 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 19 19:19:45 crc kubenswrapper[4826]: I0319 19:19:45.700881 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8b2afd63-9250-4f56-be2a-271b75704b95-httpd-run\") pod \"8b2afd63-9250-4f56-be2a-271b75704b95\" (UID: \"8b2afd63-9250-4f56-be2a-271b75704b95\") " Mar 19 19:19:45 crc kubenswrapper[4826]: I0319 19:19:45.700953 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96pmk\" (UniqueName: \"kubernetes.io/projected/e59b525d-52f1-4d86-a5da-37c081313417-kube-api-access-96pmk\") pod \"e59b525d-52f1-4d86-a5da-37c081313417\" (UID: \"e59b525d-52f1-4d86-a5da-37c081313417\") " Mar 19 19:19:45 crc kubenswrapper[4826]: I0319 19:19:45.700998 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b2afd63-9250-4f56-be2a-271b75704b95-config-data\") pod \"8b2afd63-9250-4f56-be2a-271b75704b95\" (UID: \"8b2afd63-9250-4f56-be2a-271b75704b95\") " Mar 19 19:19:45 crc kubenswrapper[4826]: I0319 19:19:45.701436 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b2afd63-9250-4f56-be2a-271b75704b95-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8b2afd63-9250-4f56-be2a-271b75704b95" (UID: "8b2afd63-9250-4f56-be2a-271b75704b95"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:19:45 crc kubenswrapper[4826]: I0319 19:19:45.701491 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7b89d7bb-5111-43ac-b0dd-7e23377ef32f\") pod \"8b2afd63-9250-4f56-be2a-271b75704b95\" (UID: \"8b2afd63-9250-4f56-be2a-271b75704b95\") " Mar 19 19:19:45 crc kubenswrapper[4826]: I0319 19:19:45.701979 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e59b525d-52f1-4d86-a5da-37c081313417-combined-ca-bundle\") pod \"e59b525d-52f1-4d86-a5da-37c081313417\" (UID: \"e59b525d-52f1-4d86-a5da-37c081313417\") " Mar 19 19:19:45 crc kubenswrapper[4826]: I0319 19:19:45.702053 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b2afd63-9250-4f56-be2a-271b75704b95-public-tls-certs\") pod \"8b2afd63-9250-4f56-be2a-271b75704b95\" (UID: \"8b2afd63-9250-4f56-be2a-271b75704b95\") " Mar 19 19:19:45 crc kubenswrapper[4826]: I0319 19:19:45.702128 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b2afd63-9250-4f56-be2a-271b75704b95-logs\") pod \"8b2afd63-9250-4f56-be2a-271b75704b95\" (UID: \"8b2afd63-9250-4f56-be2a-271b75704b95\") " Mar 19 19:19:45 crc kubenswrapper[4826]: I0319 19:19:45.702171 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e59b525d-52f1-4d86-a5da-37c081313417-run-httpd\") pod \"e59b525d-52f1-4d86-a5da-37c081313417\" (UID: \"e59b525d-52f1-4d86-a5da-37c081313417\") " Mar 19 19:19:45 crc kubenswrapper[4826]: I0319 19:19:45.702191 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e59b525d-52f1-4d86-a5da-37c081313417-sg-core-conf-yaml\") pod \"e59b525d-52f1-4d86-a5da-37c081313417\" (UID: \"e59b525d-52f1-4d86-a5da-37c081313417\") " Mar 19 19:19:45 crc kubenswrapper[4826]: I0319 19:19:45.702250 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e59b525d-52f1-4d86-a5da-37c081313417-log-httpd\") pod \"e59b525d-52f1-4d86-a5da-37c081313417\" (UID: \"e59b525d-52f1-4d86-a5da-37c081313417\") " Mar 19 19:19:45 crc kubenswrapper[4826]: I0319 19:19:45.702274 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e59b525d-52f1-4d86-a5da-37c081313417-scripts\") pod \"e59b525d-52f1-4d86-a5da-37c081313417\" (UID: \"e59b525d-52f1-4d86-a5da-37c081313417\") " Mar 19 19:19:45 crc kubenswrapper[4826]: I0319 19:19:45.702342 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b2afd63-9250-4f56-be2a-271b75704b95-scripts\") pod \"8b2afd63-9250-4f56-be2a-271b75704b95\" (UID: \"8b2afd63-9250-4f56-be2a-271b75704b95\") " Mar 19 19:19:45 crc kubenswrapper[4826]: I0319 19:19:45.702365 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e59b525d-52f1-4d86-a5da-37c081313417-config-data\") pod \"e59b525d-52f1-4d86-a5da-37c081313417\" (UID: \"e59b525d-52f1-4d86-a5da-37c081313417\") " Mar 19 19:19:45 crc kubenswrapper[4826]: I0319 19:19:45.702396 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blf8s\" (UniqueName: \"kubernetes.io/projected/8b2afd63-9250-4f56-be2a-271b75704b95-kube-api-access-blf8s\") pod \"8b2afd63-9250-4f56-be2a-271b75704b95\" (UID: \"8b2afd63-9250-4f56-be2a-271b75704b95\") " Mar 19 19:19:45 crc kubenswrapper[4826]: I0319 19:19:45.702413 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b2afd63-9250-4f56-be2a-271b75704b95-combined-ca-bundle\") pod \"8b2afd63-9250-4f56-be2a-271b75704b95\" (UID: \"8b2afd63-9250-4f56-be2a-271b75704b95\") " Mar 19 19:19:45 crc kubenswrapper[4826]: I0319 19:19:45.702846 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e59b525d-52f1-4d86-a5da-37c081313417-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e59b525d-52f1-4d86-a5da-37c081313417" (UID: "e59b525d-52f1-4d86-a5da-37c081313417"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:19:45 crc kubenswrapper[4826]: I0319 19:19:45.702862 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b2afd63-9250-4f56-be2a-271b75704b95-logs" (OuterVolumeSpecName: "logs") pod "8b2afd63-9250-4f56-be2a-271b75704b95" (UID: "8b2afd63-9250-4f56-be2a-271b75704b95"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:19:45 crc kubenswrapper[4826]: I0319 19:19:45.703132 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e59b525d-52f1-4d86-a5da-37c081313417-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e59b525d-52f1-4d86-a5da-37c081313417" (UID: "e59b525d-52f1-4d86-a5da-37c081313417"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:19:45 crc kubenswrapper[4826]: I0319 19:19:45.704025 4826 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8b2afd63-9250-4f56-be2a-271b75704b95-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:45 crc kubenswrapper[4826]: I0319 19:19:45.704045 4826 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b2afd63-9250-4f56-be2a-271b75704b95-logs\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:45 crc kubenswrapper[4826]: I0319 19:19:45.704054 4826 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e59b525d-52f1-4d86-a5da-37c081313417-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:45 crc kubenswrapper[4826]: I0319 19:19:45.704063 4826 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e59b525d-52f1-4d86-a5da-37c081313417-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:45 crc kubenswrapper[4826]: I0319 19:19:45.706684 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e59b525d-52f1-4d86-a5da-37c081313417-scripts" (OuterVolumeSpecName: "scripts") pod "e59b525d-52f1-4d86-a5da-37c081313417" (UID: "e59b525d-52f1-4d86-a5da-37c081313417"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:45 crc kubenswrapper[4826]: I0319 19:19:45.707981 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e59b525d-52f1-4d86-a5da-37c081313417-kube-api-access-96pmk" (OuterVolumeSpecName: "kube-api-access-96pmk") pod "e59b525d-52f1-4d86-a5da-37c081313417" (UID: "e59b525d-52f1-4d86-a5da-37c081313417"). InnerVolumeSpecName "kube-api-access-96pmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:19:45 crc kubenswrapper[4826]: I0319 19:19:45.710094 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b2afd63-9250-4f56-be2a-271b75704b95-scripts" (OuterVolumeSpecName: "scripts") pod "8b2afd63-9250-4f56-be2a-271b75704b95" (UID: "8b2afd63-9250-4f56-be2a-271b75704b95"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:45 crc kubenswrapper[4826]: I0319 19:19:45.712742 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b2afd63-9250-4f56-be2a-271b75704b95-kube-api-access-blf8s" (OuterVolumeSpecName: "kube-api-access-blf8s") pod "8b2afd63-9250-4f56-be2a-271b75704b95" (UID: "8b2afd63-9250-4f56-be2a-271b75704b95"). InnerVolumeSpecName "kube-api-access-blf8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:19:45 crc kubenswrapper[4826]: I0319 19:19:45.747532 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7b89d7bb-5111-43ac-b0dd-7e23377ef32f" (OuterVolumeSpecName: "glance") pod "8b2afd63-9250-4f56-be2a-271b75704b95" (UID: "8b2afd63-9250-4f56-be2a-271b75704b95"). InnerVolumeSpecName "pvc-7b89d7bb-5111-43ac-b0dd-7e23377ef32f". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 19 19:19:45 crc kubenswrapper[4826]: I0319 19:19:45.753470 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b2afd63-9250-4f56-be2a-271b75704b95-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8b2afd63-9250-4f56-be2a-271b75704b95" (UID: "8b2afd63-9250-4f56-be2a-271b75704b95"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:45 crc kubenswrapper[4826]: I0319 19:19:45.759270 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e59b525d-52f1-4d86-a5da-37c081313417-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e59b525d-52f1-4d86-a5da-37c081313417" (UID: "e59b525d-52f1-4d86-a5da-37c081313417"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:45 crc kubenswrapper[4826]: I0319 19:19:45.785538 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b2afd63-9250-4f56-be2a-271b75704b95-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "8b2afd63-9250-4f56-be2a-271b75704b95" (UID: "8b2afd63-9250-4f56-be2a-271b75704b95"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:45 crc kubenswrapper[4826]: I0319 19:19:45.786900 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b2afd63-9250-4f56-be2a-271b75704b95-config-data" (OuterVolumeSpecName: "config-data") pod "8b2afd63-9250-4f56-be2a-271b75704b95" (UID: "8b2afd63-9250-4f56-be2a-271b75704b95"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:45 crc kubenswrapper[4826]: I0319 19:19:45.806231 4826 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b2afd63-9250-4f56-be2a-271b75704b95-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:45 crc kubenswrapper[4826]: I0319 19:19:45.806260 4826 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e59b525d-52f1-4d86-a5da-37c081313417-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:45 crc kubenswrapper[4826]: I0319 19:19:45.806271 4826 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e59b525d-52f1-4d86-a5da-37c081313417-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:45 crc kubenswrapper[4826]: I0319 19:19:45.806280 4826 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b2afd63-9250-4f56-be2a-271b75704b95-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:45 crc kubenswrapper[4826]: I0319 19:19:45.806288 4826 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b2afd63-9250-4f56-be2a-271b75704b95-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:45 crc kubenswrapper[4826]: I0319 19:19:45.806298 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blf8s\" (UniqueName: \"kubernetes.io/projected/8b2afd63-9250-4f56-be2a-271b75704b95-kube-api-access-blf8s\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:45 crc kubenswrapper[4826]: I0319 19:19:45.806311 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96pmk\" (UniqueName: \"kubernetes.io/projected/e59b525d-52f1-4d86-a5da-37c081313417-kube-api-access-96pmk\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:45 crc kubenswrapper[4826]: I0319 19:19:45.806319 4826 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b2afd63-9250-4f56-be2a-271b75704b95-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:45 crc kubenswrapper[4826]: I0319 19:19:45.806372 4826 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-7b89d7bb-5111-43ac-b0dd-7e23377ef32f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7b89d7bb-5111-43ac-b0dd-7e23377ef32f\") on node \"crc\" " Mar 19 19:19:45 crc kubenswrapper[4826]: I0319 19:19:45.810580 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e59b525d-52f1-4d86-a5da-37c081313417-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e59b525d-52f1-4d86-a5da-37c081313417" (UID: "e59b525d-52f1-4d86-a5da-37c081313417"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:45 crc kubenswrapper[4826]: I0319 19:19:45.842015 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e59b525d-52f1-4d86-a5da-37c081313417-config-data" (OuterVolumeSpecName: "config-data") pod "e59b525d-52f1-4d86-a5da-37c081313417" (UID: "e59b525d-52f1-4d86-a5da-37c081313417"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:45 crc kubenswrapper[4826]: I0319 19:19:45.857783 4826 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 19 19:19:45 crc kubenswrapper[4826]: I0319 19:19:45.857944 4826 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-7b89d7bb-5111-43ac-b0dd-7e23377ef32f" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7b89d7bb-5111-43ac-b0dd-7e23377ef32f") on node "crc" Mar 19 19:19:45 crc kubenswrapper[4826]: I0319 19:19:45.908173 4826 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e59b525d-52f1-4d86-a5da-37c081313417-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:45 crc kubenswrapper[4826]: I0319 19:19:45.908210 4826 reconciler_common.go:293] "Volume detached for volume \"pvc-7b89d7bb-5111-43ac-b0dd-7e23377ef32f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7b89d7bb-5111-43ac-b0dd-7e23377ef32f\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:45 crc kubenswrapper[4826]: I0319 19:19:45.908219 4826 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e59b525d-52f1-4d86-a5da-37c081313417-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.201339 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e59b525d-52f1-4d86-a5da-37c081313417","Type":"ContainerDied","Data":"250d1310bdb99ec08f400300424c500dd0c796dfb4f57da8efa297f9539dd7f6"} Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.201386 4826 scope.go:117] "RemoveContainer" containerID="f25fc23efa997d4174e70f1fe9be5552330d68c23639add1ce865beece04372f" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.201521 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.205522 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-nn4pv" event={"ID":"f69e3ed0-cf8e-438d-a6f0-dac56664901e","Type":"ContainerStarted","Data":"13db328b16b5b91f7dab9647f18f61f5279dc45c5f092cecb97afa038e776674"} Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.228229 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8b2afd63-9250-4f56-be2a-271b75704b95","Type":"ContainerDied","Data":"b453f513f1f8eb5c9b2f27cd22a881584f4abcac16a980e3c4eaf84ea7098067"} Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.228324 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.231562 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-nn4pv" podStartSLOduration=2.115705275 podStartE2EDuration="18.231547433s" podCreationTimestamp="2026-03-19 19:19:28 +0000 UTC" firstStartedPulling="2026-03-19 19:19:29.050038529 +0000 UTC m=+1393.804106842" lastFinishedPulling="2026-03-19 19:19:45.165880687 +0000 UTC m=+1409.919949000" observedRunningTime="2026-03-19 19:19:46.225349564 +0000 UTC m=+1410.979417877" watchObservedRunningTime="2026-03-19 19:19:46.231547433 +0000 UTC m=+1410.985615746" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.244781 4826 scope.go:117] "RemoveContainer" containerID="e4c9622a827f99ee1ce26e67a3f0278a1160184f8d6ad03c38537f6f5cff54c1" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.309068 4826 scope.go:117] "RemoveContainer" containerID="ea9f38be91b0a71b13926d7222d2bf66e5a8cd150a918024642d064f04a772e1" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.320457 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-79595bb545-xcckb" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.373158 4826 scope.go:117] "RemoveContainer" containerID="550a8f683c972bd4e72cdadb62c99812697396be7235c67cb0bf0cd2f12f8d88" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.373952 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.374007 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.391835 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.428759 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.446186 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.446266 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.449190 4826 scope.go:117] "RemoveContainer" containerID="a0ea25b10134f259221bf3f106ba88d381ec21f9146a923476d1ce60455d1651" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.449395 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.456865 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:19:46 crc kubenswrapper[4826]: E0319 19:19:46.457400 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b2afd63-9250-4f56-be2a-271b75704b95" containerName="glance-httpd" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.457418 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b2afd63-9250-4f56-be2a-271b75704b95" containerName="glance-httpd" Mar 19 19:19:46 crc kubenswrapper[4826]: E0319 19:19:46.457437 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3" containerName="heat-api" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.457446 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3" containerName="heat-api" Mar 19 19:19:46 crc kubenswrapper[4826]: E0319 19:19:46.457459 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fd164df-263a-4817-a6d0-bedc59421e75" containerName="init" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.457465 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fd164df-263a-4817-a6d0-bedc59421e75" containerName="init" Mar 19 19:19:46 crc kubenswrapper[4826]: E0319 19:19:46.457482 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e59b525d-52f1-4d86-a5da-37c081313417" containerName="sg-core" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.457489 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="e59b525d-52f1-4d86-a5da-37c081313417" containerName="sg-core" Mar 19 19:19:46 crc kubenswrapper[4826]: E0319 19:19:46.457503 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e59b525d-52f1-4d86-a5da-37c081313417" containerName="proxy-httpd" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.457517 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="e59b525d-52f1-4d86-a5da-37c081313417" containerName="proxy-httpd" Mar 19 19:19:46 crc kubenswrapper[4826]: E0319 19:19:46.457535 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fd164df-263a-4817-a6d0-bedc59421e75" containerName="dnsmasq-dns" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.457541 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fd164df-263a-4817-a6d0-bedc59421e75" containerName="dnsmasq-dns" Mar 19 19:19:46 crc kubenswrapper[4826]: E0319 19:19:46.457554 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e59b525d-52f1-4d86-a5da-37c081313417" containerName="ceilometer-central-agent" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.457560 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="e59b525d-52f1-4d86-a5da-37c081313417" containerName="ceilometer-central-agent" Mar 19 19:19:46 crc kubenswrapper[4826]: E0319 19:19:46.457570 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e59b525d-52f1-4d86-a5da-37c081313417" containerName="ceilometer-notification-agent" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.457576 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="e59b525d-52f1-4d86-a5da-37c081313417" containerName="ceilometer-notification-agent" Mar 19 19:19:46 crc kubenswrapper[4826]: E0319 19:19:46.457587 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b2afd63-9250-4f56-be2a-271b75704b95" containerName="glance-log" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.457592 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b2afd63-9250-4f56-be2a-271b75704b95" containerName="glance-log" Mar 19 19:19:46 crc kubenswrapper[4826]: E0319 19:19:46.457598 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82d5c8e9-6fae-4abc-996d-f1d3ce618ac8" containerName="heat-cfnapi" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.457605 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="82d5c8e9-6fae-4abc-996d-f1d3ce618ac8" containerName="heat-cfnapi" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.457823 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="e59b525d-52f1-4d86-a5da-37c081313417" containerName="ceilometer-notification-agent" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.457839 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b2afd63-9250-4f56-be2a-271b75704b95" containerName="glance-log" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.457849 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="82d5c8e9-6fae-4abc-996d-f1d3ce618ac8" containerName="heat-cfnapi" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.457865 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="e59b525d-52f1-4d86-a5da-37c081313417" containerName="proxy-httpd" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.457873 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="e59b525d-52f1-4d86-a5da-37c081313417" containerName="ceilometer-central-agent" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.457884 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fd164df-263a-4817-a6d0-bedc59421e75" containerName="dnsmasq-dns" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.457893 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3" containerName="heat-api" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.457904 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b2afd63-9250-4f56-be2a-271b75704b95" containerName="glance-httpd" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.457921 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="e59b525d-52f1-4d86-a5da-37c081313417" containerName="sg-core" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.460002 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.464016 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.464531 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.465503 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.485522 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.504450 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.506313 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.507054 4826 scope.go:117] "RemoveContainer" containerID="3b7f96ca48f5f84c67af82ba8d22e31bfc942a1b6150444356ba090ccd0a15de" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.509698 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.510959 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.517842 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-7c88fd79fb-wz79s" podUID="6d932d9e-4ab6-4dcc-bff6-e20ebb18e3d3" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.0.229:8004/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.526817 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.528144 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/38208185-aceb-4b6b-8bfe-1316642c990a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"38208185-aceb-4b6b-8bfe-1316642c990a\") " pod="openstack/ceilometer-0" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.528241 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38208185-aceb-4b6b-8bfe-1316642c990a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"38208185-aceb-4b6b-8bfe-1316642c990a\") " pod="openstack/ceilometer-0" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.528266 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38208185-aceb-4b6b-8bfe-1316642c990a-run-httpd\") pod \"ceilometer-0\" (UID: \"38208185-aceb-4b6b-8bfe-1316642c990a\") " pod="openstack/ceilometer-0" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.528285 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcsnp\" (UniqueName: \"kubernetes.io/projected/38208185-aceb-4b6b-8bfe-1316642c990a-kube-api-access-lcsnp\") pod \"ceilometer-0\" (UID: \"38208185-aceb-4b6b-8bfe-1316642c990a\") " pod="openstack/ceilometer-0" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.528299 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38208185-aceb-4b6b-8bfe-1316642c990a-log-httpd\") pod \"ceilometer-0\" (UID: \"38208185-aceb-4b6b-8bfe-1316642c990a\") " pod="openstack/ceilometer-0" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.528329 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38208185-aceb-4b6b-8bfe-1316642c990a-config-data\") pod \"ceilometer-0\" (UID: \"38208185-aceb-4b6b-8bfe-1316642c990a\") " pod="openstack/ceilometer-0" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.528401 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38208185-aceb-4b6b-8bfe-1316642c990a-scripts\") pod \"ceilometer-0\" (UID: \"38208185-aceb-4b6b-8bfe-1316642c990a\") " pod="openstack/ceilometer-0" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.629927 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38208185-aceb-4b6b-8bfe-1316642c990a-scripts\") pod \"ceilometer-0\" (UID: \"38208185-aceb-4b6b-8bfe-1316642c990a\") " pod="openstack/ceilometer-0" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.630014 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2bdeb398-28cb-4d25-89fd-81af1e9ad81e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2bdeb398-28cb-4d25-89fd-81af1e9ad81e\") " pod="openstack/glance-default-external-api-0" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.630042 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pjm5\" (UniqueName: \"kubernetes.io/projected/2bdeb398-28cb-4d25-89fd-81af1e9ad81e-kube-api-access-7pjm5\") pod \"glance-default-external-api-0\" (UID: \"2bdeb398-28cb-4d25-89fd-81af1e9ad81e\") " pod="openstack/glance-default-external-api-0" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.630067 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/38208185-aceb-4b6b-8bfe-1316642c990a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"38208185-aceb-4b6b-8bfe-1316642c990a\") " pod="openstack/ceilometer-0" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.630094 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2bdeb398-28cb-4d25-89fd-81af1e9ad81e-logs\") pod \"glance-default-external-api-0\" (UID: \"2bdeb398-28cb-4d25-89fd-81af1e9ad81e\") " pod="openstack/glance-default-external-api-0" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.630142 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7b89d7bb-5111-43ac-b0dd-7e23377ef32f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7b89d7bb-5111-43ac-b0dd-7e23377ef32f\") pod \"glance-default-external-api-0\" (UID: \"2bdeb398-28cb-4d25-89fd-81af1e9ad81e\") " pod="openstack/glance-default-external-api-0" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.630174 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bdeb398-28cb-4d25-89fd-81af1e9ad81e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2bdeb398-28cb-4d25-89fd-81af1e9ad81e\") " pod="openstack/glance-default-external-api-0" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.630217 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38208185-aceb-4b6b-8bfe-1316642c990a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"38208185-aceb-4b6b-8bfe-1316642c990a\") " pod="openstack/ceilometer-0" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.630233 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bdeb398-28cb-4d25-89fd-81af1e9ad81e-scripts\") pod \"glance-default-external-api-0\" (UID: \"2bdeb398-28cb-4d25-89fd-81af1e9ad81e\") " pod="openstack/glance-default-external-api-0" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.630252 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38208185-aceb-4b6b-8bfe-1316642c990a-run-httpd\") pod \"ceilometer-0\" (UID: \"38208185-aceb-4b6b-8bfe-1316642c990a\") " pod="openstack/ceilometer-0" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.630269 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcsnp\" (UniqueName: \"kubernetes.io/projected/38208185-aceb-4b6b-8bfe-1316642c990a-kube-api-access-lcsnp\") pod \"ceilometer-0\" (UID: \"38208185-aceb-4b6b-8bfe-1316642c990a\") " pod="openstack/ceilometer-0" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.630288 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38208185-aceb-4b6b-8bfe-1316642c990a-log-httpd\") pod \"ceilometer-0\" (UID: \"38208185-aceb-4b6b-8bfe-1316642c990a\") " pod="openstack/ceilometer-0" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.630310 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bdeb398-28cb-4d25-89fd-81af1e9ad81e-config-data\") pod \"glance-default-external-api-0\" (UID: \"2bdeb398-28cb-4d25-89fd-81af1e9ad81e\") " pod="openstack/glance-default-external-api-0" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.630331 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bdeb398-28cb-4d25-89fd-81af1e9ad81e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2bdeb398-28cb-4d25-89fd-81af1e9ad81e\") " pod="openstack/glance-default-external-api-0" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.630355 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38208185-aceb-4b6b-8bfe-1316642c990a-config-data\") pod \"ceilometer-0\" (UID: \"38208185-aceb-4b6b-8bfe-1316642c990a\") " pod="openstack/ceilometer-0" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.636561 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38208185-aceb-4b6b-8bfe-1316642c990a-config-data\") pod \"ceilometer-0\" (UID: \"38208185-aceb-4b6b-8bfe-1316642c990a\") " pod="openstack/ceilometer-0" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.636870 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38208185-aceb-4b6b-8bfe-1316642c990a-log-httpd\") pod \"ceilometer-0\" (UID: \"38208185-aceb-4b6b-8bfe-1316642c990a\") " pod="openstack/ceilometer-0" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.639354 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38208185-aceb-4b6b-8bfe-1316642c990a-run-httpd\") pod \"ceilometer-0\" (UID: \"38208185-aceb-4b6b-8bfe-1316642c990a\") " pod="openstack/ceilometer-0" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.642060 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38208185-aceb-4b6b-8bfe-1316642c990a-scripts\") pod \"ceilometer-0\" (UID: \"38208185-aceb-4b6b-8bfe-1316642c990a\") " pod="openstack/ceilometer-0" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.643780 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/38208185-aceb-4b6b-8bfe-1316642c990a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"38208185-aceb-4b6b-8bfe-1316642c990a\") " pod="openstack/ceilometer-0" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.643986 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38208185-aceb-4b6b-8bfe-1316642c990a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"38208185-aceb-4b6b-8bfe-1316642c990a\") " pod="openstack/ceilometer-0" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.660915 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcsnp\" (UniqueName: \"kubernetes.io/projected/38208185-aceb-4b6b-8bfe-1316642c990a-kube-api-access-lcsnp\") pod \"ceilometer-0\" (UID: \"38208185-aceb-4b6b-8bfe-1316642c990a\") " pod="openstack/ceilometer-0" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.732036 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2bdeb398-28cb-4d25-89fd-81af1e9ad81e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2bdeb398-28cb-4d25-89fd-81af1e9ad81e\") " pod="openstack/glance-default-external-api-0" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.732290 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pjm5\" (UniqueName: \"kubernetes.io/projected/2bdeb398-28cb-4d25-89fd-81af1e9ad81e-kube-api-access-7pjm5\") pod \"glance-default-external-api-0\" (UID: \"2bdeb398-28cb-4d25-89fd-81af1e9ad81e\") " pod="openstack/glance-default-external-api-0" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.732396 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2bdeb398-28cb-4d25-89fd-81af1e9ad81e-logs\") pod \"glance-default-external-api-0\" (UID: \"2bdeb398-28cb-4d25-89fd-81af1e9ad81e\") " pod="openstack/glance-default-external-api-0" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.732514 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7b89d7bb-5111-43ac-b0dd-7e23377ef32f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7b89d7bb-5111-43ac-b0dd-7e23377ef32f\") pod \"glance-default-external-api-0\" (UID: \"2bdeb398-28cb-4d25-89fd-81af1e9ad81e\") " pod="openstack/glance-default-external-api-0" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.732611 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bdeb398-28cb-4d25-89fd-81af1e9ad81e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2bdeb398-28cb-4d25-89fd-81af1e9ad81e\") " pod="openstack/glance-default-external-api-0" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.732770 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bdeb398-28cb-4d25-89fd-81af1e9ad81e-scripts\") pod \"glance-default-external-api-0\" (UID: \"2bdeb398-28cb-4d25-89fd-81af1e9ad81e\") " pod="openstack/glance-default-external-api-0" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.732879 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bdeb398-28cb-4d25-89fd-81af1e9ad81e-config-data\") pod \"glance-default-external-api-0\" (UID: \"2bdeb398-28cb-4d25-89fd-81af1e9ad81e\") " pod="openstack/glance-default-external-api-0" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.732970 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bdeb398-28cb-4d25-89fd-81af1e9ad81e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2bdeb398-28cb-4d25-89fd-81af1e9ad81e\") " pod="openstack/glance-default-external-api-0" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.732796 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2bdeb398-28cb-4d25-89fd-81af1e9ad81e-logs\") pod \"glance-default-external-api-0\" (UID: \"2bdeb398-28cb-4d25-89fd-81af1e9ad81e\") " pod="openstack/glance-default-external-api-0" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.733647 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2bdeb398-28cb-4d25-89fd-81af1e9ad81e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2bdeb398-28cb-4d25-89fd-81af1e9ad81e\") " pod="openstack/glance-default-external-api-0" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.737483 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bdeb398-28cb-4d25-89fd-81af1e9ad81e-scripts\") pod \"glance-default-external-api-0\" (UID: \"2bdeb398-28cb-4d25-89fd-81af1e9ad81e\") " pod="openstack/glance-default-external-api-0" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.737690 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bdeb398-28cb-4d25-89fd-81af1e9ad81e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2bdeb398-28cb-4d25-89fd-81af1e9ad81e\") " pod="openstack/glance-default-external-api-0" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.738018 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bdeb398-28cb-4d25-89fd-81af1e9ad81e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2bdeb398-28cb-4d25-89fd-81af1e9ad81e\") " pod="openstack/glance-default-external-api-0" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.742197 4826 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.742239 4826 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7b89d7bb-5111-43ac-b0dd-7e23377ef32f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7b89d7bb-5111-43ac-b0dd-7e23377ef32f\") pod \"glance-default-external-api-0\" (UID: \"2bdeb398-28cb-4d25-89fd-81af1e9ad81e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f5ddb69bbd0946a253964aa6f0f321c34484aa81c710d424a0f6be0ed74bf7c0/globalmount\"" pod="openstack/glance-default-external-api-0" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.742337 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bdeb398-28cb-4d25-89fd-81af1e9ad81e-config-data\") pod \"glance-default-external-api-0\" (UID: \"2bdeb398-28cb-4d25-89fd-81af1e9ad81e\") " pod="openstack/glance-default-external-api-0" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.750269 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pjm5\" (UniqueName: \"kubernetes.io/projected/2bdeb398-28cb-4d25-89fd-81af1e9ad81e-kube-api-access-7pjm5\") pod \"glance-default-external-api-0\" (UID: \"2bdeb398-28cb-4d25-89fd-81af1e9ad81e\") " pod="openstack/glance-default-external-api-0" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.783599 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.790883 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7b89d7bb-5111-43ac-b0dd-7e23377ef32f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7b89d7bb-5111-43ac-b0dd-7e23377ef32f\") pod \"glance-default-external-api-0\" (UID: \"2bdeb398-28cb-4d25-89fd-81af1e9ad81e\") " pod="openstack/glance-default-external-api-0" Mar 19 19:19:46 crc kubenswrapper[4826]: I0319 19:19:46.838718 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 19 19:19:47 crc kubenswrapper[4826]: I0319 19:19:47.294100 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 19 19:19:47 crc kubenswrapper[4826]: I0319 19:19:47.295145 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 19 19:19:47 crc kubenswrapper[4826]: I0319 19:19:47.391039 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:19:47 crc kubenswrapper[4826]: W0319 19:19:47.423536 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod38208185_aceb_4b6b_8bfe_1316642c990a.slice/crio-fa37882a0eb970daa3f58a3fe0f03067bcacd925220b84d39abb279b86189efb WatchSource:0}: Error finding container fa37882a0eb970daa3f58a3fe0f03067bcacd925220b84d39abb279b86189efb: Status 404 returned error can't find the container with id fa37882a0eb970daa3f58a3fe0f03067bcacd925220b84d39abb279b86189efb Mar 19 19:19:47 crc kubenswrapper[4826]: I0319 19:19:47.763294 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 19:19:47 crc kubenswrapper[4826]: I0319 19:19:47.994468 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b2afd63-9250-4f56-be2a-271b75704b95" path="/var/lib/kubelet/pods/8b2afd63-9250-4f56-be2a-271b75704b95/volumes" Mar 19 19:19:47 crc kubenswrapper[4826]: I0319 19:19:47.995569 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e59b525d-52f1-4d86-a5da-37c081313417" path="/var/lib/kubelet/pods/e59b525d-52f1-4d86-a5da-37c081313417/volumes" Mar 19 19:19:48 crc kubenswrapper[4826]: I0319 19:19:48.311647 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-784d5749b4-gl4q7" Mar 19 19:19:48 crc kubenswrapper[4826]: I0319 19:19:48.321337 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2bdeb398-28cb-4d25-89fd-81af1e9ad81e","Type":"ContainerStarted","Data":"35d7f36e984a4b8b7d3635a8a1f43f292206ba221f4e871e415c53129fbb47a9"} Mar 19 19:19:48 crc kubenswrapper[4826]: I0319 19:19:48.326998 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38208185-aceb-4b6b-8bfe-1316642c990a","Type":"ContainerStarted","Data":"1866ac882b06b33c951bae16b5de3447ff7d202831777eef2c94947e017a091d"} Mar 19 19:19:48 crc kubenswrapper[4826]: I0319 19:19:48.327088 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38208185-aceb-4b6b-8bfe-1316642c990a","Type":"ContainerStarted","Data":"fa37882a0eb970daa3f58a3fe0f03067bcacd925220b84d39abb279b86189efb"} Mar 19 19:19:48 crc kubenswrapper[4826]: I0319 19:19:48.401177 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-d8697dd89-f2jgx"] Mar 19 19:19:48 crc kubenswrapper[4826]: I0319 19:19:48.548581 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-559768d959-n8n6w" Mar 19 19:19:48 crc kubenswrapper[4826]: I0319 19:19:48.632548 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-6bc67f648c-7ktnt"] Mar 19 19:19:49 crc kubenswrapper[4826]: I0319 19:19:49.217944 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-d8697dd89-f2jgx" Mar 19 19:19:49 crc kubenswrapper[4826]: I0319 19:19:49.229988 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6bc67f648c-7ktnt" Mar 19 19:19:49 crc kubenswrapper[4826]: I0319 19:19:49.239633 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fba7304a-508e-40ce-899c-608fd790ee26-config-data\") pod \"fba7304a-508e-40ce-899c-608fd790ee26\" (UID: \"fba7304a-508e-40ce-899c-608fd790ee26\") " Mar 19 19:19:49 crc kubenswrapper[4826]: I0319 19:19:49.239700 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fba7304a-508e-40ce-899c-608fd790ee26-combined-ca-bundle\") pod \"fba7304a-508e-40ce-899c-608fd790ee26\" (UID: \"fba7304a-508e-40ce-899c-608fd790ee26\") " Mar 19 19:19:49 crc kubenswrapper[4826]: I0319 19:19:49.239896 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fba7304a-508e-40ce-899c-608fd790ee26-config-data-custom\") pod \"fba7304a-508e-40ce-899c-608fd790ee26\" (UID: \"fba7304a-508e-40ce-899c-608fd790ee26\") " Mar 19 19:19:49 crc kubenswrapper[4826]: I0319 19:19:49.240030 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8pj5\" (UniqueName: \"kubernetes.io/projected/fba7304a-508e-40ce-899c-608fd790ee26-kube-api-access-z8pj5\") pod \"fba7304a-508e-40ce-899c-608fd790ee26\" (UID: \"fba7304a-508e-40ce-899c-608fd790ee26\") " Mar 19 19:19:49 crc kubenswrapper[4826]: I0319 19:19:49.248891 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fba7304a-508e-40ce-899c-608fd790ee26-kube-api-access-z8pj5" (OuterVolumeSpecName: "kube-api-access-z8pj5") pod "fba7304a-508e-40ce-899c-608fd790ee26" (UID: "fba7304a-508e-40ce-899c-608fd790ee26"). InnerVolumeSpecName "kube-api-access-z8pj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:19:49 crc kubenswrapper[4826]: I0319 19:19:49.255769 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fba7304a-508e-40ce-899c-608fd790ee26-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "fba7304a-508e-40ce-899c-608fd790ee26" (UID: "fba7304a-508e-40ce-899c-608fd790ee26"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:49 crc kubenswrapper[4826]: I0319 19:19:49.347592 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6bc67f648c-7ktnt" event={"ID":"6f72991a-39f2-4e20-bbba-68e7761f0644","Type":"ContainerDied","Data":"71a288ef9674cb737997f93b27bdd7d3d476e228a4ba8a52ad37b0e60ca54555"} Mar 19 19:19:49 crc kubenswrapper[4826]: I0319 19:19:49.347818 4826 scope.go:117] "RemoveContainer" containerID="18a42fbd22ce350dde3d9c34836042403dbbd26779f4f53aa1938a2fce6dfb67" Mar 19 19:19:49 crc kubenswrapper[4826]: I0319 19:19:49.347900 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6bc67f648c-7ktnt" Mar 19 19:19:49 crc kubenswrapper[4826]: I0319 19:19:49.350454 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f72991a-39f2-4e20-bbba-68e7761f0644-config-data\") pod \"6f72991a-39f2-4e20-bbba-68e7761f0644\" (UID: \"6f72991a-39f2-4e20-bbba-68e7761f0644\") " Mar 19 19:19:49 crc kubenswrapper[4826]: I0319 19:19:49.350616 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6f72991a-39f2-4e20-bbba-68e7761f0644-config-data-custom\") pod \"6f72991a-39f2-4e20-bbba-68e7761f0644\" (UID: \"6f72991a-39f2-4e20-bbba-68e7761f0644\") " Mar 19 19:19:49 crc kubenswrapper[4826]: I0319 19:19:49.350681 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9jpr\" (UniqueName: \"kubernetes.io/projected/6f72991a-39f2-4e20-bbba-68e7761f0644-kube-api-access-j9jpr\") pod \"6f72991a-39f2-4e20-bbba-68e7761f0644\" (UID: \"6f72991a-39f2-4e20-bbba-68e7761f0644\") " Mar 19 19:19:49 crc kubenswrapper[4826]: I0319 19:19:49.350727 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f72991a-39f2-4e20-bbba-68e7761f0644-combined-ca-bundle\") pod \"6f72991a-39f2-4e20-bbba-68e7761f0644\" (UID: \"6f72991a-39f2-4e20-bbba-68e7761f0644\") " Mar 19 19:19:49 crc kubenswrapper[4826]: I0319 19:19:49.351609 4826 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fba7304a-508e-40ce-899c-608fd790ee26-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:49 crc kubenswrapper[4826]: I0319 19:19:49.351620 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8pj5\" (UniqueName: \"kubernetes.io/projected/fba7304a-508e-40ce-899c-608fd790ee26-kube-api-access-z8pj5\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:49 crc kubenswrapper[4826]: I0319 19:19:49.375882 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f72991a-39f2-4e20-bbba-68e7761f0644-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6f72991a-39f2-4e20-bbba-68e7761f0644" (UID: "6f72991a-39f2-4e20-bbba-68e7761f0644"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:49 crc kubenswrapper[4826]: I0319 19:19:49.384444 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f72991a-39f2-4e20-bbba-68e7761f0644-kube-api-access-j9jpr" (OuterVolumeSpecName: "kube-api-access-j9jpr") pod "6f72991a-39f2-4e20-bbba-68e7761f0644" (UID: "6f72991a-39f2-4e20-bbba-68e7761f0644"). InnerVolumeSpecName "kube-api-access-j9jpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:19:49 crc kubenswrapper[4826]: I0319 19:19:49.390137 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-d8697dd89-f2jgx" event={"ID":"fba7304a-508e-40ce-899c-608fd790ee26","Type":"ContainerDied","Data":"cbcbf27e3ef482164887bde4ba3f7c94f36488fac325c72a365d0e2a6952bd26"} Mar 19 19:19:49 crc kubenswrapper[4826]: I0319 19:19:49.390140 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-d8697dd89-f2jgx" Mar 19 19:19:49 crc kubenswrapper[4826]: I0319 19:19:49.397118 4826 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 19:19:49 crc kubenswrapper[4826]: I0319 19:19:49.397140 4826 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 19:19:49 crc kubenswrapper[4826]: I0319 19:19:49.398457 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2bdeb398-28cb-4d25-89fd-81af1e9ad81e","Type":"ContainerStarted","Data":"50497dbd8e6a5b5c367e7f324f243d890c97dee21b72ab0e2060c83e4711949e"} Mar 19 19:19:49 crc kubenswrapper[4826]: I0319 19:19:49.454115 4826 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6f72991a-39f2-4e20-bbba-68e7761f0644-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:49 crc kubenswrapper[4826]: I0319 19:19:49.454145 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9jpr\" (UniqueName: \"kubernetes.io/projected/6f72991a-39f2-4e20-bbba-68e7761f0644-kube-api-access-j9jpr\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:49 crc kubenswrapper[4826]: I0319 19:19:49.471061 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fba7304a-508e-40ce-899c-608fd790ee26-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fba7304a-508e-40ce-899c-608fd790ee26" (UID: "fba7304a-508e-40ce-899c-608fd790ee26"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:49 crc kubenswrapper[4826]: I0319 19:19:49.475821 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fba7304a-508e-40ce-899c-608fd790ee26-config-data" (OuterVolumeSpecName: "config-data") pod "fba7304a-508e-40ce-899c-608fd790ee26" (UID: "fba7304a-508e-40ce-899c-608fd790ee26"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:49 crc kubenswrapper[4826]: I0319 19:19:49.486045 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f72991a-39f2-4e20-bbba-68e7761f0644-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6f72991a-39f2-4e20-bbba-68e7761f0644" (UID: "6f72991a-39f2-4e20-bbba-68e7761f0644"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:49 crc kubenswrapper[4826]: I0319 19:19:49.516168 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f72991a-39f2-4e20-bbba-68e7761f0644-config-data" (OuterVolumeSpecName: "config-data") pod "6f72991a-39f2-4e20-bbba-68e7761f0644" (UID: "6f72991a-39f2-4e20-bbba-68e7761f0644"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:19:49 crc kubenswrapper[4826]: I0319 19:19:49.555869 4826 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fba7304a-508e-40ce-899c-608fd790ee26-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:49 crc kubenswrapper[4826]: I0319 19:19:49.556143 4826 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f72991a-39f2-4e20-bbba-68e7761f0644-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:49 crc kubenswrapper[4826]: I0319 19:19:49.556157 4826 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fba7304a-508e-40ce-899c-608fd790ee26-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:49 crc kubenswrapper[4826]: I0319 19:19:49.556166 4826 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f72991a-39f2-4e20-bbba-68e7761f0644-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:19:49 crc kubenswrapper[4826]: I0319 19:19:49.569729 4826 scope.go:117] "RemoveContainer" containerID="7aec749a0c7586daee3e38e3fcffd060f07c049dea607343a073230cf02a1985" Mar 19 19:19:49 crc kubenswrapper[4826]: I0319 19:19:49.682562 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-6bc67f648c-7ktnt"] Mar 19 19:19:49 crc kubenswrapper[4826]: I0319 19:19:49.692922 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-6bc67f648c-7ktnt"] Mar 19 19:19:49 crc kubenswrapper[4826]: I0319 19:19:49.724702 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-d8697dd89-f2jgx"] Mar 19 19:19:49 crc kubenswrapper[4826]: I0319 19:19:49.735065 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-d8697dd89-f2jgx"] Mar 19 19:19:49 crc kubenswrapper[4826]: I0319 19:19:49.990628 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f72991a-39f2-4e20-bbba-68e7761f0644" path="/var/lib/kubelet/pods/6f72991a-39f2-4e20-bbba-68e7761f0644/volumes" Mar 19 19:19:49 crc kubenswrapper[4826]: I0319 19:19:49.991463 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fba7304a-508e-40ce-899c-608fd790ee26" path="/var/lib/kubelet/pods/fba7304a-508e-40ce-899c-608fd790ee26/volumes" Mar 19 19:19:50 crc kubenswrapper[4826]: I0319 19:19:50.410284 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38208185-aceb-4b6b-8bfe-1316642c990a","Type":"ContainerStarted","Data":"7467a99cf47615c389cc307f805b8cc5d39ce54b49f3652c74297080d2ff8852"} Mar 19 19:19:50 crc kubenswrapper[4826]: I0319 19:19:50.410585 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38208185-aceb-4b6b-8bfe-1316642c990a","Type":"ContainerStarted","Data":"d9f8c0551c821d12250f72d1fc2827c94b3a6351414f2c6bbe07beef2177afcf"} Mar 19 19:19:50 crc kubenswrapper[4826]: I0319 19:19:50.413231 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2bdeb398-28cb-4d25-89fd-81af1e9ad81e","Type":"ContainerStarted","Data":"497a884daa4817cc1559b13ba9cab9670627b34726c78f340d287b441dc9b6c8"} Mar 19 19:19:52 crc kubenswrapper[4826]: I0319 19:19:52.437325 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38208185-aceb-4b6b-8bfe-1316642c990a","Type":"ContainerStarted","Data":"b3d3ffc245d5ebf9fea0afff853a781d2d4210525bd074778db36ce2c8efea29"} Mar 19 19:19:52 crc kubenswrapper[4826]: I0319 19:19:52.437939 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 19 19:19:52 crc kubenswrapper[4826]: I0319 19:19:52.458037 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.458019693 podStartE2EDuration="6.458019693s" podCreationTimestamp="2026-03-19 19:19:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:19:50.441254899 +0000 UTC m=+1415.195323212" watchObservedRunningTime="2026-03-19 19:19:52.458019693 +0000 UTC m=+1417.212088006" Mar 19 19:19:52 crc kubenswrapper[4826]: I0319 19:19:52.548738 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 19 19:19:52 crc kubenswrapper[4826]: I0319 19:19:52.548862 4826 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 19:19:52 crc kubenswrapper[4826]: I0319 19:19:52.552901 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 19 19:19:52 crc kubenswrapper[4826]: I0319 19:19:52.580136 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.013910551 podStartE2EDuration="6.580114684s" podCreationTimestamp="2026-03-19 19:19:46 +0000 UTC" firstStartedPulling="2026-03-19 19:19:47.428633706 +0000 UTC m=+1412.182702019" lastFinishedPulling="2026-03-19 19:19:51.994837839 +0000 UTC m=+1416.748906152" observedRunningTime="2026-03-19 19:19:52.458317561 +0000 UTC m=+1417.212385904" watchObservedRunningTime="2026-03-19 19:19:52.580114684 +0000 UTC m=+1417.334183007" Mar 19 19:19:53 crc kubenswrapper[4826]: I0319 19:19:53.131380 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-5778865fb9-z27ps" Mar 19 19:19:53 crc kubenswrapper[4826]: I0319 19:19:53.201919 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-79595bb545-xcckb"] Mar 19 19:19:53 crc kubenswrapper[4826]: I0319 19:19:53.202147 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-79595bb545-xcckb" podUID="6a3611e1-4719-476c-8d8b-ceccedcb14bc" containerName="heat-engine" containerID="cri-o://230b6d2884e1b6005c051b0b95d6403e68a27adb8c35eeee9e84242ad167358b" gracePeriod=60 Mar 19 19:19:55 crc kubenswrapper[4826]: I0319 19:19:55.185828 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cml5k" podUID="af2662b9-3873-4947-9793-e7e1c6611dcb" containerName="registry-server" probeResult="failure" output=< Mar 19 19:19:55 crc kubenswrapper[4826]: timeout: failed to connect service ":50051" within 1s Mar 19 19:19:55 crc kubenswrapper[4826]: > Mar 19 19:19:56 crc kubenswrapper[4826]: E0319 19:19:56.209191 4826 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="230b6d2884e1b6005c051b0b95d6403e68a27adb8c35eeee9e84242ad167358b" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 19 19:19:56 crc kubenswrapper[4826]: E0319 19:19:56.210564 4826 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="230b6d2884e1b6005c051b0b95d6403e68a27adb8c35eeee9e84242ad167358b" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 19 19:19:56 crc kubenswrapper[4826]: E0319 19:19:56.212013 4826 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="230b6d2884e1b6005c051b0b95d6403e68a27adb8c35eeee9e84242ad167358b" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 19 19:19:56 crc kubenswrapper[4826]: E0319 19:19:56.212081 4826 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-79595bb545-xcckb" podUID="6a3611e1-4719-476c-8d8b-ceccedcb14bc" containerName="heat-engine" Mar 19 19:19:56 crc kubenswrapper[4826]: I0319 19:19:56.839840 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 19 19:19:56 crc kubenswrapper[4826]: I0319 19:19:56.840195 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 19 19:19:56 crc kubenswrapper[4826]: I0319 19:19:56.895208 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 19 19:19:56 crc kubenswrapper[4826]: I0319 19:19:56.911216 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 19 19:19:57 crc kubenswrapper[4826]: I0319 19:19:57.484803 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 19 19:19:57 crc kubenswrapper[4826]: I0319 19:19:57.484855 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 19 19:19:58 crc kubenswrapper[4826]: I0319 19:19:58.034227 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:19:58 crc kubenswrapper[4826]: I0319 19:19:58.034786 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="38208185-aceb-4b6b-8bfe-1316642c990a" containerName="ceilometer-central-agent" containerID="cri-o://1866ac882b06b33c951bae16b5de3447ff7d202831777eef2c94947e017a091d" gracePeriod=30 Mar 19 19:19:58 crc kubenswrapper[4826]: I0319 19:19:58.034860 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="38208185-aceb-4b6b-8bfe-1316642c990a" containerName="ceilometer-notification-agent" containerID="cri-o://d9f8c0551c821d12250f72d1fc2827c94b3a6351414f2c6bbe07beef2177afcf" gracePeriod=30 Mar 19 19:19:58 crc kubenswrapper[4826]: I0319 19:19:58.034880 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="38208185-aceb-4b6b-8bfe-1316642c990a" containerName="proxy-httpd" containerID="cri-o://b3d3ffc245d5ebf9fea0afff853a781d2d4210525bd074778db36ce2c8efea29" gracePeriod=30 Mar 19 19:19:58 crc kubenswrapper[4826]: I0319 19:19:58.034873 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="38208185-aceb-4b6b-8bfe-1316642c990a" containerName="sg-core" containerID="cri-o://7467a99cf47615c389cc307f805b8cc5d39ce54b49f3652c74297080d2ff8852" gracePeriod=30 Mar 19 19:19:58 crc kubenswrapper[4826]: I0319 19:19:58.498356 4826 generic.go:334] "Generic (PLEG): container finished" podID="38208185-aceb-4b6b-8bfe-1316642c990a" containerID="b3d3ffc245d5ebf9fea0afff853a781d2d4210525bd074778db36ce2c8efea29" exitCode=0 Mar 19 19:19:58 crc kubenswrapper[4826]: I0319 19:19:58.498626 4826 generic.go:334] "Generic (PLEG): container finished" podID="38208185-aceb-4b6b-8bfe-1316642c990a" containerID="7467a99cf47615c389cc307f805b8cc5d39ce54b49f3652c74297080d2ff8852" exitCode=2 Mar 19 19:19:58 crc kubenswrapper[4826]: I0319 19:19:58.498454 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38208185-aceb-4b6b-8bfe-1316642c990a","Type":"ContainerDied","Data":"b3d3ffc245d5ebf9fea0afff853a781d2d4210525bd074778db36ce2c8efea29"} Mar 19 19:19:58 crc kubenswrapper[4826]: I0319 19:19:58.498720 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38208185-aceb-4b6b-8bfe-1316642c990a","Type":"ContainerDied","Data":"7467a99cf47615c389cc307f805b8cc5d39ce54b49f3652c74297080d2ff8852"} Mar 19 19:19:59 crc kubenswrapper[4826]: I0319 19:19:59.511267 4826 generic.go:334] "Generic (PLEG): container finished" podID="38208185-aceb-4b6b-8bfe-1316642c990a" containerID="d9f8c0551c821d12250f72d1fc2827c94b3a6351414f2c6bbe07beef2177afcf" exitCode=0 Mar 19 19:19:59 crc kubenswrapper[4826]: I0319 19:19:59.511309 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38208185-aceb-4b6b-8bfe-1316642c990a","Type":"ContainerDied","Data":"d9f8c0551c821d12250f72d1fc2827c94b3a6351414f2c6bbe07beef2177afcf"} Mar 19 19:19:59 crc kubenswrapper[4826]: I0319 19:19:59.705677 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 19 19:19:59 crc kubenswrapper[4826]: I0319 19:19:59.706051 4826 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 19:19:59 crc kubenswrapper[4826]: I0319 19:19:59.718289 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wrcwr"] Mar 19 19:19:59 crc kubenswrapper[4826]: E0319 19:19:59.718920 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fba7304a-508e-40ce-899c-608fd790ee26" containerName="heat-api" Mar 19 19:19:59 crc kubenswrapper[4826]: I0319 19:19:59.718944 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="fba7304a-508e-40ce-899c-608fd790ee26" containerName="heat-api" Mar 19 19:19:59 crc kubenswrapper[4826]: E0319 19:19:59.718966 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f72991a-39f2-4e20-bbba-68e7761f0644" containerName="heat-cfnapi" Mar 19 19:19:59 crc kubenswrapper[4826]: I0319 19:19:59.718975 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f72991a-39f2-4e20-bbba-68e7761f0644" containerName="heat-cfnapi" Mar 19 19:19:59 crc kubenswrapper[4826]: E0319 19:19:59.718986 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f72991a-39f2-4e20-bbba-68e7761f0644" containerName="heat-cfnapi" Mar 19 19:19:59 crc kubenswrapper[4826]: I0319 19:19:59.718995 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f72991a-39f2-4e20-bbba-68e7761f0644" containerName="heat-cfnapi" Mar 19 19:19:59 crc kubenswrapper[4826]: E0319 19:19:59.719027 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fba7304a-508e-40ce-899c-608fd790ee26" containerName="heat-api" Mar 19 19:19:59 crc kubenswrapper[4826]: I0319 19:19:59.719034 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="fba7304a-508e-40ce-899c-608fd790ee26" containerName="heat-api" Mar 19 19:19:59 crc kubenswrapper[4826]: I0319 19:19:59.719230 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="fba7304a-508e-40ce-899c-608fd790ee26" containerName="heat-api" Mar 19 19:19:59 crc kubenswrapper[4826]: I0319 19:19:59.719243 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f72991a-39f2-4e20-bbba-68e7761f0644" containerName="heat-cfnapi" Mar 19 19:19:59 crc kubenswrapper[4826]: I0319 19:19:59.719263 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f72991a-39f2-4e20-bbba-68e7761f0644" containerName="heat-cfnapi" Mar 19 19:19:59 crc kubenswrapper[4826]: I0319 19:19:59.719274 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="fba7304a-508e-40ce-899c-608fd790ee26" containerName="heat-api" Mar 19 19:19:59 crc kubenswrapper[4826]: I0319 19:19:59.720974 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wrcwr" Mar 19 19:19:59 crc kubenswrapper[4826]: I0319 19:19:59.741338 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wrcwr"] Mar 19 19:19:59 crc kubenswrapper[4826]: I0319 19:19:59.749469 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 19 19:19:59 crc kubenswrapper[4826]: I0319 19:19:59.814000 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/451fe5a3-8a05-42fd-8059-85084283a59f-utilities\") pod \"community-operators-wrcwr\" (UID: \"451fe5a3-8a05-42fd-8059-85084283a59f\") " pod="openshift-marketplace/community-operators-wrcwr" Mar 19 19:19:59 crc kubenswrapper[4826]: I0319 19:19:59.814302 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/451fe5a3-8a05-42fd-8059-85084283a59f-catalog-content\") pod \"community-operators-wrcwr\" (UID: \"451fe5a3-8a05-42fd-8059-85084283a59f\") " pod="openshift-marketplace/community-operators-wrcwr" Mar 19 19:19:59 crc kubenswrapper[4826]: I0319 19:19:59.814469 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79nqv\" (UniqueName: \"kubernetes.io/projected/451fe5a3-8a05-42fd-8059-85084283a59f-kube-api-access-79nqv\") pod \"community-operators-wrcwr\" (UID: \"451fe5a3-8a05-42fd-8059-85084283a59f\") " pod="openshift-marketplace/community-operators-wrcwr" Mar 19 19:19:59 crc kubenswrapper[4826]: I0319 19:19:59.921981 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79nqv\" (UniqueName: \"kubernetes.io/projected/451fe5a3-8a05-42fd-8059-85084283a59f-kube-api-access-79nqv\") pod \"community-operators-wrcwr\" (UID: \"451fe5a3-8a05-42fd-8059-85084283a59f\") " pod="openshift-marketplace/community-operators-wrcwr" Mar 19 19:19:59 crc kubenswrapper[4826]: I0319 19:19:59.922073 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/451fe5a3-8a05-42fd-8059-85084283a59f-utilities\") pod \"community-operators-wrcwr\" (UID: \"451fe5a3-8a05-42fd-8059-85084283a59f\") " pod="openshift-marketplace/community-operators-wrcwr" Mar 19 19:19:59 crc kubenswrapper[4826]: I0319 19:19:59.922272 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/451fe5a3-8a05-42fd-8059-85084283a59f-catalog-content\") pod \"community-operators-wrcwr\" (UID: \"451fe5a3-8a05-42fd-8059-85084283a59f\") " pod="openshift-marketplace/community-operators-wrcwr" Mar 19 19:19:59 crc kubenswrapper[4826]: I0319 19:19:59.922893 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/451fe5a3-8a05-42fd-8059-85084283a59f-catalog-content\") pod \"community-operators-wrcwr\" (UID: \"451fe5a3-8a05-42fd-8059-85084283a59f\") " pod="openshift-marketplace/community-operators-wrcwr" Mar 19 19:19:59 crc kubenswrapper[4826]: I0319 19:19:59.926973 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/451fe5a3-8a05-42fd-8059-85084283a59f-utilities\") pod \"community-operators-wrcwr\" (UID: \"451fe5a3-8a05-42fd-8059-85084283a59f\") " pod="openshift-marketplace/community-operators-wrcwr" Mar 19 19:19:59 crc kubenswrapper[4826]: I0319 19:19:59.955387 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79nqv\" (UniqueName: \"kubernetes.io/projected/451fe5a3-8a05-42fd-8059-85084283a59f-kube-api-access-79nqv\") pod \"community-operators-wrcwr\" (UID: \"451fe5a3-8a05-42fd-8059-85084283a59f\") " pod="openshift-marketplace/community-operators-wrcwr" Mar 19 19:20:00 crc kubenswrapper[4826]: I0319 19:20:00.053086 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wrcwr" Mar 19 19:20:00 crc kubenswrapper[4826]: I0319 19:20:00.225714 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565800-l7j4f"] Mar 19 19:20:00 crc kubenswrapper[4826]: I0319 19:20:00.227435 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565800-l7j4f" Mar 19 19:20:00 crc kubenswrapper[4826]: I0319 19:20:00.235770 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 19:20:00 crc kubenswrapper[4826]: I0319 19:20:00.235949 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 19:20:00 crc kubenswrapper[4826]: I0319 19:20:00.236026 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b27wl" Mar 19 19:20:00 crc kubenswrapper[4826]: I0319 19:20:00.251754 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565800-l7j4f"] Mar 19 19:20:00 crc kubenswrapper[4826]: I0319 19:20:00.362223 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsxjt\" (UniqueName: \"kubernetes.io/projected/eb05aceb-7f5c-42c3-a9a6-1242def4b9cc-kube-api-access-rsxjt\") pod \"auto-csr-approver-29565800-l7j4f\" (UID: \"eb05aceb-7f5c-42c3-a9a6-1242def4b9cc\") " pod="openshift-infra/auto-csr-approver-29565800-l7j4f" Mar 19 19:20:00 crc kubenswrapper[4826]: I0319 19:20:00.464177 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsxjt\" (UniqueName: \"kubernetes.io/projected/eb05aceb-7f5c-42c3-a9a6-1242def4b9cc-kube-api-access-rsxjt\") pod \"auto-csr-approver-29565800-l7j4f\" (UID: \"eb05aceb-7f5c-42c3-a9a6-1242def4b9cc\") " pod="openshift-infra/auto-csr-approver-29565800-l7j4f" Mar 19 19:20:00 crc kubenswrapper[4826]: I0319 19:20:00.489914 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsxjt\" (UniqueName: \"kubernetes.io/projected/eb05aceb-7f5c-42c3-a9a6-1242def4b9cc-kube-api-access-rsxjt\") pod \"auto-csr-approver-29565800-l7j4f\" (UID: \"eb05aceb-7f5c-42c3-a9a6-1242def4b9cc\") " pod="openshift-infra/auto-csr-approver-29565800-l7j4f" Mar 19 19:20:00 crc kubenswrapper[4826]: I0319 19:20:00.563251 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565800-l7j4f" Mar 19 19:20:00 crc kubenswrapper[4826]: I0319 19:20:00.682693 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wrcwr"] Mar 19 19:20:01 crc kubenswrapper[4826]: I0319 19:20:01.129683 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565800-l7j4f"] Mar 19 19:20:01 crc kubenswrapper[4826]: I0319 19:20:01.549816 4826 generic.go:334] "Generic (PLEG): container finished" podID="451fe5a3-8a05-42fd-8059-85084283a59f" containerID="5527f42efe8f2e3c3378d7c9cd685ea7f203c135be6bef00b3f34316bb528565" exitCode=0 Mar 19 19:20:01 crc kubenswrapper[4826]: I0319 19:20:01.550246 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wrcwr" event={"ID":"451fe5a3-8a05-42fd-8059-85084283a59f","Type":"ContainerDied","Data":"5527f42efe8f2e3c3378d7c9cd685ea7f203c135be6bef00b3f34316bb528565"} Mar 19 19:20:01 crc kubenswrapper[4826]: I0319 19:20:01.550312 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wrcwr" event={"ID":"451fe5a3-8a05-42fd-8059-85084283a59f","Type":"ContainerStarted","Data":"e54c2aefa21c8964848c23fe1a7d6640300f53c2c7fe8fa41c7e97a22f2b8448"} Mar 19 19:20:01 crc kubenswrapper[4826]: I0319 19:20:01.556513 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565800-l7j4f" event={"ID":"eb05aceb-7f5c-42c3-a9a6-1242def4b9cc","Type":"ContainerStarted","Data":"29bbe1e8e287ea8a02aa80222936ad5d96082328cbb2a7323672ce9b7ada515a"} Mar 19 19:20:03 crc kubenswrapper[4826]: I0319 19:20:03.596867 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wrcwr" event={"ID":"451fe5a3-8a05-42fd-8059-85084283a59f","Type":"ContainerStarted","Data":"bb275d63b1df93aa76077b15ed436d436cf33fa341f768bdf2e8096cfcc50b93"} Mar 19 19:20:03 crc kubenswrapper[4826]: I0319 19:20:03.606851 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565800-l7j4f" event={"ID":"eb05aceb-7f5c-42c3-a9a6-1242def4b9cc","Type":"ContainerStarted","Data":"dad70c4189d94827d3fb69e4bb747fcd71cb6a4ae9acbcdbf1334d3d619c5756"} Mar 19 19:20:03 crc kubenswrapper[4826]: I0319 19:20:03.611834 4826 generic.go:334] "Generic (PLEG): container finished" podID="f69e3ed0-cf8e-438d-a6f0-dac56664901e" containerID="13db328b16b5b91f7dab9647f18f61f5279dc45c5f092cecb97afa038e776674" exitCode=0 Mar 19 19:20:03 crc kubenswrapper[4826]: I0319 19:20:03.611881 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-nn4pv" event={"ID":"f69e3ed0-cf8e-438d-a6f0-dac56664901e","Type":"ContainerDied","Data":"13db328b16b5b91f7dab9647f18f61f5279dc45c5f092cecb97afa038e776674"} Mar 19 19:20:03 crc kubenswrapper[4826]: I0319 19:20:03.640965 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565800-l7j4f" podStartSLOduration=2.245652052 podStartE2EDuration="3.640946596s" podCreationTimestamp="2026-03-19 19:20:00 +0000 UTC" firstStartedPulling="2026-03-19 19:20:01.126220937 +0000 UTC m=+1425.880289250" lastFinishedPulling="2026-03-19 19:20:02.521515481 +0000 UTC m=+1427.275583794" observedRunningTime="2026-03-19 19:20:03.635380941 +0000 UTC m=+1428.389449254" watchObservedRunningTime="2026-03-19 19:20:03.640946596 +0000 UTC m=+1428.395014909" Mar 19 19:20:04 crc kubenswrapper[4826]: I0319 19:20:04.347215 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:20:04 crc kubenswrapper[4826]: I0319 19:20:04.480426 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38208185-aceb-4b6b-8bfe-1316642c990a-combined-ca-bundle\") pod \"38208185-aceb-4b6b-8bfe-1316642c990a\" (UID: \"38208185-aceb-4b6b-8bfe-1316642c990a\") " Mar 19 19:20:04 crc kubenswrapper[4826]: I0319 19:20:04.480505 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38208185-aceb-4b6b-8bfe-1316642c990a-log-httpd\") pod \"38208185-aceb-4b6b-8bfe-1316642c990a\" (UID: \"38208185-aceb-4b6b-8bfe-1316642c990a\") " Mar 19 19:20:04 crc kubenswrapper[4826]: I0319 19:20:04.480531 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38208185-aceb-4b6b-8bfe-1316642c990a-run-httpd\") pod \"38208185-aceb-4b6b-8bfe-1316642c990a\" (UID: \"38208185-aceb-4b6b-8bfe-1316642c990a\") " Mar 19 19:20:04 crc kubenswrapper[4826]: I0319 19:20:04.480571 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38208185-aceb-4b6b-8bfe-1316642c990a-scripts\") pod \"38208185-aceb-4b6b-8bfe-1316642c990a\" (UID: \"38208185-aceb-4b6b-8bfe-1316642c990a\") " Mar 19 19:20:04 crc kubenswrapper[4826]: I0319 19:20:04.480601 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38208185-aceb-4b6b-8bfe-1316642c990a-config-data\") pod \"38208185-aceb-4b6b-8bfe-1316642c990a\" (UID: \"38208185-aceb-4b6b-8bfe-1316642c990a\") " Mar 19 19:20:04 crc kubenswrapper[4826]: I0319 19:20:04.480681 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lcsnp\" (UniqueName: \"kubernetes.io/projected/38208185-aceb-4b6b-8bfe-1316642c990a-kube-api-access-lcsnp\") pod \"38208185-aceb-4b6b-8bfe-1316642c990a\" (UID: \"38208185-aceb-4b6b-8bfe-1316642c990a\") " Mar 19 19:20:04 crc kubenswrapper[4826]: I0319 19:20:04.480738 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/38208185-aceb-4b6b-8bfe-1316642c990a-sg-core-conf-yaml\") pod \"38208185-aceb-4b6b-8bfe-1316642c990a\" (UID: \"38208185-aceb-4b6b-8bfe-1316642c990a\") " Mar 19 19:20:04 crc kubenswrapper[4826]: I0319 19:20:04.482510 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38208185-aceb-4b6b-8bfe-1316642c990a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "38208185-aceb-4b6b-8bfe-1316642c990a" (UID: "38208185-aceb-4b6b-8bfe-1316642c990a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:20:04 crc kubenswrapper[4826]: I0319 19:20:04.482754 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38208185-aceb-4b6b-8bfe-1316642c990a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "38208185-aceb-4b6b-8bfe-1316642c990a" (UID: "38208185-aceb-4b6b-8bfe-1316642c990a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:20:04 crc kubenswrapper[4826]: I0319 19:20:04.489544 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38208185-aceb-4b6b-8bfe-1316642c990a-kube-api-access-lcsnp" (OuterVolumeSpecName: "kube-api-access-lcsnp") pod "38208185-aceb-4b6b-8bfe-1316642c990a" (UID: "38208185-aceb-4b6b-8bfe-1316642c990a"). InnerVolumeSpecName "kube-api-access-lcsnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:20:04 crc kubenswrapper[4826]: I0319 19:20:04.491006 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38208185-aceb-4b6b-8bfe-1316642c990a-scripts" (OuterVolumeSpecName: "scripts") pod "38208185-aceb-4b6b-8bfe-1316642c990a" (UID: "38208185-aceb-4b6b-8bfe-1316642c990a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:20:04 crc kubenswrapper[4826]: I0319 19:20:04.534802 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38208185-aceb-4b6b-8bfe-1316642c990a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "38208185-aceb-4b6b-8bfe-1316642c990a" (UID: "38208185-aceb-4b6b-8bfe-1316642c990a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:20:04 crc kubenswrapper[4826]: I0319 19:20:04.583388 4826 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38208185-aceb-4b6b-8bfe-1316642c990a-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:04 crc kubenswrapper[4826]: I0319 19:20:04.583415 4826 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38208185-aceb-4b6b-8bfe-1316642c990a-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:04 crc kubenswrapper[4826]: I0319 19:20:04.583423 4826 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38208185-aceb-4b6b-8bfe-1316642c990a-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:04 crc kubenswrapper[4826]: I0319 19:20:04.583431 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lcsnp\" (UniqueName: \"kubernetes.io/projected/38208185-aceb-4b6b-8bfe-1316642c990a-kube-api-access-lcsnp\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:04 crc kubenswrapper[4826]: I0319 19:20:04.583440 4826 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/38208185-aceb-4b6b-8bfe-1316642c990a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:04 crc kubenswrapper[4826]: I0319 19:20:04.609277 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38208185-aceb-4b6b-8bfe-1316642c990a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "38208185-aceb-4b6b-8bfe-1316642c990a" (UID: "38208185-aceb-4b6b-8bfe-1316642c990a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:20:04 crc kubenswrapper[4826]: I0319 19:20:04.623341 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38208185-aceb-4b6b-8bfe-1316642c990a-config-data" (OuterVolumeSpecName: "config-data") pod "38208185-aceb-4b6b-8bfe-1316642c990a" (UID: "38208185-aceb-4b6b-8bfe-1316642c990a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:20:04 crc kubenswrapper[4826]: I0319 19:20:04.625190 4826 generic.go:334] "Generic (PLEG): container finished" podID="38208185-aceb-4b6b-8bfe-1316642c990a" containerID="1866ac882b06b33c951bae16b5de3447ff7d202831777eef2c94947e017a091d" exitCode=0 Mar 19 19:20:04 crc kubenswrapper[4826]: I0319 19:20:04.625263 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38208185-aceb-4b6b-8bfe-1316642c990a","Type":"ContainerDied","Data":"1866ac882b06b33c951bae16b5de3447ff7d202831777eef2c94947e017a091d"} Mar 19 19:20:04 crc kubenswrapper[4826]: I0319 19:20:04.625315 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38208185-aceb-4b6b-8bfe-1316642c990a","Type":"ContainerDied","Data":"fa37882a0eb970daa3f58a3fe0f03067bcacd925220b84d39abb279b86189efb"} Mar 19 19:20:04 crc kubenswrapper[4826]: I0319 19:20:04.625334 4826 scope.go:117] "RemoveContainer" containerID="b3d3ffc245d5ebf9fea0afff853a781d2d4210525bd074778db36ce2c8efea29" Mar 19 19:20:04 crc kubenswrapper[4826]: I0319 19:20:04.625386 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:20:04 crc kubenswrapper[4826]: I0319 19:20:04.665551 4826 scope.go:117] "RemoveContainer" containerID="7467a99cf47615c389cc307f805b8cc5d39ce54b49f3652c74297080d2ff8852" Mar 19 19:20:04 crc kubenswrapper[4826]: I0319 19:20:04.666259 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:20:04 crc kubenswrapper[4826]: I0319 19:20:04.699275 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:20:04 crc kubenswrapper[4826]: I0319 19:20:04.720702 4826 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38208185-aceb-4b6b-8bfe-1316642c990a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:04 crc kubenswrapper[4826]: I0319 19:20:04.720731 4826 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38208185-aceb-4b6b-8bfe-1316642c990a-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:04 crc kubenswrapper[4826]: I0319 19:20:04.724772 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:20:04 crc kubenswrapper[4826]: E0319 19:20:04.725566 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38208185-aceb-4b6b-8bfe-1316642c990a" containerName="sg-core" Mar 19 19:20:04 crc kubenswrapper[4826]: I0319 19:20:04.725608 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="38208185-aceb-4b6b-8bfe-1316642c990a" containerName="sg-core" Mar 19 19:20:04 crc kubenswrapper[4826]: E0319 19:20:04.725633 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38208185-aceb-4b6b-8bfe-1316642c990a" containerName="ceilometer-central-agent" Mar 19 19:20:04 crc kubenswrapper[4826]: I0319 19:20:04.725642 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="38208185-aceb-4b6b-8bfe-1316642c990a" containerName="ceilometer-central-agent" Mar 19 19:20:04 crc kubenswrapper[4826]: E0319 19:20:04.725693 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38208185-aceb-4b6b-8bfe-1316642c990a" containerName="proxy-httpd" Mar 19 19:20:04 crc kubenswrapper[4826]: I0319 19:20:04.725705 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="38208185-aceb-4b6b-8bfe-1316642c990a" containerName="proxy-httpd" Mar 19 19:20:04 crc kubenswrapper[4826]: E0319 19:20:04.725730 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38208185-aceb-4b6b-8bfe-1316642c990a" containerName="ceilometer-notification-agent" Mar 19 19:20:04 crc kubenswrapper[4826]: I0319 19:20:04.725758 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="38208185-aceb-4b6b-8bfe-1316642c990a" containerName="ceilometer-notification-agent" Mar 19 19:20:04 crc kubenswrapper[4826]: I0319 19:20:04.726171 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="38208185-aceb-4b6b-8bfe-1316642c990a" containerName="proxy-httpd" Mar 19 19:20:04 crc kubenswrapper[4826]: I0319 19:20:04.726193 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="38208185-aceb-4b6b-8bfe-1316642c990a" containerName="ceilometer-central-agent" Mar 19 19:20:04 crc kubenswrapper[4826]: I0319 19:20:04.726205 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="38208185-aceb-4b6b-8bfe-1316642c990a" containerName="ceilometer-notification-agent" Mar 19 19:20:04 crc kubenswrapper[4826]: I0319 19:20:04.726225 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="38208185-aceb-4b6b-8bfe-1316642c990a" containerName="sg-core" Mar 19 19:20:04 crc kubenswrapper[4826]: I0319 19:20:04.739355 4826 scope.go:117] "RemoveContainer" containerID="d9f8c0551c821d12250f72d1fc2827c94b3a6351414f2c6bbe07beef2177afcf" Mar 19 19:20:04 crc kubenswrapper[4826]: I0319 19:20:04.741945 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:20:04 crc kubenswrapper[4826]: I0319 19:20:04.755842 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:20:04 crc kubenswrapper[4826]: I0319 19:20:04.762876 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 19 19:20:04 crc kubenswrapper[4826]: I0319 19:20:04.763236 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 19 19:20:04 crc kubenswrapper[4826]: I0319 19:20:04.795248 4826 scope.go:117] "RemoveContainer" containerID="1866ac882b06b33c951bae16b5de3447ff7d202831777eef2c94947e017a091d" Mar 19 19:20:04 crc kubenswrapper[4826]: I0319 19:20:04.821434 4826 scope.go:117] "RemoveContainer" containerID="b3d3ffc245d5ebf9fea0afff853a781d2d4210525bd074778db36ce2c8efea29" Mar 19 19:20:04 crc kubenswrapper[4826]: E0319 19:20:04.825258 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3d3ffc245d5ebf9fea0afff853a781d2d4210525bd074778db36ce2c8efea29\": container with ID starting with b3d3ffc245d5ebf9fea0afff853a781d2d4210525bd074778db36ce2c8efea29 not found: ID does not exist" containerID="b3d3ffc245d5ebf9fea0afff853a781d2d4210525bd074778db36ce2c8efea29" Mar 19 19:20:04 crc kubenswrapper[4826]: I0319 19:20:04.825309 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3d3ffc245d5ebf9fea0afff853a781d2d4210525bd074778db36ce2c8efea29"} err="failed to get container status \"b3d3ffc245d5ebf9fea0afff853a781d2d4210525bd074778db36ce2c8efea29\": rpc error: code = NotFound desc = could not find container \"b3d3ffc245d5ebf9fea0afff853a781d2d4210525bd074778db36ce2c8efea29\": container with ID starting with b3d3ffc245d5ebf9fea0afff853a781d2d4210525bd074778db36ce2c8efea29 not found: ID does not exist" Mar 19 19:20:04 crc kubenswrapper[4826]: I0319 19:20:04.825331 4826 scope.go:117] "RemoveContainer" containerID="7467a99cf47615c389cc307f805b8cc5d39ce54b49f3652c74297080d2ff8852" Mar 19 19:20:04 crc kubenswrapper[4826]: E0319 19:20:04.827549 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7467a99cf47615c389cc307f805b8cc5d39ce54b49f3652c74297080d2ff8852\": container with ID starting with 7467a99cf47615c389cc307f805b8cc5d39ce54b49f3652c74297080d2ff8852 not found: ID does not exist" containerID="7467a99cf47615c389cc307f805b8cc5d39ce54b49f3652c74297080d2ff8852" Mar 19 19:20:04 crc kubenswrapper[4826]: I0319 19:20:04.827572 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7467a99cf47615c389cc307f805b8cc5d39ce54b49f3652c74297080d2ff8852"} err="failed to get container status \"7467a99cf47615c389cc307f805b8cc5d39ce54b49f3652c74297080d2ff8852\": rpc error: code = NotFound desc = could not find container \"7467a99cf47615c389cc307f805b8cc5d39ce54b49f3652c74297080d2ff8852\": container with ID starting with 7467a99cf47615c389cc307f805b8cc5d39ce54b49f3652c74297080d2ff8852 not found: ID does not exist" Mar 19 19:20:04 crc kubenswrapper[4826]: I0319 19:20:04.827588 4826 scope.go:117] "RemoveContainer" containerID="d9f8c0551c821d12250f72d1fc2827c94b3a6351414f2c6bbe07beef2177afcf" Mar 19 19:20:04 crc kubenswrapper[4826]: E0319 19:20:04.828713 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9f8c0551c821d12250f72d1fc2827c94b3a6351414f2c6bbe07beef2177afcf\": container with ID starting with d9f8c0551c821d12250f72d1fc2827c94b3a6351414f2c6bbe07beef2177afcf not found: ID does not exist" containerID="d9f8c0551c821d12250f72d1fc2827c94b3a6351414f2c6bbe07beef2177afcf" Mar 19 19:20:04 crc kubenswrapper[4826]: I0319 19:20:04.828733 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9f8c0551c821d12250f72d1fc2827c94b3a6351414f2c6bbe07beef2177afcf"} err="failed to get container status \"d9f8c0551c821d12250f72d1fc2827c94b3a6351414f2c6bbe07beef2177afcf\": rpc error: code = NotFound desc = could not find container \"d9f8c0551c821d12250f72d1fc2827c94b3a6351414f2c6bbe07beef2177afcf\": container with ID starting with d9f8c0551c821d12250f72d1fc2827c94b3a6351414f2c6bbe07beef2177afcf not found: ID does not exist" Mar 19 19:20:04 crc kubenswrapper[4826]: I0319 19:20:04.828746 4826 scope.go:117] "RemoveContainer" containerID="1866ac882b06b33c951bae16b5de3447ff7d202831777eef2c94947e017a091d" Mar 19 19:20:04 crc kubenswrapper[4826]: E0319 19:20:04.829728 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1866ac882b06b33c951bae16b5de3447ff7d202831777eef2c94947e017a091d\": container with ID starting with 1866ac882b06b33c951bae16b5de3447ff7d202831777eef2c94947e017a091d not found: ID does not exist" containerID="1866ac882b06b33c951bae16b5de3447ff7d202831777eef2c94947e017a091d" Mar 19 19:20:04 crc kubenswrapper[4826]: I0319 19:20:04.829750 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1866ac882b06b33c951bae16b5de3447ff7d202831777eef2c94947e017a091d"} err="failed to get container status \"1866ac882b06b33c951bae16b5de3447ff7d202831777eef2c94947e017a091d\": rpc error: code = NotFound desc = could not find container \"1866ac882b06b33c951bae16b5de3447ff7d202831777eef2c94947e017a091d\": container with ID starting with 1866ac882b06b33c951bae16b5de3447ff7d202831777eef2c94947e017a091d not found: ID does not exist" Mar 19 19:20:04 crc kubenswrapper[4826]: I0319 19:20:04.925502 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cksng\" (UniqueName: \"kubernetes.io/projected/3a17ed35-ca29-4371-b7a1-ea55463f4033-kube-api-access-cksng\") pod \"ceilometer-0\" (UID: \"3a17ed35-ca29-4371-b7a1-ea55463f4033\") " pod="openstack/ceilometer-0" Mar 19 19:20:04 crc kubenswrapper[4826]: I0319 19:20:04.925551 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a17ed35-ca29-4371-b7a1-ea55463f4033-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3a17ed35-ca29-4371-b7a1-ea55463f4033\") " pod="openstack/ceilometer-0" Mar 19 19:20:04 crc kubenswrapper[4826]: I0319 19:20:04.925577 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3a17ed35-ca29-4371-b7a1-ea55463f4033-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3a17ed35-ca29-4371-b7a1-ea55463f4033\") " pod="openstack/ceilometer-0" Mar 19 19:20:04 crc kubenswrapper[4826]: I0319 19:20:04.925613 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a17ed35-ca29-4371-b7a1-ea55463f4033-config-data\") pod \"ceilometer-0\" (UID: \"3a17ed35-ca29-4371-b7a1-ea55463f4033\") " pod="openstack/ceilometer-0" Mar 19 19:20:04 crc kubenswrapper[4826]: I0319 19:20:04.925639 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a17ed35-ca29-4371-b7a1-ea55463f4033-scripts\") pod \"ceilometer-0\" (UID: \"3a17ed35-ca29-4371-b7a1-ea55463f4033\") " pod="openstack/ceilometer-0" Mar 19 19:20:04 crc kubenswrapper[4826]: I0319 19:20:04.925674 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a17ed35-ca29-4371-b7a1-ea55463f4033-run-httpd\") pod \"ceilometer-0\" (UID: \"3a17ed35-ca29-4371-b7a1-ea55463f4033\") " pod="openstack/ceilometer-0" Mar 19 19:20:04 crc kubenswrapper[4826]: I0319 19:20:04.925723 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a17ed35-ca29-4371-b7a1-ea55463f4033-log-httpd\") pod \"ceilometer-0\" (UID: \"3a17ed35-ca29-4371-b7a1-ea55463f4033\") " pod="openstack/ceilometer-0" Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.027249 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a17ed35-ca29-4371-b7a1-ea55463f4033-log-httpd\") pod \"ceilometer-0\" (UID: \"3a17ed35-ca29-4371-b7a1-ea55463f4033\") " pod="openstack/ceilometer-0" Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.027396 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cksng\" (UniqueName: \"kubernetes.io/projected/3a17ed35-ca29-4371-b7a1-ea55463f4033-kube-api-access-cksng\") pod \"ceilometer-0\" (UID: \"3a17ed35-ca29-4371-b7a1-ea55463f4033\") " pod="openstack/ceilometer-0" Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.027419 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a17ed35-ca29-4371-b7a1-ea55463f4033-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3a17ed35-ca29-4371-b7a1-ea55463f4033\") " pod="openstack/ceilometer-0" Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.027443 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3a17ed35-ca29-4371-b7a1-ea55463f4033-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3a17ed35-ca29-4371-b7a1-ea55463f4033\") " pod="openstack/ceilometer-0" Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.027476 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a17ed35-ca29-4371-b7a1-ea55463f4033-config-data\") pod \"ceilometer-0\" (UID: \"3a17ed35-ca29-4371-b7a1-ea55463f4033\") " pod="openstack/ceilometer-0" Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.027508 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a17ed35-ca29-4371-b7a1-ea55463f4033-scripts\") pod \"ceilometer-0\" (UID: \"3a17ed35-ca29-4371-b7a1-ea55463f4033\") " pod="openstack/ceilometer-0" Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.027528 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a17ed35-ca29-4371-b7a1-ea55463f4033-run-httpd\") pod \"ceilometer-0\" (UID: \"3a17ed35-ca29-4371-b7a1-ea55463f4033\") " pod="openstack/ceilometer-0" Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.028028 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a17ed35-ca29-4371-b7a1-ea55463f4033-run-httpd\") pod \"ceilometer-0\" (UID: \"3a17ed35-ca29-4371-b7a1-ea55463f4033\") " pod="openstack/ceilometer-0" Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.031639 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a17ed35-ca29-4371-b7a1-ea55463f4033-log-httpd\") pod \"ceilometer-0\" (UID: \"3a17ed35-ca29-4371-b7a1-ea55463f4033\") " pod="openstack/ceilometer-0" Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.032835 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3a17ed35-ca29-4371-b7a1-ea55463f4033-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3a17ed35-ca29-4371-b7a1-ea55463f4033\") " pod="openstack/ceilometer-0" Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.037504 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a17ed35-ca29-4371-b7a1-ea55463f4033-config-data\") pod \"ceilometer-0\" (UID: \"3a17ed35-ca29-4371-b7a1-ea55463f4033\") " pod="openstack/ceilometer-0" Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.040220 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a17ed35-ca29-4371-b7a1-ea55463f4033-scripts\") pod \"ceilometer-0\" (UID: \"3a17ed35-ca29-4371-b7a1-ea55463f4033\") " pod="openstack/ceilometer-0" Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.050532 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:20:05 crc kubenswrapper[4826]: E0319 19:20:05.051428 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle kube-api-access-cksng], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="3a17ed35-ca29-4371-b7a1-ea55463f4033" Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.051540 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cksng\" (UniqueName: \"kubernetes.io/projected/3a17ed35-ca29-4371-b7a1-ea55463f4033-kube-api-access-cksng\") pod \"ceilometer-0\" (UID: \"3a17ed35-ca29-4371-b7a1-ea55463f4033\") " pod="openstack/ceilometer-0" Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.069875 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a17ed35-ca29-4371-b7a1-ea55463f4033-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3a17ed35-ca29-4371-b7a1-ea55463f4033\") " pod="openstack/ceilometer-0" Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.172076 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-nn4pv" Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.195757 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cml5k" podUID="af2662b9-3873-4947-9793-e7e1c6611dcb" containerName="registry-server" probeResult="failure" output=< Mar 19 19:20:05 crc kubenswrapper[4826]: timeout: failed to connect service ":50051" within 1s Mar 19 19:20:05 crc kubenswrapper[4826]: > Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.332930 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2mww\" (UniqueName: \"kubernetes.io/projected/f69e3ed0-cf8e-438d-a6f0-dac56664901e-kube-api-access-q2mww\") pod \"f69e3ed0-cf8e-438d-a6f0-dac56664901e\" (UID: \"f69e3ed0-cf8e-438d-a6f0-dac56664901e\") " Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.333284 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f69e3ed0-cf8e-438d-a6f0-dac56664901e-scripts\") pod \"f69e3ed0-cf8e-438d-a6f0-dac56664901e\" (UID: \"f69e3ed0-cf8e-438d-a6f0-dac56664901e\") " Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.333372 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f69e3ed0-cf8e-438d-a6f0-dac56664901e-combined-ca-bundle\") pod \"f69e3ed0-cf8e-438d-a6f0-dac56664901e\" (UID: \"f69e3ed0-cf8e-438d-a6f0-dac56664901e\") " Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.333510 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f69e3ed0-cf8e-438d-a6f0-dac56664901e-config-data\") pod \"f69e3ed0-cf8e-438d-a6f0-dac56664901e\" (UID: \"f69e3ed0-cf8e-438d-a6f0-dac56664901e\") " Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.338172 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f69e3ed0-cf8e-438d-a6f0-dac56664901e-kube-api-access-q2mww" (OuterVolumeSpecName: "kube-api-access-q2mww") pod "f69e3ed0-cf8e-438d-a6f0-dac56664901e" (UID: "f69e3ed0-cf8e-438d-a6f0-dac56664901e"). InnerVolumeSpecName "kube-api-access-q2mww". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.343383 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f69e3ed0-cf8e-438d-a6f0-dac56664901e-scripts" (OuterVolumeSpecName: "scripts") pod "f69e3ed0-cf8e-438d-a6f0-dac56664901e" (UID: "f69e3ed0-cf8e-438d-a6f0-dac56664901e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.367245 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f69e3ed0-cf8e-438d-a6f0-dac56664901e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f69e3ed0-cf8e-438d-a6f0-dac56664901e" (UID: "f69e3ed0-cf8e-438d-a6f0-dac56664901e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.387541 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f69e3ed0-cf8e-438d-a6f0-dac56664901e-config-data" (OuterVolumeSpecName: "config-data") pod "f69e3ed0-cf8e-438d-a6f0-dac56664901e" (UID: "f69e3ed0-cf8e-438d-a6f0-dac56664901e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.436775 4826 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f69e3ed0-cf8e-438d-a6f0-dac56664901e-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.436809 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2mww\" (UniqueName: \"kubernetes.io/projected/f69e3ed0-cf8e-438d-a6f0-dac56664901e-kube-api-access-q2mww\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.436819 4826 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f69e3ed0-cf8e-438d-a6f0-dac56664901e-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.436828 4826 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f69e3ed0-cf8e-438d-a6f0-dac56664901e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.638161 4826 generic.go:334] "Generic (PLEG): container finished" podID="eb05aceb-7f5c-42c3-a9a6-1242def4b9cc" containerID="dad70c4189d94827d3fb69e4bb747fcd71cb6a4ae9acbcdbf1334d3d619c5756" exitCode=0 Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.638254 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565800-l7j4f" event={"ID":"eb05aceb-7f5c-42c3-a9a6-1242def4b9cc","Type":"ContainerDied","Data":"dad70c4189d94827d3fb69e4bb747fcd71cb6a4ae9acbcdbf1334d3d619c5756"} Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.640905 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-nn4pv" event={"ID":"f69e3ed0-cf8e-438d-a6f0-dac56664901e","Type":"ContainerDied","Data":"c39672eee0893827c86076897c9a17dc4b22dd5e239b0314ce758f37306318ca"} Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.640937 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c39672eee0893827c86076897c9a17dc4b22dd5e239b0314ce758f37306318ca" Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.640950 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-nn4pv" Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.642579 4826 generic.go:334] "Generic (PLEG): container finished" podID="451fe5a3-8a05-42fd-8059-85084283a59f" containerID="bb275d63b1df93aa76077b15ed436d436cf33fa341f768bdf2e8096cfcc50b93" exitCode=0 Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.642673 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wrcwr" event={"ID":"451fe5a3-8a05-42fd-8059-85084283a59f","Type":"ContainerDied","Data":"bb275d63b1df93aa76077b15ed436d436cf33fa341f768bdf2e8096cfcc50b93"} Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.642720 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.654132 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.742443 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a17ed35-ca29-4371-b7a1-ea55463f4033-combined-ca-bundle\") pod \"3a17ed35-ca29-4371-b7a1-ea55463f4033\" (UID: \"3a17ed35-ca29-4371-b7a1-ea55463f4033\") " Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.742546 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a17ed35-ca29-4371-b7a1-ea55463f4033-log-httpd\") pod \"3a17ed35-ca29-4371-b7a1-ea55463f4033\" (UID: \"3a17ed35-ca29-4371-b7a1-ea55463f4033\") " Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.742784 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3a17ed35-ca29-4371-b7a1-ea55463f4033-sg-core-conf-yaml\") pod \"3a17ed35-ca29-4371-b7a1-ea55463f4033\" (UID: \"3a17ed35-ca29-4371-b7a1-ea55463f4033\") " Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.742865 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a17ed35-ca29-4371-b7a1-ea55463f4033-scripts\") pod \"3a17ed35-ca29-4371-b7a1-ea55463f4033\" (UID: \"3a17ed35-ca29-4371-b7a1-ea55463f4033\") " Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.742939 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a17ed35-ca29-4371-b7a1-ea55463f4033-run-httpd\") pod \"3a17ed35-ca29-4371-b7a1-ea55463f4033\" (UID: \"3a17ed35-ca29-4371-b7a1-ea55463f4033\") " Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.742983 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a17ed35-ca29-4371-b7a1-ea55463f4033-config-data\") pod \"3a17ed35-ca29-4371-b7a1-ea55463f4033\" (UID: \"3a17ed35-ca29-4371-b7a1-ea55463f4033\") " Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.743047 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cksng\" (UniqueName: \"kubernetes.io/projected/3a17ed35-ca29-4371-b7a1-ea55463f4033-kube-api-access-cksng\") pod \"3a17ed35-ca29-4371-b7a1-ea55463f4033\" (UID: \"3a17ed35-ca29-4371-b7a1-ea55463f4033\") " Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.743283 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a17ed35-ca29-4371-b7a1-ea55463f4033-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3a17ed35-ca29-4371-b7a1-ea55463f4033" (UID: "3a17ed35-ca29-4371-b7a1-ea55463f4033"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.743408 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a17ed35-ca29-4371-b7a1-ea55463f4033-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3a17ed35-ca29-4371-b7a1-ea55463f4033" (UID: "3a17ed35-ca29-4371-b7a1-ea55463f4033"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.743899 4826 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a17ed35-ca29-4371-b7a1-ea55463f4033-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.743997 4826 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a17ed35-ca29-4371-b7a1-ea55463f4033-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.750100 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a17ed35-ca29-4371-b7a1-ea55463f4033-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3a17ed35-ca29-4371-b7a1-ea55463f4033" (UID: "3a17ed35-ca29-4371-b7a1-ea55463f4033"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.750236 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a17ed35-ca29-4371-b7a1-ea55463f4033-config-data" (OuterVolumeSpecName: "config-data") pod "3a17ed35-ca29-4371-b7a1-ea55463f4033" (UID: "3a17ed35-ca29-4371-b7a1-ea55463f4033"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.751779 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a17ed35-ca29-4371-b7a1-ea55463f4033-scripts" (OuterVolumeSpecName: "scripts") pod "3a17ed35-ca29-4371-b7a1-ea55463f4033" (UID: "3a17ed35-ca29-4371-b7a1-ea55463f4033"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.751903 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a17ed35-ca29-4371-b7a1-ea55463f4033-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3a17ed35-ca29-4371-b7a1-ea55463f4033" (UID: "3a17ed35-ca29-4371-b7a1-ea55463f4033"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.754871 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a17ed35-ca29-4371-b7a1-ea55463f4033-kube-api-access-cksng" (OuterVolumeSpecName: "kube-api-access-cksng") pod "3a17ed35-ca29-4371-b7a1-ea55463f4033" (UID: "3a17ed35-ca29-4371-b7a1-ea55463f4033"). InnerVolumeSpecName "kube-api-access-cksng". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.766185 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 19 19:20:05 crc kubenswrapper[4826]: E0319 19:20:05.766724 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f69e3ed0-cf8e-438d-a6f0-dac56664901e" containerName="nova-cell0-conductor-db-sync" Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.766746 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="f69e3ed0-cf8e-438d-a6f0-dac56664901e" containerName="nova-cell0-conductor-db-sync" Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.766971 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="f69e3ed0-cf8e-438d-a6f0-dac56664901e" containerName="nova-cell0-conductor-db-sync" Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.767770 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.771245 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.771378 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-8p84g" Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.784700 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.846033 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d98a927a-17ed-4120-b2e9-0480ce473022-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"d98a927a-17ed-4120-b2e9-0480ce473022\") " pod="openstack/nova-cell0-conductor-0" Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.846074 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srv7k\" (UniqueName: \"kubernetes.io/projected/d98a927a-17ed-4120-b2e9-0480ce473022-kube-api-access-srv7k\") pod \"nova-cell0-conductor-0\" (UID: \"d98a927a-17ed-4120-b2e9-0480ce473022\") " pod="openstack/nova-cell0-conductor-0" Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.846170 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d98a927a-17ed-4120-b2e9-0480ce473022-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"d98a927a-17ed-4120-b2e9-0480ce473022\") " pod="openstack/nova-cell0-conductor-0" Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.846567 4826 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a17ed35-ca29-4371-b7a1-ea55463f4033-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.846598 4826 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3a17ed35-ca29-4371-b7a1-ea55463f4033-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.846610 4826 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a17ed35-ca29-4371-b7a1-ea55463f4033-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.846618 4826 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a17ed35-ca29-4371-b7a1-ea55463f4033-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.846629 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cksng\" (UniqueName: \"kubernetes.io/projected/3a17ed35-ca29-4371-b7a1-ea55463f4033-kube-api-access-cksng\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.948978 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d98a927a-17ed-4120-b2e9-0480ce473022-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"d98a927a-17ed-4120-b2e9-0480ce473022\") " pod="openstack/nova-cell0-conductor-0" Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.949248 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d98a927a-17ed-4120-b2e9-0480ce473022-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"d98a927a-17ed-4120-b2e9-0480ce473022\") " pod="openstack/nova-cell0-conductor-0" Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.949281 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srv7k\" (UniqueName: \"kubernetes.io/projected/d98a927a-17ed-4120-b2e9-0480ce473022-kube-api-access-srv7k\") pod \"nova-cell0-conductor-0\" (UID: \"d98a927a-17ed-4120-b2e9-0480ce473022\") " pod="openstack/nova-cell0-conductor-0" Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.953175 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d98a927a-17ed-4120-b2e9-0480ce473022-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"d98a927a-17ed-4120-b2e9-0480ce473022\") " pod="openstack/nova-cell0-conductor-0" Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.954125 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d98a927a-17ed-4120-b2e9-0480ce473022-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"d98a927a-17ed-4120-b2e9-0480ce473022\") " pod="openstack/nova-cell0-conductor-0" Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.972774 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srv7k\" (UniqueName: \"kubernetes.io/projected/d98a927a-17ed-4120-b2e9-0480ce473022-kube-api-access-srv7k\") pod \"nova-cell0-conductor-0\" (UID: \"d98a927a-17ed-4120-b2e9-0480ce473022\") " pod="openstack/nova-cell0-conductor-0" Mar 19 19:20:05 crc kubenswrapper[4826]: I0319 19:20:05.993846 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38208185-aceb-4b6b-8bfe-1316642c990a" path="/var/lib/kubelet/pods/38208185-aceb-4b6b-8bfe-1316642c990a/volumes" Mar 19 19:20:06 crc kubenswrapper[4826]: I0319 19:20:06.095791 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 19 19:20:06 crc kubenswrapper[4826]: E0319 19:20:06.218780 4826 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 230b6d2884e1b6005c051b0b95d6403e68a27adb8c35eeee9e84242ad167358b is running failed: container process not found" containerID="230b6d2884e1b6005c051b0b95d6403e68a27adb8c35eeee9e84242ad167358b" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 19 19:20:06 crc kubenswrapper[4826]: E0319 19:20:06.229785 4826 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 230b6d2884e1b6005c051b0b95d6403e68a27adb8c35eeee9e84242ad167358b is running failed: container process not found" containerID="230b6d2884e1b6005c051b0b95d6403e68a27adb8c35eeee9e84242ad167358b" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 19 19:20:06 crc kubenswrapper[4826]: E0319 19:20:06.238010 4826 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 230b6d2884e1b6005c051b0b95d6403e68a27adb8c35eeee9e84242ad167358b is running failed: container process not found" containerID="230b6d2884e1b6005c051b0b95d6403e68a27adb8c35eeee9e84242ad167358b" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 19 19:20:06 crc kubenswrapper[4826]: E0319 19:20:06.238070 4826 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 230b6d2884e1b6005c051b0b95d6403e68a27adb8c35eeee9e84242ad167358b is running failed: container process not found" probeType="Readiness" pod="openstack/heat-engine-79595bb545-xcckb" podUID="6a3611e1-4719-476c-8d8b-ceccedcb14bc" containerName="heat-engine" Mar 19 19:20:06 crc kubenswrapper[4826]: I0319 19:20:06.383263 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-79595bb545-xcckb" Mar 19 19:20:06 crc kubenswrapper[4826]: I0319 19:20:06.472477 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a3611e1-4719-476c-8d8b-ceccedcb14bc-config-data\") pod \"6a3611e1-4719-476c-8d8b-ceccedcb14bc\" (UID: \"6a3611e1-4719-476c-8d8b-ceccedcb14bc\") " Mar 19 19:20:06 crc kubenswrapper[4826]: I0319 19:20:06.472561 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmjqb\" (UniqueName: \"kubernetes.io/projected/6a3611e1-4719-476c-8d8b-ceccedcb14bc-kube-api-access-mmjqb\") pod \"6a3611e1-4719-476c-8d8b-ceccedcb14bc\" (UID: \"6a3611e1-4719-476c-8d8b-ceccedcb14bc\") " Mar 19 19:20:06 crc kubenswrapper[4826]: I0319 19:20:06.472681 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a3611e1-4719-476c-8d8b-ceccedcb14bc-combined-ca-bundle\") pod \"6a3611e1-4719-476c-8d8b-ceccedcb14bc\" (UID: \"6a3611e1-4719-476c-8d8b-ceccedcb14bc\") " Mar 19 19:20:06 crc kubenswrapper[4826]: I0319 19:20:06.472784 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6a3611e1-4719-476c-8d8b-ceccedcb14bc-config-data-custom\") pod \"6a3611e1-4719-476c-8d8b-ceccedcb14bc\" (UID: \"6a3611e1-4719-476c-8d8b-ceccedcb14bc\") " Mar 19 19:20:06 crc kubenswrapper[4826]: I0319 19:20:06.478390 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a3611e1-4719-476c-8d8b-ceccedcb14bc-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6a3611e1-4719-476c-8d8b-ceccedcb14bc" (UID: "6a3611e1-4719-476c-8d8b-ceccedcb14bc"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:20:06 crc kubenswrapper[4826]: I0319 19:20:06.489438 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a3611e1-4719-476c-8d8b-ceccedcb14bc-kube-api-access-mmjqb" (OuterVolumeSpecName: "kube-api-access-mmjqb") pod "6a3611e1-4719-476c-8d8b-ceccedcb14bc" (UID: "6a3611e1-4719-476c-8d8b-ceccedcb14bc"). InnerVolumeSpecName "kube-api-access-mmjqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:20:06 crc kubenswrapper[4826]: I0319 19:20:06.538798 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a3611e1-4719-476c-8d8b-ceccedcb14bc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a3611e1-4719-476c-8d8b-ceccedcb14bc" (UID: "6a3611e1-4719-476c-8d8b-ceccedcb14bc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:20:06 crc kubenswrapper[4826]: I0319 19:20:06.571340 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a3611e1-4719-476c-8d8b-ceccedcb14bc-config-data" (OuterVolumeSpecName: "config-data") pod "6a3611e1-4719-476c-8d8b-ceccedcb14bc" (UID: "6a3611e1-4719-476c-8d8b-ceccedcb14bc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:20:06 crc kubenswrapper[4826]: I0319 19:20:06.578513 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmjqb\" (UniqueName: \"kubernetes.io/projected/6a3611e1-4719-476c-8d8b-ceccedcb14bc-kube-api-access-mmjqb\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:06 crc kubenswrapper[4826]: I0319 19:20:06.578582 4826 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a3611e1-4719-476c-8d8b-ceccedcb14bc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:06 crc kubenswrapper[4826]: I0319 19:20:06.578597 4826 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6a3611e1-4719-476c-8d8b-ceccedcb14bc-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:06 crc kubenswrapper[4826]: I0319 19:20:06.578608 4826 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a3611e1-4719-476c-8d8b-ceccedcb14bc-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:06 crc kubenswrapper[4826]: I0319 19:20:06.660082 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wrcwr" event={"ID":"451fe5a3-8a05-42fd-8059-85084283a59f","Type":"ContainerStarted","Data":"4180f287567b7892702a7e7731d229aa070a0ae74ad3d5ea4bbc1997f80f30ac"} Mar 19 19:20:06 crc kubenswrapper[4826]: I0319 19:20:06.672150 4826 generic.go:334] "Generic (PLEG): container finished" podID="6a3611e1-4719-476c-8d8b-ceccedcb14bc" containerID="230b6d2884e1b6005c051b0b95d6403e68a27adb8c35eeee9e84242ad167358b" exitCode=0 Mar 19 19:20:06 crc kubenswrapper[4826]: I0319 19:20:06.672268 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:20:06 crc kubenswrapper[4826]: I0319 19:20:06.672934 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-79595bb545-xcckb" event={"ID":"6a3611e1-4719-476c-8d8b-ceccedcb14bc","Type":"ContainerDied","Data":"230b6d2884e1b6005c051b0b95d6403e68a27adb8c35eeee9e84242ad167358b"} Mar 19 19:20:06 crc kubenswrapper[4826]: I0319 19:20:06.672992 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-79595bb545-xcckb" event={"ID":"6a3611e1-4719-476c-8d8b-ceccedcb14bc","Type":"ContainerDied","Data":"1ce291fa5ab6a3a2bd586ef9a9f2d801ada93141230d7c7668c16e9eaa84c838"} Mar 19 19:20:06 crc kubenswrapper[4826]: I0319 19:20:06.673026 4826 scope.go:117] "RemoveContainer" containerID="230b6d2884e1b6005c051b0b95d6403e68a27adb8c35eeee9e84242ad167358b" Mar 19 19:20:06 crc kubenswrapper[4826]: I0319 19:20:06.673210 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-79595bb545-xcckb" Mar 19 19:20:06 crc kubenswrapper[4826]: I0319 19:20:06.696706 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wrcwr" podStartSLOduration=2.90711618 podStartE2EDuration="7.696688281s" podCreationTimestamp="2026-03-19 19:19:59 +0000 UTC" firstStartedPulling="2026-03-19 19:20:01.554712983 +0000 UTC m=+1426.308781286" lastFinishedPulling="2026-03-19 19:20:06.344285074 +0000 UTC m=+1431.098353387" observedRunningTime="2026-03-19 19:20:06.679698911 +0000 UTC m=+1431.433767234" watchObservedRunningTime="2026-03-19 19:20:06.696688281 +0000 UTC m=+1431.450756594" Mar 19 19:20:06 crc kubenswrapper[4826]: I0319 19:20:06.729630 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-79595bb545-xcckb"] Mar 19 19:20:06 crc kubenswrapper[4826]: I0319 19:20:06.731553 4826 scope.go:117] "RemoveContainer" containerID="230b6d2884e1b6005c051b0b95d6403e68a27adb8c35eeee9e84242ad167358b" Mar 19 19:20:06 crc kubenswrapper[4826]: E0319 19:20:06.732098 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"230b6d2884e1b6005c051b0b95d6403e68a27adb8c35eeee9e84242ad167358b\": container with ID starting with 230b6d2884e1b6005c051b0b95d6403e68a27adb8c35eeee9e84242ad167358b not found: ID does not exist" containerID="230b6d2884e1b6005c051b0b95d6403e68a27adb8c35eeee9e84242ad167358b" Mar 19 19:20:06 crc kubenswrapper[4826]: I0319 19:20:06.732161 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"230b6d2884e1b6005c051b0b95d6403e68a27adb8c35eeee9e84242ad167358b"} err="failed to get container status \"230b6d2884e1b6005c051b0b95d6403e68a27adb8c35eeee9e84242ad167358b\": rpc error: code = NotFound desc = could not find container \"230b6d2884e1b6005c051b0b95d6403e68a27adb8c35eeee9e84242ad167358b\": container with ID starting with 230b6d2884e1b6005c051b0b95d6403e68a27adb8c35eeee9e84242ad167358b not found: ID does not exist" Mar 19 19:20:06 crc kubenswrapper[4826]: I0319 19:20:06.753984 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-79595bb545-xcckb"] Mar 19 19:20:06 crc kubenswrapper[4826]: I0319 19:20:06.791918 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:20:06 crc kubenswrapper[4826]: I0319 19:20:06.822042 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:20:06 crc kubenswrapper[4826]: I0319 19:20:06.826053 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:20:06 crc kubenswrapper[4826]: E0319 19:20:06.826595 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a3611e1-4719-476c-8d8b-ceccedcb14bc" containerName="heat-engine" Mar 19 19:20:06 crc kubenswrapper[4826]: I0319 19:20:06.826620 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a3611e1-4719-476c-8d8b-ceccedcb14bc" containerName="heat-engine" Mar 19 19:20:06 crc kubenswrapper[4826]: I0319 19:20:06.826897 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a3611e1-4719-476c-8d8b-ceccedcb14bc" containerName="heat-engine" Mar 19 19:20:06 crc kubenswrapper[4826]: I0319 19:20:06.829419 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:20:06 crc kubenswrapper[4826]: I0319 19:20:06.831737 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 19 19:20:06 crc kubenswrapper[4826]: I0319 19:20:06.832050 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 19 19:20:06 crc kubenswrapper[4826]: I0319 19:20:06.848821 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:20:06 crc kubenswrapper[4826]: I0319 19:20:06.860343 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 19 19:20:06 crc kubenswrapper[4826]: I0319 19:20:06.986972 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af8d8fc1-0ebe-47a6-87ff-f93d31c7490e-config-data\") pod \"ceilometer-0\" (UID: \"af8d8fc1-0ebe-47a6-87ff-f93d31c7490e\") " pod="openstack/ceilometer-0" Mar 19 19:20:06 crc kubenswrapper[4826]: I0319 19:20:06.987019 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af8d8fc1-0ebe-47a6-87ff-f93d31c7490e-log-httpd\") pod \"ceilometer-0\" (UID: \"af8d8fc1-0ebe-47a6-87ff-f93d31c7490e\") " pod="openstack/ceilometer-0" Mar 19 19:20:06 crc kubenswrapper[4826]: I0319 19:20:06.987108 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79ptj\" (UniqueName: \"kubernetes.io/projected/af8d8fc1-0ebe-47a6-87ff-f93d31c7490e-kube-api-access-79ptj\") pod \"ceilometer-0\" (UID: \"af8d8fc1-0ebe-47a6-87ff-f93d31c7490e\") " pod="openstack/ceilometer-0" Mar 19 19:20:06 crc kubenswrapper[4826]: I0319 19:20:06.987126 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/af8d8fc1-0ebe-47a6-87ff-f93d31c7490e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"af8d8fc1-0ebe-47a6-87ff-f93d31c7490e\") " pod="openstack/ceilometer-0" Mar 19 19:20:06 crc kubenswrapper[4826]: I0319 19:20:06.987183 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af8d8fc1-0ebe-47a6-87ff-f93d31c7490e-scripts\") pod \"ceilometer-0\" (UID: \"af8d8fc1-0ebe-47a6-87ff-f93d31c7490e\") " pod="openstack/ceilometer-0" Mar 19 19:20:06 crc kubenswrapper[4826]: I0319 19:20:06.987236 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af8d8fc1-0ebe-47a6-87ff-f93d31c7490e-run-httpd\") pod \"ceilometer-0\" (UID: \"af8d8fc1-0ebe-47a6-87ff-f93d31c7490e\") " pod="openstack/ceilometer-0" Mar 19 19:20:06 crc kubenswrapper[4826]: I0319 19:20:06.987257 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af8d8fc1-0ebe-47a6-87ff-f93d31c7490e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"af8d8fc1-0ebe-47a6-87ff-f93d31c7490e\") " pod="openstack/ceilometer-0" Mar 19 19:20:07 crc kubenswrapper[4826]: I0319 19:20:07.090070 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af8d8fc1-0ebe-47a6-87ff-f93d31c7490e-run-httpd\") pod \"ceilometer-0\" (UID: \"af8d8fc1-0ebe-47a6-87ff-f93d31c7490e\") " pod="openstack/ceilometer-0" Mar 19 19:20:07 crc kubenswrapper[4826]: I0319 19:20:07.090119 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af8d8fc1-0ebe-47a6-87ff-f93d31c7490e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"af8d8fc1-0ebe-47a6-87ff-f93d31c7490e\") " pod="openstack/ceilometer-0" Mar 19 19:20:07 crc kubenswrapper[4826]: I0319 19:20:07.090191 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af8d8fc1-0ebe-47a6-87ff-f93d31c7490e-config-data\") pod \"ceilometer-0\" (UID: \"af8d8fc1-0ebe-47a6-87ff-f93d31c7490e\") " pod="openstack/ceilometer-0" Mar 19 19:20:07 crc kubenswrapper[4826]: I0319 19:20:07.090210 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af8d8fc1-0ebe-47a6-87ff-f93d31c7490e-log-httpd\") pod \"ceilometer-0\" (UID: \"af8d8fc1-0ebe-47a6-87ff-f93d31c7490e\") " pod="openstack/ceilometer-0" Mar 19 19:20:07 crc kubenswrapper[4826]: I0319 19:20:07.090300 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79ptj\" (UniqueName: \"kubernetes.io/projected/af8d8fc1-0ebe-47a6-87ff-f93d31c7490e-kube-api-access-79ptj\") pod \"ceilometer-0\" (UID: \"af8d8fc1-0ebe-47a6-87ff-f93d31c7490e\") " pod="openstack/ceilometer-0" Mar 19 19:20:07 crc kubenswrapper[4826]: I0319 19:20:07.090319 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/af8d8fc1-0ebe-47a6-87ff-f93d31c7490e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"af8d8fc1-0ebe-47a6-87ff-f93d31c7490e\") " pod="openstack/ceilometer-0" Mar 19 19:20:07 crc kubenswrapper[4826]: I0319 19:20:07.090380 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af8d8fc1-0ebe-47a6-87ff-f93d31c7490e-scripts\") pod \"ceilometer-0\" (UID: \"af8d8fc1-0ebe-47a6-87ff-f93d31c7490e\") " pod="openstack/ceilometer-0" Mar 19 19:20:07 crc kubenswrapper[4826]: I0319 19:20:07.091380 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af8d8fc1-0ebe-47a6-87ff-f93d31c7490e-run-httpd\") pod \"ceilometer-0\" (UID: \"af8d8fc1-0ebe-47a6-87ff-f93d31c7490e\") " pod="openstack/ceilometer-0" Mar 19 19:20:07 crc kubenswrapper[4826]: I0319 19:20:07.093704 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af8d8fc1-0ebe-47a6-87ff-f93d31c7490e-log-httpd\") pod \"ceilometer-0\" (UID: \"af8d8fc1-0ebe-47a6-87ff-f93d31c7490e\") " pod="openstack/ceilometer-0" Mar 19 19:20:07 crc kubenswrapper[4826]: I0319 19:20:07.098080 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af8d8fc1-0ebe-47a6-87ff-f93d31c7490e-scripts\") pod \"ceilometer-0\" (UID: \"af8d8fc1-0ebe-47a6-87ff-f93d31c7490e\") " pod="openstack/ceilometer-0" Mar 19 19:20:07 crc kubenswrapper[4826]: I0319 19:20:07.099505 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af8d8fc1-0ebe-47a6-87ff-f93d31c7490e-config-data\") pod \"ceilometer-0\" (UID: \"af8d8fc1-0ebe-47a6-87ff-f93d31c7490e\") " pod="openstack/ceilometer-0" Mar 19 19:20:07 crc kubenswrapper[4826]: I0319 19:20:07.112739 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79ptj\" (UniqueName: \"kubernetes.io/projected/af8d8fc1-0ebe-47a6-87ff-f93d31c7490e-kube-api-access-79ptj\") pod \"ceilometer-0\" (UID: \"af8d8fc1-0ebe-47a6-87ff-f93d31c7490e\") " pod="openstack/ceilometer-0" Mar 19 19:20:07 crc kubenswrapper[4826]: I0319 19:20:07.113227 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/af8d8fc1-0ebe-47a6-87ff-f93d31c7490e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"af8d8fc1-0ebe-47a6-87ff-f93d31c7490e\") " pod="openstack/ceilometer-0" Mar 19 19:20:07 crc kubenswrapper[4826]: I0319 19:20:07.120904 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af8d8fc1-0ebe-47a6-87ff-f93d31c7490e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"af8d8fc1-0ebe-47a6-87ff-f93d31c7490e\") " pod="openstack/ceilometer-0" Mar 19 19:20:07 crc kubenswrapper[4826]: I0319 19:20:07.226842 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565800-l7j4f" Mar 19 19:20:07 crc kubenswrapper[4826]: I0319 19:20:07.336889 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:20:07 crc kubenswrapper[4826]: I0319 19:20:07.398884 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rsxjt\" (UniqueName: \"kubernetes.io/projected/eb05aceb-7f5c-42c3-a9a6-1242def4b9cc-kube-api-access-rsxjt\") pod \"eb05aceb-7f5c-42c3-a9a6-1242def4b9cc\" (UID: \"eb05aceb-7f5c-42c3-a9a6-1242def4b9cc\") " Mar 19 19:20:07 crc kubenswrapper[4826]: I0319 19:20:07.403719 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb05aceb-7f5c-42c3-a9a6-1242def4b9cc-kube-api-access-rsxjt" (OuterVolumeSpecName: "kube-api-access-rsxjt") pod "eb05aceb-7f5c-42c3-a9a6-1242def4b9cc" (UID: "eb05aceb-7f5c-42c3-a9a6-1242def4b9cc"). InnerVolumeSpecName "kube-api-access-rsxjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:20:07 crc kubenswrapper[4826]: I0319 19:20:07.501696 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rsxjt\" (UniqueName: \"kubernetes.io/projected/eb05aceb-7f5c-42c3-a9a6-1242def4b9cc-kube-api-access-rsxjt\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:07 crc kubenswrapper[4826]: I0319 19:20:07.721139 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565800-l7j4f" event={"ID":"eb05aceb-7f5c-42c3-a9a6-1242def4b9cc","Type":"ContainerDied","Data":"29bbe1e8e287ea8a02aa80222936ad5d96082328cbb2a7323672ce9b7ada515a"} Mar 19 19:20:07 crc kubenswrapper[4826]: I0319 19:20:07.721537 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29bbe1e8e287ea8a02aa80222936ad5d96082328cbb2a7323672ce9b7ada515a" Mar 19 19:20:07 crc kubenswrapper[4826]: I0319 19:20:07.721593 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565800-l7j4f" Mar 19 19:20:07 crc kubenswrapper[4826]: I0319 19:20:07.725575 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"d98a927a-17ed-4120-b2e9-0480ce473022","Type":"ContainerStarted","Data":"d3b81b2daa5b28225e41de8684d052906afadd2ad0cee7cbbdb63ab147f49447"} Mar 19 19:20:07 crc kubenswrapper[4826]: I0319 19:20:07.725617 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"d98a927a-17ed-4120-b2e9-0480ce473022","Type":"ContainerStarted","Data":"28911f787401cd0387e909fd81bbd849085ae316329640afa775b3b8e8bc163f"} Mar 19 19:20:07 crc kubenswrapper[4826]: I0319 19:20:07.726958 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 19 19:20:07 crc kubenswrapper[4826]: I0319 19:20:07.727550 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565794-6r8f4"] Mar 19 19:20:07 crc kubenswrapper[4826]: I0319 19:20:07.738442 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565794-6r8f4"] Mar 19 19:20:07 crc kubenswrapper[4826]: I0319 19:20:07.746989 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.746972655 podStartE2EDuration="2.746972655s" podCreationTimestamp="2026-03-19 19:20:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:20:07.743212814 +0000 UTC m=+1432.497281127" watchObservedRunningTime="2026-03-19 19:20:07.746972655 +0000 UTC m=+1432.501040968" Mar 19 19:20:07 crc kubenswrapper[4826]: I0319 19:20:07.874695 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:20:07 crc kubenswrapper[4826]: W0319 19:20:07.879593 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf8d8fc1_0ebe_47a6_87ff_f93d31c7490e.slice/crio-d5afd36b70a251ab89cf89cc9c73d9cb885a8d3deb8b28743481910151c6633d WatchSource:0}: Error finding container d5afd36b70a251ab89cf89cc9c73d9cb885a8d3deb8b28743481910151c6633d: Status 404 returned error can't find the container with id d5afd36b70a251ab89cf89cc9c73d9cb885a8d3deb8b28743481910151c6633d Mar 19 19:20:07 crc kubenswrapper[4826]: I0319 19:20:07.988313 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a17ed35-ca29-4371-b7a1-ea55463f4033" path="/var/lib/kubelet/pods/3a17ed35-ca29-4371-b7a1-ea55463f4033/volumes" Mar 19 19:20:07 crc kubenswrapper[4826]: I0319 19:20:07.988764 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a3611e1-4719-476c-8d8b-ceccedcb14bc" path="/var/lib/kubelet/pods/6a3611e1-4719-476c-8d8b-ceccedcb14bc/volumes" Mar 19 19:20:07 crc kubenswrapper[4826]: I0319 19:20:07.990401 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c439c10-be6f-469b-a52e-1a77f9afc7be" path="/var/lib/kubelet/pods/6c439c10-be6f-469b-a52e-1a77f9afc7be/volumes" Mar 19 19:20:08 crc kubenswrapper[4826]: I0319 19:20:08.748833 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af8d8fc1-0ebe-47a6-87ff-f93d31c7490e","Type":"ContainerStarted","Data":"aece7260bc6cb44f26a3050c09b44a755c92d13e817fedd2284ec6224fe043fb"} Mar 19 19:20:08 crc kubenswrapper[4826]: I0319 19:20:08.749416 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af8d8fc1-0ebe-47a6-87ff-f93d31c7490e","Type":"ContainerStarted","Data":"d5afd36b70a251ab89cf89cc9c73d9cb885a8d3deb8b28743481910151c6633d"} Mar 19 19:20:09 crc kubenswrapper[4826]: I0319 19:20:09.762119 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af8d8fc1-0ebe-47a6-87ff-f93d31c7490e","Type":"ContainerStarted","Data":"41047eb86fc17342447ec83766d21f38a4b8cf564912dc19cb0943a67d5d5b6a"} Mar 19 19:20:10 crc kubenswrapper[4826]: I0319 19:20:10.054080 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wrcwr" Mar 19 19:20:10 crc kubenswrapper[4826]: I0319 19:20:10.054354 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wrcwr" Mar 19 19:20:10 crc kubenswrapper[4826]: I0319 19:20:10.126051 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wrcwr" Mar 19 19:20:10 crc kubenswrapper[4826]: I0319 19:20:10.776079 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af8d8fc1-0ebe-47a6-87ff-f93d31c7490e","Type":"ContainerStarted","Data":"308f90651aebf6ede5a02875cbdf006ce7685904bcdee6e4980004b0ee3a7aef"} Mar 19 19:20:14 crc kubenswrapper[4826]: I0319 19:20:14.825138 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af8d8fc1-0ebe-47a6-87ff-f93d31c7490e","Type":"ContainerStarted","Data":"3e4445853271e84079570a81ffc46b3df8683dcdabdf90ffc1294de30e4fc96a"} Mar 19 19:20:14 crc kubenswrapper[4826]: I0319 19:20:14.825696 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 19 19:20:14 crc kubenswrapper[4826]: I0319 19:20:14.876074 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.222236962 podStartE2EDuration="8.876048371s" podCreationTimestamp="2026-03-19 19:20:06 +0000 UTC" firstStartedPulling="2026-03-19 19:20:07.881999488 +0000 UTC m=+1432.636067801" lastFinishedPulling="2026-03-19 19:20:13.535810897 +0000 UTC m=+1438.289879210" observedRunningTime="2026-03-19 19:20:14.855255288 +0000 UTC m=+1439.609323601" watchObservedRunningTime="2026-03-19 19:20:14.876048371 +0000 UTC m=+1439.630116694" Mar 19 19:20:15 crc kubenswrapper[4826]: I0319 19:20:15.193998 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cml5k" podUID="af2662b9-3873-4947-9793-e7e1c6611dcb" containerName="registry-server" probeResult="failure" output=< Mar 19 19:20:15 crc kubenswrapper[4826]: timeout: failed to connect service ":50051" within 1s Mar 19 19:20:15 crc kubenswrapper[4826]: > Mar 19 19:20:16 crc kubenswrapper[4826]: I0319 19:20:16.152432 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 19 19:20:16 crc kubenswrapper[4826]: I0319 19:20:16.731546 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-jsjkk"] Mar 19 19:20:16 crc kubenswrapper[4826]: E0319 19:20:16.732147 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb05aceb-7f5c-42c3-a9a6-1242def4b9cc" containerName="oc" Mar 19 19:20:16 crc kubenswrapper[4826]: I0319 19:20:16.732171 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb05aceb-7f5c-42c3-a9a6-1242def4b9cc" containerName="oc" Mar 19 19:20:16 crc kubenswrapper[4826]: I0319 19:20:16.732487 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb05aceb-7f5c-42c3-a9a6-1242def4b9cc" containerName="oc" Mar 19 19:20:16 crc kubenswrapper[4826]: I0319 19:20:16.733475 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-jsjkk" Mar 19 19:20:16 crc kubenswrapper[4826]: I0319 19:20:16.736038 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 19 19:20:16 crc kubenswrapper[4826]: I0319 19:20:16.736264 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 19 19:20:16 crc kubenswrapper[4826]: I0319 19:20:16.754701 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-jsjkk"] Mar 19 19:20:16 crc kubenswrapper[4826]: I0319 19:20:16.845793 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05dd165a-9504-4521-9814-4e252234fd9b-config-data\") pod \"nova-cell0-cell-mapping-jsjkk\" (UID: \"05dd165a-9504-4521-9814-4e252234fd9b\") " pod="openstack/nova-cell0-cell-mapping-jsjkk" Mar 19 19:20:16 crc kubenswrapper[4826]: I0319 19:20:16.845947 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5kqk\" (UniqueName: \"kubernetes.io/projected/05dd165a-9504-4521-9814-4e252234fd9b-kube-api-access-k5kqk\") pod \"nova-cell0-cell-mapping-jsjkk\" (UID: \"05dd165a-9504-4521-9814-4e252234fd9b\") " pod="openstack/nova-cell0-cell-mapping-jsjkk" Mar 19 19:20:16 crc kubenswrapper[4826]: I0319 19:20:16.846025 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05dd165a-9504-4521-9814-4e252234fd9b-scripts\") pod \"nova-cell0-cell-mapping-jsjkk\" (UID: \"05dd165a-9504-4521-9814-4e252234fd9b\") " pod="openstack/nova-cell0-cell-mapping-jsjkk" Mar 19 19:20:16 crc kubenswrapper[4826]: I0319 19:20:16.846297 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05dd165a-9504-4521-9814-4e252234fd9b-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-jsjkk\" (UID: \"05dd165a-9504-4521-9814-4e252234fd9b\") " pod="openstack/nova-cell0-cell-mapping-jsjkk" Mar 19 19:20:16 crc kubenswrapper[4826]: I0319 19:20:16.948161 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05dd165a-9504-4521-9814-4e252234fd9b-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-jsjkk\" (UID: \"05dd165a-9504-4521-9814-4e252234fd9b\") " pod="openstack/nova-cell0-cell-mapping-jsjkk" Mar 19 19:20:16 crc kubenswrapper[4826]: I0319 19:20:16.948424 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05dd165a-9504-4521-9814-4e252234fd9b-config-data\") pod \"nova-cell0-cell-mapping-jsjkk\" (UID: \"05dd165a-9504-4521-9814-4e252234fd9b\") " pod="openstack/nova-cell0-cell-mapping-jsjkk" Mar 19 19:20:16 crc kubenswrapper[4826]: I0319 19:20:16.948492 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5kqk\" (UniqueName: \"kubernetes.io/projected/05dd165a-9504-4521-9814-4e252234fd9b-kube-api-access-k5kqk\") pod \"nova-cell0-cell-mapping-jsjkk\" (UID: \"05dd165a-9504-4521-9814-4e252234fd9b\") " pod="openstack/nova-cell0-cell-mapping-jsjkk" Mar 19 19:20:16 crc kubenswrapper[4826]: I0319 19:20:16.948544 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05dd165a-9504-4521-9814-4e252234fd9b-scripts\") pod \"nova-cell0-cell-mapping-jsjkk\" (UID: \"05dd165a-9504-4521-9814-4e252234fd9b\") " pod="openstack/nova-cell0-cell-mapping-jsjkk" Mar 19 19:20:16 crc kubenswrapper[4826]: I0319 19:20:16.962811 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 19:20:16 crc kubenswrapper[4826]: I0319 19:20:16.964504 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 19:20:16 crc kubenswrapper[4826]: I0319 19:20:16.976168 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05dd165a-9504-4521-9814-4e252234fd9b-scripts\") pod \"nova-cell0-cell-mapping-jsjkk\" (UID: \"05dd165a-9504-4521-9814-4e252234fd9b\") " pod="openstack/nova-cell0-cell-mapping-jsjkk" Mar 19 19:20:16 crc kubenswrapper[4826]: I0319 19:20:16.976744 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 19 19:20:16 crc kubenswrapper[4826]: I0319 19:20:16.977594 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5kqk\" (UniqueName: \"kubernetes.io/projected/05dd165a-9504-4521-9814-4e252234fd9b-kube-api-access-k5kqk\") pod \"nova-cell0-cell-mapping-jsjkk\" (UID: \"05dd165a-9504-4521-9814-4e252234fd9b\") " pod="openstack/nova-cell0-cell-mapping-jsjkk" Mar 19 19:20:16 crc kubenswrapper[4826]: I0319 19:20:16.979160 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05dd165a-9504-4521-9814-4e252234fd9b-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-jsjkk\" (UID: \"05dd165a-9504-4521-9814-4e252234fd9b\") " pod="openstack/nova-cell0-cell-mapping-jsjkk" Mar 19 19:20:16 crc kubenswrapper[4826]: I0319 19:20:16.980143 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 19:20:16 crc kubenswrapper[4826]: I0319 19:20:16.983241 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05dd165a-9504-4521-9814-4e252234fd9b-config-data\") pod \"nova-cell0-cell-mapping-jsjkk\" (UID: \"05dd165a-9504-4521-9814-4e252234fd9b\") " pod="openstack/nova-cell0-cell-mapping-jsjkk" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.010843 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.012761 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.022059 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.041538 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.043387 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.051258 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5lrn\" (UniqueName: \"kubernetes.io/projected/6a2c4496-cd50-45d1-a17b-23c600e9ea92-kube-api-access-z5lrn\") pod \"nova-scheduler-0\" (UID: \"6a2c4496-cd50-45d1-a17b-23c600e9ea92\") " pod="openstack/nova-scheduler-0" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.051611 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a2c4496-cd50-45d1-a17b-23c600e9ea92-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6a2c4496-cd50-45d1-a17b-23c600e9ea92\") " pod="openstack/nova-scheduler-0" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.051710 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a2c4496-cd50-45d1-a17b-23c600e9ea92-config-data\") pod \"nova-scheduler-0\" (UID: \"6a2c4496-cd50-45d1-a17b-23c600e9ea92\") " pod="openstack/nova-scheduler-0" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.052380 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.061104 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-jsjkk" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.081517 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.149627 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.154947 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21e0fb8a-2c26-4c1d-973c-3819e82aac1e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"21e0fb8a-2c26-4c1d-973c-3819e82aac1e\") " pod="openstack/nova-metadata-0" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.155004 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfxp8\" (UniqueName: \"kubernetes.io/projected/21e0fb8a-2c26-4c1d-973c-3819e82aac1e-kube-api-access-tfxp8\") pod \"nova-metadata-0\" (UID: \"21e0fb8a-2c26-4c1d-973c-3819e82aac1e\") " pod="openstack/nova-metadata-0" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.155037 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a2c4496-cd50-45d1-a17b-23c600e9ea92-config-data\") pod \"nova-scheduler-0\" (UID: \"6a2c4496-cd50-45d1-a17b-23c600e9ea92\") " pod="openstack/nova-scheduler-0" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.155072 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69c44859-378d-42ac-af83-30fd8b5e9010-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"69c44859-378d-42ac-af83-30fd8b5e9010\") " pod="openstack/nova-api-0" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.155135 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fr7kj\" (UniqueName: \"kubernetes.io/projected/69c44859-378d-42ac-af83-30fd8b5e9010-kube-api-access-fr7kj\") pod \"nova-api-0\" (UID: \"69c44859-378d-42ac-af83-30fd8b5e9010\") " pod="openstack/nova-api-0" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.155159 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21e0fb8a-2c26-4c1d-973c-3819e82aac1e-logs\") pod \"nova-metadata-0\" (UID: \"21e0fb8a-2c26-4c1d-973c-3819e82aac1e\") " pod="openstack/nova-metadata-0" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.155181 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69c44859-378d-42ac-af83-30fd8b5e9010-logs\") pod \"nova-api-0\" (UID: \"69c44859-378d-42ac-af83-30fd8b5e9010\") " pod="openstack/nova-api-0" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.155215 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5lrn\" (UniqueName: \"kubernetes.io/projected/6a2c4496-cd50-45d1-a17b-23c600e9ea92-kube-api-access-z5lrn\") pod \"nova-scheduler-0\" (UID: \"6a2c4496-cd50-45d1-a17b-23c600e9ea92\") " pod="openstack/nova-scheduler-0" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.155253 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69c44859-378d-42ac-af83-30fd8b5e9010-config-data\") pod \"nova-api-0\" (UID: \"69c44859-378d-42ac-af83-30fd8b5e9010\") " pod="openstack/nova-api-0" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.155286 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21e0fb8a-2c26-4c1d-973c-3819e82aac1e-config-data\") pod \"nova-metadata-0\" (UID: \"21e0fb8a-2c26-4c1d-973c-3819e82aac1e\") " pod="openstack/nova-metadata-0" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.155332 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a2c4496-cd50-45d1-a17b-23c600e9ea92-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6a2c4496-cd50-45d1-a17b-23c600e9ea92\") " pod="openstack/nova-scheduler-0" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.164470 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a2c4496-cd50-45d1-a17b-23c600e9ea92-config-data\") pod \"nova-scheduler-0\" (UID: \"6a2c4496-cd50-45d1-a17b-23c600e9ea92\") " pod="openstack/nova-scheduler-0" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.185277 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a2c4496-cd50-45d1-a17b-23c600e9ea92-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6a2c4496-cd50-45d1-a17b-23c600e9ea92\") " pod="openstack/nova-scheduler-0" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.200202 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5lrn\" (UniqueName: \"kubernetes.io/projected/6a2c4496-cd50-45d1-a17b-23c600e9ea92-kube-api-access-z5lrn\") pod \"nova-scheduler-0\" (UID: \"6a2c4496-cd50-45d1-a17b-23c600e9ea92\") " pod="openstack/nova-scheduler-0" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.241223 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-568d7fd7cf-2l77b"] Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.245049 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-568d7fd7cf-2l77b" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.256876 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.258393 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.259944 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.259958 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69c44859-378d-42ac-af83-30fd8b5e9010-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"69c44859-378d-42ac-af83-30fd8b5e9010\") " pod="openstack/nova-api-0" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.260261 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fr7kj\" (UniqueName: \"kubernetes.io/projected/69c44859-378d-42ac-af83-30fd8b5e9010-kube-api-access-fr7kj\") pod \"nova-api-0\" (UID: \"69c44859-378d-42ac-af83-30fd8b5e9010\") " pod="openstack/nova-api-0" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.260317 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21e0fb8a-2c26-4c1d-973c-3819e82aac1e-logs\") pod \"nova-metadata-0\" (UID: \"21e0fb8a-2c26-4c1d-973c-3819e82aac1e\") " pod="openstack/nova-metadata-0" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.260369 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69c44859-378d-42ac-af83-30fd8b5e9010-logs\") pod \"nova-api-0\" (UID: \"69c44859-378d-42ac-af83-30fd8b5e9010\") " pod="openstack/nova-api-0" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.260527 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69c44859-378d-42ac-af83-30fd8b5e9010-config-data\") pod \"nova-api-0\" (UID: \"69c44859-378d-42ac-af83-30fd8b5e9010\") " pod="openstack/nova-api-0" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.260692 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21e0fb8a-2c26-4c1d-973c-3819e82aac1e-config-data\") pod \"nova-metadata-0\" (UID: \"21e0fb8a-2c26-4c1d-973c-3819e82aac1e\") " pod="openstack/nova-metadata-0" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.260866 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21e0fb8a-2c26-4c1d-973c-3819e82aac1e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"21e0fb8a-2c26-4c1d-973c-3819e82aac1e\") " pod="openstack/nova-metadata-0" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.260923 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfxp8\" (UniqueName: \"kubernetes.io/projected/21e0fb8a-2c26-4c1d-973c-3819e82aac1e-kube-api-access-tfxp8\") pod \"nova-metadata-0\" (UID: \"21e0fb8a-2c26-4c1d-973c-3819e82aac1e\") " pod="openstack/nova-metadata-0" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.262071 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21e0fb8a-2c26-4c1d-973c-3819e82aac1e-logs\") pod \"nova-metadata-0\" (UID: \"21e0fb8a-2c26-4c1d-973c-3819e82aac1e\") " pod="openstack/nova-metadata-0" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.262469 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69c44859-378d-42ac-af83-30fd8b5e9010-logs\") pod \"nova-api-0\" (UID: \"69c44859-378d-42ac-af83-30fd8b5e9010\") " pod="openstack/nova-api-0" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.271765 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21e0fb8a-2c26-4c1d-973c-3819e82aac1e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"21e0fb8a-2c26-4c1d-973c-3819e82aac1e\") " pod="openstack/nova-metadata-0" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.276208 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69c44859-378d-42ac-af83-30fd8b5e9010-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"69c44859-378d-42ac-af83-30fd8b5e9010\") " pod="openstack/nova-api-0" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.276304 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69c44859-378d-42ac-af83-30fd8b5e9010-config-data\") pod \"nova-api-0\" (UID: \"69c44859-378d-42ac-af83-30fd8b5e9010\") " pod="openstack/nova-api-0" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.286544 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21e0fb8a-2c26-4c1d-973c-3819e82aac1e-config-data\") pod \"nova-metadata-0\" (UID: \"21e0fb8a-2c26-4c1d-973c-3819e82aac1e\") " pod="openstack/nova-metadata-0" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.292631 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fr7kj\" (UniqueName: \"kubernetes.io/projected/69c44859-378d-42ac-af83-30fd8b5e9010-kube-api-access-fr7kj\") pod \"nova-api-0\" (UID: \"69c44859-378d-42ac-af83-30fd8b5e9010\") " pod="openstack/nova-api-0" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.293566 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfxp8\" (UniqueName: \"kubernetes.io/projected/21e0fb8a-2c26-4c1d-973c-3819e82aac1e-kube-api-access-tfxp8\") pod \"nova-metadata-0\" (UID: \"21e0fb8a-2c26-4c1d-973c-3819e82aac1e\") " pod="openstack/nova-metadata-0" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.298361 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-568d7fd7cf-2l77b"] Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.327280 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.347592 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.364992 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9d685221-0f29-401d-af30-cbf46d71761f-dns-swift-storage-0\") pod \"dnsmasq-dns-568d7fd7cf-2l77b\" (UID: \"9d685221-0f29-401d-af30-cbf46d71761f\") " pod="openstack/dnsmasq-dns-568d7fd7cf-2l77b" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.365145 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38a2c36f-5d7d-4666-b6ed-833eb5e4df87-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"38a2c36f-5d7d-4666-b6ed-833eb5e4df87\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.365239 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d685221-0f29-401d-af30-cbf46d71761f-config\") pod \"dnsmasq-dns-568d7fd7cf-2l77b\" (UID: \"9d685221-0f29-401d-af30-cbf46d71761f\") " pod="openstack/dnsmasq-dns-568d7fd7cf-2l77b" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.365385 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d685221-0f29-401d-af30-cbf46d71761f-ovsdbserver-sb\") pod \"dnsmasq-dns-568d7fd7cf-2l77b\" (UID: \"9d685221-0f29-401d-af30-cbf46d71761f\") " pod="openstack/dnsmasq-dns-568d7fd7cf-2l77b" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.365475 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38a2c36f-5d7d-4666-b6ed-833eb5e4df87-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"38a2c36f-5d7d-4666-b6ed-833eb5e4df87\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.365539 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d685221-0f29-401d-af30-cbf46d71761f-dns-svc\") pod \"dnsmasq-dns-568d7fd7cf-2l77b\" (UID: \"9d685221-0f29-401d-af30-cbf46d71761f\") " pod="openstack/dnsmasq-dns-568d7fd7cf-2l77b" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.365647 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d685221-0f29-401d-af30-cbf46d71761f-ovsdbserver-nb\") pod \"dnsmasq-dns-568d7fd7cf-2l77b\" (UID: \"9d685221-0f29-401d-af30-cbf46d71761f\") " pod="openstack/dnsmasq-dns-568d7fd7cf-2l77b" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.365816 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnnf6\" (UniqueName: \"kubernetes.io/projected/9d685221-0f29-401d-af30-cbf46d71761f-kube-api-access-gnnf6\") pod \"dnsmasq-dns-568d7fd7cf-2l77b\" (UID: \"9d685221-0f29-401d-af30-cbf46d71761f\") " pod="openstack/dnsmasq-dns-568d7fd7cf-2l77b" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.365957 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5spl\" (UniqueName: \"kubernetes.io/projected/38a2c36f-5d7d-4666-b6ed-833eb5e4df87-kube-api-access-j5spl\") pod \"nova-cell1-novncproxy-0\" (UID: \"38a2c36f-5d7d-4666-b6ed-833eb5e4df87\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.417691 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.471022 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5spl\" (UniqueName: \"kubernetes.io/projected/38a2c36f-5d7d-4666-b6ed-833eb5e4df87-kube-api-access-j5spl\") pod \"nova-cell1-novncproxy-0\" (UID: \"38a2c36f-5d7d-4666-b6ed-833eb5e4df87\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.471392 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9d685221-0f29-401d-af30-cbf46d71761f-dns-swift-storage-0\") pod \"dnsmasq-dns-568d7fd7cf-2l77b\" (UID: \"9d685221-0f29-401d-af30-cbf46d71761f\") " pod="openstack/dnsmasq-dns-568d7fd7cf-2l77b" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.471434 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38a2c36f-5d7d-4666-b6ed-833eb5e4df87-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"38a2c36f-5d7d-4666-b6ed-833eb5e4df87\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.471523 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d685221-0f29-401d-af30-cbf46d71761f-config\") pod \"dnsmasq-dns-568d7fd7cf-2l77b\" (UID: \"9d685221-0f29-401d-af30-cbf46d71761f\") " pod="openstack/dnsmasq-dns-568d7fd7cf-2l77b" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.471758 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d685221-0f29-401d-af30-cbf46d71761f-ovsdbserver-sb\") pod \"dnsmasq-dns-568d7fd7cf-2l77b\" (UID: \"9d685221-0f29-401d-af30-cbf46d71761f\") " pod="openstack/dnsmasq-dns-568d7fd7cf-2l77b" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.471853 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38a2c36f-5d7d-4666-b6ed-833eb5e4df87-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"38a2c36f-5d7d-4666-b6ed-833eb5e4df87\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.471884 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d685221-0f29-401d-af30-cbf46d71761f-dns-svc\") pod \"dnsmasq-dns-568d7fd7cf-2l77b\" (UID: \"9d685221-0f29-401d-af30-cbf46d71761f\") " pod="openstack/dnsmasq-dns-568d7fd7cf-2l77b" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.472003 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d685221-0f29-401d-af30-cbf46d71761f-ovsdbserver-nb\") pod \"dnsmasq-dns-568d7fd7cf-2l77b\" (UID: \"9d685221-0f29-401d-af30-cbf46d71761f\") " pod="openstack/dnsmasq-dns-568d7fd7cf-2l77b" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.472179 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnnf6\" (UniqueName: \"kubernetes.io/projected/9d685221-0f29-401d-af30-cbf46d71761f-kube-api-access-gnnf6\") pod \"dnsmasq-dns-568d7fd7cf-2l77b\" (UID: \"9d685221-0f29-401d-af30-cbf46d71761f\") " pod="openstack/dnsmasq-dns-568d7fd7cf-2l77b" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.479892 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d685221-0f29-401d-af30-cbf46d71761f-config\") pod \"dnsmasq-dns-568d7fd7cf-2l77b\" (UID: \"9d685221-0f29-401d-af30-cbf46d71761f\") " pod="openstack/dnsmasq-dns-568d7fd7cf-2l77b" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.480115 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d685221-0f29-401d-af30-cbf46d71761f-ovsdbserver-sb\") pod \"dnsmasq-dns-568d7fd7cf-2l77b\" (UID: \"9d685221-0f29-401d-af30-cbf46d71761f\") " pod="openstack/dnsmasq-dns-568d7fd7cf-2l77b" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.485592 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.487722 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38a2c36f-5d7d-4666-b6ed-833eb5e4df87-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"38a2c36f-5d7d-4666-b6ed-833eb5e4df87\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.488837 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9d685221-0f29-401d-af30-cbf46d71761f-dns-swift-storage-0\") pod \"dnsmasq-dns-568d7fd7cf-2l77b\" (UID: \"9d685221-0f29-401d-af30-cbf46d71761f\") " pod="openstack/dnsmasq-dns-568d7fd7cf-2l77b" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.495698 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38a2c36f-5d7d-4666-b6ed-833eb5e4df87-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"38a2c36f-5d7d-4666-b6ed-833eb5e4df87\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.507538 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnnf6\" (UniqueName: \"kubernetes.io/projected/9d685221-0f29-401d-af30-cbf46d71761f-kube-api-access-gnnf6\") pod \"dnsmasq-dns-568d7fd7cf-2l77b\" (UID: \"9d685221-0f29-401d-af30-cbf46d71761f\") " pod="openstack/dnsmasq-dns-568d7fd7cf-2l77b" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.516185 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5spl\" (UniqueName: \"kubernetes.io/projected/38a2c36f-5d7d-4666-b6ed-833eb5e4df87-kube-api-access-j5spl\") pod \"nova-cell1-novncproxy-0\" (UID: \"38a2c36f-5d7d-4666-b6ed-833eb5e4df87\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.517460 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d685221-0f29-401d-af30-cbf46d71761f-dns-svc\") pod \"dnsmasq-dns-568d7fd7cf-2l77b\" (UID: \"9d685221-0f29-401d-af30-cbf46d71761f\") " pod="openstack/dnsmasq-dns-568d7fd7cf-2l77b" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.517467 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d685221-0f29-401d-af30-cbf46d71761f-ovsdbserver-nb\") pod \"dnsmasq-dns-568d7fd7cf-2l77b\" (UID: \"9d685221-0f29-401d-af30-cbf46d71761f\") " pod="openstack/dnsmasq-dns-568d7fd7cf-2l77b" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.774285 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-568d7fd7cf-2l77b" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.789089 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-jsjkk"] Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.796691 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:20:17 crc kubenswrapper[4826]: I0319 19:20:17.893760 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-jsjkk" event={"ID":"05dd165a-9504-4521-9814-4e252234fd9b","Type":"ContainerStarted","Data":"621a88c21f917f8ce1803918371a7b1350e5fae6d77e5417f442bfa39c4b81f6"} Mar 19 19:20:18 crc kubenswrapper[4826]: I0319 19:20:18.167746 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 19:20:18 crc kubenswrapper[4826]: I0319 19:20:18.665640 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 19:20:18 crc kubenswrapper[4826]: I0319 19:20:18.690374 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 19:20:18 crc kubenswrapper[4826]: I0319 19:20:18.856214 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 19:20:18 crc kubenswrapper[4826]: I0319 19:20:18.910623 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-568d7fd7cf-2l77b"] Mar 19 19:20:18 crc kubenswrapper[4826]: I0319 19:20:18.924758 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-mdwdm"] Mar 19 19:20:18 crc kubenswrapper[4826]: I0319 19:20:18.926716 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-mdwdm" Mar 19 19:20:18 crc kubenswrapper[4826]: I0319 19:20:18.936836 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 19 19:20:18 crc kubenswrapper[4826]: I0319 19:20:18.937043 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 19 19:20:18 crc kubenswrapper[4826]: I0319 19:20:18.945360 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-jsjkk" event={"ID":"05dd165a-9504-4521-9814-4e252234fd9b","Type":"ContainerStarted","Data":"95feeb7d4dd3a671e8330b63a53f2465bf33a41ef703a61445cf7d0fb1715f3d"} Mar 19 19:20:18 crc kubenswrapper[4826]: I0319 19:20:18.945796 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-mdwdm"] Mar 19 19:20:18 crc kubenswrapper[4826]: I0319 19:20:18.955990 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15200e4f-15d2-450d-93ff-3b26e3df0b48-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-mdwdm\" (UID: \"15200e4f-15d2-450d-93ff-3b26e3df0b48\") " pod="openstack/nova-cell1-conductor-db-sync-mdwdm" Mar 19 19:20:18 crc kubenswrapper[4826]: I0319 19:20:18.956315 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15200e4f-15d2-450d-93ff-3b26e3df0b48-scripts\") pod \"nova-cell1-conductor-db-sync-mdwdm\" (UID: \"15200e4f-15d2-450d-93ff-3b26e3df0b48\") " pod="openstack/nova-cell1-conductor-db-sync-mdwdm" Mar 19 19:20:18 crc kubenswrapper[4826]: I0319 19:20:18.956478 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xb5fc\" (UniqueName: \"kubernetes.io/projected/15200e4f-15d2-450d-93ff-3b26e3df0b48-kube-api-access-xb5fc\") pod \"nova-cell1-conductor-db-sync-mdwdm\" (UID: \"15200e4f-15d2-450d-93ff-3b26e3df0b48\") " pod="openstack/nova-cell1-conductor-db-sync-mdwdm" Mar 19 19:20:18 crc kubenswrapper[4826]: I0319 19:20:18.956557 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15200e4f-15d2-450d-93ff-3b26e3df0b48-config-data\") pod \"nova-cell1-conductor-db-sync-mdwdm\" (UID: \"15200e4f-15d2-450d-93ff-3b26e3df0b48\") " pod="openstack/nova-cell1-conductor-db-sync-mdwdm" Mar 19 19:20:18 crc kubenswrapper[4826]: I0319 19:20:18.957268 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568d7fd7cf-2l77b" event={"ID":"9d685221-0f29-401d-af30-cbf46d71761f","Type":"ContainerStarted","Data":"b6aab2661df461134f6042d890f6e93d6ef0741cdf9dbb0248c37bf9029966fc"} Mar 19 19:20:18 crc kubenswrapper[4826]: I0319 19:20:18.980351 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6a2c4496-cd50-45d1-a17b-23c600e9ea92","Type":"ContainerStarted","Data":"1c502d9c4c609c0365f53f2d4482ca38dcb3ae0157f3ad190663c9766218349c"} Mar 19 19:20:18 crc kubenswrapper[4826]: I0319 19:20:18.989724 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"21e0fb8a-2c26-4c1d-973c-3819e82aac1e","Type":"ContainerStarted","Data":"203de1919c5b0f25c59f540cfddbdc98ef2a2cff5a27ff43afd1ac66bf65d1a7"} Mar 19 19:20:18 crc kubenswrapper[4826]: I0319 19:20:18.991360 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-jsjkk" podStartSLOduration=2.991343234 podStartE2EDuration="2.991343234s" podCreationTimestamp="2026-03-19 19:20:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:20:18.98497649 +0000 UTC m=+1443.739044803" watchObservedRunningTime="2026-03-19 19:20:18.991343234 +0000 UTC m=+1443.745411547" Mar 19 19:20:18 crc kubenswrapper[4826]: I0319 19:20:18.994041 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"38a2c36f-5d7d-4666-b6ed-833eb5e4df87","Type":"ContainerStarted","Data":"c2536f3d9c9e81c3508e4f4dfe2d26929cacba95a23b8703a715e65389f7b2e4"} Mar 19 19:20:19 crc kubenswrapper[4826]: I0319 19:20:19.006054 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"69c44859-378d-42ac-af83-30fd8b5e9010","Type":"ContainerStarted","Data":"cd16d1579f38d37e6a172017d701d37e70973847e05a6d68cb826a44bd7a78fc"} Mar 19 19:20:19 crc kubenswrapper[4826]: I0319 19:20:19.081920 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15200e4f-15d2-450d-93ff-3b26e3df0b48-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-mdwdm\" (UID: \"15200e4f-15d2-450d-93ff-3b26e3df0b48\") " pod="openstack/nova-cell1-conductor-db-sync-mdwdm" Mar 19 19:20:19 crc kubenswrapper[4826]: I0319 19:20:19.082197 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15200e4f-15d2-450d-93ff-3b26e3df0b48-scripts\") pod \"nova-cell1-conductor-db-sync-mdwdm\" (UID: \"15200e4f-15d2-450d-93ff-3b26e3df0b48\") " pod="openstack/nova-cell1-conductor-db-sync-mdwdm" Mar 19 19:20:19 crc kubenswrapper[4826]: I0319 19:20:19.082465 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xb5fc\" (UniqueName: \"kubernetes.io/projected/15200e4f-15d2-450d-93ff-3b26e3df0b48-kube-api-access-xb5fc\") pod \"nova-cell1-conductor-db-sync-mdwdm\" (UID: \"15200e4f-15d2-450d-93ff-3b26e3df0b48\") " pod="openstack/nova-cell1-conductor-db-sync-mdwdm" Mar 19 19:20:19 crc kubenswrapper[4826]: I0319 19:20:19.083952 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15200e4f-15d2-450d-93ff-3b26e3df0b48-config-data\") pod \"nova-cell1-conductor-db-sync-mdwdm\" (UID: \"15200e4f-15d2-450d-93ff-3b26e3df0b48\") " pod="openstack/nova-cell1-conductor-db-sync-mdwdm" Mar 19 19:20:19 crc kubenswrapper[4826]: I0319 19:20:19.090509 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15200e4f-15d2-450d-93ff-3b26e3df0b48-scripts\") pod \"nova-cell1-conductor-db-sync-mdwdm\" (UID: \"15200e4f-15d2-450d-93ff-3b26e3df0b48\") " pod="openstack/nova-cell1-conductor-db-sync-mdwdm" Mar 19 19:20:19 crc kubenswrapper[4826]: I0319 19:20:19.104442 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15200e4f-15d2-450d-93ff-3b26e3df0b48-config-data\") pod \"nova-cell1-conductor-db-sync-mdwdm\" (UID: \"15200e4f-15d2-450d-93ff-3b26e3df0b48\") " pod="openstack/nova-cell1-conductor-db-sync-mdwdm" Mar 19 19:20:19 crc kubenswrapper[4826]: I0319 19:20:19.105082 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15200e4f-15d2-450d-93ff-3b26e3df0b48-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-mdwdm\" (UID: \"15200e4f-15d2-450d-93ff-3b26e3df0b48\") " pod="openstack/nova-cell1-conductor-db-sync-mdwdm" Mar 19 19:20:19 crc kubenswrapper[4826]: I0319 19:20:19.114231 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xb5fc\" (UniqueName: \"kubernetes.io/projected/15200e4f-15d2-450d-93ff-3b26e3df0b48-kube-api-access-xb5fc\") pod \"nova-cell1-conductor-db-sync-mdwdm\" (UID: \"15200e4f-15d2-450d-93ff-3b26e3df0b48\") " pod="openstack/nova-cell1-conductor-db-sync-mdwdm" Mar 19 19:20:19 crc kubenswrapper[4826]: I0319 19:20:19.405431 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-mdwdm" Mar 19 19:20:20 crc kubenswrapper[4826]: I0319 19:20:20.009607 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-mdwdm"] Mar 19 19:20:20 crc kubenswrapper[4826]: W0319 19:20:20.040057 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15200e4f_15d2_450d_93ff_3b26e3df0b48.slice/crio-bb03eababb0663669edb466768a49bf5db7f7e19233616111a77042fe65db52e WatchSource:0}: Error finding container bb03eababb0663669edb466768a49bf5db7f7e19233616111a77042fe65db52e: Status 404 returned error can't find the container with id bb03eababb0663669edb466768a49bf5db7f7e19233616111a77042fe65db52e Mar 19 19:20:20 crc kubenswrapper[4826]: I0319 19:20:20.066746 4826 generic.go:334] "Generic (PLEG): container finished" podID="9d685221-0f29-401d-af30-cbf46d71761f" containerID="ffea9df019c22d2622d4e303b878781de8efe11751dca9df73e3faca51be38ad" exitCode=0 Mar 19 19:20:20 crc kubenswrapper[4826]: I0319 19:20:20.089527 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568d7fd7cf-2l77b" event={"ID":"9d685221-0f29-401d-af30-cbf46d71761f","Type":"ContainerDied","Data":"ffea9df019c22d2622d4e303b878781de8efe11751dca9df73e3faca51be38ad"} Mar 19 19:20:20 crc kubenswrapper[4826]: I0319 19:20:20.172001 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wrcwr" Mar 19 19:20:20 crc kubenswrapper[4826]: I0319 19:20:20.241360 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wrcwr"] Mar 19 19:20:20 crc kubenswrapper[4826]: I0319 19:20:20.807700 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 19:20:20 crc kubenswrapper[4826]: I0319 19:20:20.818003 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 19:20:21 crc kubenswrapper[4826]: I0319 19:20:21.080467 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568d7fd7cf-2l77b" event={"ID":"9d685221-0f29-401d-af30-cbf46d71761f","Type":"ContainerStarted","Data":"08ca47e73a555446029fe4510be127fe2cb0a3214a86ca16eef9fe536654a285"} Mar 19 19:20:21 crc kubenswrapper[4826]: I0319 19:20:21.081985 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-568d7fd7cf-2l77b" Mar 19 19:20:21 crc kubenswrapper[4826]: I0319 19:20:21.085721 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wrcwr" podUID="451fe5a3-8a05-42fd-8059-85084283a59f" containerName="registry-server" containerID="cri-o://4180f287567b7892702a7e7731d229aa070a0ae74ad3d5ea4bbc1997f80f30ac" gracePeriod=2 Mar 19 19:20:21 crc kubenswrapper[4826]: I0319 19:20:21.086352 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-mdwdm" event={"ID":"15200e4f-15d2-450d-93ff-3b26e3df0b48","Type":"ContainerStarted","Data":"06c8539cd9a6da203de24f0f362302644bbdfa63a2e96e302fee3836bf4a7a20"} Mar 19 19:20:21 crc kubenswrapper[4826]: I0319 19:20:21.086393 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-mdwdm" event={"ID":"15200e4f-15d2-450d-93ff-3b26e3df0b48","Type":"ContainerStarted","Data":"bb03eababb0663669edb466768a49bf5db7f7e19233616111a77042fe65db52e"} Mar 19 19:20:21 crc kubenswrapper[4826]: I0319 19:20:21.106922 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-568d7fd7cf-2l77b" podStartSLOduration=4.106904706 podStartE2EDuration="4.106904706s" podCreationTimestamp="2026-03-19 19:20:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:20:21.096551365 +0000 UTC m=+1445.850619678" watchObservedRunningTime="2026-03-19 19:20:21.106904706 +0000 UTC m=+1445.860973019" Mar 19 19:20:21 crc kubenswrapper[4826]: I0319 19:20:21.165545 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-mdwdm" podStartSLOduration=3.1655201330000002 podStartE2EDuration="3.165520133s" podCreationTimestamp="2026-03-19 19:20:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:20:21.121931399 +0000 UTC m=+1445.875999712" watchObservedRunningTime="2026-03-19 19:20:21.165520133 +0000 UTC m=+1445.919588436" Mar 19 19:20:22 crc kubenswrapper[4826]: I0319 19:20:22.099262 4826 generic.go:334] "Generic (PLEG): container finished" podID="451fe5a3-8a05-42fd-8059-85084283a59f" containerID="4180f287567b7892702a7e7731d229aa070a0ae74ad3d5ea4bbc1997f80f30ac" exitCode=0 Mar 19 19:20:22 crc kubenswrapper[4826]: I0319 19:20:22.099589 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wrcwr" event={"ID":"451fe5a3-8a05-42fd-8059-85084283a59f","Type":"ContainerDied","Data":"4180f287567b7892702a7e7731d229aa070a0ae74ad3d5ea4bbc1997f80f30ac"} Mar 19 19:20:23 crc kubenswrapper[4826]: I0319 19:20:23.252017 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wrcwr" Mar 19 19:20:23 crc kubenswrapper[4826]: I0319 19:20:23.311094 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/451fe5a3-8a05-42fd-8059-85084283a59f-catalog-content\") pod \"451fe5a3-8a05-42fd-8059-85084283a59f\" (UID: \"451fe5a3-8a05-42fd-8059-85084283a59f\") " Mar 19 19:20:23 crc kubenswrapper[4826]: I0319 19:20:23.311471 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79nqv\" (UniqueName: \"kubernetes.io/projected/451fe5a3-8a05-42fd-8059-85084283a59f-kube-api-access-79nqv\") pod \"451fe5a3-8a05-42fd-8059-85084283a59f\" (UID: \"451fe5a3-8a05-42fd-8059-85084283a59f\") " Mar 19 19:20:23 crc kubenswrapper[4826]: I0319 19:20:23.312243 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/451fe5a3-8a05-42fd-8059-85084283a59f-utilities\") pod \"451fe5a3-8a05-42fd-8059-85084283a59f\" (UID: \"451fe5a3-8a05-42fd-8059-85084283a59f\") " Mar 19 19:20:23 crc kubenswrapper[4826]: I0319 19:20:23.312532 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/451fe5a3-8a05-42fd-8059-85084283a59f-utilities" (OuterVolumeSpecName: "utilities") pod "451fe5a3-8a05-42fd-8059-85084283a59f" (UID: "451fe5a3-8a05-42fd-8059-85084283a59f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:20:23 crc kubenswrapper[4826]: I0319 19:20:23.313101 4826 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/451fe5a3-8a05-42fd-8059-85084283a59f-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:23 crc kubenswrapper[4826]: I0319 19:20:23.328280 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/451fe5a3-8a05-42fd-8059-85084283a59f-kube-api-access-79nqv" (OuterVolumeSpecName: "kube-api-access-79nqv") pod "451fe5a3-8a05-42fd-8059-85084283a59f" (UID: "451fe5a3-8a05-42fd-8059-85084283a59f"). InnerVolumeSpecName "kube-api-access-79nqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:20:23 crc kubenswrapper[4826]: I0319 19:20:23.345926 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/451fe5a3-8a05-42fd-8059-85084283a59f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "451fe5a3-8a05-42fd-8059-85084283a59f" (UID: "451fe5a3-8a05-42fd-8059-85084283a59f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:20:23 crc kubenswrapper[4826]: I0319 19:20:23.399789 4826 scope.go:117] "RemoveContainer" containerID="36c100cd6943dc6bbc62bf14ad99171aedef87ef8190d7f788ed589a8325a9f0" Mar 19 19:20:23 crc kubenswrapper[4826]: I0319 19:20:23.414355 4826 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/451fe5a3-8a05-42fd-8059-85084283a59f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:23 crc kubenswrapper[4826]: I0319 19:20:23.414548 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79nqv\" (UniqueName: \"kubernetes.io/projected/451fe5a3-8a05-42fd-8059-85084283a59f-kube-api-access-79nqv\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:23 crc kubenswrapper[4826]: I0319 19:20:23.902147 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:20:23 crc kubenswrapper[4826]: I0319 19:20:23.902669 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="af8d8fc1-0ebe-47a6-87ff-f93d31c7490e" containerName="ceilometer-central-agent" containerID="cri-o://aece7260bc6cb44f26a3050c09b44a755c92d13e817fedd2284ec6224fe043fb" gracePeriod=30 Mar 19 19:20:23 crc kubenswrapper[4826]: I0319 19:20:23.902718 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="af8d8fc1-0ebe-47a6-87ff-f93d31c7490e" containerName="sg-core" containerID="cri-o://308f90651aebf6ede5a02875cbdf006ce7685904bcdee6e4980004b0ee3a7aef" gracePeriod=30 Mar 19 19:20:23 crc kubenswrapper[4826]: I0319 19:20:23.902789 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="af8d8fc1-0ebe-47a6-87ff-f93d31c7490e" containerName="ceilometer-notification-agent" containerID="cri-o://41047eb86fc17342447ec83766d21f38a4b8cf564912dc19cb0943a67d5d5b6a" gracePeriod=30 Mar 19 19:20:23 crc kubenswrapper[4826]: I0319 19:20:23.902855 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="af8d8fc1-0ebe-47a6-87ff-f93d31c7490e" containerName="proxy-httpd" containerID="cri-o://3e4445853271e84079570a81ffc46b3df8683dcdabdf90ffc1294de30e4fc96a" gracePeriod=30 Mar 19 19:20:23 crc kubenswrapper[4826]: I0319 19:20:23.935124 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="af8d8fc1-0ebe-47a6-87ff-f93d31c7490e" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.244:3000/\": EOF" Mar 19 19:20:24 crc kubenswrapper[4826]: I0319 19:20:24.189795 4826 generic.go:334] "Generic (PLEG): container finished" podID="af8d8fc1-0ebe-47a6-87ff-f93d31c7490e" containerID="3e4445853271e84079570a81ffc46b3df8683dcdabdf90ffc1294de30e4fc96a" exitCode=0 Mar 19 19:20:24 crc kubenswrapper[4826]: I0319 19:20:24.189838 4826 generic.go:334] "Generic (PLEG): container finished" podID="af8d8fc1-0ebe-47a6-87ff-f93d31c7490e" containerID="308f90651aebf6ede5a02875cbdf006ce7685904bcdee6e4980004b0ee3a7aef" exitCode=2 Mar 19 19:20:24 crc kubenswrapper[4826]: I0319 19:20:24.189888 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af8d8fc1-0ebe-47a6-87ff-f93d31c7490e","Type":"ContainerDied","Data":"3e4445853271e84079570a81ffc46b3df8683dcdabdf90ffc1294de30e4fc96a"} Mar 19 19:20:24 crc kubenswrapper[4826]: I0319 19:20:24.189917 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af8d8fc1-0ebe-47a6-87ff-f93d31c7490e","Type":"ContainerDied","Data":"308f90651aebf6ede5a02875cbdf006ce7685904bcdee6e4980004b0ee3a7aef"} Mar 19 19:20:24 crc kubenswrapper[4826]: I0319 19:20:24.205774 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wrcwr" event={"ID":"451fe5a3-8a05-42fd-8059-85084283a59f","Type":"ContainerDied","Data":"e54c2aefa21c8964848c23fe1a7d6640300f53c2c7fe8fa41c7e97a22f2b8448"} Mar 19 19:20:24 crc kubenswrapper[4826]: I0319 19:20:24.205822 4826 scope.go:117] "RemoveContainer" containerID="4180f287567b7892702a7e7731d229aa070a0ae74ad3d5ea4bbc1997f80f30ac" Mar 19 19:20:24 crc kubenswrapper[4826]: I0319 19:20:24.205905 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wrcwr" Mar 19 19:20:24 crc kubenswrapper[4826]: I0319 19:20:24.255031 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wrcwr"] Mar 19 19:20:24 crc kubenswrapper[4826]: I0319 19:20:24.265484 4826 scope.go:117] "RemoveContainer" containerID="bb275d63b1df93aa76077b15ed436d436cf33fa341f768bdf2e8096cfcc50b93" Mar 19 19:20:24 crc kubenswrapper[4826]: I0319 19:20:24.271197 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wrcwr"] Mar 19 19:20:24 crc kubenswrapper[4826]: I0319 19:20:24.393921 4826 scope.go:117] "RemoveContainer" containerID="5527f42efe8f2e3c3378d7c9cd685ea7f203c135be6bef00b3f34316bb528565" Mar 19 19:20:25 crc kubenswrapper[4826]: I0319 19:20:25.216249 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6a2c4496-cd50-45d1-a17b-23c600e9ea92","Type":"ContainerStarted","Data":"ffb8232a3bccd7c7c41bdeb4f676dab4738a62c0f6b1935af0c660b5b69be706"} Mar 19 19:20:25 crc kubenswrapper[4826]: I0319 19:20:25.219200 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"38a2c36f-5d7d-4666-b6ed-833eb5e4df87","Type":"ContainerStarted","Data":"7145ea36fea7fd8f76ef92adc222fdf5036ecb1b3fa9387569368f9dd533336a"} Mar 19 19:20:25 crc kubenswrapper[4826]: I0319 19:20:25.219298 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="38a2c36f-5d7d-4666-b6ed-833eb5e4df87" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://7145ea36fea7fd8f76ef92adc222fdf5036ecb1b3fa9387569368f9dd533336a" gracePeriod=30 Mar 19 19:20:25 crc kubenswrapper[4826]: I0319 19:20:25.221516 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"21e0fb8a-2c26-4c1d-973c-3819e82aac1e","Type":"ContainerStarted","Data":"12b0237eb50acb20ee1dd8fa6736ea1d41616c908ff1adff5d42be3f58207763"} Mar 19 19:20:25 crc kubenswrapper[4826]: I0319 19:20:25.221539 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"21e0fb8a-2c26-4c1d-973c-3819e82aac1e","Type":"ContainerStarted","Data":"6830913d3ded7f795234e7a87222830044655e55f8e49cba668eb7e61fa53f0b"} Mar 19 19:20:25 crc kubenswrapper[4826]: I0319 19:20:25.221613 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="21e0fb8a-2c26-4c1d-973c-3819e82aac1e" containerName="nova-metadata-log" containerID="cri-o://6830913d3ded7f795234e7a87222830044655e55f8e49cba668eb7e61fa53f0b" gracePeriod=30 Mar 19 19:20:25 crc kubenswrapper[4826]: I0319 19:20:25.221688 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="21e0fb8a-2c26-4c1d-973c-3819e82aac1e" containerName="nova-metadata-metadata" containerID="cri-o://12b0237eb50acb20ee1dd8fa6736ea1d41616c908ff1adff5d42be3f58207763" gracePeriod=30 Mar 19 19:20:25 crc kubenswrapper[4826]: I0319 19:20:25.227902 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"69c44859-378d-42ac-af83-30fd8b5e9010","Type":"ContainerStarted","Data":"1c6eddd33bbc243a29f37127e409bd0627abb745a4dfdd243ee276c054f9641f"} Mar 19 19:20:25 crc kubenswrapper[4826]: I0319 19:20:25.227939 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"69c44859-378d-42ac-af83-30fd8b5e9010","Type":"ContainerStarted","Data":"028c65640619880612ba7013fe076a8774827041d76cc65bc92cd1b2b984e0b0"} Mar 19 19:20:25 crc kubenswrapper[4826]: I0319 19:20:25.234271 4826 generic.go:334] "Generic (PLEG): container finished" podID="af8d8fc1-0ebe-47a6-87ff-f93d31c7490e" containerID="aece7260bc6cb44f26a3050c09b44a755c92d13e817fedd2284ec6224fe043fb" exitCode=0 Mar 19 19:20:25 crc kubenswrapper[4826]: I0319 19:20:25.234311 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af8d8fc1-0ebe-47a6-87ff-f93d31c7490e","Type":"ContainerDied","Data":"aece7260bc6cb44f26a3050c09b44a755c92d13e817fedd2284ec6224fe043fb"} Mar 19 19:20:25 crc kubenswrapper[4826]: I0319 19:20:25.244112 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.462015761 podStartE2EDuration="9.244094139s" podCreationTimestamp="2026-03-19 19:20:16 +0000 UTC" firstStartedPulling="2026-03-19 19:20:18.22077686 +0000 UTC m=+1442.974845173" lastFinishedPulling="2026-03-19 19:20:24.002855238 +0000 UTC m=+1448.756923551" observedRunningTime="2026-03-19 19:20:25.236515325 +0000 UTC m=+1449.990583638" watchObservedRunningTime="2026-03-19 19:20:25.244094139 +0000 UTC m=+1449.998162452" Mar 19 19:20:25 crc kubenswrapper[4826]: I0319 19:20:25.274919 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cml5k" podUID="af2662b9-3873-4947-9793-e7e1c6611dcb" containerName="registry-server" probeResult="failure" output=< Mar 19 19:20:25 crc kubenswrapper[4826]: timeout: failed to connect service ":50051" within 1s Mar 19 19:20:25 crc kubenswrapper[4826]: > Mar 19 19:20:25 crc kubenswrapper[4826]: I0319 19:20:25.283108 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.942681968 podStartE2EDuration="9.283087171s" podCreationTimestamp="2026-03-19 19:20:16 +0000 UTC" firstStartedPulling="2026-03-19 19:20:18.6651218 +0000 UTC m=+1443.419190113" lastFinishedPulling="2026-03-19 19:20:24.005527003 +0000 UTC m=+1448.759595316" observedRunningTime="2026-03-19 19:20:25.274628047 +0000 UTC m=+1450.028696370" watchObservedRunningTime="2026-03-19 19:20:25.283087171 +0000 UTC m=+1450.037155474" Mar 19 19:20:25 crc kubenswrapper[4826]: I0319 19:20:25.299213 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.153034882 podStartE2EDuration="8.29919143s" podCreationTimestamp="2026-03-19 19:20:17 +0000 UTC" firstStartedPulling="2026-03-19 19:20:18.860710427 +0000 UTC m=+1443.614778740" lastFinishedPulling="2026-03-19 19:20:24.006866965 +0000 UTC m=+1448.760935288" observedRunningTime="2026-03-19 19:20:25.298171265 +0000 UTC m=+1450.052239598" watchObservedRunningTime="2026-03-19 19:20:25.29919143 +0000 UTC m=+1450.053259753" Mar 19 19:20:25 crc kubenswrapper[4826]: I0319 19:20:25.320114 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.988352731 podStartE2EDuration="9.320096165s" podCreationTimestamp="2026-03-19 19:20:16 +0000 UTC" firstStartedPulling="2026-03-19 19:20:18.69080863 +0000 UTC m=+1443.444876943" lastFinishedPulling="2026-03-19 19:20:24.022552064 +0000 UTC m=+1448.776620377" observedRunningTime="2026-03-19 19:20:25.317026671 +0000 UTC m=+1450.071094984" watchObservedRunningTime="2026-03-19 19:20:25.320096165 +0000 UTC m=+1450.074164478" Mar 19 19:20:25 crc kubenswrapper[4826]: I0319 19:20:25.807775 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-xntrr"] Mar 19 19:20:25 crc kubenswrapper[4826]: E0319 19:20:25.808368 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="451fe5a3-8a05-42fd-8059-85084283a59f" containerName="extract-content" Mar 19 19:20:25 crc kubenswrapper[4826]: I0319 19:20:25.808399 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="451fe5a3-8a05-42fd-8059-85084283a59f" containerName="extract-content" Mar 19 19:20:25 crc kubenswrapper[4826]: E0319 19:20:25.808432 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="451fe5a3-8a05-42fd-8059-85084283a59f" containerName="registry-server" Mar 19 19:20:25 crc kubenswrapper[4826]: I0319 19:20:25.808438 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="451fe5a3-8a05-42fd-8059-85084283a59f" containerName="registry-server" Mar 19 19:20:25 crc kubenswrapper[4826]: E0319 19:20:25.808488 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="451fe5a3-8a05-42fd-8059-85084283a59f" containerName="extract-utilities" Mar 19 19:20:25 crc kubenswrapper[4826]: I0319 19:20:25.808496 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="451fe5a3-8a05-42fd-8059-85084283a59f" containerName="extract-utilities" Mar 19 19:20:25 crc kubenswrapper[4826]: I0319 19:20:25.808844 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="451fe5a3-8a05-42fd-8059-85084283a59f" containerName="registry-server" Mar 19 19:20:25 crc kubenswrapper[4826]: I0319 19:20:25.809912 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-xntrr" Mar 19 19:20:25 crc kubenswrapper[4826]: I0319 19:20:25.832747 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-xntrr"] Mar 19 19:20:25 crc kubenswrapper[4826]: I0319 19:20:25.877276 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c93b8d4-4e5b-4a8d-a8fb-de909730e675-operator-scripts\") pod \"aodh-db-create-xntrr\" (UID: \"6c93b8d4-4e5b-4a8d-a8fb-de909730e675\") " pod="openstack/aodh-db-create-xntrr" Mar 19 19:20:25 crc kubenswrapper[4826]: I0319 19:20:25.877437 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm7rc\" (UniqueName: \"kubernetes.io/projected/6c93b8d4-4e5b-4a8d-a8fb-de909730e675-kube-api-access-wm7rc\") pod \"aodh-db-create-xntrr\" (UID: \"6c93b8d4-4e5b-4a8d-a8fb-de909730e675\") " pod="openstack/aodh-db-create-xntrr" Mar 19 19:20:25 crc kubenswrapper[4826]: I0319 19:20:25.931292 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-b743-account-create-update-tg8gf"] Mar 19 19:20:25 crc kubenswrapper[4826]: I0319 19:20:25.933441 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-b743-account-create-update-tg8gf" Mar 19 19:20:25 crc kubenswrapper[4826]: I0319 19:20:25.946991 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Mar 19 19:20:25 crc kubenswrapper[4826]: I0319 19:20:25.949197 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-b743-account-create-update-tg8gf"] Mar 19 19:20:25 crc kubenswrapper[4826]: I0319 19:20:25.980075 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c79db235-4360-4ffc-a56e-b9fe0fad7eb6-operator-scripts\") pod \"aodh-b743-account-create-update-tg8gf\" (UID: \"c79db235-4360-4ffc-a56e-b9fe0fad7eb6\") " pod="openstack/aodh-b743-account-create-update-tg8gf" Mar 19 19:20:25 crc kubenswrapper[4826]: I0319 19:20:25.980209 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wm7rc\" (UniqueName: \"kubernetes.io/projected/6c93b8d4-4e5b-4a8d-a8fb-de909730e675-kube-api-access-wm7rc\") pod \"aodh-db-create-xntrr\" (UID: \"6c93b8d4-4e5b-4a8d-a8fb-de909730e675\") " pod="openstack/aodh-db-create-xntrr" Mar 19 19:20:25 crc kubenswrapper[4826]: I0319 19:20:25.980292 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjvhd\" (UniqueName: \"kubernetes.io/projected/c79db235-4360-4ffc-a56e-b9fe0fad7eb6-kube-api-access-kjvhd\") pod \"aodh-b743-account-create-update-tg8gf\" (UID: \"c79db235-4360-4ffc-a56e-b9fe0fad7eb6\") " pod="openstack/aodh-b743-account-create-update-tg8gf" Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.005353 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wm7rc\" (UniqueName: \"kubernetes.io/projected/6c93b8d4-4e5b-4a8d-a8fb-de909730e675-kube-api-access-wm7rc\") pod \"aodh-db-create-xntrr\" (UID: \"6c93b8d4-4e5b-4a8d-a8fb-de909730e675\") " pod="openstack/aodh-db-create-xntrr" Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.047373 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="451fe5a3-8a05-42fd-8059-85084283a59f" path="/var/lib/kubelet/pods/451fe5a3-8a05-42fd-8059-85084283a59f/volumes" Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.048575 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c93b8d4-4e5b-4a8d-a8fb-de909730e675-operator-scripts\") pod \"aodh-db-create-xntrr\" (UID: \"6c93b8d4-4e5b-4a8d-a8fb-de909730e675\") " pod="openstack/aodh-db-create-xntrr" Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.049348 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c93b8d4-4e5b-4a8d-a8fb-de909730e675-operator-scripts\") pod \"aodh-db-create-xntrr\" (UID: \"6c93b8d4-4e5b-4a8d-a8fb-de909730e675\") " pod="openstack/aodh-db-create-xntrr" Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.127031 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-xntrr" Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.150197 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c79db235-4360-4ffc-a56e-b9fe0fad7eb6-operator-scripts\") pod \"aodh-b743-account-create-update-tg8gf\" (UID: \"c79db235-4360-4ffc-a56e-b9fe0fad7eb6\") " pod="openstack/aodh-b743-account-create-update-tg8gf" Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.150423 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjvhd\" (UniqueName: \"kubernetes.io/projected/c79db235-4360-4ffc-a56e-b9fe0fad7eb6-kube-api-access-kjvhd\") pod \"aodh-b743-account-create-update-tg8gf\" (UID: \"c79db235-4360-4ffc-a56e-b9fe0fad7eb6\") " pod="openstack/aodh-b743-account-create-update-tg8gf" Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.151707 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c79db235-4360-4ffc-a56e-b9fe0fad7eb6-operator-scripts\") pod \"aodh-b743-account-create-update-tg8gf\" (UID: \"c79db235-4360-4ffc-a56e-b9fe0fad7eb6\") " pod="openstack/aodh-b743-account-create-update-tg8gf" Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.163345 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.171420 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjvhd\" (UniqueName: \"kubernetes.io/projected/c79db235-4360-4ffc-a56e-b9fe0fad7eb6-kube-api-access-kjvhd\") pod \"aodh-b743-account-create-update-tg8gf\" (UID: \"c79db235-4360-4ffc-a56e-b9fe0fad7eb6\") " pod="openstack/aodh-b743-account-create-update-tg8gf" Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.251940 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af8d8fc1-0ebe-47a6-87ff-f93d31c7490e-log-httpd\") pod \"af8d8fc1-0ebe-47a6-87ff-f93d31c7490e\" (UID: \"af8d8fc1-0ebe-47a6-87ff-f93d31c7490e\") " Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.252147 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79ptj\" (UniqueName: \"kubernetes.io/projected/af8d8fc1-0ebe-47a6-87ff-f93d31c7490e-kube-api-access-79ptj\") pod \"af8d8fc1-0ebe-47a6-87ff-f93d31c7490e\" (UID: \"af8d8fc1-0ebe-47a6-87ff-f93d31c7490e\") " Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.252235 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af8d8fc1-0ebe-47a6-87ff-f93d31c7490e-combined-ca-bundle\") pod \"af8d8fc1-0ebe-47a6-87ff-f93d31c7490e\" (UID: \"af8d8fc1-0ebe-47a6-87ff-f93d31c7490e\") " Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.255430 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af8d8fc1-0ebe-47a6-87ff-f93d31c7490e-config-data\") pod \"af8d8fc1-0ebe-47a6-87ff-f93d31c7490e\" (UID: \"af8d8fc1-0ebe-47a6-87ff-f93d31c7490e\") " Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.255557 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/af8d8fc1-0ebe-47a6-87ff-f93d31c7490e-sg-core-conf-yaml\") pod \"af8d8fc1-0ebe-47a6-87ff-f93d31c7490e\" (UID: \"af8d8fc1-0ebe-47a6-87ff-f93d31c7490e\") " Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.255586 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af8d8fc1-0ebe-47a6-87ff-f93d31c7490e-scripts\") pod \"af8d8fc1-0ebe-47a6-87ff-f93d31c7490e\" (UID: \"af8d8fc1-0ebe-47a6-87ff-f93d31c7490e\") " Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.255611 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af8d8fc1-0ebe-47a6-87ff-f93d31c7490e-run-httpd\") pod \"af8d8fc1-0ebe-47a6-87ff-f93d31c7490e\" (UID: \"af8d8fc1-0ebe-47a6-87ff-f93d31c7490e\") " Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.257033 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af8d8fc1-0ebe-47a6-87ff-f93d31c7490e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "af8d8fc1-0ebe-47a6-87ff-f93d31c7490e" (UID: "af8d8fc1-0ebe-47a6-87ff-f93d31c7490e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.261672 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-b743-account-create-update-tg8gf" Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.272508 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af8d8fc1-0ebe-47a6-87ff-f93d31c7490e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "af8d8fc1-0ebe-47a6-87ff-f93d31c7490e" (UID: "af8d8fc1-0ebe-47a6-87ff-f93d31c7490e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.275630 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af8d8fc1-0ebe-47a6-87ff-f93d31c7490e-scripts" (OuterVolumeSpecName: "scripts") pod "af8d8fc1-0ebe-47a6-87ff-f93d31c7490e" (UID: "af8d8fc1-0ebe-47a6-87ff-f93d31c7490e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.287542 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af8d8fc1-0ebe-47a6-87ff-f93d31c7490e-kube-api-access-79ptj" (OuterVolumeSpecName: "kube-api-access-79ptj") pod "af8d8fc1-0ebe-47a6-87ff-f93d31c7490e" (UID: "af8d8fc1-0ebe-47a6-87ff-f93d31c7490e"). InnerVolumeSpecName "kube-api-access-79ptj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.287764 4826 generic.go:334] "Generic (PLEG): container finished" podID="21e0fb8a-2c26-4c1d-973c-3819e82aac1e" containerID="6830913d3ded7f795234e7a87222830044655e55f8e49cba668eb7e61fa53f0b" exitCode=143 Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.287959 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"21e0fb8a-2c26-4c1d-973c-3819e82aac1e","Type":"ContainerDied","Data":"6830913d3ded7f795234e7a87222830044655e55f8e49cba668eb7e61fa53f0b"} Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.325939 4826 generic.go:334] "Generic (PLEG): container finished" podID="af8d8fc1-0ebe-47a6-87ff-f93d31c7490e" containerID="41047eb86fc17342447ec83766d21f38a4b8cf564912dc19cb0943a67d5d5b6a" exitCode=0 Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.326222 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.326402 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af8d8fc1-0ebe-47a6-87ff-f93d31c7490e","Type":"ContainerDied","Data":"41047eb86fc17342447ec83766d21f38a4b8cf564912dc19cb0943a67d5d5b6a"} Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.326448 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af8d8fc1-0ebe-47a6-87ff-f93d31c7490e","Type":"ContainerDied","Data":"d5afd36b70a251ab89cf89cc9c73d9cb885a8d3deb8b28743481910151c6633d"} Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.326465 4826 scope.go:117] "RemoveContainer" containerID="3e4445853271e84079570a81ffc46b3df8683dcdabdf90ffc1294de30e4fc96a" Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.344967 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af8d8fc1-0ebe-47a6-87ff-f93d31c7490e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "af8d8fc1-0ebe-47a6-87ff-f93d31c7490e" (UID: "af8d8fc1-0ebe-47a6-87ff-f93d31c7490e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.358910 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79ptj\" (UniqueName: \"kubernetes.io/projected/af8d8fc1-0ebe-47a6-87ff-f93d31c7490e-kube-api-access-79ptj\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.358982 4826 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/af8d8fc1-0ebe-47a6-87ff-f93d31c7490e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.358993 4826 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af8d8fc1-0ebe-47a6-87ff-f93d31c7490e-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.359003 4826 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af8d8fc1-0ebe-47a6-87ff-f93d31c7490e-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.359012 4826 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af8d8fc1-0ebe-47a6-87ff-f93d31c7490e-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.410904 4826 scope.go:117] "RemoveContainer" containerID="308f90651aebf6ede5a02875cbdf006ce7685904bcdee6e4980004b0ee3a7aef" Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.464783 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af8d8fc1-0ebe-47a6-87ff-f93d31c7490e-config-data" (OuterVolumeSpecName: "config-data") pod "af8d8fc1-0ebe-47a6-87ff-f93d31c7490e" (UID: "af8d8fc1-0ebe-47a6-87ff-f93d31c7490e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.475160 4826 scope.go:117] "RemoveContainer" containerID="41047eb86fc17342447ec83766d21f38a4b8cf564912dc19cb0943a67d5d5b6a" Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.523423 4826 scope.go:117] "RemoveContainer" containerID="aece7260bc6cb44f26a3050c09b44a755c92d13e817fedd2284ec6224fe043fb" Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.532878 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af8d8fc1-0ebe-47a6-87ff-f93d31c7490e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "af8d8fc1-0ebe-47a6-87ff-f93d31c7490e" (UID: "af8d8fc1-0ebe-47a6-87ff-f93d31c7490e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.561835 4826 scope.go:117] "RemoveContainer" containerID="3e4445853271e84079570a81ffc46b3df8683dcdabdf90ffc1294de30e4fc96a" Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.563854 4826 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af8d8fc1-0ebe-47a6-87ff-f93d31c7490e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.563973 4826 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af8d8fc1-0ebe-47a6-87ff-f93d31c7490e-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:26 crc kubenswrapper[4826]: E0319 19:20:26.564425 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e4445853271e84079570a81ffc46b3df8683dcdabdf90ffc1294de30e4fc96a\": container with ID starting with 3e4445853271e84079570a81ffc46b3df8683dcdabdf90ffc1294de30e4fc96a not found: ID does not exist" containerID="3e4445853271e84079570a81ffc46b3df8683dcdabdf90ffc1294de30e4fc96a" Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.564537 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e4445853271e84079570a81ffc46b3df8683dcdabdf90ffc1294de30e4fc96a"} err="failed to get container status \"3e4445853271e84079570a81ffc46b3df8683dcdabdf90ffc1294de30e4fc96a\": rpc error: code = NotFound desc = could not find container \"3e4445853271e84079570a81ffc46b3df8683dcdabdf90ffc1294de30e4fc96a\": container with ID starting with 3e4445853271e84079570a81ffc46b3df8683dcdabdf90ffc1294de30e4fc96a not found: ID does not exist" Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.564647 4826 scope.go:117] "RemoveContainer" containerID="308f90651aebf6ede5a02875cbdf006ce7685904bcdee6e4980004b0ee3a7aef" Mar 19 19:20:26 crc kubenswrapper[4826]: E0319 19:20:26.566547 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"308f90651aebf6ede5a02875cbdf006ce7685904bcdee6e4980004b0ee3a7aef\": container with ID starting with 308f90651aebf6ede5a02875cbdf006ce7685904bcdee6e4980004b0ee3a7aef not found: ID does not exist" containerID="308f90651aebf6ede5a02875cbdf006ce7685904bcdee6e4980004b0ee3a7aef" Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.566675 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"308f90651aebf6ede5a02875cbdf006ce7685904bcdee6e4980004b0ee3a7aef"} err="failed to get container status \"308f90651aebf6ede5a02875cbdf006ce7685904bcdee6e4980004b0ee3a7aef\": rpc error: code = NotFound desc = could not find container \"308f90651aebf6ede5a02875cbdf006ce7685904bcdee6e4980004b0ee3a7aef\": container with ID starting with 308f90651aebf6ede5a02875cbdf006ce7685904bcdee6e4980004b0ee3a7aef not found: ID does not exist" Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.567751 4826 scope.go:117] "RemoveContainer" containerID="41047eb86fc17342447ec83766d21f38a4b8cf564912dc19cb0943a67d5d5b6a" Mar 19 19:20:26 crc kubenswrapper[4826]: E0319 19:20:26.569798 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41047eb86fc17342447ec83766d21f38a4b8cf564912dc19cb0943a67d5d5b6a\": container with ID starting with 41047eb86fc17342447ec83766d21f38a4b8cf564912dc19cb0943a67d5d5b6a not found: ID does not exist" containerID="41047eb86fc17342447ec83766d21f38a4b8cf564912dc19cb0943a67d5d5b6a" Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.577144 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41047eb86fc17342447ec83766d21f38a4b8cf564912dc19cb0943a67d5d5b6a"} err="failed to get container status \"41047eb86fc17342447ec83766d21f38a4b8cf564912dc19cb0943a67d5d5b6a\": rpc error: code = NotFound desc = could not find container \"41047eb86fc17342447ec83766d21f38a4b8cf564912dc19cb0943a67d5d5b6a\": container with ID starting with 41047eb86fc17342447ec83766d21f38a4b8cf564912dc19cb0943a67d5d5b6a not found: ID does not exist" Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.577230 4826 scope.go:117] "RemoveContainer" containerID="aece7260bc6cb44f26a3050c09b44a755c92d13e817fedd2284ec6224fe043fb" Mar 19 19:20:26 crc kubenswrapper[4826]: E0319 19:20:26.591223 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aece7260bc6cb44f26a3050c09b44a755c92d13e817fedd2284ec6224fe043fb\": container with ID starting with aece7260bc6cb44f26a3050c09b44a755c92d13e817fedd2284ec6224fe043fb not found: ID does not exist" containerID="aece7260bc6cb44f26a3050c09b44a755c92d13e817fedd2284ec6224fe043fb" Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.591266 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aece7260bc6cb44f26a3050c09b44a755c92d13e817fedd2284ec6224fe043fb"} err="failed to get container status \"aece7260bc6cb44f26a3050c09b44a755c92d13e817fedd2284ec6224fe043fb\": rpc error: code = NotFound desc = could not find container \"aece7260bc6cb44f26a3050c09b44a755c92d13e817fedd2284ec6224fe043fb\": container with ID starting with aece7260bc6cb44f26a3050c09b44a755c92d13e817fedd2284ec6224fe043fb not found: ID does not exist" Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.728637 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.744481 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.780216 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-xntrr"] Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.803810 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:20:26 crc kubenswrapper[4826]: E0319 19:20:26.804322 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af8d8fc1-0ebe-47a6-87ff-f93d31c7490e" containerName="ceilometer-central-agent" Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.804341 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="af8d8fc1-0ebe-47a6-87ff-f93d31c7490e" containerName="ceilometer-central-agent" Mar 19 19:20:26 crc kubenswrapper[4826]: E0319 19:20:26.804361 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af8d8fc1-0ebe-47a6-87ff-f93d31c7490e" containerName="proxy-httpd" Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.804367 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="af8d8fc1-0ebe-47a6-87ff-f93d31c7490e" containerName="proxy-httpd" Mar 19 19:20:26 crc kubenswrapper[4826]: E0319 19:20:26.804401 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af8d8fc1-0ebe-47a6-87ff-f93d31c7490e" containerName="sg-core" Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.804408 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="af8d8fc1-0ebe-47a6-87ff-f93d31c7490e" containerName="sg-core" Mar 19 19:20:26 crc kubenswrapper[4826]: E0319 19:20:26.804425 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af8d8fc1-0ebe-47a6-87ff-f93d31c7490e" containerName="ceilometer-notification-agent" Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.804432 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="af8d8fc1-0ebe-47a6-87ff-f93d31c7490e" containerName="ceilometer-notification-agent" Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.804644 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="af8d8fc1-0ebe-47a6-87ff-f93d31c7490e" containerName="ceilometer-central-agent" Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.804680 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="af8d8fc1-0ebe-47a6-87ff-f93d31c7490e" containerName="ceilometer-notification-agent" Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.804691 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="af8d8fc1-0ebe-47a6-87ff-f93d31c7490e" containerName="sg-core" Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.804700 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="af8d8fc1-0ebe-47a6-87ff-f93d31c7490e" containerName="proxy-httpd" Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.807030 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.812735 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.814217 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.822290 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.896479 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-b743-account-create-update-tg8gf"] Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.981343 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35ddd07d-67cd-45c0-a20f-956fcadee9f2-config-data\") pod \"ceilometer-0\" (UID: \"35ddd07d-67cd-45c0-a20f-956fcadee9f2\") " pod="openstack/ceilometer-0" Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.981423 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35ddd07d-67cd-45c0-a20f-956fcadee9f2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"35ddd07d-67cd-45c0-a20f-956fcadee9f2\") " pod="openstack/ceilometer-0" Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.981488 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsdbg\" (UniqueName: \"kubernetes.io/projected/35ddd07d-67cd-45c0-a20f-956fcadee9f2-kube-api-access-tsdbg\") pod \"ceilometer-0\" (UID: \"35ddd07d-67cd-45c0-a20f-956fcadee9f2\") " pod="openstack/ceilometer-0" Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.981610 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35ddd07d-67cd-45c0-a20f-956fcadee9f2-scripts\") pod \"ceilometer-0\" (UID: \"35ddd07d-67cd-45c0-a20f-956fcadee9f2\") " pod="openstack/ceilometer-0" Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.981703 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35ddd07d-67cd-45c0-a20f-956fcadee9f2-log-httpd\") pod \"ceilometer-0\" (UID: \"35ddd07d-67cd-45c0-a20f-956fcadee9f2\") " pod="openstack/ceilometer-0" Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.981735 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35ddd07d-67cd-45c0-a20f-956fcadee9f2-run-httpd\") pod \"ceilometer-0\" (UID: \"35ddd07d-67cd-45c0-a20f-956fcadee9f2\") " pod="openstack/ceilometer-0" Mar 19 19:20:26 crc kubenswrapper[4826]: I0319 19:20:26.981798 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/35ddd07d-67cd-45c0-a20f-956fcadee9f2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"35ddd07d-67cd-45c0-a20f-956fcadee9f2\") " pod="openstack/ceilometer-0" Mar 19 19:20:27 crc kubenswrapper[4826]: I0319 19:20:27.084101 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsdbg\" (UniqueName: \"kubernetes.io/projected/35ddd07d-67cd-45c0-a20f-956fcadee9f2-kube-api-access-tsdbg\") pod \"ceilometer-0\" (UID: \"35ddd07d-67cd-45c0-a20f-956fcadee9f2\") " pod="openstack/ceilometer-0" Mar 19 19:20:27 crc kubenswrapper[4826]: I0319 19:20:27.084481 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35ddd07d-67cd-45c0-a20f-956fcadee9f2-scripts\") pod \"ceilometer-0\" (UID: \"35ddd07d-67cd-45c0-a20f-956fcadee9f2\") " pod="openstack/ceilometer-0" Mar 19 19:20:27 crc kubenswrapper[4826]: I0319 19:20:27.084538 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35ddd07d-67cd-45c0-a20f-956fcadee9f2-log-httpd\") pod \"ceilometer-0\" (UID: \"35ddd07d-67cd-45c0-a20f-956fcadee9f2\") " pod="openstack/ceilometer-0" Mar 19 19:20:27 crc kubenswrapper[4826]: I0319 19:20:27.084567 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35ddd07d-67cd-45c0-a20f-956fcadee9f2-run-httpd\") pod \"ceilometer-0\" (UID: \"35ddd07d-67cd-45c0-a20f-956fcadee9f2\") " pod="openstack/ceilometer-0" Mar 19 19:20:27 crc kubenswrapper[4826]: I0319 19:20:27.084619 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/35ddd07d-67cd-45c0-a20f-956fcadee9f2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"35ddd07d-67cd-45c0-a20f-956fcadee9f2\") " pod="openstack/ceilometer-0" Mar 19 19:20:27 crc kubenswrapper[4826]: I0319 19:20:27.084704 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35ddd07d-67cd-45c0-a20f-956fcadee9f2-config-data\") pod \"ceilometer-0\" (UID: \"35ddd07d-67cd-45c0-a20f-956fcadee9f2\") " pod="openstack/ceilometer-0" Mar 19 19:20:27 crc kubenswrapper[4826]: I0319 19:20:27.084744 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35ddd07d-67cd-45c0-a20f-956fcadee9f2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"35ddd07d-67cd-45c0-a20f-956fcadee9f2\") " pod="openstack/ceilometer-0" Mar 19 19:20:27 crc kubenswrapper[4826]: I0319 19:20:27.085084 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35ddd07d-67cd-45c0-a20f-956fcadee9f2-run-httpd\") pod \"ceilometer-0\" (UID: \"35ddd07d-67cd-45c0-a20f-956fcadee9f2\") " pod="openstack/ceilometer-0" Mar 19 19:20:27 crc kubenswrapper[4826]: I0319 19:20:27.085379 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35ddd07d-67cd-45c0-a20f-956fcadee9f2-log-httpd\") pod \"ceilometer-0\" (UID: \"35ddd07d-67cd-45c0-a20f-956fcadee9f2\") " pod="openstack/ceilometer-0" Mar 19 19:20:27 crc kubenswrapper[4826]: I0319 19:20:27.090041 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35ddd07d-67cd-45c0-a20f-956fcadee9f2-config-data\") pod \"ceilometer-0\" (UID: \"35ddd07d-67cd-45c0-a20f-956fcadee9f2\") " pod="openstack/ceilometer-0" Mar 19 19:20:27 crc kubenswrapper[4826]: I0319 19:20:27.091451 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35ddd07d-67cd-45c0-a20f-956fcadee9f2-scripts\") pod \"ceilometer-0\" (UID: \"35ddd07d-67cd-45c0-a20f-956fcadee9f2\") " pod="openstack/ceilometer-0" Mar 19 19:20:27 crc kubenswrapper[4826]: I0319 19:20:27.093220 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/35ddd07d-67cd-45c0-a20f-956fcadee9f2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"35ddd07d-67cd-45c0-a20f-956fcadee9f2\") " pod="openstack/ceilometer-0" Mar 19 19:20:27 crc kubenswrapper[4826]: I0319 19:20:27.095641 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35ddd07d-67cd-45c0-a20f-956fcadee9f2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"35ddd07d-67cd-45c0-a20f-956fcadee9f2\") " pod="openstack/ceilometer-0" Mar 19 19:20:27 crc kubenswrapper[4826]: I0319 19:20:27.109408 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsdbg\" (UniqueName: \"kubernetes.io/projected/35ddd07d-67cd-45c0-a20f-956fcadee9f2-kube-api-access-tsdbg\") pod \"ceilometer-0\" (UID: \"35ddd07d-67cd-45c0-a20f-956fcadee9f2\") " pod="openstack/ceilometer-0" Mar 19 19:20:27 crc kubenswrapper[4826]: I0319 19:20:27.141275 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:20:27 crc kubenswrapper[4826]: I0319 19:20:27.348424 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 19 19:20:27 crc kubenswrapper[4826]: I0319 19:20:27.348681 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 19 19:20:27 crc kubenswrapper[4826]: I0319 19:20:27.369295 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-b743-account-create-update-tg8gf" event={"ID":"c79db235-4360-4ffc-a56e-b9fe0fad7eb6","Type":"ContainerStarted","Data":"233819d697b61141ef384fc05a50210d065f07c7f69db5d65d6425edd348850e"} Mar 19 19:20:27 crc kubenswrapper[4826]: I0319 19:20:27.369345 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-b743-account-create-update-tg8gf" event={"ID":"c79db235-4360-4ffc-a56e-b9fe0fad7eb6","Type":"ContainerStarted","Data":"2a001d41b29d571a125fa10b1f67414b8df9a64c8b0ae9f523d88152953ca378"} Mar 19 19:20:27 crc kubenswrapper[4826]: I0319 19:20:27.377436 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-xntrr" event={"ID":"6c93b8d4-4e5b-4a8d-a8fb-de909730e675","Type":"ContainerStarted","Data":"ec3eb7c4919e7e32538e479de44132a4e46e4a61c8b886700cefcb8d689aab77"} Mar 19 19:20:27 crc kubenswrapper[4826]: I0319 19:20:27.377472 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-xntrr" event={"ID":"6c93b8d4-4e5b-4a8d-a8fb-de909730e675","Type":"ContainerStarted","Data":"9d2983cb77566bae1c8a715cc266f593081b1f23840e46da89b3d93ff9546de8"} Mar 19 19:20:27 crc kubenswrapper[4826]: I0319 19:20:27.386077 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-b743-account-create-update-tg8gf" podStartSLOduration=2.386059499 podStartE2EDuration="2.386059499s" podCreationTimestamp="2026-03-19 19:20:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:20:27.382716657 +0000 UTC m=+1452.136784980" watchObservedRunningTime="2026-03-19 19:20:27.386059499 +0000 UTC m=+1452.140127812" Mar 19 19:20:27 crc kubenswrapper[4826]: I0319 19:20:27.418415 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 19 19:20:27 crc kubenswrapper[4826]: I0319 19:20:27.417648 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-create-xntrr" podStartSLOduration=2.417630981 podStartE2EDuration="2.417630981s" podCreationTimestamp="2026-03-19 19:20:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:20:27.398624081 +0000 UTC m=+1452.152692414" watchObservedRunningTime="2026-03-19 19:20:27.417630981 +0000 UTC m=+1452.171699294" Mar 19 19:20:27 crc kubenswrapper[4826]: I0319 19:20:27.419169 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 19 19:20:27 crc kubenswrapper[4826]: I0319 19:20:27.419192 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 19 19:20:27 crc kubenswrapper[4826]: I0319 19:20:27.728436 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:20:27 crc kubenswrapper[4826]: I0319 19:20:27.785479 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-568d7fd7cf-2l77b" Mar 19 19:20:27 crc kubenswrapper[4826]: I0319 19:20:27.797479 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:20:27 crc kubenswrapper[4826]: I0319 19:20:27.900899 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688b9f5b49-sspxd"] Mar 19 19:20:27 crc kubenswrapper[4826]: I0319 19:20:27.901118 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-688b9f5b49-sspxd" podUID="e4769a42-cedc-492d-a6bf-e1ccf3cd77bc" containerName="dnsmasq-dns" containerID="cri-o://1286bc0ae1e60c4fd1bca205079bc99eb86cd33504aed4a0de1f5eeec643da4a" gracePeriod=10 Mar 19 19:20:27 crc kubenswrapper[4826]: I0319 19:20:27.993012 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af8d8fc1-0ebe-47a6-87ff-f93d31c7490e" path="/var/lib/kubelet/pods/af8d8fc1-0ebe-47a6-87ff-f93d31c7490e/volumes" Mar 19 19:20:28 crc kubenswrapper[4826]: I0319 19:20:28.390974 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35ddd07d-67cd-45c0-a20f-956fcadee9f2","Type":"ContainerStarted","Data":"2c9d73708d01c663763cd07ad9bb981eb1253d77c36a8e1d5bb5eefe3ec9a363"} Mar 19 19:20:28 crc kubenswrapper[4826]: I0319 19:20:28.396194 4826 generic.go:334] "Generic (PLEG): container finished" podID="c79db235-4360-4ffc-a56e-b9fe0fad7eb6" containerID="233819d697b61141ef384fc05a50210d065f07c7f69db5d65d6425edd348850e" exitCode=0 Mar 19 19:20:28 crc kubenswrapper[4826]: I0319 19:20:28.396280 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-b743-account-create-update-tg8gf" event={"ID":"c79db235-4360-4ffc-a56e-b9fe0fad7eb6","Type":"ContainerDied","Data":"233819d697b61141ef384fc05a50210d065f07c7f69db5d65d6425edd348850e"} Mar 19 19:20:28 crc kubenswrapper[4826]: I0319 19:20:28.413336 4826 generic.go:334] "Generic (PLEG): container finished" podID="6c93b8d4-4e5b-4a8d-a8fb-de909730e675" containerID="ec3eb7c4919e7e32538e479de44132a4e46e4a61c8b886700cefcb8d689aab77" exitCode=0 Mar 19 19:20:28 crc kubenswrapper[4826]: I0319 19:20:28.413450 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-xntrr" event={"ID":"6c93b8d4-4e5b-4a8d-a8fb-de909730e675","Type":"ContainerDied","Data":"ec3eb7c4919e7e32538e479de44132a4e46e4a61c8b886700cefcb8d689aab77"} Mar 19 19:20:28 crc kubenswrapper[4826]: I0319 19:20:28.417887 4826 generic.go:334] "Generic (PLEG): container finished" podID="e4769a42-cedc-492d-a6bf-e1ccf3cd77bc" containerID="1286bc0ae1e60c4fd1bca205079bc99eb86cd33504aed4a0de1f5eeec643da4a" exitCode=0 Mar 19 19:20:28 crc kubenswrapper[4826]: I0319 19:20:28.419074 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688b9f5b49-sspxd" event={"ID":"e4769a42-cedc-492d-a6bf-e1ccf3cd77bc","Type":"ContainerDied","Data":"1286bc0ae1e60c4fd1bca205079bc99eb86cd33504aed4a0de1f5eeec643da4a"} Mar 19 19:20:28 crc kubenswrapper[4826]: I0319 19:20:28.498311 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 19 19:20:28 crc kubenswrapper[4826]: I0319 19:20:28.502261 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="69c44859-378d-42ac-af83-30fd8b5e9010" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.247:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 19:20:28 crc kubenswrapper[4826]: I0319 19:20:28.502575 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="69c44859-378d-42ac-af83-30fd8b5e9010" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.247:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 19:20:28 crc kubenswrapper[4826]: I0319 19:20:28.846942 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688b9f5b49-sspxd" Mar 19 19:20:28 crc kubenswrapper[4826]: I0319 19:20:28.946495 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e4769a42-cedc-492d-a6bf-e1ccf3cd77bc-ovsdbserver-nb\") pod \"e4769a42-cedc-492d-a6bf-e1ccf3cd77bc\" (UID: \"e4769a42-cedc-492d-a6bf-e1ccf3cd77bc\") " Mar 19 19:20:28 crc kubenswrapper[4826]: I0319 19:20:28.946835 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e4769a42-cedc-492d-a6bf-e1ccf3cd77bc-ovsdbserver-sb\") pod \"e4769a42-cedc-492d-a6bf-e1ccf3cd77bc\" (UID: \"e4769a42-cedc-492d-a6bf-e1ccf3cd77bc\") " Mar 19 19:20:28 crc kubenswrapper[4826]: I0319 19:20:28.947253 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e4769a42-cedc-492d-a6bf-e1ccf3cd77bc-dns-svc\") pod \"e4769a42-cedc-492d-a6bf-e1ccf3cd77bc\" (UID: \"e4769a42-cedc-492d-a6bf-e1ccf3cd77bc\") " Mar 19 19:20:28 crc kubenswrapper[4826]: I0319 19:20:28.947424 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e4769a42-cedc-492d-a6bf-e1ccf3cd77bc-dns-swift-storage-0\") pod \"e4769a42-cedc-492d-a6bf-e1ccf3cd77bc\" (UID: \"e4769a42-cedc-492d-a6bf-e1ccf3cd77bc\") " Mar 19 19:20:28 crc kubenswrapper[4826]: I0319 19:20:28.947523 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlflq\" (UniqueName: \"kubernetes.io/projected/e4769a42-cedc-492d-a6bf-e1ccf3cd77bc-kube-api-access-vlflq\") pod \"e4769a42-cedc-492d-a6bf-e1ccf3cd77bc\" (UID: \"e4769a42-cedc-492d-a6bf-e1ccf3cd77bc\") " Mar 19 19:20:28 crc kubenswrapper[4826]: I0319 19:20:28.947610 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4769a42-cedc-492d-a6bf-e1ccf3cd77bc-config\") pod \"e4769a42-cedc-492d-a6bf-e1ccf3cd77bc\" (UID: \"e4769a42-cedc-492d-a6bf-e1ccf3cd77bc\") " Mar 19 19:20:28 crc kubenswrapper[4826]: I0319 19:20:28.960865 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4769a42-cedc-492d-a6bf-e1ccf3cd77bc-kube-api-access-vlflq" (OuterVolumeSpecName: "kube-api-access-vlflq") pod "e4769a42-cedc-492d-a6bf-e1ccf3cd77bc" (UID: "e4769a42-cedc-492d-a6bf-e1ccf3cd77bc"). InnerVolumeSpecName "kube-api-access-vlflq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:20:29 crc kubenswrapper[4826]: I0319 19:20:29.017469 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4769a42-cedc-492d-a6bf-e1ccf3cd77bc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e4769a42-cedc-492d-a6bf-e1ccf3cd77bc" (UID: "e4769a42-cedc-492d-a6bf-e1ccf3cd77bc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:20:29 crc kubenswrapper[4826]: I0319 19:20:29.035037 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4769a42-cedc-492d-a6bf-e1ccf3cd77bc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e4769a42-cedc-492d-a6bf-e1ccf3cd77bc" (UID: "e4769a42-cedc-492d-a6bf-e1ccf3cd77bc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:20:29 crc kubenswrapper[4826]: I0319 19:20:29.050800 4826 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e4769a42-cedc-492d-a6bf-e1ccf3cd77bc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:29 crc kubenswrapper[4826]: I0319 19:20:29.050832 4826 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e4769a42-cedc-492d-a6bf-e1ccf3cd77bc-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:29 crc kubenswrapper[4826]: I0319 19:20:29.050842 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlflq\" (UniqueName: \"kubernetes.io/projected/e4769a42-cedc-492d-a6bf-e1ccf3cd77bc-kube-api-access-vlflq\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:29 crc kubenswrapper[4826]: I0319 19:20:29.052630 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4769a42-cedc-492d-a6bf-e1ccf3cd77bc-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e4769a42-cedc-492d-a6bf-e1ccf3cd77bc" (UID: "e4769a42-cedc-492d-a6bf-e1ccf3cd77bc"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:20:29 crc kubenswrapper[4826]: I0319 19:20:29.061278 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4769a42-cedc-492d-a6bf-e1ccf3cd77bc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e4769a42-cedc-492d-a6bf-e1ccf3cd77bc" (UID: "e4769a42-cedc-492d-a6bf-e1ccf3cd77bc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:20:29 crc kubenswrapper[4826]: I0319 19:20:29.064407 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4769a42-cedc-492d-a6bf-e1ccf3cd77bc-config" (OuterVolumeSpecName: "config") pod "e4769a42-cedc-492d-a6bf-e1ccf3cd77bc" (UID: "e4769a42-cedc-492d-a6bf-e1ccf3cd77bc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:20:29 crc kubenswrapper[4826]: I0319 19:20:29.153159 4826 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e4769a42-cedc-492d-a6bf-e1ccf3cd77bc-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:29 crc kubenswrapper[4826]: I0319 19:20:29.153266 4826 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4769a42-cedc-492d-a6bf-e1ccf3cd77bc-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:29 crc kubenswrapper[4826]: I0319 19:20:29.153336 4826 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e4769a42-cedc-492d-a6bf-e1ccf3cd77bc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:29 crc kubenswrapper[4826]: I0319 19:20:29.434244 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688b9f5b49-sspxd" event={"ID":"e4769a42-cedc-492d-a6bf-e1ccf3cd77bc","Type":"ContainerDied","Data":"3160f3b77bf7ac243fdd1c5ab2cc5953a965a3d7bc54a238ae25234fc78ed0cf"} Mar 19 19:20:29 crc kubenswrapper[4826]: I0319 19:20:29.434312 4826 scope.go:117] "RemoveContainer" containerID="1286bc0ae1e60c4fd1bca205079bc99eb86cd33504aed4a0de1f5eeec643da4a" Mar 19 19:20:29 crc kubenswrapper[4826]: I0319 19:20:29.434252 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688b9f5b49-sspxd" Mar 19 19:20:29 crc kubenswrapper[4826]: I0319 19:20:29.473087 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688b9f5b49-sspxd"] Mar 19 19:20:29 crc kubenswrapper[4826]: I0319 19:20:29.484563 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-688b9f5b49-sspxd"] Mar 19 19:20:29 crc kubenswrapper[4826]: I0319 19:20:29.571148 4826 scope.go:117] "RemoveContainer" containerID="78fcfab2b14088f7eccbcaa73d544a229ad6d4384e2efc2c80135a6bef514d16" Mar 19 19:20:30 crc kubenswrapper[4826]: I0319 19:20:30.008753 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4769a42-cedc-492d-a6bf-e1ccf3cd77bc" path="/var/lib/kubelet/pods/e4769a42-cedc-492d-a6bf-e1ccf3cd77bc/volumes" Mar 19 19:20:30 crc kubenswrapper[4826]: I0319 19:20:30.301558 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-b743-account-create-update-tg8gf" Mar 19 19:20:30 crc kubenswrapper[4826]: I0319 19:20:30.309907 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-xntrr" Mar 19 19:20:30 crc kubenswrapper[4826]: I0319 19:20:30.382849 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wm7rc\" (UniqueName: \"kubernetes.io/projected/6c93b8d4-4e5b-4a8d-a8fb-de909730e675-kube-api-access-wm7rc\") pod \"6c93b8d4-4e5b-4a8d-a8fb-de909730e675\" (UID: \"6c93b8d4-4e5b-4a8d-a8fb-de909730e675\") " Mar 19 19:20:30 crc kubenswrapper[4826]: I0319 19:20:30.382988 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c79db235-4360-4ffc-a56e-b9fe0fad7eb6-operator-scripts\") pod \"c79db235-4360-4ffc-a56e-b9fe0fad7eb6\" (UID: \"c79db235-4360-4ffc-a56e-b9fe0fad7eb6\") " Mar 19 19:20:30 crc kubenswrapper[4826]: I0319 19:20:30.383067 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c93b8d4-4e5b-4a8d-a8fb-de909730e675-operator-scripts\") pod \"6c93b8d4-4e5b-4a8d-a8fb-de909730e675\" (UID: \"6c93b8d4-4e5b-4a8d-a8fb-de909730e675\") " Mar 19 19:20:30 crc kubenswrapper[4826]: I0319 19:20:30.383170 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjvhd\" (UniqueName: \"kubernetes.io/projected/c79db235-4360-4ffc-a56e-b9fe0fad7eb6-kube-api-access-kjvhd\") pod \"c79db235-4360-4ffc-a56e-b9fe0fad7eb6\" (UID: \"c79db235-4360-4ffc-a56e-b9fe0fad7eb6\") " Mar 19 19:20:30 crc kubenswrapper[4826]: I0319 19:20:30.384343 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c79db235-4360-4ffc-a56e-b9fe0fad7eb6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c79db235-4360-4ffc-a56e-b9fe0fad7eb6" (UID: "c79db235-4360-4ffc-a56e-b9fe0fad7eb6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:20:30 crc kubenswrapper[4826]: I0319 19:20:30.384353 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c93b8d4-4e5b-4a8d-a8fb-de909730e675-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6c93b8d4-4e5b-4a8d-a8fb-de909730e675" (UID: "6c93b8d4-4e5b-4a8d-a8fb-de909730e675"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:20:30 crc kubenswrapper[4826]: I0319 19:20:30.392292 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c93b8d4-4e5b-4a8d-a8fb-de909730e675-kube-api-access-wm7rc" (OuterVolumeSpecName: "kube-api-access-wm7rc") pod "6c93b8d4-4e5b-4a8d-a8fb-de909730e675" (UID: "6c93b8d4-4e5b-4a8d-a8fb-de909730e675"). InnerVolumeSpecName "kube-api-access-wm7rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:20:30 crc kubenswrapper[4826]: I0319 19:20:30.394489 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c79db235-4360-4ffc-a56e-b9fe0fad7eb6-kube-api-access-kjvhd" (OuterVolumeSpecName: "kube-api-access-kjvhd") pod "c79db235-4360-4ffc-a56e-b9fe0fad7eb6" (UID: "c79db235-4360-4ffc-a56e-b9fe0fad7eb6"). InnerVolumeSpecName "kube-api-access-kjvhd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:20:30 crc kubenswrapper[4826]: I0319 19:20:30.459526 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-b743-account-create-update-tg8gf" event={"ID":"c79db235-4360-4ffc-a56e-b9fe0fad7eb6","Type":"ContainerDied","Data":"2a001d41b29d571a125fa10b1f67414b8df9a64c8b0ae9f523d88152953ca378"} Mar 19 19:20:30 crc kubenswrapper[4826]: I0319 19:20:30.459571 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a001d41b29d571a125fa10b1f67414b8df9a64c8b0ae9f523d88152953ca378" Mar 19 19:20:30 crc kubenswrapper[4826]: I0319 19:20:30.459578 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-b743-account-create-update-tg8gf" Mar 19 19:20:30 crc kubenswrapper[4826]: I0319 19:20:30.462578 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-xntrr" event={"ID":"6c93b8d4-4e5b-4a8d-a8fb-de909730e675","Type":"ContainerDied","Data":"9d2983cb77566bae1c8a715cc266f593081b1f23840e46da89b3d93ff9546de8"} Mar 19 19:20:30 crc kubenswrapper[4826]: I0319 19:20:30.462602 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d2983cb77566bae1c8a715cc266f593081b1f23840e46da89b3d93ff9546de8" Mar 19 19:20:30 crc kubenswrapper[4826]: I0319 19:20:30.462676 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-xntrr" Mar 19 19:20:30 crc kubenswrapper[4826]: I0319 19:20:30.492313 4826 generic.go:334] "Generic (PLEG): container finished" podID="05dd165a-9504-4521-9814-4e252234fd9b" containerID="95feeb7d4dd3a671e8330b63a53f2465bf33a41ef703a61445cf7d0fb1715f3d" exitCode=0 Mar 19 19:20:30 crc kubenswrapper[4826]: I0319 19:20:30.492366 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-jsjkk" event={"ID":"05dd165a-9504-4521-9814-4e252234fd9b","Type":"ContainerDied","Data":"95feeb7d4dd3a671e8330b63a53f2465bf33a41ef703a61445cf7d0fb1715f3d"} Mar 19 19:20:30 crc kubenswrapper[4826]: I0319 19:20:30.497624 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjvhd\" (UniqueName: \"kubernetes.io/projected/c79db235-4360-4ffc-a56e-b9fe0fad7eb6-kube-api-access-kjvhd\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:30 crc kubenswrapper[4826]: I0319 19:20:30.497679 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wm7rc\" (UniqueName: \"kubernetes.io/projected/6c93b8d4-4e5b-4a8d-a8fb-de909730e675-kube-api-access-wm7rc\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:30 crc kubenswrapper[4826]: I0319 19:20:30.497772 4826 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c79db235-4360-4ffc-a56e-b9fe0fad7eb6-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:30 crc kubenswrapper[4826]: I0319 19:20:30.497806 4826 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c93b8d4-4e5b-4a8d-a8fb-de909730e675-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:31 crc kubenswrapper[4826]: I0319 19:20:31.249240 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-76txf"] Mar 19 19:20:31 crc kubenswrapper[4826]: E0319 19:20:31.250184 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4769a42-cedc-492d-a6bf-e1ccf3cd77bc" containerName="dnsmasq-dns" Mar 19 19:20:31 crc kubenswrapper[4826]: I0319 19:20:31.250208 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4769a42-cedc-492d-a6bf-e1ccf3cd77bc" containerName="dnsmasq-dns" Mar 19 19:20:31 crc kubenswrapper[4826]: E0319 19:20:31.250237 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c79db235-4360-4ffc-a56e-b9fe0fad7eb6" containerName="mariadb-account-create-update" Mar 19 19:20:31 crc kubenswrapper[4826]: I0319 19:20:31.250244 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="c79db235-4360-4ffc-a56e-b9fe0fad7eb6" containerName="mariadb-account-create-update" Mar 19 19:20:31 crc kubenswrapper[4826]: E0319 19:20:31.250258 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4769a42-cedc-492d-a6bf-e1ccf3cd77bc" containerName="init" Mar 19 19:20:31 crc kubenswrapper[4826]: I0319 19:20:31.250264 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4769a42-cedc-492d-a6bf-e1ccf3cd77bc" containerName="init" Mar 19 19:20:31 crc kubenswrapper[4826]: E0319 19:20:31.250282 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c93b8d4-4e5b-4a8d-a8fb-de909730e675" containerName="mariadb-database-create" Mar 19 19:20:31 crc kubenswrapper[4826]: I0319 19:20:31.250288 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c93b8d4-4e5b-4a8d-a8fb-de909730e675" containerName="mariadb-database-create" Mar 19 19:20:31 crc kubenswrapper[4826]: I0319 19:20:31.250513 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4769a42-cedc-492d-a6bf-e1ccf3cd77bc" containerName="dnsmasq-dns" Mar 19 19:20:31 crc kubenswrapper[4826]: I0319 19:20:31.250563 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c93b8d4-4e5b-4a8d-a8fb-de909730e675" containerName="mariadb-database-create" Mar 19 19:20:31 crc kubenswrapper[4826]: I0319 19:20:31.250584 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="c79db235-4360-4ffc-a56e-b9fe0fad7eb6" containerName="mariadb-account-create-update" Mar 19 19:20:31 crc kubenswrapper[4826]: I0319 19:20:31.251648 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-76txf" Mar 19 19:20:31 crc kubenswrapper[4826]: I0319 19:20:31.266148 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 19 19:20:31 crc kubenswrapper[4826]: I0319 19:20:31.266386 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 19 19:20:31 crc kubenswrapper[4826]: I0319 19:20:31.266615 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-pqw5p" Mar 19 19:20:31 crc kubenswrapper[4826]: I0319 19:20:31.266772 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 19 19:20:31 crc kubenswrapper[4826]: I0319 19:20:31.280314 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-76txf"] Mar 19 19:20:31 crc kubenswrapper[4826]: I0319 19:20:31.316118 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f41dad7-21ad-43b0-9e07-443bcd0c8c6a-scripts\") pod \"aodh-db-sync-76txf\" (UID: \"1f41dad7-21ad-43b0-9e07-443bcd0c8c6a\") " pod="openstack/aodh-db-sync-76txf" Mar 19 19:20:31 crc kubenswrapper[4826]: I0319 19:20:31.316278 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f41dad7-21ad-43b0-9e07-443bcd0c8c6a-combined-ca-bundle\") pod \"aodh-db-sync-76txf\" (UID: \"1f41dad7-21ad-43b0-9e07-443bcd0c8c6a\") " pod="openstack/aodh-db-sync-76txf" Mar 19 19:20:31 crc kubenswrapper[4826]: I0319 19:20:31.316381 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h98m\" (UniqueName: \"kubernetes.io/projected/1f41dad7-21ad-43b0-9e07-443bcd0c8c6a-kube-api-access-7h98m\") pod \"aodh-db-sync-76txf\" (UID: \"1f41dad7-21ad-43b0-9e07-443bcd0c8c6a\") " pod="openstack/aodh-db-sync-76txf" Mar 19 19:20:31 crc kubenswrapper[4826]: I0319 19:20:31.316455 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f41dad7-21ad-43b0-9e07-443bcd0c8c6a-config-data\") pod \"aodh-db-sync-76txf\" (UID: \"1f41dad7-21ad-43b0-9e07-443bcd0c8c6a\") " pod="openstack/aodh-db-sync-76txf" Mar 19 19:20:31 crc kubenswrapper[4826]: I0319 19:20:31.418358 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f41dad7-21ad-43b0-9e07-443bcd0c8c6a-scripts\") pod \"aodh-db-sync-76txf\" (UID: \"1f41dad7-21ad-43b0-9e07-443bcd0c8c6a\") " pod="openstack/aodh-db-sync-76txf" Mar 19 19:20:31 crc kubenswrapper[4826]: I0319 19:20:31.418477 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f41dad7-21ad-43b0-9e07-443bcd0c8c6a-combined-ca-bundle\") pod \"aodh-db-sync-76txf\" (UID: \"1f41dad7-21ad-43b0-9e07-443bcd0c8c6a\") " pod="openstack/aodh-db-sync-76txf" Mar 19 19:20:31 crc kubenswrapper[4826]: I0319 19:20:31.418567 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7h98m\" (UniqueName: \"kubernetes.io/projected/1f41dad7-21ad-43b0-9e07-443bcd0c8c6a-kube-api-access-7h98m\") pod \"aodh-db-sync-76txf\" (UID: \"1f41dad7-21ad-43b0-9e07-443bcd0c8c6a\") " pod="openstack/aodh-db-sync-76txf" Mar 19 19:20:31 crc kubenswrapper[4826]: I0319 19:20:31.418615 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f41dad7-21ad-43b0-9e07-443bcd0c8c6a-config-data\") pod \"aodh-db-sync-76txf\" (UID: \"1f41dad7-21ad-43b0-9e07-443bcd0c8c6a\") " pod="openstack/aodh-db-sync-76txf" Mar 19 19:20:31 crc kubenswrapper[4826]: I0319 19:20:31.423260 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f41dad7-21ad-43b0-9e07-443bcd0c8c6a-scripts\") pod \"aodh-db-sync-76txf\" (UID: \"1f41dad7-21ad-43b0-9e07-443bcd0c8c6a\") " pod="openstack/aodh-db-sync-76txf" Mar 19 19:20:31 crc kubenswrapper[4826]: I0319 19:20:31.424367 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f41dad7-21ad-43b0-9e07-443bcd0c8c6a-config-data\") pod \"aodh-db-sync-76txf\" (UID: \"1f41dad7-21ad-43b0-9e07-443bcd0c8c6a\") " pod="openstack/aodh-db-sync-76txf" Mar 19 19:20:31 crc kubenswrapper[4826]: I0319 19:20:31.426332 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f41dad7-21ad-43b0-9e07-443bcd0c8c6a-combined-ca-bundle\") pod \"aodh-db-sync-76txf\" (UID: \"1f41dad7-21ad-43b0-9e07-443bcd0c8c6a\") " pod="openstack/aodh-db-sync-76txf" Mar 19 19:20:31 crc kubenswrapper[4826]: I0319 19:20:31.440437 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h98m\" (UniqueName: \"kubernetes.io/projected/1f41dad7-21ad-43b0-9e07-443bcd0c8c6a-kube-api-access-7h98m\") pod \"aodh-db-sync-76txf\" (UID: \"1f41dad7-21ad-43b0-9e07-443bcd0c8c6a\") " pod="openstack/aodh-db-sync-76txf" Mar 19 19:20:31 crc kubenswrapper[4826]: I0319 19:20:31.507028 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35ddd07d-67cd-45c0-a20f-956fcadee9f2","Type":"ContainerStarted","Data":"2cf3f8b4eb2d75d2931cdb391a0a30b223b7a2fd2875a8ac038f3d303a1604a2"} Mar 19 19:20:31 crc kubenswrapper[4826]: I0319 19:20:31.507083 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35ddd07d-67cd-45c0-a20f-956fcadee9f2","Type":"ContainerStarted","Data":"a4c9c2b33151051406b8c7d44641c2d089b3e04330b8aece4a3db25b071301b1"} Mar 19 19:20:31 crc kubenswrapper[4826]: I0319 19:20:31.593455 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-76txf" Mar 19 19:20:32 crc kubenswrapper[4826]: I0319 19:20:32.238943 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-jsjkk" Mar 19 19:20:32 crc kubenswrapper[4826]: I0319 19:20:32.349073 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05dd165a-9504-4521-9814-4e252234fd9b-combined-ca-bundle\") pod \"05dd165a-9504-4521-9814-4e252234fd9b\" (UID: \"05dd165a-9504-4521-9814-4e252234fd9b\") " Mar 19 19:20:32 crc kubenswrapper[4826]: I0319 19:20:32.349178 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5kqk\" (UniqueName: \"kubernetes.io/projected/05dd165a-9504-4521-9814-4e252234fd9b-kube-api-access-k5kqk\") pod \"05dd165a-9504-4521-9814-4e252234fd9b\" (UID: \"05dd165a-9504-4521-9814-4e252234fd9b\") " Mar 19 19:20:32 crc kubenswrapper[4826]: I0319 19:20:32.349215 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05dd165a-9504-4521-9814-4e252234fd9b-scripts\") pod \"05dd165a-9504-4521-9814-4e252234fd9b\" (UID: \"05dd165a-9504-4521-9814-4e252234fd9b\") " Mar 19 19:20:32 crc kubenswrapper[4826]: I0319 19:20:32.349301 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05dd165a-9504-4521-9814-4e252234fd9b-config-data\") pod \"05dd165a-9504-4521-9814-4e252234fd9b\" (UID: \"05dd165a-9504-4521-9814-4e252234fd9b\") " Mar 19 19:20:32 crc kubenswrapper[4826]: I0319 19:20:32.355524 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05dd165a-9504-4521-9814-4e252234fd9b-scripts" (OuterVolumeSpecName: "scripts") pod "05dd165a-9504-4521-9814-4e252234fd9b" (UID: "05dd165a-9504-4521-9814-4e252234fd9b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:20:32 crc kubenswrapper[4826]: I0319 19:20:32.355820 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05dd165a-9504-4521-9814-4e252234fd9b-kube-api-access-k5kqk" (OuterVolumeSpecName: "kube-api-access-k5kqk") pod "05dd165a-9504-4521-9814-4e252234fd9b" (UID: "05dd165a-9504-4521-9814-4e252234fd9b"). InnerVolumeSpecName "kube-api-access-k5kqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:20:32 crc kubenswrapper[4826]: I0319 19:20:32.394758 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05dd165a-9504-4521-9814-4e252234fd9b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "05dd165a-9504-4521-9814-4e252234fd9b" (UID: "05dd165a-9504-4521-9814-4e252234fd9b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:20:32 crc kubenswrapper[4826]: I0319 19:20:32.408822 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05dd165a-9504-4521-9814-4e252234fd9b-config-data" (OuterVolumeSpecName: "config-data") pod "05dd165a-9504-4521-9814-4e252234fd9b" (UID: "05dd165a-9504-4521-9814-4e252234fd9b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:20:32 crc kubenswrapper[4826]: W0319 19:20:32.422346 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f41dad7_21ad_43b0_9e07_443bcd0c8c6a.slice/crio-bd8908e3f9fb820a01ccd53012e69295f976f3116773c3722c221a0af17e6b79 WatchSource:0}: Error finding container bd8908e3f9fb820a01ccd53012e69295f976f3116773c3722c221a0af17e6b79: Status 404 returned error can't find the container with id bd8908e3f9fb820a01ccd53012e69295f976f3116773c3722c221a0af17e6b79 Mar 19 19:20:32 crc kubenswrapper[4826]: I0319 19:20:32.425031 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-76txf"] Mar 19 19:20:32 crc kubenswrapper[4826]: I0319 19:20:32.452337 4826 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05dd165a-9504-4521-9814-4e252234fd9b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:32 crc kubenswrapper[4826]: I0319 19:20:32.452371 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5kqk\" (UniqueName: \"kubernetes.io/projected/05dd165a-9504-4521-9814-4e252234fd9b-kube-api-access-k5kqk\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:32 crc kubenswrapper[4826]: I0319 19:20:32.452382 4826 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05dd165a-9504-4521-9814-4e252234fd9b-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:32 crc kubenswrapper[4826]: I0319 19:20:32.452393 4826 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05dd165a-9504-4521-9814-4e252234fd9b-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:32 crc kubenswrapper[4826]: I0319 19:20:32.526319 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-jsjkk" event={"ID":"05dd165a-9504-4521-9814-4e252234fd9b","Type":"ContainerDied","Data":"621a88c21f917f8ce1803918371a7b1350e5fae6d77e5417f442bfa39c4b81f6"} Mar 19 19:20:32 crc kubenswrapper[4826]: I0319 19:20:32.526404 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="621a88c21f917f8ce1803918371a7b1350e5fae6d77e5417f442bfa39c4b81f6" Mar 19 19:20:32 crc kubenswrapper[4826]: I0319 19:20:32.526332 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-jsjkk" Mar 19 19:20:32 crc kubenswrapper[4826]: I0319 19:20:32.527671 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-76txf" event={"ID":"1f41dad7-21ad-43b0-9e07-443bcd0c8c6a","Type":"ContainerStarted","Data":"bd8908e3f9fb820a01ccd53012e69295f976f3116773c3722c221a0af17e6b79"} Mar 19 19:20:32 crc kubenswrapper[4826]: I0319 19:20:32.530152 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35ddd07d-67cd-45c0-a20f-956fcadee9f2","Type":"ContainerStarted","Data":"a2589dabf42c4588b0cc6dc78578f7f4a80fc586d2d738b8fdb0dcb2e6d94270"} Mar 19 19:20:32 crc kubenswrapper[4826]: I0319 19:20:32.713375 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 19 19:20:32 crc kubenswrapper[4826]: I0319 19:20:32.713674 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="69c44859-378d-42ac-af83-30fd8b5e9010" containerName="nova-api-log" containerID="cri-o://028c65640619880612ba7013fe076a8774827041d76cc65bc92cd1b2b984e0b0" gracePeriod=30 Mar 19 19:20:32 crc kubenswrapper[4826]: I0319 19:20:32.713736 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="69c44859-378d-42ac-af83-30fd8b5e9010" containerName="nova-api-api" containerID="cri-o://1c6eddd33bbc243a29f37127e409bd0627abb745a4dfdd243ee276c054f9641f" gracePeriod=30 Mar 19 19:20:32 crc kubenswrapper[4826]: I0319 19:20:32.737833 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 19:20:32 crc kubenswrapper[4826]: I0319 19:20:32.738036 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="6a2c4496-cd50-45d1-a17b-23c600e9ea92" containerName="nova-scheduler-scheduler" containerID="cri-o://ffb8232a3bccd7c7c41bdeb4f676dab4738a62c0f6b1935af0c660b5b69be706" gracePeriod=30 Mar 19 19:20:33 crc kubenswrapper[4826]: I0319 19:20:33.542769 4826 generic.go:334] "Generic (PLEG): container finished" podID="15200e4f-15d2-450d-93ff-3b26e3df0b48" containerID="06c8539cd9a6da203de24f0f362302644bbdfa63a2e96e302fee3836bf4a7a20" exitCode=0 Mar 19 19:20:33 crc kubenswrapper[4826]: I0319 19:20:33.542876 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-mdwdm" event={"ID":"15200e4f-15d2-450d-93ff-3b26e3df0b48","Type":"ContainerDied","Data":"06c8539cd9a6da203de24f0f362302644bbdfa63a2e96e302fee3836bf4a7a20"} Mar 19 19:20:33 crc kubenswrapper[4826]: I0319 19:20:33.547298 4826 generic.go:334] "Generic (PLEG): container finished" podID="69c44859-378d-42ac-af83-30fd8b5e9010" containerID="028c65640619880612ba7013fe076a8774827041d76cc65bc92cd1b2b984e0b0" exitCode=143 Mar 19 19:20:33 crc kubenswrapper[4826]: I0319 19:20:33.547338 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"69c44859-378d-42ac-af83-30fd8b5e9010","Type":"ContainerDied","Data":"028c65640619880612ba7013fe076a8774827041d76cc65bc92cd1b2b984e0b0"} Mar 19 19:20:34 crc kubenswrapper[4826]: I0319 19:20:34.562492 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35ddd07d-67cd-45c0-a20f-956fcadee9f2","Type":"ContainerStarted","Data":"eb9eda7b0b40134844a93ca83c22c387b8b8f574ea7088bae71c9a89f7d2d772"} Mar 19 19:20:34 crc kubenswrapper[4826]: I0319 19:20:34.585146 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.262870851 podStartE2EDuration="8.585131685s" podCreationTimestamp="2026-03-19 19:20:26 +0000 UTC" firstStartedPulling="2026-03-19 19:20:27.754833211 +0000 UTC m=+1452.508901524" lastFinishedPulling="2026-03-19 19:20:34.077094045 +0000 UTC m=+1458.831162358" observedRunningTime="2026-03-19 19:20:34.580893272 +0000 UTC m=+1459.334961585" watchObservedRunningTime="2026-03-19 19:20:34.585131685 +0000 UTC m=+1459.339199998" Mar 19 19:20:35 crc kubenswrapper[4826]: I0319 19:20:35.090447 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-mdwdm" Mar 19 19:20:35 crc kubenswrapper[4826]: I0319 19:20:35.212731 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cml5k" podUID="af2662b9-3873-4947-9793-e7e1c6611dcb" containerName="registry-server" probeResult="failure" output=< Mar 19 19:20:35 crc kubenswrapper[4826]: timeout: failed to connect service ":50051" within 1s Mar 19 19:20:35 crc kubenswrapper[4826]: > Mar 19 19:20:35 crc kubenswrapper[4826]: I0319 19:20:35.217764 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xb5fc\" (UniqueName: \"kubernetes.io/projected/15200e4f-15d2-450d-93ff-3b26e3df0b48-kube-api-access-xb5fc\") pod \"15200e4f-15d2-450d-93ff-3b26e3df0b48\" (UID: \"15200e4f-15d2-450d-93ff-3b26e3df0b48\") " Mar 19 19:20:35 crc kubenswrapper[4826]: I0319 19:20:35.217847 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15200e4f-15d2-450d-93ff-3b26e3df0b48-scripts\") pod \"15200e4f-15d2-450d-93ff-3b26e3df0b48\" (UID: \"15200e4f-15d2-450d-93ff-3b26e3df0b48\") " Mar 19 19:20:35 crc kubenswrapper[4826]: I0319 19:20:35.217874 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15200e4f-15d2-450d-93ff-3b26e3df0b48-combined-ca-bundle\") pod \"15200e4f-15d2-450d-93ff-3b26e3df0b48\" (UID: \"15200e4f-15d2-450d-93ff-3b26e3df0b48\") " Mar 19 19:20:35 crc kubenswrapper[4826]: I0319 19:20:35.217908 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15200e4f-15d2-450d-93ff-3b26e3df0b48-config-data\") pod \"15200e4f-15d2-450d-93ff-3b26e3df0b48\" (UID: \"15200e4f-15d2-450d-93ff-3b26e3df0b48\") " Mar 19 19:20:35 crc kubenswrapper[4826]: I0319 19:20:35.225950 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15200e4f-15d2-450d-93ff-3b26e3df0b48-kube-api-access-xb5fc" (OuterVolumeSpecName: "kube-api-access-xb5fc") pod "15200e4f-15d2-450d-93ff-3b26e3df0b48" (UID: "15200e4f-15d2-450d-93ff-3b26e3df0b48"). InnerVolumeSpecName "kube-api-access-xb5fc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:20:35 crc kubenswrapper[4826]: I0319 19:20:35.227382 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15200e4f-15d2-450d-93ff-3b26e3df0b48-scripts" (OuterVolumeSpecName: "scripts") pod "15200e4f-15d2-450d-93ff-3b26e3df0b48" (UID: "15200e4f-15d2-450d-93ff-3b26e3df0b48"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:20:35 crc kubenswrapper[4826]: I0319 19:20:35.260365 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15200e4f-15d2-450d-93ff-3b26e3df0b48-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "15200e4f-15d2-450d-93ff-3b26e3df0b48" (UID: "15200e4f-15d2-450d-93ff-3b26e3df0b48"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:20:35 crc kubenswrapper[4826]: I0319 19:20:35.273225 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15200e4f-15d2-450d-93ff-3b26e3df0b48-config-data" (OuterVolumeSpecName: "config-data") pod "15200e4f-15d2-450d-93ff-3b26e3df0b48" (UID: "15200e4f-15d2-450d-93ff-3b26e3df0b48"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:20:35 crc kubenswrapper[4826]: I0319 19:20:35.322076 4826 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15200e4f-15d2-450d-93ff-3b26e3df0b48-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:35 crc kubenswrapper[4826]: I0319 19:20:35.322109 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xb5fc\" (UniqueName: \"kubernetes.io/projected/15200e4f-15d2-450d-93ff-3b26e3df0b48-kube-api-access-xb5fc\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:35 crc kubenswrapper[4826]: I0319 19:20:35.322121 4826 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15200e4f-15d2-450d-93ff-3b26e3df0b48-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:35 crc kubenswrapper[4826]: I0319 19:20:35.322129 4826 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15200e4f-15d2-450d-93ff-3b26e3df0b48-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:35 crc kubenswrapper[4826]: I0319 19:20:35.419274 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 19 19:20:35 crc kubenswrapper[4826]: I0319 19:20:35.419325 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 19 19:20:35 crc kubenswrapper[4826]: I0319 19:20:35.485915 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 19 19:20:35 crc kubenswrapper[4826]: I0319 19:20:35.487393 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 19 19:20:35 crc kubenswrapper[4826]: I0319 19:20:35.584696 4826 generic.go:334] "Generic (PLEG): container finished" podID="6a2c4496-cd50-45d1-a17b-23c600e9ea92" containerID="ffb8232a3bccd7c7c41bdeb4f676dab4738a62c0f6b1935af0c660b5b69be706" exitCode=0 Mar 19 19:20:35 crc kubenswrapper[4826]: I0319 19:20:35.584762 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6a2c4496-cd50-45d1-a17b-23c600e9ea92","Type":"ContainerDied","Data":"ffb8232a3bccd7c7c41bdeb4f676dab4738a62c0f6b1935af0c660b5b69be706"} Mar 19 19:20:35 crc kubenswrapper[4826]: I0319 19:20:35.594419 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-mdwdm" event={"ID":"15200e4f-15d2-450d-93ff-3b26e3df0b48","Type":"ContainerDied","Data":"bb03eababb0663669edb466768a49bf5db7f7e19233616111a77042fe65db52e"} Mar 19 19:20:35 crc kubenswrapper[4826]: I0319 19:20:35.594489 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb03eababb0663669edb466768a49bf5db7f7e19233616111a77042fe65db52e" Mar 19 19:20:35 crc kubenswrapper[4826]: I0319 19:20:35.594497 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-mdwdm" Mar 19 19:20:35 crc kubenswrapper[4826]: I0319 19:20:35.595416 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 19 19:20:35 crc kubenswrapper[4826]: I0319 19:20:35.640669 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 19 19:20:35 crc kubenswrapper[4826]: E0319 19:20:35.641213 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05dd165a-9504-4521-9814-4e252234fd9b" containerName="nova-manage" Mar 19 19:20:35 crc kubenswrapper[4826]: I0319 19:20:35.641235 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="05dd165a-9504-4521-9814-4e252234fd9b" containerName="nova-manage" Mar 19 19:20:35 crc kubenswrapper[4826]: E0319 19:20:35.641286 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15200e4f-15d2-450d-93ff-3b26e3df0b48" containerName="nova-cell1-conductor-db-sync" Mar 19 19:20:35 crc kubenswrapper[4826]: I0319 19:20:35.641293 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="15200e4f-15d2-450d-93ff-3b26e3df0b48" containerName="nova-cell1-conductor-db-sync" Mar 19 19:20:35 crc kubenswrapper[4826]: I0319 19:20:35.641504 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="05dd165a-9504-4521-9814-4e252234fd9b" containerName="nova-manage" Mar 19 19:20:35 crc kubenswrapper[4826]: I0319 19:20:35.641527 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="15200e4f-15d2-450d-93ff-3b26e3df0b48" containerName="nova-cell1-conductor-db-sync" Mar 19 19:20:35 crc kubenswrapper[4826]: I0319 19:20:35.642839 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 19 19:20:35 crc kubenswrapper[4826]: I0319 19:20:35.645867 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 19 19:20:35 crc kubenswrapper[4826]: I0319 19:20:35.658879 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 19 19:20:35 crc kubenswrapper[4826]: I0319 19:20:35.731871 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb070564-2600-49cb-92d3-f8097addd815-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"fb070564-2600-49cb-92d3-f8097addd815\") " pod="openstack/nova-cell1-conductor-0" Mar 19 19:20:35 crc kubenswrapper[4826]: I0319 19:20:35.731929 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjjcw\" (UniqueName: \"kubernetes.io/projected/fb070564-2600-49cb-92d3-f8097addd815-kube-api-access-tjjcw\") pod \"nova-cell1-conductor-0\" (UID: \"fb070564-2600-49cb-92d3-f8097addd815\") " pod="openstack/nova-cell1-conductor-0" Mar 19 19:20:35 crc kubenswrapper[4826]: I0319 19:20:35.732197 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb070564-2600-49cb-92d3-f8097addd815-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"fb070564-2600-49cb-92d3-f8097addd815\") " pod="openstack/nova-cell1-conductor-0" Mar 19 19:20:35 crc kubenswrapper[4826]: I0319 19:20:35.838005 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb070564-2600-49cb-92d3-f8097addd815-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"fb070564-2600-49cb-92d3-f8097addd815\") " pod="openstack/nova-cell1-conductor-0" Mar 19 19:20:35 crc kubenswrapper[4826]: I0319 19:20:35.838138 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb070564-2600-49cb-92d3-f8097addd815-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"fb070564-2600-49cb-92d3-f8097addd815\") " pod="openstack/nova-cell1-conductor-0" Mar 19 19:20:35 crc kubenswrapper[4826]: I0319 19:20:35.838164 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjjcw\" (UniqueName: \"kubernetes.io/projected/fb070564-2600-49cb-92d3-f8097addd815-kube-api-access-tjjcw\") pod \"nova-cell1-conductor-0\" (UID: \"fb070564-2600-49cb-92d3-f8097addd815\") " pod="openstack/nova-cell1-conductor-0" Mar 19 19:20:35 crc kubenswrapper[4826]: I0319 19:20:35.842927 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb070564-2600-49cb-92d3-f8097addd815-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"fb070564-2600-49cb-92d3-f8097addd815\") " pod="openstack/nova-cell1-conductor-0" Mar 19 19:20:35 crc kubenswrapper[4826]: I0319 19:20:35.854540 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjjcw\" (UniqueName: \"kubernetes.io/projected/fb070564-2600-49cb-92d3-f8097addd815-kube-api-access-tjjcw\") pod \"nova-cell1-conductor-0\" (UID: \"fb070564-2600-49cb-92d3-f8097addd815\") " pod="openstack/nova-cell1-conductor-0" Mar 19 19:20:35 crc kubenswrapper[4826]: I0319 19:20:35.870836 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb070564-2600-49cb-92d3-f8097addd815-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"fb070564-2600-49cb-92d3-f8097addd815\") " pod="openstack/nova-cell1-conductor-0" Mar 19 19:20:35 crc kubenswrapper[4826]: I0319 19:20:35.970770 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 19 19:20:36 crc kubenswrapper[4826]: I0319 19:20:36.613417 4826 generic.go:334] "Generic (PLEG): container finished" podID="69c44859-378d-42ac-af83-30fd8b5e9010" containerID="1c6eddd33bbc243a29f37127e409bd0627abb745a4dfdd243ee276c054f9641f" exitCode=0 Mar 19 19:20:36 crc kubenswrapper[4826]: I0319 19:20:36.613724 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"69c44859-378d-42ac-af83-30fd8b5e9010","Type":"ContainerDied","Data":"1c6eddd33bbc243a29f37127e409bd0627abb745a4dfdd243ee276c054f9641f"} Mar 19 19:20:37 crc kubenswrapper[4826]: E0319 19:20:37.360105 4826 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ffb8232a3bccd7c7c41bdeb4f676dab4738a62c0f6b1935af0c660b5b69be706 is running failed: container process not found" containerID="ffb8232a3bccd7c7c41bdeb4f676dab4738a62c0f6b1935af0c660b5b69be706" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 19 19:20:37 crc kubenswrapper[4826]: E0319 19:20:37.360909 4826 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ffb8232a3bccd7c7c41bdeb4f676dab4738a62c0f6b1935af0c660b5b69be706 is running failed: container process not found" containerID="ffb8232a3bccd7c7c41bdeb4f676dab4738a62c0f6b1935af0c660b5b69be706" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 19 19:20:37 crc kubenswrapper[4826]: E0319 19:20:37.361466 4826 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ffb8232a3bccd7c7c41bdeb4f676dab4738a62c0f6b1935af0c660b5b69be706 is running failed: container process not found" containerID="ffb8232a3bccd7c7c41bdeb4f676dab4738a62c0f6b1935af0c660b5b69be706" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 19 19:20:37 crc kubenswrapper[4826]: E0319 19:20:37.361508 4826 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ffb8232a3bccd7c7c41bdeb4f676dab4738a62c0f6b1935af0c660b5b69be706 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="6a2c4496-cd50-45d1-a17b-23c600e9ea92" containerName="nova-scheduler-scheduler" Mar 19 19:20:39 crc kubenswrapper[4826]: I0319 19:20:39.041096 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 19:20:39 crc kubenswrapper[4826]: I0319 19:20:39.059833 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 19:20:39 crc kubenswrapper[4826]: I0319 19:20:39.117191 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5lrn\" (UniqueName: \"kubernetes.io/projected/6a2c4496-cd50-45d1-a17b-23c600e9ea92-kube-api-access-z5lrn\") pod \"6a2c4496-cd50-45d1-a17b-23c600e9ea92\" (UID: \"6a2c4496-cd50-45d1-a17b-23c600e9ea92\") " Mar 19 19:20:39 crc kubenswrapper[4826]: I0319 19:20:39.117613 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69c44859-378d-42ac-af83-30fd8b5e9010-logs\") pod \"69c44859-378d-42ac-af83-30fd8b5e9010\" (UID: \"69c44859-378d-42ac-af83-30fd8b5e9010\") " Mar 19 19:20:39 crc kubenswrapper[4826]: I0319 19:20:39.117632 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69c44859-378d-42ac-af83-30fd8b5e9010-combined-ca-bundle\") pod \"69c44859-378d-42ac-af83-30fd8b5e9010\" (UID: \"69c44859-378d-42ac-af83-30fd8b5e9010\") " Mar 19 19:20:39 crc kubenswrapper[4826]: I0319 19:20:39.117680 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69c44859-378d-42ac-af83-30fd8b5e9010-config-data\") pod \"69c44859-378d-42ac-af83-30fd8b5e9010\" (UID: \"69c44859-378d-42ac-af83-30fd8b5e9010\") " Mar 19 19:20:39 crc kubenswrapper[4826]: I0319 19:20:39.117702 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a2c4496-cd50-45d1-a17b-23c600e9ea92-config-data\") pod \"6a2c4496-cd50-45d1-a17b-23c600e9ea92\" (UID: \"6a2c4496-cd50-45d1-a17b-23c600e9ea92\") " Mar 19 19:20:39 crc kubenswrapper[4826]: I0319 19:20:39.117743 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fr7kj\" (UniqueName: \"kubernetes.io/projected/69c44859-378d-42ac-af83-30fd8b5e9010-kube-api-access-fr7kj\") pod \"69c44859-378d-42ac-af83-30fd8b5e9010\" (UID: \"69c44859-378d-42ac-af83-30fd8b5e9010\") " Mar 19 19:20:39 crc kubenswrapper[4826]: I0319 19:20:39.117798 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a2c4496-cd50-45d1-a17b-23c600e9ea92-combined-ca-bundle\") pod \"6a2c4496-cd50-45d1-a17b-23c600e9ea92\" (UID: \"6a2c4496-cd50-45d1-a17b-23c600e9ea92\") " Mar 19 19:20:39 crc kubenswrapper[4826]: I0319 19:20:39.120902 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69c44859-378d-42ac-af83-30fd8b5e9010-logs" (OuterVolumeSpecName: "logs") pod "69c44859-378d-42ac-af83-30fd8b5e9010" (UID: "69c44859-378d-42ac-af83-30fd8b5e9010"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:20:39 crc kubenswrapper[4826]: I0319 19:20:39.125118 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a2c4496-cd50-45d1-a17b-23c600e9ea92-kube-api-access-z5lrn" (OuterVolumeSpecName: "kube-api-access-z5lrn") pod "6a2c4496-cd50-45d1-a17b-23c600e9ea92" (UID: "6a2c4496-cd50-45d1-a17b-23c600e9ea92"). InnerVolumeSpecName "kube-api-access-z5lrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:20:39 crc kubenswrapper[4826]: I0319 19:20:39.140760 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69c44859-378d-42ac-af83-30fd8b5e9010-kube-api-access-fr7kj" (OuterVolumeSpecName: "kube-api-access-fr7kj") pod "69c44859-378d-42ac-af83-30fd8b5e9010" (UID: "69c44859-378d-42ac-af83-30fd8b5e9010"). InnerVolumeSpecName "kube-api-access-fr7kj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:20:39 crc kubenswrapper[4826]: I0319 19:20:39.172421 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69c44859-378d-42ac-af83-30fd8b5e9010-config-data" (OuterVolumeSpecName: "config-data") pod "69c44859-378d-42ac-af83-30fd8b5e9010" (UID: "69c44859-378d-42ac-af83-30fd8b5e9010"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:20:39 crc kubenswrapper[4826]: I0319 19:20:39.185340 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69c44859-378d-42ac-af83-30fd8b5e9010-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "69c44859-378d-42ac-af83-30fd8b5e9010" (UID: "69c44859-378d-42ac-af83-30fd8b5e9010"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:20:39 crc kubenswrapper[4826]: I0319 19:20:39.189143 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a2c4496-cd50-45d1-a17b-23c600e9ea92-config-data" (OuterVolumeSpecName: "config-data") pod "6a2c4496-cd50-45d1-a17b-23c600e9ea92" (UID: "6a2c4496-cd50-45d1-a17b-23c600e9ea92"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:20:39 crc kubenswrapper[4826]: I0319 19:20:39.190833 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a2c4496-cd50-45d1-a17b-23c600e9ea92-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a2c4496-cd50-45d1-a17b-23c600e9ea92" (UID: "6a2c4496-cd50-45d1-a17b-23c600e9ea92"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:20:39 crc kubenswrapper[4826]: I0319 19:20:39.198993 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 19 19:20:39 crc kubenswrapper[4826]: I0319 19:20:39.220947 4826 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a2c4496-cd50-45d1-a17b-23c600e9ea92-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:39 crc kubenswrapper[4826]: I0319 19:20:39.220987 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5lrn\" (UniqueName: \"kubernetes.io/projected/6a2c4496-cd50-45d1-a17b-23c600e9ea92-kube-api-access-z5lrn\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:39 crc kubenswrapper[4826]: I0319 19:20:39.221004 4826 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69c44859-378d-42ac-af83-30fd8b5e9010-logs\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:39 crc kubenswrapper[4826]: I0319 19:20:39.221017 4826 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69c44859-378d-42ac-af83-30fd8b5e9010-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:39 crc kubenswrapper[4826]: I0319 19:20:39.221028 4826 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69c44859-378d-42ac-af83-30fd8b5e9010-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:39 crc kubenswrapper[4826]: I0319 19:20:39.221041 4826 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a2c4496-cd50-45d1-a17b-23c600e9ea92-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:39 crc kubenswrapper[4826]: I0319 19:20:39.221052 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fr7kj\" (UniqueName: \"kubernetes.io/projected/69c44859-378d-42ac-af83-30fd8b5e9010-kube-api-access-fr7kj\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:39 crc kubenswrapper[4826]: I0319 19:20:39.727573 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"fb070564-2600-49cb-92d3-f8097addd815","Type":"ContainerStarted","Data":"741a7101f7ecfe898c80dcb68aac5e3e92acb8c28a086451522f167d1c40b532"} Mar 19 19:20:39 crc kubenswrapper[4826]: I0319 19:20:39.727980 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"fb070564-2600-49cb-92d3-f8097addd815","Type":"ContainerStarted","Data":"87dcc2370b31e38e7d80555c9f95f1a4c896d7623c2157bf01b534b9589b15db"} Mar 19 19:20:39 crc kubenswrapper[4826]: I0319 19:20:39.728006 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 19 19:20:39 crc kubenswrapper[4826]: I0319 19:20:39.729375 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6a2c4496-cd50-45d1-a17b-23c600e9ea92","Type":"ContainerDied","Data":"1c502d9c4c609c0365f53f2d4482ca38dcb3ae0157f3ad190663c9766218349c"} Mar 19 19:20:39 crc kubenswrapper[4826]: I0319 19:20:39.729412 4826 scope.go:117] "RemoveContainer" containerID="ffb8232a3bccd7c7c41bdeb4f676dab4738a62c0f6b1935af0c660b5b69be706" Mar 19 19:20:39 crc kubenswrapper[4826]: I0319 19:20:39.729412 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 19:20:39 crc kubenswrapper[4826]: I0319 19:20:39.734692 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"69c44859-378d-42ac-af83-30fd8b5e9010","Type":"ContainerDied","Data":"cd16d1579f38d37e6a172017d701d37e70973847e05a6d68cb826a44bd7a78fc"} Mar 19 19:20:39 crc kubenswrapper[4826]: I0319 19:20:39.734750 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 19:20:39 crc kubenswrapper[4826]: I0319 19:20:39.737455 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-76txf" event={"ID":"1f41dad7-21ad-43b0-9e07-443bcd0c8c6a","Type":"ContainerStarted","Data":"7ce6eef6df66bd1bb3b844606039927fd901641babd83bd3643f424e56a42ffa"} Mar 19 19:20:39 crc kubenswrapper[4826]: I0319 19:20:39.752247 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=4.75222552 podStartE2EDuration="4.75222552s" podCreationTimestamp="2026-03-19 19:20:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:20:39.745742953 +0000 UTC m=+1464.499811266" watchObservedRunningTime="2026-03-19 19:20:39.75222552 +0000 UTC m=+1464.506293833" Mar 19 19:20:39 crc kubenswrapper[4826]: I0319 19:20:39.760185 4826 scope.go:117] "RemoveContainer" containerID="1c6eddd33bbc243a29f37127e409bd0627abb745a4dfdd243ee276c054f9641f" Mar 19 19:20:39 crc kubenswrapper[4826]: I0319 19:20:39.777373 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-76txf" podStartSLOduration=2.585166656 podStartE2EDuration="8.777354178s" podCreationTimestamp="2026-03-19 19:20:31 +0000 UTC" firstStartedPulling="2026-03-19 19:20:32.424439822 +0000 UTC m=+1457.178508135" lastFinishedPulling="2026-03-19 19:20:38.616627344 +0000 UTC m=+1463.370695657" observedRunningTime="2026-03-19 19:20:39.775110293 +0000 UTC m=+1464.529178616" watchObservedRunningTime="2026-03-19 19:20:39.777354178 +0000 UTC m=+1464.531422481" Mar 19 19:20:39 crc kubenswrapper[4826]: I0319 19:20:39.787792 4826 scope.go:117] "RemoveContainer" containerID="028c65640619880612ba7013fe076a8774827041d76cc65bc92cd1b2b984e0b0" Mar 19 19:20:39 crc kubenswrapper[4826]: I0319 19:20:39.819862 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 19 19:20:39 crc kubenswrapper[4826]: I0319 19:20:39.832937 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 19 19:20:39 crc kubenswrapper[4826]: I0319 19:20:39.846612 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 19:20:39 crc kubenswrapper[4826]: I0319 19:20:39.857152 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 19:20:39 crc kubenswrapper[4826]: I0319 19:20:39.869435 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 19 19:20:39 crc kubenswrapper[4826]: E0319 19:20:39.870137 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a2c4496-cd50-45d1-a17b-23c600e9ea92" containerName="nova-scheduler-scheduler" Mar 19 19:20:39 crc kubenswrapper[4826]: I0319 19:20:39.870155 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a2c4496-cd50-45d1-a17b-23c600e9ea92" containerName="nova-scheduler-scheduler" Mar 19 19:20:39 crc kubenswrapper[4826]: E0319 19:20:39.870198 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69c44859-378d-42ac-af83-30fd8b5e9010" containerName="nova-api-log" Mar 19 19:20:39 crc kubenswrapper[4826]: I0319 19:20:39.870208 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="69c44859-378d-42ac-af83-30fd8b5e9010" containerName="nova-api-log" Mar 19 19:20:39 crc kubenswrapper[4826]: E0319 19:20:39.870222 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69c44859-378d-42ac-af83-30fd8b5e9010" containerName="nova-api-api" Mar 19 19:20:39 crc kubenswrapper[4826]: I0319 19:20:39.870230 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="69c44859-378d-42ac-af83-30fd8b5e9010" containerName="nova-api-api" Mar 19 19:20:39 crc kubenswrapper[4826]: I0319 19:20:39.870548 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="69c44859-378d-42ac-af83-30fd8b5e9010" containerName="nova-api-log" Mar 19 19:20:39 crc kubenswrapper[4826]: I0319 19:20:39.870591 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a2c4496-cd50-45d1-a17b-23c600e9ea92" containerName="nova-scheduler-scheduler" Mar 19 19:20:39 crc kubenswrapper[4826]: I0319 19:20:39.870609 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="69c44859-378d-42ac-af83-30fd8b5e9010" containerName="nova-api-api" Mar 19 19:20:39 crc kubenswrapper[4826]: I0319 19:20:39.872172 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 19:20:39 crc kubenswrapper[4826]: I0319 19:20:39.883842 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 19:20:39 crc kubenswrapper[4826]: I0319 19:20:39.885922 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 19 19:20:39 crc kubenswrapper[4826]: I0319 19:20:39.894090 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 19:20:39 crc kubenswrapper[4826]: I0319 19:20:39.896137 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 19:20:39 crc kubenswrapper[4826]: I0319 19:20:39.898928 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 19 19:20:39 crc kubenswrapper[4826]: I0319 19:20:39.905437 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 19:20:39 crc kubenswrapper[4826]: I0319 19:20:39.944049 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96becd19-70c0-4cc6-babc-650f7c8bac8f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"96becd19-70c0-4cc6-babc-650f7c8bac8f\") " pod="openstack/nova-scheduler-0" Mar 19 19:20:39 crc kubenswrapper[4826]: I0319 19:20:39.944112 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7rvr\" (UniqueName: \"kubernetes.io/projected/96becd19-70c0-4cc6-babc-650f7c8bac8f-kube-api-access-r7rvr\") pod \"nova-scheduler-0\" (UID: \"96becd19-70c0-4cc6-babc-650f7c8bac8f\") " pod="openstack/nova-scheduler-0" Mar 19 19:20:39 crc kubenswrapper[4826]: I0319 19:20:39.944139 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6bded02-bd3b-455f-9ca2-e075e45efce2-logs\") pod \"nova-api-0\" (UID: \"a6bded02-bd3b-455f-9ca2-e075e45efce2\") " pod="openstack/nova-api-0" Mar 19 19:20:39 crc kubenswrapper[4826]: I0319 19:20:39.944286 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6bded02-bd3b-455f-9ca2-e075e45efce2-config-data\") pod \"nova-api-0\" (UID: \"a6bded02-bd3b-455f-9ca2-e075e45efce2\") " pod="openstack/nova-api-0" Mar 19 19:20:39 crc kubenswrapper[4826]: I0319 19:20:39.944329 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96becd19-70c0-4cc6-babc-650f7c8bac8f-config-data\") pod \"nova-scheduler-0\" (UID: \"96becd19-70c0-4cc6-babc-650f7c8bac8f\") " pod="openstack/nova-scheduler-0" Mar 19 19:20:39 crc kubenswrapper[4826]: I0319 19:20:39.951625 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6bded02-bd3b-455f-9ca2-e075e45efce2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a6bded02-bd3b-455f-9ca2-e075e45efce2\") " pod="openstack/nova-api-0" Mar 19 19:20:39 crc kubenswrapper[4826]: I0319 19:20:39.951745 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqxl6\" (UniqueName: \"kubernetes.io/projected/a6bded02-bd3b-455f-9ca2-e075e45efce2-kube-api-access-vqxl6\") pod \"nova-api-0\" (UID: \"a6bded02-bd3b-455f-9ca2-e075e45efce2\") " pod="openstack/nova-api-0" Mar 19 19:20:40 crc kubenswrapper[4826]: I0319 19:20:40.003226 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69c44859-378d-42ac-af83-30fd8b5e9010" path="/var/lib/kubelet/pods/69c44859-378d-42ac-af83-30fd8b5e9010/volumes" Mar 19 19:20:40 crc kubenswrapper[4826]: I0319 19:20:40.004022 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a2c4496-cd50-45d1-a17b-23c600e9ea92" path="/var/lib/kubelet/pods/6a2c4496-cd50-45d1-a17b-23c600e9ea92/volumes" Mar 19 19:20:40 crc kubenswrapper[4826]: I0319 19:20:40.054722 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96becd19-70c0-4cc6-babc-650f7c8bac8f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"96becd19-70c0-4cc6-babc-650f7c8bac8f\") " pod="openstack/nova-scheduler-0" Mar 19 19:20:40 crc kubenswrapper[4826]: I0319 19:20:40.054771 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7rvr\" (UniqueName: \"kubernetes.io/projected/96becd19-70c0-4cc6-babc-650f7c8bac8f-kube-api-access-r7rvr\") pod \"nova-scheduler-0\" (UID: \"96becd19-70c0-4cc6-babc-650f7c8bac8f\") " pod="openstack/nova-scheduler-0" Mar 19 19:20:40 crc kubenswrapper[4826]: I0319 19:20:40.054790 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6bded02-bd3b-455f-9ca2-e075e45efce2-logs\") pod \"nova-api-0\" (UID: \"a6bded02-bd3b-455f-9ca2-e075e45efce2\") " pod="openstack/nova-api-0" Mar 19 19:20:40 crc kubenswrapper[4826]: I0319 19:20:40.054894 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6bded02-bd3b-455f-9ca2-e075e45efce2-config-data\") pod \"nova-api-0\" (UID: \"a6bded02-bd3b-455f-9ca2-e075e45efce2\") " pod="openstack/nova-api-0" Mar 19 19:20:40 crc kubenswrapper[4826]: I0319 19:20:40.054925 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96becd19-70c0-4cc6-babc-650f7c8bac8f-config-data\") pod \"nova-scheduler-0\" (UID: \"96becd19-70c0-4cc6-babc-650f7c8bac8f\") " pod="openstack/nova-scheduler-0" Mar 19 19:20:40 crc kubenswrapper[4826]: I0319 19:20:40.054971 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6bded02-bd3b-455f-9ca2-e075e45efce2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a6bded02-bd3b-455f-9ca2-e075e45efce2\") " pod="openstack/nova-api-0" Mar 19 19:20:40 crc kubenswrapper[4826]: I0319 19:20:40.055008 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqxl6\" (UniqueName: \"kubernetes.io/projected/a6bded02-bd3b-455f-9ca2-e075e45efce2-kube-api-access-vqxl6\") pod \"nova-api-0\" (UID: \"a6bded02-bd3b-455f-9ca2-e075e45efce2\") " pod="openstack/nova-api-0" Mar 19 19:20:40 crc kubenswrapper[4826]: I0319 19:20:40.055631 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6bded02-bd3b-455f-9ca2-e075e45efce2-logs\") pod \"nova-api-0\" (UID: \"a6bded02-bd3b-455f-9ca2-e075e45efce2\") " pod="openstack/nova-api-0" Mar 19 19:20:40 crc kubenswrapper[4826]: I0319 19:20:40.058755 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96becd19-70c0-4cc6-babc-650f7c8bac8f-config-data\") pod \"nova-scheduler-0\" (UID: \"96becd19-70c0-4cc6-babc-650f7c8bac8f\") " pod="openstack/nova-scheduler-0" Mar 19 19:20:40 crc kubenswrapper[4826]: I0319 19:20:40.063198 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96becd19-70c0-4cc6-babc-650f7c8bac8f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"96becd19-70c0-4cc6-babc-650f7c8bac8f\") " pod="openstack/nova-scheduler-0" Mar 19 19:20:40 crc kubenswrapper[4826]: I0319 19:20:40.084482 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6bded02-bd3b-455f-9ca2-e075e45efce2-config-data\") pod \"nova-api-0\" (UID: \"a6bded02-bd3b-455f-9ca2-e075e45efce2\") " pod="openstack/nova-api-0" Mar 19 19:20:40 crc kubenswrapper[4826]: I0319 19:20:40.085051 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6bded02-bd3b-455f-9ca2-e075e45efce2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a6bded02-bd3b-455f-9ca2-e075e45efce2\") " pod="openstack/nova-api-0" Mar 19 19:20:40 crc kubenswrapper[4826]: I0319 19:20:40.087810 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqxl6\" (UniqueName: \"kubernetes.io/projected/a6bded02-bd3b-455f-9ca2-e075e45efce2-kube-api-access-vqxl6\") pod \"nova-api-0\" (UID: \"a6bded02-bd3b-455f-9ca2-e075e45efce2\") " pod="openstack/nova-api-0" Mar 19 19:20:40 crc kubenswrapper[4826]: I0319 19:20:40.088188 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7rvr\" (UniqueName: \"kubernetes.io/projected/96becd19-70c0-4cc6-babc-650f7c8bac8f-kube-api-access-r7rvr\") pod \"nova-scheduler-0\" (UID: \"96becd19-70c0-4cc6-babc-650f7c8bac8f\") " pod="openstack/nova-scheduler-0" Mar 19 19:20:40 crc kubenswrapper[4826]: I0319 19:20:40.210217 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 19:20:40 crc kubenswrapper[4826]: I0319 19:20:40.227401 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 19:20:40 crc kubenswrapper[4826]: I0319 19:20:40.714234 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 19:20:40 crc kubenswrapper[4826]: I0319 19:20:40.756160 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a6bded02-bd3b-455f-9ca2-e075e45efce2","Type":"ContainerStarted","Data":"53c853ab47911738e92f7646bfa6c9ac86adf0f6c8bf6986a0d0c7f508a945d6"} Mar 19 19:20:40 crc kubenswrapper[4826]: I0319 19:20:40.883517 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 19:20:40 crc kubenswrapper[4826]: W0319 19:20:40.886180 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96becd19_70c0_4cc6_babc_650f7c8bac8f.slice/crio-84f337e41f248c89afa8077e2022e8d63829c257ded0212b17dfdd4a3a5619e0 WatchSource:0}: Error finding container 84f337e41f248c89afa8077e2022e8d63829c257ded0212b17dfdd4a3a5619e0: Status 404 returned error can't find the container with id 84f337e41f248c89afa8077e2022e8d63829c257ded0212b17dfdd4a3a5619e0 Mar 19 19:20:41 crc kubenswrapper[4826]: I0319 19:20:41.785735 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a6bded02-bd3b-455f-9ca2-e075e45efce2","Type":"ContainerStarted","Data":"d45aa60393a89e6eb5f2b8d8e8f3fc97b97f1c8ea445bc961c4d74b049bd2ad8"} Mar 19 19:20:41 crc kubenswrapper[4826]: I0319 19:20:41.786126 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a6bded02-bd3b-455f-9ca2-e075e45efce2","Type":"ContainerStarted","Data":"b662033fd78e3e49d124f351a53cb51de19019121010bc527f15b070a0b6cbd5"} Mar 19 19:20:41 crc kubenswrapper[4826]: I0319 19:20:41.787552 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"96becd19-70c0-4cc6-babc-650f7c8bac8f","Type":"ContainerStarted","Data":"a0ed9893e8c7a2ea039f9b4692c6d5ced301b61ca392a211050c392653e4db7a"} Mar 19 19:20:41 crc kubenswrapper[4826]: I0319 19:20:41.787573 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"96becd19-70c0-4cc6-babc-650f7c8bac8f","Type":"ContainerStarted","Data":"84f337e41f248c89afa8077e2022e8d63829c257ded0212b17dfdd4a3a5619e0"} Mar 19 19:20:41 crc kubenswrapper[4826]: I0319 19:20:41.818207 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.818189363 podStartE2EDuration="2.818189363s" podCreationTimestamp="2026-03-19 19:20:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:20:41.807575067 +0000 UTC m=+1466.561643380" watchObservedRunningTime="2026-03-19 19:20:41.818189363 +0000 UTC m=+1466.572257676" Mar 19 19:20:41 crc kubenswrapper[4826]: I0319 19:20:41.829221 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.829202589 podStartE2EDuration="2.829202589s" podCreationTimestamp="2026-03-19 19:20:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:20:41.828051361 +0000 UTC m=+1466.582119684" watchObservedRunningTime="2026-03-19 19:20:41.829202589 +0000 UTC m=+1466.583270922" Mar 19 19:20:42 crc kubenswrapper[4826]: I0319 19:20:42.801939 4826 generic.go:334] "Generic (PLEG): container finished" podID="1f41dad7-21ad-43b0-9e07-443bcd0c8c6a" containerID="7ce6eef6df66bd1bb3b844606039927fd901641babd83bd3643f424e56a42ffa" exitCode=0 Mar 19 19:20:42 crc kubenswrapper[4826]: I0319 19:20:42.802047 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-76txf" event={"ID":"1f41dad7-21ad-43b0-9e07-443bcd0c8c6a","Type":"ContainerDied","Data":"7ce6eef6df66bd1bb3b844606039927fd901641babd83bd3643f424e56a42ffa"} Mar 19 19:20:44 crc kubenswrapper[4826]: I0319 19:20:44.213589 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cml5k" Mar 19 19:20:44 crc kubenswrapper[4826]: I0319 19:20:44.360974 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-76txf" Mar 19 19:20:44 crc kubenswrapper[4826]: I0319 19:20:44.378313 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cml5k" Mar 19 19:20:44 crc kubenswrapper[4826]: I0319 19:20:44.466166 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cml5k"] Mar 19 19:20:44 crc kubenswrapper[4826]: I0319 19:20:44.486151 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f41dad7-21ad-43b0-9e07-443bcd0c8c6a-config-data\") pod \"1f41dad7-21ad-43b0-9e07-443bcd0c8c6a\" (UID: \"1f41dad7-21ad-43b0-9e07-443bcd0c8c6a\") " Mar 19 19:20:44 crc kubenswrapper[4826]: I0319 19:20:44.486395 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7h98m\" (UniqueName: \"kubernetes.io/projected/1f41dad7-21ad-43b0-9e07-443bcd0c8c6a-kube-api-access-7h98m\") pod \"1f41dad7-21ad-43b0-9e07-443bcd0c8c6a\" (UID: \"1f41dad7-21ad-43b0-9e07-443bcd0c8c6a\") " Mar 19 19:20:44 crc kubenswrapper[4826]: I0319 19:20:44.486466 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f41dad7-21ad-43b0-9e07-443bcd0c8c6a-combined-ca-bundle\") pod \"1f41dad7-21ad-43b0-9e07-443bcd0c8c6a\" (UID: \"1f41dad7-21ad-43b0-9e07-443bcd0c8c6a\") " Mar 19 19:20:44 crc kubenswrapper[4826]: I0319 19:20:44.486518 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f41dad7-21ad-43b0-9e07-443bcd0c8c6a-scripts\") pod \"1f41dad7-21ad-43b0-9e07-443bcd0c8c6a\" (UID: \"1f41dad7-21ad-43b0-9e07-443bcd0c8c6a\") " Mar 19 19:20:44 crc kubenswrapper[4826]: I0319 19:20:44.492067 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f41dad7-21ad-43b0-9e07-443bcd0c8c6a-kube-api-access-7h98m" (OuterVolumeSpecName: "kube-api-access-7h98m") pod "1f41dad7-21ad-43b0-9e07-443bcd0c8c6a" (UID: "1f41dad7-21ad-43b0-9e07-443bcd0c8c6a"). InnerVolumeSpecName "kube-api-access-7h98m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:20:44 crc kubenswrapper[4826]: I0319 19:20:44.495773 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f41dad7-21ad-43b0-9e07-443bcd0c8c6a-scripts" (OuterVolumeSpecName: "scripts") pod "1f41dad7-21ad-43b0-9e07-443bcd0c8c6a" (UID: "1f41dad7-21ad-43b0-9e07-443bcd0c8c6a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:20:44 crc kubenswrapper[4826]: I0319 19:20:44.519507 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f41dad7-21ad-43b0-9e07-443bcd0c8c6a-config-data" (OuterVolumeSpecName: "config-data") pod "1f41dad7-21ad-43b0-9e07-443bcd0c8c6a" (UID: "1f41dad7-21ad-43b0-9e07-443bcd0c8c6a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:20:44 crc kubenswrapper[4826]: I0319 19:20:44.519720 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f41dad7-21ad-43b0-9e07-443bcd0c8c6a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1f41dad7-21ad-43b0-9e07-443bcd0c8c6a" (UID: "1f41dad7-21ad-43b0-9e07-443bcd0c8c6a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:20:44 crc kubenswrapper[4826]: I0319 19:20:44.589250 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7h98m\" (UniqueName: \"kubernetes.io/projected/1f41dad7-21ad-43b0-9e07-443bcd0c8c6a-kube-api-access-7h98m\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:44 crc kubenswrapper[4826]: I0319 19:20:44.589291 4826 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f41dad7-21ad-43b0-9e07-443bcd0c8c6a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:44 crc kubenswrapper[4826]: I0319 19:20:44.589304 4826 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f41dad7-21ad-43b0-9e07-443bcd0c8c6a-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:44 crc kubenswrapper[4826]: I0319 19:20:44.589317 4826 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f41dad7-21ad-43b0-9e07-443bcd0c8c6a-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:44 crc kubenswrapper[4826]: I0319 19:20:44.832010 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-76txf" event={"ID":"1f41dad7-21ad-43b0-9e07-443bcd0c8c6a","Type":"ContainerDied","Data":"bd8908e3f9fb820a01ccd53012e69295f976f3116773c3722c221a0af17e6b79"} Mar 19 19:20:44 crc kubenswrapper[4826]: I0319 19:20:44.832054 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd8908e3f9fb820a01ccd53012e69295f976f3116773c3722c221a0af17e6b79" Mar 19 19:20:44 crc kubenswrapper[4826]: I0319 19:20:44.832067 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-76txf" Mar 19 19:20:45 crc kubenswrapper[4826]: I0319 19:20:45.228275 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 19 19:20:45 crc kubenswrapper[4826]: I0319 19:20:45.845587 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cml5k" podUID="af2662b9-3873-4947-9793-e7e1c6611dcb" containerName="registry-server" containerID="cri-o://c527ac1ea75fd336044e51d8b3c0ca4ff65d1d8505818e1642568cf5ef2fafca" gracePeriod=2 Mar 19 19:20:46 crc kubenswrapper[4826]: I0319 19:20:46.013749 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 19 19:20:46 crc kubenswrapper[4826]: I0319 19:20:46.376824 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Mar 19 19:20:46 crc kubenswrapper[4826]: E0319 19:20:46.377518 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f41dad7-21ad-43b0-9e07-443bcd0c8c6a" containerName="aodh-db-sync" Mar 19 19:20:46 crc kubenswrapper[4826]: I0319 19:20:46.377540 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f41dad7-21ad-43b0-9e07-443bcd0c8c6a" containerName="aodh-db-sync" Mar 19 19:20:46 crc kubenswrapper[4826]: I0319 19:20:46.377861 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f41dad7-21ad-43b0-9e07-443bcd0c8c6a" containerName="aodh-db-sync" Mar 19 19:20:46 crc kubenswrapper[4826]: I0319 19:20:46.380353 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 19 19:20:46 crc kubenswrapper[4826]: I0319 19:20:46.383352 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-pqw5p" Mar 19 19:20:46 crc kubenswrapper[4826]: I0319 19:20:46.384398 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 19 19:20:46 crc kubenswrapper[4826]: I0319 19:20:46.384539 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 19 19:20:46 crc kubenswrapper[4826]: I0319 19:20:46.392081 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 19 19:20:46 crc kubenswrapper[4826]: I0319 19:20:46.433741 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5835e9d-d28e-47a1-aa89-c61f110f2f53-combined-ca-bundle\") pod \"aodh-0\" (UID: \"c5835e9d-d28e-47a1-aa89-c61f110f2f53\") " pod="openstack/aodh-0" Mar 19 19:20:46 crc kubenswrapper[4826]: I0319 19:20:46.434308 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5835e9d-d28e-47a1-aa89-c61f110f2f53-config-data\") pod \"aodh-0\" (UID: \"c5835e9d-d28e-47a1-aa89-c61f110f2f53\") " pod="openstack/aodh-0" Mar 19 19:20:46 crc kubenswrapper[4826]: I0319 19:20:46.434679 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5835e9d-d28e-47a1-aa89-c61f110f2f53-scripts\") pod \"aodh-0\" (UID: \"c5835e9d-d28e-47a1-aa89-c61f110f2f53\") " pod="openstack/aodh-0" Mar 19 19:20:46 crc kubenswrapper[4826]: I0319 19:20:46.434800 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzrgb\" (UniqueName: \"kubernetes.io/projected/c5835e9d-d28e-47a1-aa89-c61f110f2f53-kube-api-access-kzrgb\") pod \"aodh-0\" (UID: \"c5835e9d-d28e-47a1-aa89-c61f110f2f53\") " pod="openstack/aodh-0" Mar 19 19:20:46 crc kubenswrapper[4826]: I0319 19:20:46.536647 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5835e9d-d28e-47a1-aa89-c61f110f2f53-config-data\") pod \"aodh-0\" (UID: \"c5835e9d-d28e-47a1-aa89-c61f110f2f53\") " pod="openstack/aodh-0" Mar 19 19:20:46 crc kubenswrapper[4826]: I0319 19:20:46.536860 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5835e9d-d28e-47a1-aa89-c61f110f2f53-scripts\") pod \"aodh-0\" (UID: \"c5835e9d-d28e-47a1-aa89-c61f110f2f53\") " pod="openstack/aodh-0" Mar 19 19:20:46 crc kubenswrapper[4826]: I0319 19:20:46.536929 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzrgb\" (UniqueName: \"kubernetes.io/projected/c5835e9d-d28e-47a1-aa89-c61f110f2f53-kube-api-access-kzrgb\") pod \"aodh-0\" (UID: \"c5835e9d-d28e-47a1-aa89-c61f110f2f53\") " pod="openstack/aodh-0" Mar 19 19:20:46 crc kubenswrapper[4826]: I0319 19:20:46.537061 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5835e9d-d28e-47a1-aa89-c61f110f2f53-combined-ca-bundle\") pod \"aodh-0\" (UID: \"c5835e9d-d28e-47a1-aa89-c61f110f2f53\") " pod="openstack/aodh-0" Mar 19 19:20:46 crc kubenswrapper[4826]: I0319 19:20:46.543480 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5835e9d-d28e-47a1-aa89-c61f110f2f53-combined-ca-bundle\") pod \"aodh-0\" (UID: \"c5835e9d-d28e-47a1-aa89-c61f110f2f53\") " pod="openstack/aodh-0" Mar 19 19:20:46 crc kubenswrapper[4826]: I0319 19:20:46.543512 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5835e9d-d28e-47a1-aa89-c61f110f2f53-scripts\") pod \"aodh-0\" (UID: \"c5835e9d-d28e-47a1-aa89-c61f110f2f53\") " pod="openstack/aodh-0" Mar 19 19:20:46 crc kubenswrapper[4826]: I0319 19:20:46.544821 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5835e9d-d28e-47a1-aa89-c61f110f2f53-config-data\") pod \"aodh-0\" (UID: \"c5835e9d-d28e-47a1-aa89-c61f110f2f53\") " pod="openstack/aodh-0" Mar 19 19:20:46 crc kubenswrapper[4826]: I0319 19:20:46.560745 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzrgb\" (UniqueName: \"kubernetes.io/projected/c5835e9d-d28e-47a1-aa89-c61f110f2f53-kube-api-access-kzrgb\") pod \"aodh-0\" (UID: \"c5835e9d-d28e-47a1-aa89-c61f110f2f53\") " pod="openstack/aodh-0" Mar 19 19:20:46 crc kubenswrapper[4826]: I0319 19:20:46.652810 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cml5k" Mar 19 19:20:46 crc kubenswrapper[4826]: I0319 19:20:46.740850 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af2662b9-3873-4947-9793-e7e1c6611dcb-catalog-content\") pod \"af2662b9-3873-4947-9793-e7e1c6611dcb\" (UID: \"af2662b9-3873-4947-9793-e7e1c6611dcb\") " Mar 19 19:20:46 crc kubenswrapper[4826]: I0319 19:20:46.741234 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7v9w\" (UniqueName: \"kubernetes.io/projected/af2662b9-3873-4947-9793-e7e1c6611dcb-kube-api-access-h7v9w\") pod \"af2662b9-3873-4947-9793-e7e1c6611dcb\" (UID: \"af2662b9-3873-4947-9793-e7e1c6611dcb\") " Mar 19 19:20:46 crc kubenswrapper[4826]: I0319 19:20:46.741272 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af2662b9-3873-4947-9793-e7e1c6611dcb-utilities\") pod \"af2662b9-3873-4947-9793-e7e1c6611dcb\" (UID: \"af2662b9-3873-4947-9793-e7e1c6611dcb\") " Mar 19 19:20:46 crc kubenswrapper[4826]: I0319 19:20:46.742104 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 19 19:20:46 crc kubenswrapper[4826]: I0319 19:20:46.742115 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af2662b9-3873-4947-9793-e7e1c6611dcb-utilities" (OuterVolumeSpecName: "utilities") pod "af2662b9-3873-4947-9793-e7e1c6611dcb" (UID: "af2662b9-3873-4947-9793-e7e1c6611dcb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:20:46 crc kubenswrapper[4826]: I0319 19:20:46.754682 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af2662b9-3873-4947-9793-e7e1c6611dcb-kube-api-access-h7v9w" (OuterVolumeSpecName: "kube-api-access-h7v9w") pod "af2662b9-3873-4947-9793-e7e1c6611dcb" (UID: "af2662b9-3873-4947-9793-e7e1c6611dcb"). InnerVolumeSpecName "kube-api-access-h7v9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:20:46 crc kubenswrapper[4826]: I0319 19:20:46.844171 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7v9w\" (UniqueName: \"kubernetes.io/projected/af2662b9-3873-4947-9793-e7e1c6611dcb-kube-api-access-h7v9w\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:46 crc kubenswrapper[4826]: I0319 19:20:46.844202 4826 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af2662b9-3873-4947-9793-e7e1c6611dcb-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:46 crc kubenswrapper[4826]: I0319 19:20:46.866535 4826 generic.go:334] "Generic (PLEG): container finished" podID="af2662b9-3873-4947-9793-e7e1c6611dcb" containerID="c527ac1ea75fd336044e51d8b3c0ca4ff65d1d8505818e1642568cf5ef2fafca" exitCode=0 Mar 19 19:20:46 crc kubenswrapper[4826]: I0319 19:20:46.866821 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cml5k" event={"ID":"af2662b9-3873-4947-9793-e7e1c6611dcb","Type":"ContainerDied","Data":"c527ac1ea75fd336044e51d8b3c0ca4ff65d1d8505818e1642568cf5ef2fafca"} Mar 19 19:20:46 crc kubenswrapper[4826]: I0319 19:20:46.866851 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cml5k" event={"ID":"af2662b9-3873-4947-9793-e7e1c6611dcb","Type":"ContainerDied","Data":"91350502dcb210944af251ae5a18f9020c8e7524f77d556eca5954e0f74c19fa"} Mar 19 19:20:46 crc kubenswrapper[4826]: I0319 19:20:46.866869 4826 scope.go:117] "RemoveContainer" containerID="c527ac1ea75fd336044e51d8b3c0ca4ff65d1d8505818e1642568cf5ef2fafca" Mar 19 19:20:46 crc kubenswrapper[4826]: I0319 19:20:46.867001 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cml5k" Mar 19 19:20:46 crc kubenswrapper[4826]: I0319 19:20:46.890529 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af2662b9-3873-4947-9793-e7e1c6611dcb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "af2662b9-3873-4947-9793-e7e1c6611dcb" (UID: "af2662b9-3873-4947-9793-e7e1c6611dcb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:20:46 crc kubenswrapper[4826]: I0319 19:20:46.924469 4826 scope.go:117] "RemoveContainer" containerID="3be29d70298c1093bc3ea79f530bfa5d2598792bf372dd42f474edfb330146ef" Mar 19 19:20:46 crc kubenswrapper[4826]: I0319 19:20:46.948571 4826 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af2662b9-3873-4947-9793-e7e1c6611dcb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:46 crc kubenswrapper[4826]: I0319 19:20:46.979791 4826 scope.go:117] "RemoveContainer" containerID="90d79c80978df5290ed3480f9117dd794bb87506bd39c0812f84a12d5ae758cc" Mar 19 19:20:47 crc kubenswrapper[4826]: I0319 19:20:47.082945 4826 scope.go:117] "RemoveContainer" containerID="c527ac1ea75fd336044e51d8b3c0ca4ff65d1d8505818e1642568cf5ef2fafca" Mar 19 19:20:47 crc kubenswrapper[4826]: E0319 19:20:47.083799 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c527ac1ea75fd336044e51d8b3c0ca4ff65d1d8505818e1642568cf5ef2fafca\": container with ID starting with c527ac1ea75fd336044e51d8b3c0ca4ff65d1d8505818e1642568cf5ef2fafca not found: ID does not exist" containerID="c527ac1ea75fd336044e51d8b3c0ca4ff65d1d8505818e1642568cf5ef2fafca" Mar 19 19:20:47 crc kubenswrapper[4826]: I0319 19:20:47.083908 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c527ac1ea75fd336044e51d8b3c0ca4ff65d1d8505818e1642568cf5ef2fafca"} err="failed to get container status \"c527ac1ea75fd336044e51d8b3c0ca4ff65d1d8505818e1642568cf5ef2fafca\": rpc error: code = NotFound desc = could not find container \"c527ac1ea75fd336044e51d8b3c0ca4ff65d1d8505818e1642568cf5ef2fafca\": container with ID starting with c527ac1ea75fd336044e51d8b3c0ca4ff65d1d8505818e1642568cf5ef2fafca not found: ID does not exist" Mar 19 19:20:47 crc kubenswrapper[4826]: I0319 19:20:47.083999 4826 scope.go:117] "RemoveContainer" containerID="3be29d70298c1093bc3ea79f530bfa5d2598792bf372dd42f474edfb330146ef" Mar 19 19:20:47 crc kubenswrapper[4826]: E0319 19:20:47.084341 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3be29d70298c1093bc3ea79f530bfa5d2598792bf372dd42f474edfb330146ef\": container with ID starting with 3be29d70298c1093bc3ea79f530bfa5d2598792bf372dd42f474edfb330146ef not found: ID does not exist" containerID="3be29d70298c1093bc3ea79f530bfa5d2598792bf372dd42f474edfb330146ef" Mar 19 19:20:47 crc kubenswrapper[4826]: I0319 19:20:47.084420 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3be29d70298c1093bc3ea79f530bfa5d2598792bf372dd42f474edfb330146ef"} err="failed to get container status \"3be29d70298c1093bc3ea79f530bfa5d2598792bf372dd42f474edfb330146ef\": rpc error: code = NotFound desc = could not find container \"3be29d70298c1093bc3ea79f530bfa5d2598792bf372dd42f474edfb330146ef\": container with ID starting with 3be29d70298c1093bc3ea79f530bfa5d2598792bf372dd42f474edfb330146ef not found: ID does not exist" Mar 19 19:20:47 crc kubenswrapper[4826]: I0319 19:20:47.084483 4826 scope.go:117] "RemoveContainer" containerID="90d79c80978df5290ed3480f9117dd794bb87506bd39c0812f84a12d5ae758cc" Mar 19 19:20:47 crc kubenswrapper[4826]: E0319 19:20:47.086316 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90d79c80978df5290ed3480f9117dd794bb87506bd39c0812f84a12d5ae758cc\": container with ID starting with 90d79c80978df5290ed3480f9117dd794bb87506bd39c0812f84a12d5ae758cc not found: ID does not exist" containerID="90d79c80978df5290ed3480f9117dd794bb87506bd39c0812f84a12d5ae758cc" Mar 19 19:20:47 crc kubenswrapper[4826]: I0319 19:20:47.086343 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90d79c80978df5290ed3480f9117dd794bb87506bd39c0812f84a12d5ae758cc"} err="failed to get container status \"90d79c80978df5290ed3480f9117dd794bb87506bd39c0812f84a12d5ae758cc\": rpc error: code = NotFound desc = could not find container \"90d79c80978df5290ed3480f9117dd794bb87506bd39c0812f84a12d5ae758cc\": container with ID starting with 90d79c80978df5290ed3480f9117dd794bb87506bd39c0812f84a12d5ae758cc not found: ID does not exist" Mar 19 19:20:47 crc kubenswrapper[4826]: I0319 19:20:47.215174 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cml5k"] Mar 19 19:20:47 crc kubenswrapper[4826]: I0319 19:20:47.227948 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cml5k"] Mar 19 19:20:47 crc kubenswrapper[4826]: I0319 19:20:47.280981 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 19 19:20:47 crc kubenswrapper[4826]: W0319 19:20:47.302449 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5835e9d_d28e_47a1_aa89_c61f110f2f53.slice/crio-75fef9943fcfbb86259f93012d6d2771ddc07682ef67813d64a9442994826d17 WatchSource:0}: Error finding container 75fef9943fcfbb86259f93012d6d2771ddc07682ef67813d64a9442994826d17: Status 404 returned error can't find the container with id 75fef9943fcfbb86259f93012d6d2771ddc07682ef67813d64a9442994826d17 Mar 19 19:20:47 crc kubenswrapper[4826]: I0319 19:20:47.891814 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"c5835e9d-d28e-47a1-aa89-c61f110f2f53","Type":"ContainerStarted","Data":"75fef9943fcfbb86259f93012d6d2771ddc07682ef67813d64a9442994826d17"} Mar 19 19:20:48 crc kubenswrapper[4826]: I0319 19:20:48.025322 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af2662b9-3873-4947-9793-e7e1c6611dcb" path="/var/lib/kubelet/pods/af2662b9-3873-4947-9793-e7e1c6611dcb/volumes" Mar 19 19:20:48 crc kubenswrapper[4826]: I0319 19:20:48.913644 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"c5835e9d-d28e-47a1-aa89-c61f110f2f53","Type":"ContainerStarted","Data":"066bb8481e3cc8105d0dc22392d969ef963ef4abb0cfa4d433ffeff5fca3f3ac"} Mar 19 19:20:49 crc kubenswrapper[4826]: I0319 19:20:49.033281 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:20:49 crc kubenswrapper[4826]: I0319 19:20:49.033806 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="35ddd07d-67cd-45c0-a20f-956fcadee9f2" containerName="ceilometer-central-agent" containerID="cri-o://2cf3f8b4eb2d75d2931cdb391a0a30b223b7a2fd2875a8ac038f3d303a1604a2" gracePeriod=30 Mar 19 19:20:49 crc kubenswrapper[4826]: I0319 19:20:49.034667 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="35ddd07d-67cd-45c0-a20f-956fcadee9f2" containerName="sg-core" containerID="cri-o://a2589dabf42c4588b0cc6dc78578f7f4a80fc586d2d738b8fdb0dcb2e6d94270" gracePeriod=30 Mar 19 19:20:49 crc kubenswrapper[4826]: I0319 19:20:49.034775 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="35ddd07d-67cd-45c0-a20f-956fcadee9f2" containerName="proxy-httpd" containerID="cri-o://eb9eda7b0b40134844a93ca83c22c387b8b8f574ea7088bae71c9a89f7d2d772" gracePeriod=30 Mar 19 19:20:49 crc kubenswrapper[4826]: I0319 19:20:49.034822 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="35ddd07d-67cd-45c0-a20f-956fcadee9f2" containerName="ceilometer-notification-agent" containerID="cri-o://a4c9c2b33151051406b8c7d44641c2d089b3e04330b8aece4a3db25b071301b1" gracePeriod=30 Mar 19 19:20:49 crc kubenswrapper[4826]: I0319 19:20:49.051408 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="35ddd07d-67cd-45c0-a20f-956fcadee9f2" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.254:3000/\": EOF" Mar 19 19:20:49 crc kubenswrapper[4826]: I0319 19:20:49.758015 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Mar 19 19:20:49 crc kubenswrapper[4826]: I0319 19:20:49.933923 4826 generic.go:334] "Generic (PLEG): container finished" podID="35ddd07d-67cd-45c0-a20f-956fcadee9f2" containerID="eb9eda7b0b40134844a93ca83c22c387b8b8f574ea7088bae71c9a89f7d2d772" exitCode=0 Mar 19 19:20:49 crc kubenswrapper[4826]: I0319 19:20:49.934267 4826 generic.go:334] "Generic (PLEG): container finished" podID="35ddd07d-67cd-45c0-a20f-956fcadee9f2" containerID="a2589dabf42c4588b0cc6dc78578f7f4a80fc586d2d738b8fdb0dcb2e6d94270" exitCode=2 Mar 19 19:20:49 crc kubenswrapper[4826]: I0319 19:20:49.934280 4826 generic.go:334] "Generic (PLEG): container finished" podID="35ddd07d-67cd-45c0-a20f-956fcadee9f2" containerID="a4c9c2b33151051406b8c7d44641c2d089b3e04330b8aece4a3db25b071301b1" exitCode=0 Mar 19 19:20:49 crc kubenswrapper[4826]: I0319 19:20:49.934292 4826 generic.go:334] "Generic (PLEG): container finished" podID="35ddd07d-67cd-45c0-a20f-956fcadee9f2" containerID="2cf3f8b4eb2d75d2931cdb391a0a30b223b7a2fd2875a8ac038f3d303a1604a2" exitCode=0 Mar 19 19:20:49 crc kubenswrapper[4826]: I0319 19:20:49.934094 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35ddd07d-67cd-45c0-a20f-956fcadee9f2","Type":"ContainerDied","Data":"eb9eda7b0b40134844a93ca83c22c387b8b8f574ea7088bae71c9a89f7d2d772"} Mar 19 19:20:49 crc kubenswrapper[4826]: I0319 19:20:49.934337 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35ddd07d-67cd-45c0-a20f-956fcadee9f2","Type":"ContainerDied","Data":"a2589dabf42c4588b0cc6dc78578f7f4a80fc586d2d738b8fdb0dcb2e6d94270"} Mar 19 19:20:49 crc kubenswrapper[4826]: I0319 19:20:49.934357 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35ddd07d-67cd-45c0-a20f-956fcadee9f2","Type":"ContainerDied","Data":"a4c9c2b33151051406b8c7d44641c2d089b3e04330b8aece4a3db25b071301b1"} Mar 19 19:20:49 crc kubenswrapper[4826]: I0319 19:20:49.934372 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35ddd07d-67cd-45c0-a20f-956fcadee9f2","Type":"ContainerDied","Data":"2cf3f8b4eb2d75d2931cdb391a0a30b223b7a2fd2875a8ac038f3d303a1604a2"} Mar 19 19:20:50 crc kubenswrapper[4826]: I0319 19:20:50.211479 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 19 19:20:50 crc kubenswrapper[4826]: I0319 19:20:50.211532 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 19 19:20:50 crc kubenswrapper[4826]: I0319 19:20:50.231095 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 19 19:20:50 crc kubenswrapper[4826]: I0319 19:20:50.282289 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 19 19:20:50 crc kubenswrapper[4826]: I0319 19:20:50.495002 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:20:50 crc kubenswrapper[4826]: I0319 19:20:50.583574 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/35ddd07d-67cd-45c0-a20f-956fcadee9f2-sg-core-conf-yaml\") pod \"35ddd07d-67cd-45c0-a20f-956fcadee9f2\" (UID: \"35ddd07d-67cd-45c0-a20f-956fcadee9f2\") " Mar 19 19:20:50 crc kubenswrapper[4826]: I0319 19:20:50.583618 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35ddd07d-67cd-45c0-a20f-956fcadee9f2-log-httpd\") pod \"35ddd07d-67cd-45c0-a20f-956fcadee9f2\" (UID: \"35ddd07d-67cd-45c0-a20f-956fcadee9f2\") " Mar 19 19:20:50 crc kubenswrapper[4826]: I0319 19:20:50.583750 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsdbg\" (UniqueName: \"kubernetes.io/projected/35ddd07d-67cd-45c0-a20f-956fcadee9f2-kube-api-access-tsdbg\") pod \"35ddd07d-67cd-45c0-a20f-956fcadee9f2\" (UID: \"35ddd07d-67cd-45c0-a20f-956fcadee9f2\") " Mar 19 19:20:50 crc kubenswrapper[4826]: I0319 19:20:50.583774 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35ddd07d-67cd-45c0-a20f-956fcadee9f2-run-httpd\") pod \"35ddd07d-67cd-45c0-a20f-956fcadee9f2\" (UID: \"35ddd07d-67cd-45c0-a20f-956fcadee9f2\") " Mar 19 19:20:50 crc kubenswrapper[4826]: I0319 19:20:50.584156 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35ddd07d-67cd-45c0-a20f-956fcadee9f2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "35ddd07d-67cd-45c0-a20f-956fcadee9f2" (UID: "35ddd07d-67cd-45c0-a20f-956fcadee9f2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:20:50 crc kubenswrapper[4826]: I0319 19:20:50.584176 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35ddd07d-67cd-45c0-a20f-956fcadee9f2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "35ddd07d-67cd-45c0-a20f-956fcadee9f2" (UID: "35ddd07d-67cd-45c0-a20f-956fcadee9f2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:20:50 crc kubenswrapper[4826]: I0319 19:20:50.584019 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35ddd07d-67cd-45c0-a20f-956fcadee9f2-config-data\") pod \"35ddd07d-67cd-45c0-a20f-956fcadee9f2\" (UID: \"35ddd07d-67cd-45c0-a20f-956fcadee9f2\") " Mar 19 19:20:50 crc kubenswrapper[4826]: I0319 19:20:50.584535 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35ddd07d-67cd-45c0-a20f-956fcadee9f2-scripts\") pod \"35ddd07d-67cd-45c0-a20f-956fcadee9f2\" (UID: \"35ddd07d-67cd-45c0-a20f-956fcadee9f2\") " Mar 19 19:20:50 crc kubenswrapper[4826]: I0319 19:20:50.584625 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35ddd07d-67cd-45c0-a20f-956fcadee9f2-combined-ca-bundle\") pod \"35ddd07d-67cd-45c0-a20f-956fcadee9f2\" (UID: \"35ddd07d-67cd-45c0-a20f-956fcadee9f2\") " Mar 19 19:20:50 crc kubenswrapper[4826]: I0319 19:20:50.585415 4826 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35ddd07d-67cd-45c0-a20f-956fcadee9f2-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:50 crc kubenswrapper[4826]: I0319 19:20:50.585433 4826 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35ddd07d-67cd-45c0-a20f-956fcadee9f2-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:50 crc kubenswrapper[4826]: I0319 19:20:50.600801 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35ddd07d-67cd-45c0-a20f-956fcadee9f2-scripts" (OuterVolumeSpecName: "scripts") pod "35ddd07d-67cd-45c0-a20f-956fcadee9f2" (UID: "35ddd07d-67cd-45c0-a20f-956fcadee9f2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:20:50 crc kubenswrapper[4826]: I0319 19:20:50.627144 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35ddd07d-67cd-45c0-a20f-956fcadee9f2-kube-api-access-tsdbg" (OuterVolumeSpecName: "kube-api-access-tsdbg") pod "35ddd07d-67cd-45c0-a20f-956fcadee9f2" (UID: "35ddd07d-67cd-45c0-a20f-956fcadee9f2"). InnerVolumeSpecName "kube-api-access-tsdbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:20:50 crc kubenswrapper[4826]: I0319 19:20:50.687267 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tsdbg\" (UniqueName: \"kubernetes.io/projected/35ddd07d-67cd-45c0-a20f-956fcadee9f2-kube-api-access-tsdbg\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:50 crc kubenswrapper[4826]: I0319 19:20:50.687296 4826 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35ddd07d-67cd-45c0-a20f-956fcadee9f2-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:50 crc kubenswrapper[4826]: I0319 19:20:50.695893 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35ddd07d-67cd-45c0-a20f-956fcadee9f2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "35ddd07d-67cd-45c0-a20f-956fcadee9f2" (UID: "35ddd07d-67cd-45c0-a20f-956fcadee9f2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:20:50 crc kubenswrapper[4826]: I0319 19:20:50.770311 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35ddd07d-67cd-45c0-a20f-956fcadee9f2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "35ddd07d-67cd-45c0-a20f-956fcadee9f2" (UID: "35ddd07d-67cd-45c0-a20f-956fcadee9f2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:20:50 crc kubenswrapper[4826]: I0319 19:20:50.788839 4826 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/35ddd07d-67cd-45c0-a20f-956fcadee9f2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:50 crc kubenswrapper[4826]: I0319 19:20:50.788871 4826 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35ddd07d-67cd-45c0-a20f-956fcadee9f2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:50 crc kubenswrapper[4826]: I0319 19:20:50.791026 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35ddd07d-67cd-45c0-a20f-956fcadee9f2-config-data" (OuterVolumeSpecName: "config-data") pod "35ddd07d-67cd-45c0-a20f-956fcadee9f2" (UID: "35ddd07d-67cd-45c0-a20f-956fcadee9f2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:20:50 crc kubenswrapper[4826]: I0319 19:20:50.891422 4826 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35ddd07d-67cd-45c0-a20f-956fcadee9f2-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:50 crc kubenswrapper[4826]: I0319 19:20:50.974445 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"c5835e9d-d28e-47a1-aa89-c61f110f2f53","Type":"ContainerStarted","Data":"0d7b9305c553990c2ae667a61313bf01d6ab1104cbc53cc92394c953df61bc43"} Mar 19 19:20:50 crc kubenswrapper[4826]: I0319 19:20:50.993078 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:20:50 crc kubenswrapper[4826]: I0319 19:20:50.994094 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35ddd07d-67cd-45c0-a20f-956fcadee9f2","Type":"ContainerDied","Data":"2c9d73708d01c663763cd07ad9bb981eb1253d77c36a8e1d5bb5eefe3ec9a363"} Mar 19 19:20:50 crc kubenswrapper[4826]: I0319 19:20:50.994148 4826 scope.go:117] "RemoveContainer" containerID="eb9eda7b0b40134844a93ca83c22c387b8b8f574ea7088bae71c9a89f7d2d772" Mar 19 19:20:51 crc kubenswrapper[4826]: I0319 19:20:51.043881 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:20:51 crc kubenswrapper[4826]: I0319 19:20:51.047696 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 19 19:20:51 crc kubenswrapper[4826]: I0319 19:20:51.055576 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:20:51 crc kubenswrapper[4826]: I0319 19:20:51.062601 4826 scope.go:117] "RemoveContainer" containerID="a2589dabf42c4588b0cc6dc78578f7f4a80fc586d2d738b8fdb0dcb2e6d94270" Mar 19 19:20:51 crc kubenswrapper[4826]: I0319 19:20:51.074171 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:20:51 crc kubenswrapper[4826]: E0319 19:20:51.074692 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35ddd07d-67cd-45c0-a20f-956fcadee9f2" containerName="ceilometer-notification-agent" Mar 19 19:20:51 crc kubenswrapper[4826]: I0319 19:20:51.074709 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="35ddd07d-67cd-45c0-a20f-956fcadee9f2" containerName="ceilometer-notification-agent" Mar 19 19:20:51 crc kubenswrapper[4826]: E0319 19:20:51.074729 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af2662b9-3873-4947-9793-e7e1c6611dcb" containerName="extract-utilities" Mar 19 19:20:51 crc kubenswrapper[4826]: I0319 19:20:51.074738 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="af2662b9-3873-4947-9793-e7e1c6611dcb" containerName="extract-utilities" Mar 19 19:20:51 crc kubenswrapper[4826]: E0319 19:20:51.074752 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35ddd07d-67cd-45c0-a20f-956fcadee9f2" containerName="ceilometer-central-agent" Mar 19 19:20:51 crc kubenswrapper[4826]: I0319 19:20:51.074758 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="35ddd07d-67cd-45c0-a20f-956fcadee9f2" containerName="ceilometer-central-agent" Mar 19 19:20:51 crc kubenswrapper[4826]: E0319 19:20:51.074775 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35ddd07d-67cd-45c0-a20f-956fcadee9f2" containerName="proxy-httpd" Mar 19 19:20:51 crc kubenswrapper[4826]: I0319 19:20:51.074781 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="35ddd07d-67cd-45c0-a20f-956fcadee9f2" containerName="proxy-httpd" Mar 19 19:20:51 crc kubenswrapper[4826]: E0319 19:20:51.074791 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af2662b9-3873-4947-9793-e7e1c6611dcb" containerName="registry-server" Mar 19 19:20:51 crc kubenswrapper[4826]: I0319 19:20:51.074797 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="af2662b9-3873-4947-9793-e7e1c6611dcb" containerName="registry-server" Mar 19 19:20:51 crc kubenswrapper[4826]: E0319 19:20:51.074817 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af2662b9-3873-4947-9793-e7e1c6611dcb" containerName="extract-content" Mar 19 19:20:51 crc kubenswrapper[4826]: I0319 19:20:51.074824 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="af2662b9-3873-4947-9793-e7e1c6611dcb" containerName="extract-content" Mar 19 19:20:51 crc kubenswrapper[4826]: E0319 19:20:51.074837 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35ddd07d-67cd-45c0-a20f-956fcadee9f2" containerName="sg-core" Mar 19 19:20:51 crc kubenswrapper[4826]: I0319 19:20:51.074843 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="35ddd07d-67cd-45c0-a20f-956fcadee9f2" containerName="sg-core" Mar 19 19:20:51 crc kubenswrapper[4826]: I0319 19:20:51.075034 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="af2662b9-3873-4947-9793-e7e1c6611dcb" containerName="registry-server" Mar 19 19:20:51 crc kubenswrapper[4826]: I0319 19:20:51.075057 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="35ddd07d-67cd-45c0-a20f-956fcadee9f2" containerName="proxy-httpd" Mar 19 19:20:51 crc kubenswrapper[4826]: I0319 19:20:51.075065 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="35ddd07d-67cd-45c0-a20f-956fcadee9f2" containerName="ceilometer-notification-agent" Mar 19 19:20:51 crc kubenswrapper[4826]: I0319 19:20:51.075079 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="35ddd07d-67cd-45c0-a20f-956fcadee9f2" containerName="ceilometer-central-agent" Mar 19 19:20:51 crc kubenswrapper[4826]: I0319 19:20:51.075094 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="35ddd07d-67cd-45c0-a20f-956fcadee9f2" containerName="sg-core" Mar 19 19:20:51 crc kubenswrapper[4826]: I0319 19:20:51.077116 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:20:51 crc kubenswrapper[4826]: I0319 19:20:51.079165 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 19 19:20:51 crc kubenswrapper[4826]: I0319 19:20:51.079379 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 19 19:20:51 crc kubenswrapper[4826]: I0319 19:20:51.087052 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:20:51 crc kubenswrapper[4826]: I0319 19:20:51.127866 4826 scope.go:117] "RemoveContainer" containerID="a4c9c2b33151051406b8c7d44641c2d089b3e04330b8aece4a3db25b071301b1" Mar 19 19:20:51 crc kubenswrapper[4826]: I0319 19:20:51.160106 4826 scope.go:117] "RemoveContainer" containerID="2cf3f8b4eb2d75d2931cdb391a0a30b223b7a2fd2875a8ac038f3d303a1604a2" Mar 19 19:20:51 crc kubenswrapper[4826]: I0319 19:20:51.210974 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a25b4b35-8c3f-41b6-b5e4-170125ab3ab3-log-httpd\") pod \"ceilometer-0\" (UID: \"a25b4b35-8c3f-41b6-b5e4-170125ab3ab3\") " pod="openstack/ceilometer-0" Mar 19 19:20:51 crc kubenswrapper[4826]: I0319 19:20:51.211051 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a25b4b35-8c3f-41b6-b5e4-170125ab3ab3-scripts\") pod \"ceilometer-0\" (UID: \"a25b4b35-8c3f-41b6-b5e4-170125ab3ab3\") " pod="openstack/ceilometer-0" Mar 19 19:20:51 crc kubenswrapper[4826]: I0319 19:20:51.211090 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a25b4b35-8c3f-41b6-b5e4-170125ab3ab3-run-httpd\") pod \"ceilometer-0\" (UID: \"a25b4b35-8c3f-41b6-b5e4-170125ab3ab3\") " pod="openstack/ceilometer-0" Mar 19 19:20:51 crc kubenswrapper[4826]: I0319 19:20:51.211426 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a25b4b35-8c3f-41b6-b5e4-170125ab3ab3-config-data\") pod \"ceilometer-0\" (UID: \"a25b4b35-8c3f-41b6-b5e4-170125ab3ab3\") " pod="openstack/ceilometer-0" Mar 19 19:20:51 crc kubenswrapper[4826]: I0319 19:20:51.211493 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a25b4b35-8c3f-41b6-b5e4-170125ab3ab3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a25b4b35-8c3f-41b6-b5e4-170125ab3ab3\") " pod="openstack/ceilometer-0" Mar 19 19:20:51 crc kubenswrapper[4826]: I0319 19:20:51.211568 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a25b4b35-8c3f-41b6-b5e4-170125ab3ab3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a25b4b35-8c3f-41b6-b5e4-170125ab3ab3\") " pod="openstack/ceilometer-0" Mar 19 19:20:51 crc kubenswrapper[4826]: I0319 19:20:51.211619 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gmkd\" (UniqueName: \"kubernetes.io/projected/a25b4b35-8c3f-41b6-b5e4-170125ab3ab3-kube-api-access-7gmkd\") pod \"ceilometer-0\" (UID: \"a25b4b35-8c3f-41b6-b5e4-170125ab3ab3\") " pod="openstack/ceilometer-0" Mar 19 19:20:51 crc kubenswrapper[4826]: I0319 19:20:51.293916 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a6bded02-bd3b-455f-9ca2-e075e45efce2" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.1:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 19:20:51 crc kubenswrapper[4826]: I0319 19:20:51.293884 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a6bded02-bd3b-455f-9ca2-e075e45efce2" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.1:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 19:20:51 crc kubenswrapper[4826]: I0319 19:20:51.314008 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a25b4b35-8c3f-41b6-b5e4-170125ab3ab3-scripts\") pod \"ceilometer-0\" (UID: \"a25b4b35-8c3f-41b6-b5e4-170125ab3ab3\") " pod="openstack/ceilometer-0" Mar 19 19:20:51 crc kubenswrapper[4826]: I0319 19:20:51.314071 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a25b4b35-8c3f-41b6-b5e4-170125ab3ab3-run-httpd\") pod \"ceilometer-0\" (UID: \"a25b4b35-8c3f-41b6-b5e4-170125ab3ab3\") " pod="openstack/ceilometer-0" Mar 19 19:20:51 crc kubenswrapper[4826]: I0319 19:20:51.314171 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a25b4b35-8c3f-41b6-b5e4-170125ab3ab3-config-data\") pod \"ceilometer-0\" (UID: \"a25b4b35-8c3f-41b6-b5e4-170125ab3ab3\") " pod="openstack/ceilometer-0" Mar 19 19:20:51 crc kubenswrapper[4826]: I0319 19:20:51.314194 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a25b4b35-8c3f-41b6-b5e4-170125ab3ab3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a25b4b35-8c3f-41b6-b5e4-170125ab3ab3\") " pod="openstack/ceilometer-0" Mar 19 19:20:51 crc kubenswrapper[4826]: I0319 19:20:51.314227 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a25b4b35-8c3f-41b6-b5e4-170125ab3ab3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a25b4b35-8c3f-41b6-b5e4-170125ab3ab3\") " pod="openstack/ceilometer-0" Mar 19 19:20:51 crc kubenswrapper[4826]: I0319 19:20:51.314260 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gmkd\" (UniqueName: \"kubernetes.io/projected/a25b4b35-8c3f-41b6-b5e4-170125ab3ab3-kube-api-access-7gmkd\") pod \"ceilometer-0\" (UID: \"a25b4b35-8c3f-41b6-b5e4-170125ab3ab3\") " pod="openstack/ceilometer-0" Mar 19 19:20:51 crc kubenswrapper[4826]: I0319 19:20:51.314312 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a25b4b35-8c3f-41b6-b5e4-170125ab3ab3-log-httpd\") pod \"ceilometer-0\" (UID: \"a25b4b35-8c3f-41b6-b5e4-170125ab3ab3\") " pod="openstack/ceilometer-0" Mar 19 19:20:51 crc kubenswrapper[4826]: I0319 19:20:51.314892 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a25b4b35-8c3f-41b6-b5e4-170125ab3ab3-log-httpd\") pod \"ceilometer-0\" (UID: \"a25b4b35-8c3f-41b6-b5e4-170125ab3ab3\") " pod="openstack/ceilometer-0" Mar 19 19:20:51 crc kubenswrapper[4826]: I0319 19:20:51.314949 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a25b4b35-8c3f-41b6-b5e4-170125ab3ab3-run-httpd\") pod \"ceilometer-0\" (UID: \"a25b4b35-8c3f-41b6-b5e4-170125ab3ab3\") " pod="openstack/ceilometer-0" Mar 19 19:20:51 crc kubenswrapper[4826]: I0319 19:20:51.319290 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a25b4b35-8c3f-41b6-b5e4-170125ab3ab3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a25b4b35-8c3f-41b6-b5e4-170125ab3ab3\") " pod="openstack/ceilometer-0" Mar 19 19:20:51 crc kubenswrapper[4826]: I0319 19:20:51.319806 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a25b4b35-8c3f-41b6-b5e4-170125ab3ab3-config-data\") pod \"ceilometer-0\" (UID: \"a25b4b35-8c3f-41b6-b5e4-170125ab3ab3\") " pod="openstack/ceilometer-0" Mar 19 19:20:51 crc kubenswrapper[4826]: I0319 19:20:51.323067 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a25b4b35-8c3f-41b6-b5e4-170125ab3ab3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a25b4b35-8c3f-41b6-b5e4-170125ab3ab3\") " pod="openstack/ceilometer-0" Mar 19 19:20:51 crc kubenswrapper[4826]: I0319 19:20:51.324032 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a25b4b35-8c3f-41b6-b5e4-170125ab3ab3-scripts\") pod \"ceilometer-0\" (UID: \"a25b4b35-8c3f-41b6-b5e4-170125ab3ab3\") " pod="openstack/ceilometer-0" Mar 19 19:20:51 crc kubenswrapper[4826]: I0319 19:20:51.333936 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gmkd\" (UniqueName: \"kubernetes.io/projected/a25b4b35-8c3f-41b6-b5e4-170125ab3ab3-kube-api-access-7gmkd\") pod \"ceilometer-0\" (UID: \"a25b4b35-8c3f-41b6-b5e4-170125ab3ab3\") " pod="openstack/ceilometer-0" Mar 19 19:20:51 crc kubenswrapper[4826]: I0319 19:20:51.431274 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:20:51 crc kubenswrapper[4826]: I0319 19:20:51.953207 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:20:51 crc kubenswrapper[4826]: I0319 19:20:51.966086 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:20:51 crc kubenswrapper[4826]: I0319 19:20:51.993182 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35ddd07d-67cd-45c0-a20f-956fcadee9f2" path="/var/lib/kubelet/pods/35ddd07d-67cd-45c0-a20f-956fcadee9f2/volumes" Mar 19 19:20:53 crc kubenswrapper[4826]: I0319 19:20:53.033498 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"c5835e9d-d28e-47a1-aa89-c61f110f2f53","Type":"ContainerStarted","Data":"283dc0c21fc4d13f34abba2b9306e1cc1ea1732a61cf2f788cb5b3c56edf823e"} Mar 19 19:20:53 crc kubenswrapper[4826]: I0319 19:20:53.034842 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a25b4b35-8c3f-41b6-b5e4-170125ab3ab3","Type":"ContainerStarted","Data":"e23afa9ad0f26f6e05e868a10c5767db0dda3c6fe48b07a8270fee74c34ff554"} Mar 19 19:20:53 crc kubenswrapper[4826]: I0319 19:20:53.034891 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a25b4b35-8c3f-41b6-b5e4-170125ab3ab3","Type":"ContainerStarted","Data":"d47399e375d7adfccc1cf4b543273baac463502cb7f0d48c0f4fcec2b120f13c"} Mar 19 19:20:55 crc kubenswrapper[4826]: I0319 19:20:55.071284 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"c5835e9d-d28e-47a1-aa89-c61f110f2f53","Type":"ContainerStarted","Data":"12132ab8807a509dbbdbe4be36d39620780d00d23d52b74989ee6106ede014d6"} Mar 19 19:20:55 crc kubenswrapper[4826]: I0319 19:20:55.071406 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="c5835e9d-d28e-47a1-aa89-c61f110f2f53" containerName="aodh-api" containerID="cri-o://066bb8481e3cc8105d0dc22392d969ef963ef4abb0cfa4d433ffeff5fca3f3ac" gracePeriod=30 Mar 19 19:20:55 crc kubenswrapper[4826]: I0319 19:20:55.071474 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="c5835e9d-d28e-47a1-aa89-c61f110f2f53" containerName="aodh-listener" containerID="cri-o://12132ab8807a509dbbdbe4be36d39620780d00d23d52b74989ee6106ede014d6" gracePeriod=30 Mar 19 19:20:55 crc kubenswrapper[4826]: I0319 19:20:55.071496 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="c5835e9d-d28e-47a1-aa89-c61f110f2f53" containerName="aodh-notifier" containerID="cri-o://283dc0c21fc4d13f34abba2b9306e1cc1ea1732a61cf2f788cb5b3c56edf823e" gracePeriod=30 Mar 19 19:20:55 crc kubenswrapper[4826]: I0319 19:20:55.071531 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="c5835e9d-d28e-47a1-aa89-c61f110f2f53" containerName="aodh-evaluator" containerID="cri-o://0d7b9305c553990c2ae667a61313bf01d6ab1104cbc53cc92394c953df61bc43" gracePeriod=30 Mar 19 19:20:55 crc kubenswrapper[4826]: I0319 19:20:55.074731 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a25b4b35-8c3f-41b6-b5e4-170125ab3ab3","Type":"ContainerStarted","Data":"707c1884305aa6b0e887fd03e0b50e66d333df46b33567fdc5189abb11171ad7"} Mar 19 19:20:55 crc kubenswrapper[4826]: I0319 19:20:55.074769 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a25b4b35-8c3f-41b6-b5e4-170125ab3ab3","Type":"ContainerStarted","Data":"aafd0be43139554fbc6b2fe29e0107e1d91ebd35e42582a4c48ad96e381f63f7"} Mar 19 19:20:55 crc kubenswrapper[4826]: I0319 19:20:55.090889 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.487392253 podStartE2EDuration="9.090875024s" podCreationTimestamp="2026-03-19 19:20:46 +0000 UTC" firstStartedPulling="2026-03-19 19:20:47.316552884 +0000 UTC m=+1472.070621197" lastFinishedPulling="2026-03-19 19:20:53.920035665 +0000 UTC m=+1478.674103968" observedRunningTime="2026-03-19 19:20:55.088535477 +0000 UTC m=+1479.842603790" watchObservedRunningTime="2026-03-19 19:20:55.090875024 +0000 UTC m=+1479.844943337" Mar 19 19:20:55 crc kubenswrapper[4826]: I0319 19:20:55.401094 4826 patch_prober.go:28] interesting pod/machine-config-daemon-zz87p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:20:55 crc kubenswrapper[4826]: I0319 19:20:55.401439 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:20:55 crc kubenswrapper[4826]: I0319 19:20:55.718338 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:20:55 crc kubenswrapper[4826]: I0319 19:20:55.841468 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5spl\" (UniqueName: \"kubernetes.io/projected/38a2c36f-5d7d-4666-b6ed-833eb5e4df87-kube-api-access-j5spl\") pod \"38a2c36f-5d7d-4666-b6ed-833eb5e4df87\" (UID: \"38a2c36f-5d7d-4666-b6ed-833eb5e4df87\") " Mar 19 19:20:55 crc kubenswrapper[4826]: I0319 19:20:55.841700 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38a2c36f-5d7d-4666-b6ed-833eb5e4df87-combined-ca-bundle\") pod \"38a2c36f-5d7d-4666-b6ed-833eb5e4df87\" (UID: \"38a2c36f-5d7d-4666-b6ed-833eb5e4df87\") " Mar 19 19:20:55 crc kubenswrapper[4826]: I0319 19:20:55.841773 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38a2c36f-5d7d-4666-b6ed-833eb5e4df87-config-data\") pod \"38a2c36f-5d7d-4666-b6ed-833eb5e4df87\" (UID: \"38a2c36f-5d7d-4666-b6ed-833eb5e4df87\") " Mar 19 19:20:55 crc kubenswrapper[4826]: I0319 19:20:55.857912 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38a2c36f-5d7d-4666-b6ed-833eb5e4df87-kube-api-access-j5spl" (OuterVolumeSpecName: "kube-api-access-j5spl") pod "38a2c36f-5d7d-4666-b6ed-833eb5e4df87" (UID: "38a2c36f-5d7d-4666-b6ed-833eb5e4df87"). InnerVolumeSpecName "kube-api-access-j5spl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:20:55 crc kubenswrapper[4826]: I0319 19:20:55.895802 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38a2c36f-5d7d-4666-b6ed-833eb5e4df87-config-data" (OuterVolumeSpecName: "config-data") pod "38a2c36f-5d7d-4666-b6ed-833eb5e4df87" (UID: "38a2c36f-5d7d-4666-b6ed-833eb5e4df87"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:20:55 crc kubenswrapper[4826]: I0319 19:20:55.902453 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38a2c36f-5d7d-4666-b6ed-833eb5e4df87-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "38a2c36f-5d7d-4666-b6ed-833eb5e4df87" (UID: "38a2c36f-5d7d-4666-b6ed-833eb5e4df87"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:20:55 crc kubenswrapper[4826]: I0319 19:20:55.948201 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5spl\" (UniqueName: \"kubernetes.io/projected/38a2c36f-5d7d-4666-b6ed-833eb5e4df87-kube-api-access-j5spl\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:55 crc kubenswrapper[4826]: I0319 19:20:55.948230 4826 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38a2c36f-5d7d-4666-b6ed-833eb5e4df87-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:55 crc kubenswrapper[4826]: I0319 19:20:55.948240 4826 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38a2c36f-5d7d-4666-b6ed-833eb5e4df87-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.087580 4826 generic.go:334] "Generic (PLEG): container finished" podID="38a2c36f-5d7d-4666-b6ed-833eb5e4df87" containerID="7145ea36fea7fd8f76ef92adc222fdf5036ecb1b3fa9387569368f9dd533336a" exitCode=137 Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.087634 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.087700 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"38a2c36f-5d7d-4666-b6ed-833eb5e4df87","Type":"ContainerDied","Data":"7145ea36fea7fd8f76ef92adc222fdf5036ecb1b3fa9387569368f9dd533336a"} Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.087727 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"38a2c36f-5d7d-4666-b6ed-833eb5e4df87","Type":"ContainerDied","Data":"c2536f3d9c9e81c3508e4f4dfe2d26929cacba95a23b8703a715e65389f7b2e4"} Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.087744 4826 scope.go:117] "RemoveContainer" containerID="7145ea36fea7fd8f76ef92adc222fdf5036ecb1b3fa9387569368f9dd533336a" Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.090970 4826 generic.go:334] "Generic (PLEG): container finished" podID="c5835e9d-d28e-47a1-aa89-c61f110f2f53" containerID="283dc0c21fc4d13f34abba2b9306e1cc1ea1732a61cf2f788cb5b3c56edf823e" exitCode=0 Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.091002 4826 generic.go:334] "Generic (PLEG): container finished" podID="c5835e9d-d28e-47a1-aa89-c61f110f2f53" containerID="0d7b9305c553990c2ae667a61313bf01d6ab1104cbc53cc92394c953df61bc43" exitCode=0 Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.091012 4826 generic.go:334] "Generic (PLEG): container finished" podID="c5835e9d-d28e-47a1-aa89-c61f110f2f53" containerID="066bb8481e3cc8105d0dc22392d969ef963ef4abb0cfa4d433ffeff5fca3f3ac" exitCode=0 Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.091071 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"c5835e9d-d28e-47a1-aa89-c61f110f2f53","Type":"ContainerDied","Data":"283dc0c21fc4d13f34abba2b9306e1cc1ea1732a61cf2f788cb5b3c56edf823e"} Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.091109 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"c5835e9d-d28e-47a1-aa89-c61f110f2f53","Type":"ContainerDied","Data":"0d7b9305c553990c2ae667a61313bf01d6ab1104cbc53cc92394c953df61bc43"} Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.091118 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"c5835e9d-d28e-47a1-aa89-c61f110f2f53","Type":"ContainerDied","Data":"066bb8481e3cc8105d0dc22392d969ef963ef4abb0cfa4d433ffeff5fca3f3ac"} Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.095960 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.096095 4826 generic.go:334] "Generic (PLEG): container finished" podID="21e0fb8a-2c26-4c1d-973c-3819e82aac1e" containerID="12b0237eb50acb20ee1dd8fa6736ea1d41616c908ff1adff5d42be3f58207763" exitCode=137 Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.096128 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"21e0fb8a-2c26-4c1d-973c-3819e82aac1e","Type":"ContainerDied","Data":"12b0237eb50acb20ee1dd8fa6736ea1d41616c908ff1adff5d42be3f58207763"} Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.096152 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"21e0fb8a-2c26-4c1d-973c-3819e82aac1e","Type":"ContainerDied","Data":"203de1919c5b0f25c59f540cfddbdc98ef2a2cff5a27ff43afd1ac66bf65d1a7"} Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.117702 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.118668 4826 scope.go:117] "RemoveContainer" containerID="7145ea36fea7fd8f76ef92adc222fdf5036ecb1b3fa9387569368f9dd533336a" Mar 19 19:20:56 crc kubenswrapper[4826]: E0319 19:20:56.127419 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7145ea36fea7fd8f76ef92adc222fdf5036ecb1b3fa9387569368f9dd533336a\": container with ID starting with 7145ea36fea7fd8f76ef92adc222fdf5036ecb1b3fa9387569368f9dd533336a not found: ID does not exist" containerID="7145ea36fea7fd8f76ef92adc222fdf5036ecb1b3fa9387569368f9dd533336a" Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.127459 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7145ea36fea7fd8f76ef92adc222fdf5036ecb1b3fa9387569368f9dd533336a"} err="failed to get container status \"7145ea36fea7fd8f76ef92adc222fdf5036ecb1b3fa9387569368f9dd533336a\": rpc error: code = NotFound desc = could not find container \"7145ea36fea7fd8f76ef92adc222fdf5036ecb1b3fa9387569368f9dd533336a\": container with ID starting with 7145ea36fea7fd8f76ef92adc222fdf5036ecb1b3fa9387569368f9dd533336a not found: ID does not exist" Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.127484 4826 scope.go:117] "RemoveContainer" containerID="12b0237eb50acb20ee1dd8fa6736ea1d41616c908ff1adff5d42be3f58207763" Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.129770 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.161189 4826 scope.go:117] "RemoveContainer" containerID="6830913d3ded7f795234e7a87222830044655e55f8e49cba668eb7e61fa53f0b" Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.169470 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 19:20:56 crc kubenswrapper[4826]: E0319 19:20:56.175798 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38a2c36f-5d7d-4666-b6ed-833eb5e4df87" containerName="nova-cell1-novncproxy-novncproxy" Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.175835 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="38a2c36f-5d7d-4666-b6ed-833eb5e4df87" containerName="nova-cell1-novncproxy-novncproxy" Mar 19 19:20:56 crc kubenswrapper[4826]: E0319 19:20:56.175977 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21e0fb8a-2c26-4c1d-973c-3819e82aac1e" containerName="nova-metadata-metadata" Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.175988 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="21e0fb8a-2c26-4c1d-973c-3819e82aac1e" containerName="nova-metadata-metadata" Mar 19 19:20:56 crc kubenswrapper[4826]: E0319 19:20:56.176086 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21e0fb8a-2c26-4c1d-973c-3819e82aac1e" containerName="nova-metadata-log" Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.176116 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="21e0fb8a-2c26-4c1d-973c-3819e82aac1e" containerName="nova-metadata-log" Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.177278 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="21e0fb8a-2c26-4c1d-973c-3819e82aac1e" containerName="nova-metadata-log" Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.177313 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="21e0fb8a-2c26-4c1d-973c-3819e82aac1e" containerName="nova-metadata-metadata" Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.177354 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="38a2c36f-5d7d-4666-b6ed-833eb5e4df87" containerName="nova-cell1-novncproxy-novncproxy" Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.179446 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.185370 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.185680 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.185820 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.206459 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.222779 4826 scope.go:117] "RemoveContainer" containerID="12b0237eb50acb20ee1dd8fa6736ea1d41616c908ff1adff5d42be3f58207763" Mar 19 19:20:56 crc kubenswrapper[4826]: E0319 19:20:56.223228 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12b0237eb50acb20ee1dd8fa6736ea1d41616c908ff1adff5d42be3f58207763\": container with ID starting with 12b0237eb50acb20ee1dd8fa6736ea1d41616c908ff1adff5d42be3f58207763 not found: ID does not exist" containerID="12b0237eb50acb20ee1dd8fa6736ea1d41616c908ff1adff5d42be3f58207763" Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.223273 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12b0237eb50acb20ee1dd8fa6736ea1d41616c908ff1adff5d42be3f58207763"} err="failed to get container status \"12b0237eb50acb20ee1dd8fa6736ea1d41616c908ff1adff5d42be3f58207763\": rpc error: code = NotFound desc = could not find container \"12b0237eb50acb20ee1dd8fa6736ea1d41616c908ff1adff5d42be3f58207763\": container with ID starting with 12b0237eb50acb20ee1dd8fa6736ea1d41616c908ff1adff5d42be3f58207763 not found: ID does not exist" Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.223296 4826 scope.go:117] "RemoveContainer" containerID="6830913d3ded7f795234e7a87222830044655e55f8e49cba668eb7e61fa53f0b" Mar 19 19:20:56 crc kubenswrapper[4826]: E0319 19:20:56.223590 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6830913d3ded7f795234e7a87222830044655e55f8e49cba668eb7e61fa53f0b\": container with ID starting with 6830913d3ded7f795234e7a87222830044655e55f8e49cba668eb7e61fa53f0b not found: ID does not exist" containerID="6830913d3ded7f795234e7a87222830044655e55f8e49cba668eb7e61fa53f0b" Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.223621 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6830913d3ded7f795234e7a87222830044655e55f8e49cba668eb7e61fa53f0b"} err="failed to get container status \"6830913d3ded7f795234e7a87222830044655e55f8e49cba668eb7e61fa53f0b\": rpc error: code = NotFound desc = could not find container \"6830913d3ded7f795234e7a87222830044655e55f8e49cba668eb7e61fa53f0b\": container with ID starting with 6830913d3ded7f795234e7a87222830044655e55f8e49cba668eb7e61fa53f0b not found: ID does not exist" Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.262395 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21e0fb8a-2c26-4c1d-973c-3819e82aac1e-config-data\") pod \"21e0fb8a-2c26-4c1d-973c-3819e82aac1e\" (UID: \"21e0fb8a-2c26-4c1d-973c-3819e82aac1e\") " Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.262730 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21e0fb8a-2c26-4c1d-973c-3819e82aac1e-combined-ca-bundle\") pod \"21e0fb8a-2c26-4c1d-973c-3819e82aac1e\" (UID: \"21e0fb8a-2c26-4c1d-973c-3819e82aac1e\") " Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.262901 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfxp8\" (UniqueName: \"kubernetes.io/projected/21e0fb8a-2c26-4c1d-973c-3819e82aac1e-kube-api-access-tfxp8\") pod \"21e0fb8a-2c26-4c1d-973c-3819e82aac1e\" (UID: \"21e0fb8a-2c26-4c1d-973c-3819e82aac1e\") " Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.263090 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21e0fb8a-2c26-4c1d-973c-3819e82aac1e-logs\") pod \"21e0fb8a-2c26-4c1d-973c-3819e82aac1e\" (UID: \"21e0fb8a-2c26-4c1d-973c-3819e82aac1e\") " Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.263335 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc847c8f-db3b-41e1-b0bd-43faeeb17707-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc847c8f-db3b-41e1-b0bd-43faeeb17707\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.263373 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcxsk\" (UniqueName: \"kubernetes.io/projected/dc847c8f-db3b-41e1-b0bd-43faeeb17707-kube-api-access-wcxsk\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc847c8f-db3b-41e1-b0bd-43faeeb17707\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.263517 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc847c8f-db3b-41e1-b0bd-43faeeb17707-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc847c8f-db3b-41e1-b0bd-43faeeb17707\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.263549 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc847c8f-db3b-41e1-b0bd-43faeeb17707-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc847c8f-db3b-41e1-b0bd-43faeeb17707\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.263574 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc847c8f-db3b-41e1-b0bd-43faeeb17707-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc847c8f-db3b-41e1-b0bd-43faeeb17707\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.263887 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21e0fb8a-2c26-4c1d-973c-3819e82aac1e-logs" (OuterVolumeSpecName: "logs") pod "21e0fb8a-2c26-4c1d-973c-3819e82aac1e" (UID: "21e0fb8a-2c26-4c1d-973c-3819e82aac1e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.269886 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21e0fb8a-2c26-4c1d-973c-3819e82aac1e-kube-api-access-tfxp8" (OuterVolumeSpecName: "kube-api-access-tfxp8") pod "21e0fb8a-2c26-4c1d-973c-3819e82aac1e" (UID: "21e0fb8a-2c26-4c1d-973c-3819e82aac1e"). InnerVolumeSpecName "kube-api-access-tfxp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.302946 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21e0fb8a-2c26-4c1d-973c-3819e82aac1e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "21e0fb8a-2c26-4c1d-973c-3819e82aac1e" (UID: "21e0fb8a-2c26-4c1d-973c-3819e82aac1e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.310820 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21e0fb8a-2c26-4c1d-973c-3819e82aac1e-config-data" (OuterVolumeSpecName: "config-data") pod "21e0fb8a-2c26-4c1d-973c-3819e82aac1e" (UID: "21e0fb8a-2c26-4c1d-973c-3819e82aac1e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.365218 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc847c8f-db3b-41e1-b0bd-43faeeb17707-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc847c8f-db3b-41e1-b0bd-43faeeb17707\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.365316 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc847c8f-db3b-41e1-b0bd-43faeeb17707-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc847c8f-db3b-41e1-b0bd-43faeeb17707\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.365355 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc847c8f-db3b-41e1-b0bd-43faeeb17707-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc847c8f-db3b-41e1-b0bd-43faeeb17707\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.365427 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc847c8f-db3b-41e1-b0bd-43faeeb17707-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc847c8f-db3b-41e1-b0bd-43faeeb17707\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.365464 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcxsk\" (UniqueName: \"kubernetes.io/projected/dc847c8f-db3b-41e1-b0bd-43faeeb17707-kube-api-access-wcxsk\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc847c8f-db3b-41e1-b0bd-43faeeb17707\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.365752 4826 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21e0fb8a-2c26-4c1d-973c-3819e82aac1e-logs\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.365780 4826 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21e0fb8a-2c26-4c1d-973c-3819e82aac1e-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.365792 4826 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21e0fb8a-2c26-4c1d-973c-3819e82aac1e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.365803 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfxp8\" (UniqueName: \"kubernetes.io/projected/21e0fb8a-2c26-4c1d-973c-3819e82aac1e-kube-api-access-tfxp8\") on node \"crc\" DevicePath \"\"" Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.370564 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc847c8f-db3b-41e1-b0bd-43faeeb17707-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc847c8f-db3b-41e1-b0bd-43faeeb17707\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.372346 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc847c8f-db3b-41e1-b0bd-43faeeb17707-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc847c8f-db3b-41e1-b0bd-43faeeb17707\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.374111 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc847c8f-db3b-41e1-b0bd-43faeeb17707-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc847c8f-db3b-41e1-b0bd-43faeeb17707\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.374944 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc847c8f-db3b-41e1-b0bd-43faeeb17707-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc847c8f-db3b-41e1-b0bd-43faeeb17707\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.383708 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcxsk\" (UniqueName: \"kubernetes.io/projected/dc847c8f-db3b-41e1-b0bd-43faeeb17707-kube-api-access-wcxsk\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc847c8f-db3b-41e1-b0bd-43faeeb17707\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:20:56 crc kubenswrapper[4826]: I0319 19:20:56.541033 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:20:57 crc kubenswrapper[4826]: I0319 19:20:57.133717 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 19:20:57 crc kubenswrapper[4826]: I0319 19:20:57.150679 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 19:20:57 crc kubenswrapper[4826]: I0319 19:20:57.359190 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 19:20:57 crc kubenswrapper[4826]: I0319 19:20:57.379060 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 19:20:57 crc kubenswrapper[4826]: I0319 19:20:57.389336 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 19 19:20:57 crc kubenswrapper[4826]: I0319 19:20:57.391414 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 19:20:57 crc kubenswrapper[4826]: I0319 19:20:57.395105 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 19 19:20:57 crc kubenswrapper[4826]: I0319 19:20:57.395337 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 19 19:20:57 crc kubenswrapper[4826]: I0319 19:20:57.406968 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 19:20:57 crc kubenswrapper[4826]: I0319 19:20:57.505322 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsrxf\" (UniqueName: \"kubernetes.io/projected/2f38764b-fa70-46d3-a045-024fe04b86a6-kube-api-access-qsrxf\") pod \"nova-metadata-0\" (UID: \"2f38764b-fa70-46d3-a045-024fe04b86a6\") " pod="openstack/nova-metadata-0" Mar 19 19:20:57 crc kubenswrapper[4826]: I0319 19:20:57.505508 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f38764b-fa70-46d3-a045-024fe04b86a6-logs\") pod \"nova-metadata-0\" (UID: \"2f38764b-fa70-46d3-a045-024fe04b86a6\") " pod="openstack/nova-metadata-0" Mar 19 19:20:57 crc kubenswrapper[4826]: I0319 19:20:57.505709 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f38764b-fa70-46d3-a045-024fe04b86a6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2f38764b-fa70-46d3-a045-024fe04b86a6\") " pod="openstack/nova-metadata-0" Mar 19 19:20:57 crc kubenswrapper[4826]: I0319 19:20:57.506352 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f38764b-fa70-46d3-a045-024fe04b86a6-config-data\") pod \"nova-metadata-0\" (UID: \"2f38764b-fa70-46d3-a045-024fe04b86a6\") " pod="openstack/nova-metadata-0" Mar 19 19:20:57 crc kubenswrapper[4826]: I0319 19:20:57.506491 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f38764b-fa70-46d3-a045-024fe04b86a6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2f38764b-fa70-46d3-a045-024fe04b86a6\") " pod="openstack/nova-metadata-0" Mar 19 19:20:57 crc kubenswrapper[4826]: I0319 19:20:57.608562 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f38764b-fa70-46d3-a045-024fe04b86a6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2f38764b-fa70-46d3-a045-024fe04b86a6\") " pod="openstack/nova-metadata-0" Mar 19 19:20:57 crc kubenswrapper[4826]: I0319 19:20:57.608938 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f38764b-fa70-46d3-a045-024fe04b86a6-config-data\") pod \"nova-metadata-0\" (UID: \"2f38764b-fa70-46d3-a045-024fe04b86a6\") " pod="openstack/nova-metadata-0" Mar 19 19:20:57 crc kubenswrapper[4826]: I0319 19:20:57.609033 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f38764b-fa70-46d3-a045-024fe04b86a6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2f38764b-fa70-46d3-a045-024fe04b86a6\") " pod="openstack/nova-metadata-0" Mar 19 19:20:57 crc kubenswrapper[4826]: I0319 19:20:57.609126 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsrxf\" (UniqueName: \"kubernetes.io/projected/2f38764b-fa70-46d3-a045-024fe04b86a6-kube-api-access-qsrxf\") pod \"nova-metadata-0\" (UID: \"2f38764b-fa70-46d3-a045-024fe04b86a6\") " pod="openstack/nova-metadata-0" Mar 19 19:20:57 crc kubenswrapper[4826]: I0319 19:20:57.609226 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f38764b-fa70-46d3-a045-024fe04b86a6-logs\") pod \"nova-metadata-0\" (UID: \"2f38764b-fa70-46d3-a045-024fe04b86a6\") " pod="openstack/nova-metadata-0" Mar 19 19:20:57 crc kubenswrapper[4826]: I0319 19:20:57.609627 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f38764b-fa70-46d3-a045-024fe04b86a6-logs\") pod \"nova-metadata-0\" (UID: \"2f38764b-fa70-46d3-a045-024fe04b86a6\") " pod="openstack/nova-metadata-0" Mar 19 19:20:57 crc kubenswrapper[4826]: I0319 19:20:57.619623 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f38764b-fa70-46d3-a045-024fe04b86a6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2f38764b-fa70-46d3-a045-024fe04b86a6\") " pod="openstack/nova-metadata-0" Mar 19 19:20:57 crc kubenswrapper[4826]: I0319 19:20:57.619861 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f38764b-fa70-46d3-a045-024fe04b86a6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2f38764b-fa70-46d3-a045-024fe04b86a6\") " pod="openstack/nova-metadata-0" Mar 19 19:20:57 crc kubenswrapper[4826]: I0319 19:20:57.624452 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f38764b-fa70-46d3-a045-024fe04b86a6-config-data\") pod \"nova-metadata-0\" (UID: \"2f38764b-fa70-46d3-a045-024fe04b86a6\") " pod="openstack/nova-metadata-0" Mar 19 19:20:57 crc kubenswrapper[4826]: I0319 19:20:57.632525 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsrxf\" (UniqueName: \"kubernetes.io/projected/2f38764b-fa70-46d3-a045-024fe04b86a6-kube-api-access-qsrxf\") pod \"nova-metadata-0\" (UID: \"2f38764b-fa70-46d3-a045-024fe04b86a6\") " pod="openstack/nova-metadata-0" Mar 19 19:20:57 crc kubenswrapper[4826]: I0319 19:20:57.714265 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 19:20:58 crc kubenswrapper[4826]: I0319 19:20:58.001910 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21e0fb8a-2c26-4c1d-973c-3819e82aac1e" path="/var/lib/kubelet/pods/21e0fb8a-2c26-4c1d-973c-3819e82aac1e/volumes" Mar 19 19:20:58 crc kubenswrapper[4826]: I0319 19:20:58.002927 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38a2c36f-5d7d-4666-b6ed-833eb5e4df87" path="/var/lib/kubelet/pods/38a2c36f-5d7d-4666-b6ed-833eb5e4df87/volumes" Mar 19 19:20:58 crc kubenswrapper[4826]: I0319 19:20:58.186612 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a25b4b35-8c3f-41b6-b5e4-170125ab3ab3","Type":"ContainerStarted","Data":"4cfabe6bab5fd022951d51e21a1085e914f967280443bb534282dee5c787c327"} Mar 19 19:20:58 crc kubenswrapper[4826]: I0319 19:20:58.187026 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a25b4b35-8c3f-41b6-b5e4-170125ab3ab3" containerName="ceilometer-central-agent" containerID="cri-o://e23afa9ad0f26f6e05e868a10c5767db0dda3c6fe48b07a8270fee74c34ff554" gracePeriod=30 Mar 19 19:20:58 crc kubenswrapper[4826]: I0319 19:20:58.187295 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 19 19:20:58 crc kubenswrapper[4826]: I0319 19:20:58.187607 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a25b4b35-8c3f-41b6-b5e4-170125ab3ab3" containerName="proxy-httpd" containerID="cri-o://4cfabe6bab5fd022951d51e21a1085e914f967280443bb534282dee5c787c327" gracePeriod=30 Mar 19 19:20:58 crc kubenswrapper[4826]: I0319 19:20:58.187674 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a25b4b35-8c3f-41b6-b5e4-170125ab3ab3" containerName="sg-core" containerID="cri-o://707c1884305aa6b0e887fd03e0b50e66d333df46b33567fdc5189abb11171ad7" gracePeriod=30 Mar 19 19:20:58 crc kubenswrapper[4826]: I0319 19:20:58.187744 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a25b4b35-8c3f-41b6-b5e4-170125ab3ab3" containerName="ceilometer-notification-agent" containerID="cri-o://aafd0be43139554fbc6b2fe29e0107e1d91ebd35e42582a4c48ad96e381f63f7" gracePeriod=30 Mar 19 19:20:58 crc kubenswrapper[4826]: I0319 19:20:58.197612 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"dc847c8f-db3b-41e1-b0bd-43faeeb17707","Type":"ContainerStarted","Data":"0730ba21877625dbac3d57b94f00e8c8b623cd3dfd8d619ee66447f9de0ad60c"} Mar 19 19:20:58 crc kubenswrapper[4826]: I0319 19:20:58.197671 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"dc847c8f-db3b-41e1-b0bd-43faeeb17707","Type":"ContainerStarted","Data":"aa47eb7f290c2b2d5f459f4d851151b4c3ee273e79dbdd44c59006fc54fc0c6b"} Mar 19 19:20:58 crc kubenswrapper[4826]: I0319 19:20:58.211074 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 19 19:20:58 crc kubenswrapper[4826]: I0319 19:20:58.211769 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 19 19:20:58 crc kubenswrapper[4826]: I0319 19:20:58.223972 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.8367861640000003 podStartE2EDuration="7.223953337s" podCreationTimestamp="2026-03-19 19:20:51 +0000 UTC" firstStartedPulling="2026-03-19 19:20:52.304890429 +0000 UTC m=+1477.058958742" lastFinishedPulling="2026-03-19 19:20:56.692057602 +0000 UTC m=+1481.446125915" observedRunningTime="2026-03-19 19:20:58.209686022 +0000 UTC m=+1482.963754335" watchObservedRunningTime="2026-03-19 19:20:58.223953337 +0000 UTC m=+1482.978021650" Mar 19 19:20:58 crc kubenswrapper[4826]: I0319 19:20:58.235457 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.235440485 podStartE2EDuration="2.235440485s" podCreationTimestamp="2026-03-19 19:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:20:58.234645866 +0000 UTC m=+1482.988714169" watchObservedRunningTime="2026-03-19 19:20:58.235440485 +0000 UTC m=+1482.989508798" Mar 19 19:20:58 crc kubenswrapper[4826]: I0319 19:20:58.279345 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 19:20:59 crc kubenswrapper[4826]: I0319 19:20:59.210816 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2f38764b-fa70-46d3-a045-024fe04b86a6","Type":"ContainerStarted","Data":"57a770e2babd457e5d0e0509b6e06c3cd5477b4dcf558f2b3169a5db02195e38"} Mar 19 19:20:59 crc kubenswrapper[4826]: I0319 19:20:59.211259 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2f38764b-fa70-46d3-a045-024fe04b86a6","Type":"ContainerStarted","Data":"542933139d76ea84d17315b736ffc8e0232eda1953b1a59a7b95f6273643fa4f"} Mar 19 19:20:59 crc kubenswrapper[4826]: I0319 19:20:59.211283 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2f38764b-fa70-46d3-a045-024fe04b86a6","Type":"ContainerStarted","Data":"13bbedafdb1a9c32f973db4f1bc22003b5e7fdd2cf4505922005ec8af8d0c9da"} Mar 19 19:20:59 crc kubenswrapper[4826]: I0319 19:20:59.218064 4826 generic.go:334] "Generic (PLEG): container finished" podID="a25b4b35-8c3f-41b6-b5e4-170125ab3ab3" containerID="4cfabe6bab5fd022951d51e21a1085e914f967280443bb534282dee5c787c327" exitCode=0 Mar 19 19:20:59 crc kubenswrapper[4826]: I0319 19:20:59.218094 4826 generic.go:334] "Generic (PLEG): container finished" podID="a25b4b35-8c3f-41b6-b5e4-170125ab3ab3" containerID="707c1884305aa6b0e887fd03e0b50e66d333df46b33567fdc5189abb11171ad7" exitCode=2 Mar 19 19:20:59 crc kubenswrapper[4826]: I0319 19:20:59.218117 4826 generic.go:334] "Generic (PLEG): container finished" podID="a25b4b35-8c3f-41b6-b5e4-170125ab3ab3" containerID="aafd0be43139554fbc6b2fe29e0107e1d91ebd35e42582a4c48ad96e381f63f7" exitCode=0 Mar 19 19:20:59 crc kubenswrapper[4826]: I0319 19:20:59.218373 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a25b4b35-8c3f-41b6-b5e4-170125ab3ab3","Type":"ContainerDied","Data":"4cfabe6bab5fd022951d51e21a1085e914f967280443bb534282dee5c787c327"} Mar 19 19:20:59 crc kubenswrapper[4826]: I0319 19:20:59.218437 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a25b4b35-8c3f-41b6-b5e4-170125ab3ab3","Type":"ContainerDied","Data":"707c1884305aa6b0e887fd03e0b50e66d333df46b33567fdc5189abb11171ad7"} Mar 19 19:20:59 crc kubenswrapper[4826]: I0319 19:20:59.218455 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a25b4b35-8c3f-41b6-b5e4-170125ab3ab3","Type":"ContainerDied","Data":"aafd0be43139554fbc6b2fe29e0107e1d91ebd35e42582a4c48ad96e381f63f7"} Mar 19 19:20:59 crc kubenswrapper[4826]: I0319 19:20:59.235733 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.235713981 podStartE2EDuration="2.235713981s" podCreationTimestamp="2026-03-19 19:20:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:20:59.227906132 +0000 UTC m=+1483.981974445" watchObservedRunningTime="2026-03-19 19:20:59.235713981 +0000 UTC m=+1483.989782294" Mar 19 19:21:00 crc kubenswrapper[4826]: I0319 19:21:00.215094 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 19 19:21:00 crc kubenswrapper[4826]: I0319 19:21:00.222335 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 19 19:21:00 crc kubenswrapper[4826]: I0319 19:21:00.235682 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 19 19:21:00 crc kubenswrapper[4826]: I0319 19:21:00.236021 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 19 19:21:00 crc kubenswrapper[4826]: I0319 19:21:00.467845 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f84f9ccf-p6gql"] Mar 19 19:21:00 crc kubenswrapper[4826]: I0319 19:21:00.470672 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84f9ccf-p6gql" Mar 19 19:21:00 crc kubenswrapper[4826]: I0319 19:21:00.484309 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f84f9ccf-p6gql"] Mar 19 19:21:00 crc kubenswrapper[4826]: I0319 19:21:00.596894 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/198aca80-7461-4f41-b6ff-9fd31d3d28e2-config\") pod \"dnsmasq-dns-f84f9ccf-p6gql\" (UID: \"198aca80-7461-4f41-b6ff-9fd31d3d28e2\") " pod="openstack/dnsmasq-dns-f84f9ccf-p6gql" Mar 19 19:21:00 crc kubenswrapper[4826]: I0319 19:21:00.596938 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/198aca80-7461-4f41-b6ff-9fd31d3d28e2-ovsdbserver-nb\") pod \"dnsmasq-dns-f84f9ccf-p6gql\" (UID: \"198aca80-7461-4f41-b6ff-9fd31d3d28e2\") " pod="openstack/dnsmasq-dns-f84f9ccf-p6gql" Mar 19 19:21:00 crc kubenswrapper[4826]: I0319 19:21:00.596958 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/198aca80-7461-4f41-b6ff-9fd31d3d28e2-ovsdbserver-sb\") pod \"dnsmasq-dns-f84f9ccf-p6gql\" (UID: \"198aca80-7461-4f41-b6ff-9fd31d3d28e2\") " pod="openstack/dnsmasq-dns-f84f9ccf-p6gql" Mar 19 19:21:00 crc kubenswrapper[4826]: I0319 19:21:00.596998 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/198aca80-7461-4f41-b6ff-9fd31d3d28e2-dns-swift-storage-0\") pod \"dnsmasq-dns-f84f9ccf-p6gql\" (UID: \"198aca80-7461-4f41-b6ff-9fd31d3d28e2\") " pod="openstack/dnsmasq-dns-f84f9ccf-p6gql" Mar 19 19:21:00 crc kubenswrapper[4826]: I0319 19:21:00.597080 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/198aca80-7461-4f41-b6ff-9fd31d3d28e2-dns-svc\") pod \"dnsmasq-dns-f84f9ccf-p6gql\" (UID: \"198aca80-7461-4f41-b6ff-9fd31d3d28e2\") " pod="openstack/dnsmasq-dns-f84f9ccf-p6gql" Mar 19 19:21:00 crc kubenswrapper[4826]: I0319 19:21:00.597175 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kf4mx\" (UniqueName: \"kubernetes.io/projected/198aca80-7461-4f41-b6ff-9fd31d3d28e2-kube-api-access-kf4mx\") pod \"dnsmasq-dns-f84f9ccf-p6gql\" (UID: \"198aca80-7461-4f41-b6ff-9fd31d3d28e2\") " pod="openstack/dnsmasq-dns-f84f9ccf-p6gql" Mar 19 19:21:00 crc kubenswrapper[4826]: I0319 19:21:00.699020 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kf4mx\" (UniqueName: \"kubernetes.io/projected/198aca80-7461-4f41-b6ff-9fd31d3d28e2-kube-api-access-kf4mx\") pod \"dnsmasq-dns-f84f9ccf-p6gql\" (UID: \"198aca80-7461-4f41-b6ff-9fd31d3d28e2\") " pod="openstack/dnsmasq-dns-f84f9ccf-p6gql" Mar 19 19:21:00 crc kubenswrapper[4826]: I0319 19:21:00.699102 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/198aca80-7461-4f41-b6ff-9fd31d3d28e2-config\") pod \"dnsmasq-dns-f84f9ccf-p6gql\" (UID: \"198aca80-7461-4f41-b6ff-9fd31d3d28e2\") " pod="openstack/dnsmasq-dns-f84f9ccf-p6gql" Mar 19 19:21:00 crc kubenswrapper[4826]: I0319 19:21:00.699127 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/198aca80-7461-4f41-b6ff-9fd31d3d28e2-ovsdbserver-nb\") pod \"dnsmasq-dns-f84f9ccf-p6gql\" (UID: \"198aca80-7461-4f41-b6ff-9fd31d3d28e2\") " pod="openstack/dnsmasq-dns-f84f9ccf-p6gql" Mar 19 19:21:00 crc kubenswrapper[4826]: I0319 19:21:00.699141 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/198aca80-7461-4f41-b6ff-9fd31d3d28e2-ovsdbserver-sb\") pod \"dnsmasq-dns-f84f9ccf-p6gql\" (UID: \"198aca80-7461-4f41-b6ff-9fd31d3d28e2\") " pod="openstack/dnsmasq-dns-f84f9ccf-p6gql" Mar 19 19:21:00 crc kubenswrapper[4826]: I0319 19:21:00.699177 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/198aca80-7461-4f41-b6ff-9fd31d3d28e2-dns-swift-storage-0\") pod \"dnsmasq-dns-f84f9ccf-p6gql\" (UID: \"198aca80-7461-4f41-b6ff-9fd31d3d28e2\") " pod="openstack/dnsmasq-dns-f84f9ccf-p6gql" Mar 19 19:21:00 crc kubenswrapper[4826]: I0319 19:21:00.699264 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/198aca80-7461-4f41-b6ff-9fd31d3d28e2-dns-svc\") pod \"dnsmasq-dns-f84f9ccf-p6gql\" (UID: \"198aca80-7461-4f41-b6ff-9fd31d3d28e2\") " pod="openstack/dnsmasq-dns-f84f9ccf-p6gql" Mar 19 19:21:00 crc kubenswrapper[4826]: I0319 19:21:00.700186 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/198aca80-7461-4f41-b6ff-9fd31d3d28e2-dns-svc\") pod \"dnsmasq-dns-f84f9ccf-p6gql\" (UID: \"198aca80-7461-4f41-b6ff-9fd31d3d28e2\") " pod="openstack/dnsmasq-dns-f84f9ccf-p6gql" Mar 19 19:21:00 crc kubenswrapper[4826]: I0319 19:21:00.700958 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/198aca80-7461-4f41-b6ff-9fd31d3d28e2-config\") pod \"dnsmasq-dns-f84f9ccf-p6gql\" (UID: \"198aca80-7461-4f41-b6ff-9fd31d3d28e2\") " pod="openstack/dnsmasq-dns-f84f9ccf-p6gql" Mar 19 19:21:00 crc kubenswrapper[4826]: I0319 19:21:00.701554 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/198aca80-7461-4f41-b6ff-9fd31d3d28e2-ovsdbserver-sb\") pod \"dnsmasq-dns-f84f9ccf-p6gql\" (UID: \"198aca80-7461-4f41-b6ff-9fd31d3d28e2\") " pod="openstack/dnsmasq-dns-f84f9ccf-p6gql" Mar 19 19:21:00 crc kubenswrapper[4826]: I0319 19:21:00.701575 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/198aca80-7461-4f41-b6ff-9fd31d3d28e2-ovsdbserver-nb\") pod \"dnsmasq-dns-f84f9ccf-p6gql\" (UID: \"198aca80-7461-4f41-b6ff-9fd31d3d28e2\") " pod="openstack/dnsmasq-dns-f84f9ccf-p6gql" Mar 19 19:21:00 crc kubenswrapper[4826]: I0319 19:21:00.702078 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/198aca80-7461-4f41-b6ff-9fd31d3d28e2-dns-swift-storage-0\") pod \"dnsmasq-dns-f84f9ccf-p6gql\" (UID: \"198aca80-7461-4f41-b6ff-9fd31d3d28e2\") " pod="openstack/dnsmasq-dns-f84f9ccf-p6gql" Mar 19 19:21:00 crc kubenswrapper[4826]: I0319 19:21:00.721785 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kf4mx\" (UniqueName: \"kubernetes.io/projected/198aca80-7461-4f41-b6ff-9fd31d3d28e2-kube-api-access-kf4mx\") pod \"dnsmasq-dns-f84f9ccf-p6gql\" (UID: \"198aca80-7461-4f41-b6ff-9fd31d3d28e2\") " pod="openstack/dnsmasq-dns-f84f9ccf-p6gql" Mar 19 19:21:00 crc kubenswrapper[4826]: I0319 19:21:00.797896 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84f9ccf-p6gql" Mar 19 19:21:01 crc kubenswrapper[4826]: I0319 19:21:01.436493 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f84f9ccf-p6gql"] Mar 19 19:21:01 crc kubenswrapper[4826]: I0319 19:21:01.546755 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:21:02 crc kubenswrapper[4826]: I0319 19:21:02.269844 4826 generic.go:334] "Generic (PLEG): container finished" podID="198aca80-7461-4f41-b6ff-9fd31d3d28e2" containerID="8fab521938bb68ddea22381c171439950485d7ad21844c9b7b3ade5ced33a7d3" exitCode=0 Mar 19 19:21:02 crc kubenswrapper[4826]: I0319 19:21:02.271605 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84f9ccf-p6gql" event={"ID":"198aca80-7461-4f41-b6ff-9fd31d3d28e2","Type":"ContainerDied","Data":"8fab521938bb68ddea22381c171439950485d7ad21844c9b7b3ade5ced33a7d3"} Mar 19 19:21:02 crc kubenswrapper[4826]: I0319 19:21:02.271637 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84f9ccf-p6gql" event={"ID":"198aca80-7461-4f41-b6ff-9fd31d3d28e2","Type":"ContainerStarted","Data":"c0d5fe54ce8ab26b51dddc5552e4c9cae64c7fd049abcc0692a4539d07ad818a"} Mar 19 19:21:03 crc kubenswrapper[4826]: I0319 19:21:03.031264 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 19 19:21:03 crc kubenswrapper[4826]: I0319 19:21:03.348261 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84f9ccf-p6gql" event={"ID":"198aca80-7461-4f41-b6ff-9fd31d3d28e2","Type":"ContainerStarted","Data":"420f2b8bdc247dc7b891bff22203648c58a40a66299159e214fe030add8758f5"} Mar 19 19:21:03 crc kubenswrapper[4826]: I0319 19:21:03.348991 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-f84f9ccf-p6gql" Mar 19 19:21:03 crc kubenswrapper[4826]: I0319 19:21:03.362410 4826 generic.go:334] "Generic (PLEG): container finished" podID="a25b4b35-8c3f-41b6-b5e4-170125ab3ab3" containerID="e23afa9ad0f26f6e05e868a10c5767db0dda3c6fe48b07a8270fee74c34ff554" exitCode=0 Mar 19 19:21:03 crc kubenswrapper[4826]: I0319 19:21:03.362609 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a6bded02-bd3b-455f-9ca2-e075e45efce2" containerName="nova-api-log" containerID="cri-o://b662033fd78e3e49d124f351a53cb51de19019121010bc527f15b070a0b6cbd5" gracePeriod=30 Mar 19 19:21:03 crc kubenswrapper[4826]: I0319 19:21:03.362901 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a25b4b35-8c3f-41b6-b5e4-170125ab3ab3","Type":"ContainerDied","Data":"e23afa9ad0f26f6e05e868a10c5767db0dda3c6fe48b07a8270fee74c34ff554"} Mar 19 19:21:03 crc kubenswrapper[4826]: I0319 19:21:03.362979 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a6bded02-bd3b-455f-9ca2-e075e45efce2" containerName="nova-api-api" containerID="cri-o://d45aa60393a89e6eb5f2b8d8e8f3fc97b97f1c8ea445bc961c4d74b049bd2ad8" gracePeriod=30 Mar 19 19:21:03 crc kubenswrapper[4826]: I0319 19:21:03.389497 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-f84f9ccf-p6gql" podStartSLOduration=3.389472075 podStartE2EDuration="3.389472075s" podCreationTimestamp="2026-03-19 19:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:21:03.371558222 +0000 UTC m=+1488.125626535" watchObservedRunningTime="2026-03-19 19:21:03.389472075 +0000 UTC m=+1488.143540388" Mar 19 19:21:03 crc kubenswrapper[4826]: I0319 19:21:03.662586 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:21:03 crc kubenswrapper[4826]: I0319 19:21:03.710436 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gmkd\" (UniqueName: \"kubernetes.io/projected/a25b4b35-8c3f-41b6-b5e4-170125ab3ab3-kube-api-access-7gmkd\") pod \"a25b4b35-8c3f-41b6-b5e4-170125ab3ab3\" (UID: \"a25b4b35-8c3f-41b6-b5e4-170125ab3ab3\") " Mar 19 19:21:03 crc kubenswrapper[4826]: I0319 19:21:03.711413 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a25b4b35-8c3f-41b6-b5e4-170125ab3ab3-combined-ca-bundle\") pod \"a25b4b35-8c3f-41b6-b5e4-170125ab3ab3\" (UID: \"a25b4b35-8c3f-41b6-b5e4-170125ab3ab3\") " Mar 19 19:21:03 crc kubenswrapper[4826]: I0319 19:21:03.712217 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a25b4b35-8c3f-41b6-b5e4-170125ab3ab3-scripts\") pod \"a25b4b35-8c3f-41b6-b5e4-170125ab3ab3\" (UID: \"a25b4b35-8c3f-41b6-b5e4-170125ab3ab3\") " Mar 19 19:21:03 crc kubenswrapper[4826]: I0319 19:21:03.712372 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a25b4b35-8c3f-41b6-b5e4-170125ab3ab3-config-data\") pod \"a25b4b35-8c3f-41b6-b5e4-170125ab3ab3\" (UID: \"a25b4b35-8c3f-41b6-b5e4-170125ab3ab3\") " Mar 19 19:21:03 crc kubenswrapper[4826]: I0319 19:21:03.712402 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a25b4b35-8c3f-41b6-b5e4-170125ab3ab3-sg-core-conf-yaml\") pod \"a25b4b35-8c3f-41b6-b5e4-170125ab3ab3\" (UID: \"a25b4b35-8c3f-41b6-b5e4-170125ab3ab3\") " Mar 19 19:21:03 crc kubenswrapper[4826]: I0319 19:21:03.712465 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a25b4b35-8c3f-41b6-b5e4-170125ab3ab3-run-httpd\") pod \"a25b4b35-8c3f-41b6-b5e4-170125ab3ab3\" (UID: \"a25b4b35-8c3f-41b6-b5e4-170125ab3ab3\") " Mar 19 19:21:03 crc kubenswrapper[4826]: I0319 19:21:03.712505 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a25b4b35-8c3f-41b6-b5e4-170125ab3ab3-log-httpd\") pod \"a25b4b35-8c3f-41b6-b5e4-170125ab3ab3\" (UID: \"a25b4b35-8c3f-41b6-b5e4-170125ab3ab3\") " Mar 19 19:21:03 crc kubenswrapper[4826]: I0319 19:21:03.713471 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a25b4b35-8c3f-41b6-b5e4-170125ab3ab3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a25b4b35-8c3f-41b6-b5e4-170125ab3ab3" (UID: "a25b4b35-8c3f-41b6-b5e4-170125ab3ab3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:21:03 crc kubenswrapper[4826]: I0319 19:21:03.717024 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a25b4b35-8c3f-41b6-b5e4-170125ab3ab3-kube-api-access-7gmkd" (OuterVolumeSpecName: "kube-api-access-7gmkd") pod "a25b4b35-8c3f-41b6-b5e4-170125ab3ab3" (UID: "a25b4b35-8c3f-41b6-b5e4-170125ab3ab3"). InnerVolumeSpecName "kube-api-access-7gmkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:21:03 crc kubenswrapper[4826]: I0319 19:21:03.717416 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a25b4b35-8c3f-41b6-b5e4-170125ab3ab3-scripts" (OuterVolumeSpecName: "scripts") pod "a25b4b35-8c3f-41b6-b5e4-170125ab3ab3" (UID: "a25b4b35-8c3f-41b6-b5e4-170125ab3ab3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:21:03 crc kubenswrapper[4826]: I0319 19:21:03.717709 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a25b4b35-8c3f-41b6-b5e4-170125ab3ab3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a25b4b35-8c3f-41b6-b5e4-170125ab3ab3" (UID: "a25b4b35-8c3f-41b6-b5e4-170125ab3ab3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:21:03 crc kubenswrapper[4826]: I0319 19:21:03.719256 4826 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a25b4b35-8c3f-41b6-b5e4-170125ab3ab3-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:03 crc kubenswrapper[4826]: I0319 19:21:03.719283 4826 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a25b4b35-8c3f-41b6-b5e4-170125ab3ab3-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:03 crc kubenswrapper[4826]: I0319 19:21:03.719296 4826 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a25b4b35-8c3f-41b6-b5e4-170125ab3ab3-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:03 crc kubenswrapper[4826]: I0319 19:21:03.719305 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gmkd\" (UniqueName: \"kubernetes.io/projected/a25b4b35-8c3f-41b6-b5e4-170125ab3ab3-kube-api-access-7gmkd\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:03 crc kubenswrapper[4826]: I0319 19:21:03.765017 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a25b4b35-8c3f-41b6-b5e4-170125ab3ab3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a25b4b35-8c3f-41b6-b5e4-170125ab3ab3" (UID: "a25b4b35-8c3f-41b6-b5e4-170125ab3ab3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:21:03 crc kubenswrapper[4826]: I0319 19:21:03.822852 4826 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a25b4b35-8c3f-41b6-b5e4-170125ab3ab3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:03 crc kubenswrapper[4826]: I0319 19:21:03.845907 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a25b4b35-8c3f-41b6-b5e4-170125ab3ab3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a25b4b35-8c3f-41b6-b5e4-170125ab3ab3" (UID: "a25b4b35-8c3f-41b6-b5e4-170125ab3ab3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:21:03 crc kubenswrapper[4826]: I0319 19:21:03.873789 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a25b4b35-8c3f-41b6-b5e4-170125ab3ab3-config-data" (OuterVolumeSpecName: "config-data") pod "a25b4b35-8c3f-41b6-b5e4-170125ab3ab3" (UID: "a25b4b35-8c3f-41b6-b5e4-170125ab3ab3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:21:03 crc kubenswrapper[4826]: I0319 19:21:03.925399 4826 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a25b4b35-8c3f-41b6-b5e4-170125ab3ab3-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:03 crc kubenswrapper[4826]: I0319 19:21:03.925428 4826 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a25b4b35-8c3f-41b6-b5e4-170125ab3ab3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:04 crc kubenswrapper[4826]: I0319 19:21:04.375766 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a25b4b35-8c3f-41b6-b5e4-170125ab3ab3","Type":"ContainerDied","Data":"d47399e375d7adfccc1cf4b543273baac463502cb7f0d48c0f4fcec2b120f13c"} Mar 19 19:21:04 crc kubenswrapper[4826]: I0319 19:21:04.375809 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:21:04 crc kubenswrapper[4826]: I0319 19:21:04.376146 4826 scope.go:117] "RemoveContainer" containerID="4cfabe6bab5fd022951d51e21a1085e914f967280443bb534282dee5c787c327" Mar 19 19:21:04 crc kubenswrapper[4826]: I0319 19:21:04.378632 4826 generic.go:334] "Generic (PLEG): container finished" podID="a6bded02-bd3b-455f-9ca2-e075e45efce2" containerID="b662033fd78e3e49d124f351a53cb51de19019121010bc527f15b070a0b6cbd5" exitCode=143 Mar 19 19:21:04 crc kubenswrapper[4826]: I0319 19:21:04.378879 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a6bded02-bd3b-455f-9ca2-e075e45efce2","Type":"ContainerDied","Data":"b662033fd78e3e49d124f351a53cb51de19019121010bc527f15b070a0b6cbd5"} Mar 19 19:21:04 crc kubenswrapper[4826]: I0319 19:21:04.398266 4826 scope.go:117] "RemoveContainer" containerID="707c1884305aa6b0e887fd03e0b50e66d333df46b33567fdc5189abb11171ad7" Mar 19 19:21:04 crc kubenswrapper[4826]: I0319 19:21:04.417959 4826 scope.go:117] "RemoveContainer" containerID="aafd0be43139554fbc6b2fe29e0107e1d91ebd35e42582a4c48ad96e381f63f7" Mar 19 19:21:04 crc kubenswrapper[4826]: I0319 19:21:04.437573 4826 scope.go:117] "RemoveContainer" containerID="e23afa9ad0f26f6e05e868a10c5767db0dda3c6fe48b07a8270fee74c34ff554" Mar 19 19:21:05 crc kubenswrapper[4826]: I0319 19:21:05.174172 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:21:05 crc kubenswrapper[4826]: I0319 19:21:05.195623 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:21:05 crc kubenswrapper[4826]: I0319 19:21:05.258266 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:21:05 crc kubenswrapper[4826]: E0319 19:21:05.260452 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a25b4b35-8c3f-41b6-b5e4-170125ab3ab3" containerName="sg-core" Mar 19 19:21:05 crc kubenswrapper[4826]: I0319 19:21:05.264054 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="a25b4b35-8c3f-41b6-b5e4-170125ab3ab3" containerName="sg-core" Mar 19 19:21:05 crc kubenswrapper[4826]: E0319 19:21:05.264206 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a25b4b35-8c3f-41b6-b5e4-170125ab3ab3" containerName="proxy-httpd" Mar 19 19:21:05 crc kubenswrapper[4826]: I0319 19:21:05.264399 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="a25b4b35-8c3f-41b6-b5e4-170125ab3ab3" containerName="proxy-httpd" Mar 19 19:21:05 crc kubenswrapper[4826]: E0319 19:21:05.264568 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a25b4b35-8c3f-41b6-b5e4-170125ab3ab3" containerName="ceilometer-central-agent" Mar 19 19:21:05 crc kubenswrapper[4826]: I0319 19:21:05.264627 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="a25b4b35-8c3f-41b6-b5e4-170125ab3ab3" containerName="ceilometer-central-agent" Mar 19 19:21:05 crc kubenswrapper[4826]: E0319 19:21:05.264730 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a25b4b35-8c3f-41b6-b5e4-170125ab3ab3" containerName="ceilometer-notification-agent" Mar 19 19:21:05 crc kubenswrapper[4826]: I0319 19:21:05.264784 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="a25b4b35-8c3f-41b6-b5e4-170125ab3ab3" containerName="ceilometer-notification-agent" Mar 19 19:21:05 crc kubenswrapper[4826]: I0319 19:21:05.272885 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="a25b4b35-8c3f-41b6-b5e4-170125ab3ab3" containerName="ceilometer-central-agent" Mar 19 19:21:05 crc kubenswrapper[4826]: I0319 19:21:05.276769 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="a25b4b35-8c3f-41b6-b5e4-170125ab3ab3" containerName="ceilometer-notification-agent" Mar 19 19:21:05 crc kubenswrapper[4826]: I0319 19:21:05.276941 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="a25b4b35-8c3f-41b6-b5e4-170125ab3ab3" containerName="sg-core" Mar 19 19:21:05 crc kubenswrapper[4826]: I0319 19:21:05.277007 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="a25b4b35-8c3f-41b6-b5e4-170125ab3ab3" containerName="proxy-httpd" Mar 19 19:21:05 crc kubenswrapper[4826]: I0319 19:21:05.294912 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:21:05 crc kubenswrapper[4826]: I0319 19:21:05.298091 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:21:05 crc kubenswrapper[4826]: I0319 19:21:05.302426 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 19 19:21:05 crc kubenswrapper[4826]: I0319 19:21:05.302668 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 19 19:21:05 crc kubenswrapper[4826]: I0319 19:21:05.464962 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb55d\" (UniqueName: \"kubernetes.io/projected/d2c5c794-322f-4411-8c97-ffe26c839870-kube-api-access-fb55d\") pod \"ceilometer-0\" (UID: \"d2c5c794-322f-4411-8c97-ffe26c839870\") " pod="openstack/ceilometer-0" Mar 19 19:21:05 crc kubenswrapper[4826]: I0319 19:21:05.465794 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2c5c794-322f-4411-8c97-ffe26c839870-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d2c5c794-322f-4411-8c97-ffe26c839870\") " pod="openstack/ceilometer-0" Mar 19 19:21:05 crc kubenswrapper[4826]: I0319 19:21:05.465968 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2c5c794-322f-4411-8c97-ffe26c839870-log-httpd\") pod \"ceilometer-0\" (UID: \"d2c5c794-322f-4411-8c97-ffe26c839870\") " pod="openstack/ceilometer-0" Mar 19 19:21:05 crc kubenswrapper[4826]: I0319 19:21:05.466120 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2c5c794-322f-4411-8c97-ffe26c839870-run-httpd\") pod \"ceilometer-0\" (UID: \"d2c5c794-322f-4411-8c97-ffe26c839870\") " pod="openstack/ceilometer-0" Mar 19 19:21:05 crc kubenswrapper[4826]: I0319 19:21:05.466316 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2c5c794-322f-4411-8c97-ffe26c839870-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d2c5c794-322f-4411-8c97-ffe26c839870\") " pod="openstack/ceilometer-0" Mar 19 19:21:05 crc kubenswrapper[4826]: I0319 19:21:05.466413 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2c5c794-322f-4411-8c97-ffe26c839870-scripts\") pod \"ceilometer-0\" (UID: \"d2c5c794-322f-4411-8c97-ffe26c839870\") " pod="openstack/ceilometer-0" Mar 19 19:21:05 crc kubenswrapper[4826]: I0319 19:21:05.466699 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2c5c794-322f-4411-8c97-ffe26c839870-config-data\") pod \"ceilometer-0\" (UID: \"d2c5c794-322f-4411-8c97-ffe26c839870\") " pod="openstack/ceilometer-0" Mar 19 19:21:05 crc kubenswrapper[4826]: I0319 19:21:05.569381 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2c5c794-322f-4411-8c97-ffe26c839870-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d2c5c794-322f-4411-8c97-ffe26c839870\") " pod="openstack/ceilometer-0" Mar 19 19:21:05 crc kubenswrapper[4826]: I0319 19:21:05.569454 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2c5c794-322f-4411-8c97-ffe26c839870-log-httpd\") pod \"ceilometer-0\" (UID: \"d2c5c794-322f-4411-8c97-ffe26c839870\") " pod="openstack/ceilometer-0" Mar 19 19:21:05 crc kubenswrapper[4826]: I0319 19:21:05.569508 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2c5c794-322f-4411-8c97-ffe26c839870-run-httpd\") pod \"ceilometer-0\" (UID: \"d2c5c794-322f-4411-8c97-ffe26c839870\") " pod="openstack/ceilometer-0" Mar 19 19:21:05 crc kubenswrapper[4826]: I0319 19:21:05.569587 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2c5c794-322f-4411-8c97-ffe26c839870-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d2c5c794-322f-4411-8c97-ffe26c839870\") " pod="openstack/ceilometer-0" Mar 19 19:21:05 crc kubenswrapper[4826]: I0319 19:21:05.569619 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2c5c794-322f-4411-8c97-ffe26c839870-scripts\") pod \"ceilometer-0\" (UID: \"d2c5c794-322f-4411-8c97-ffe26c839870\") " pod="openstack/ceilometer-0" Mar 19 19:21:05 crc kubenswrapper[4826]: I0319 19:21:05.569762 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2c5c794-322f-4411-8c97-ffe26c839870-config-data\") pod \"ceilometer-0\" (UID: \"d2c5c794-322f-4411-8c97-ffe26c839870\") " pod="openstack/ceilometer-0" Mar 19 19:21:05 crc kubenswrapper[4826]: I0319 19:21:05.569866 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fb55d\" (UniqueName: \"kubernetes.io/projected/d2c5c794-322f-4411-8c97-ffe26c839870-kube-api-access-fb55d\") pod \"ceilometer-0\" (UID: \"d2c5c794-322f-4411-8c97-ffe26c839870\") " pod="openstack/ceilometer-0" Mar 19 19:21:05 crc kubenswrapper[4826]: I0319 19:21:05.570516 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2c5c794-322f-4411-8c97-ffe26c839870-run-httpd\") pod \"ceilometer-0\" (UID: \"d2c5c794-322f-4411-8c97-ffe26c839870\") " pod="openstack/ceilometer-0" Mar 19 19:21:05 crc kubenswrapper[4826]: I0319 19:21:05.570525 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2c5c794-322f-4411-8c97-ffe26c839870-log-httpd\") pod \"ceilometer-0\" (UID: \"d2c5c794-322f-4411-8c97-ffe26c839870\") " pod="openstack/ceilometer-0" Mar 19 19:21:05 crc kubenswrapper[4826]: I0319 19:21:05.574777 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2c5c794-322f-4411-8c97-ffe26c839870-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d2c5c794-322f-4411-8c97-ffe26c839870\") " pod="openstack/ceilometer-0" Mar 19 19:21:05 crc kubenswrapper[4826]: I0319 19:21:05.576143 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2c5c794-322f-4411-8c97-ffe26c839870-scripts\") pod \"ceilometer-0\" (UID: \"d2c5c794-322f-4411-8c97-ffe26c839870\") " pod="openstack/ceilometer-0" Mar 19 19:21:05 crc kubenswrapper[4826]: I0319 19:21:05.576917 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2c5c794-322f-4411-8c97-ffe26c839870-config-data\") pod \"ceilometer-0\" (UID: \"d2c5c794-322f-4411-8c97-ffe26c839870\") " pod="openstack/ceilometer-0" Mar 19 19:21:05 crc kubenswrapper[4826]: I0319 19:21:05.577194 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2c5c794-322f-4411-8c97-ffe26c839870-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d2c5c794-322f-4411-8c97-ffe26c839870\") " pod="openstack/ceilometer-0" Mar 19 19:21:05 crc kubenswrapper[4826]: I0319 19:21:05.597091 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb55d\" (UniqueName: \"kubernetes.io/projected/d2c5c794-322f-4411-8c97-ffe26c839870-kube-api-access-fb55d\") pod \"ceilometer-0\" (UID: \"d2c5c794-322f-4411-8c97-ffe26c839870\") " pod="openstack/ceilometer-0" Mar 19 19:21:05 crc kubenswrapper[4826]: I0319 19:21:05.630003 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:21:06 crc kubenswrapper[4826]: I0319 19:21:06.001248 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a25b4b35-8c3f-41b6-b5e4-170125ab3ab3" path="/var/lib/kubelet/pods/a25b4b35-8c3f-41b6-b5e4-170125ab3ab3/volumes" Mar 19 19:21:06 crc kubenswrapper[4826]: I0319 19:21:06.145713 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:21:06 crc kubenswrapper[4826]: W0319 19:21:06.147858 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2c5c794_322f_4411_8c97_ffe26c839870.slice/crio-82dc1f47452b0089d25c8aca39aac219705b2f0f602681d93461878cb00eb05a WatchSource:0}: Error finding container 82dc1f47452b0089d25c8aca39aac219705b2f0f602681d93461878cb00eb05a: Status 404 returned error can't find the container with id 82dc1f47452b0089d25c8aca39aac219705b2f0f602681d93461878cb00eb05a Mar 19 19:21:06 crc kubenswrapper[4826]: I0319 19:21:06.405819 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2c5c794-322f-4411-8c97-ffe26c839870","Type":"ContainerStarted","Data":"82dc1f47452b0089d25c8aca39aac219705b2f0f602681d93461878cb00eb05a"} Mar 19 19:21:06 crc kubenswrapper[4826]: I0319 19:21:06.542715 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:21:06 crc kubenswrapper[4826]: I0319 19:21:06.563171 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:21:06 crc kubenswrapper[4826]: I0319 19:21:06.859956 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.134503 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.207063 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6bded02-bd3b-455f-9ca2-e075e45efce2-combined-ca-bundle\") pod \"a6bded02-bd3b-455f-9ca2-e075e45efce2\" (UID: \"a6bded02-bd3b-455f-9ca2-e075e45efce2\") " Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.207151 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqxl6\" (UniqueName: \"kubernetes.io/projected/a6bded02-bd3b-455f-9ca2-e075e45efce2-kube-api-access-vqxl6\") pod \"a6bded02-bd3b-455f-9ca2-e075e45efce2\" (UID: \"a6bded02-bd3b-455f-9ca2-e075e45efce2\") " Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.207330 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6bded02-bd3b-455f-9ca2-e075e45efce2-config-data\") pod \"a6bded02-bd3b-455f-9ca2-e075e45efce2\" (UID: \"a6bded02-bd3b-455f-9ca2-e075e45efce2\") " Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.207366 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6bded02-bd3b-455f-9ca2-e075e45efce2-logs\") pod \"a6bded02-bd3b-455f-9ca2-e075e45efce2\" (UID: \"a6bded02-bd3b-455f-9ca2-e075e45efce2\") " Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.208515 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6bded02-bd3b-455f-9ca2-e075e45efce2-logs" (OuterVolumeSpecName: "logs") pod "a6bded02-bd3b-455f-9ca2-e075e45efce2" (UID: "a6bded02-bd3b-455f-9ca2-e075e45efce2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.212689 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6bded02-bd3b-455f-9ca2-e075e45efce2-kube-api-access-vqxl6" (OuterVolumeSpecName: "kube-api-access-vqxl6") pod "a6bded02-bd3b-455f-9ca2-e075e45efce2" (UID: "a6bded02-bd3b-455f-9ca2-e075e45efce2"). InnerVolumeSpecName "kube-api-access-vqxl6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.256485 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6bded02-bd3b-455f-9ca2-e075e45efce2-config-data" (OuterVolumeSpecName: "config-data") pod "a6bded02-bd3b-455f-9ca2-e075e45efce2" (UID: "a6bded02-bd3b-455f-9ca2-e075e45efce2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.261744 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6bded02-bd3b-455f-9ca2-e075e45efce2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a6bded02-bd3b-455f-9ca2-e075e45efce2" (UID: "a6bded02-bd3b-455f-9ca2-e075e45efce2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.310443 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqxl6\" (UniqueName: \"kubernetes.io/projected/a6bded02-bd3b-455f-9ca2-e075e45efce2-kube-api-access-vqxl6\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.310767 4826 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6bded02-bd3b-455f-9ca2-e075e45efce2-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.310778 4826 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6bded02-bd3b-455f-9ca2-e075e45efce2-logs\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.310786 4826 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6bded02-bd3b-455f-9ca2-e075e45efce2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.424213 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2c5c794-322f-4411-8c97-ffe26c839870","Type":"ContainerStarted","Data":"86706cd24f93f371f82041738d01d85cbab5881bdaada3c05f6b7a4c859a34a2"} Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.428187 4826 generic.go:334] "Generic (PLEG): container finished" podID="a6bded02-bd3b-455f-9ca2-e075e45efce2" containerID="d45aa60393a89e6eb5f2b8d8e8f3fc97b97f1c8ea445bc961c4d74b049bd2ad8" exitCode=0 Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.428281 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a6bded02-bd3b-455f-9ca2-e075e45efce2","Type":"ContainerDied","Data":"d45aa60393a89e6eb5f2b8d8e8f3fc97b97f1c8ea445bc961c4d74b049bd2ad8"} Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.428318 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a6bded02-bd3b-455f-9ca2-e075e45efce2","Type":"ContainerDied","Data":"53c853ab47911738e92f7646bfa6c9ac86adf0f6c8bf6986a0d0c7f508a945d6"} Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.428348 4826 scope.go:117] "RemoveContainer" containerID="d45aa60393a89e6eb5f2b8d8e8f3fc97b97f1c8ea445bc961c4d74b049bd2ad8" Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.428498 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.450090 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.476734 4826 scope.go:117] "RemoveContainer" containerID="b662033fd78e3e49d124f351a53cb51de19019121010bc527f15b070a0b6cbd5" Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.494790 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.506624 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.568866 4826 scope.go:117] "RemoveContainer" containerID="d45aa60393a89e6eb5f2b8d8e8f3fc97b97f1c8ea445bc961c4d74b049bd2ad8" Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.574881 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 19 19:21:07 crc kubenswrapper[4826]: E0319 19:21:07.575435 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6bded02-bd3b-455f-9ca2-e075e45efce2" containerName="nova-api-api" Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.575503 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6bded02-bd3b-455f-9ca2-e075e45efce2" containerName="nova-api-api" Mar 19 19:21:07 crc kubenswrapper[4826]: E0319 19:21:07.575558 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6bded02-bd3b-455f-9ca2-e075e45efce2" containerName="nova-api-log" Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.575614 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6bded02-bd3b-455f-9ca2-e075e45efce2" containerName="nova-api-log" Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.575922 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6bded02-bd3b-455f-9ca2-e075e45efce2" containerName="nova-api-log" Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.575999 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6bded02-bd3b-455f-9ca2-e075e45efce2" containerName="nova-api-api" Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.577189 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 19:21:07 crc kubenswrapper[4826]: E0319 19:21:07.605478 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d45aa60393a89e6eb5f2b8d8e8f3fc97b97f1c8ea445bc961c4d74b049bd2ad8\": container with ID starting with d45aa60393a89e6eb5f2b8d8e8f3fc97b97f1c8ea445bc961c4d74b049bd2ad8 not found: ID does not exist" containerID="d45aa60393a89e6eb5f2b8d8e8f3fc97b97f1c8ea445bc961c4d74b049bd2ad8" Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.605524 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d45aa60393a89e6eb5f2b8d8e8f3fc97b97f1c8ea445bc961c4d74b049bd2ad8"} err="failed to get container status \"d45aa60393a89e6eb5f2b8d8e8f3fc97b97f1c8ea445bc961c4d74b049bd2ad8\": rpc error: code = NotFound desc = could not find container \"d45aa60393a89e6eb5f2b8d8e8f3fc97b97f1c8ea445bc961c4d74b049bd2ad8\": container with ID starting with d45aa60393a89e6eb5f2b8d8e8f3fc97b97f1c8ea445bc961c4d74b049bd2ad8 not found: ID does not exist" Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.605549 4826 scope.go:117] "RemoveContainer" containerID="b662033fd78e3e49d124f351a53cb51de19019121010bc527f15b070a0b6cbd5" Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.616912 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.617287 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.617468 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.619060 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9edd25fe-7b6f-45d9-835e-6294924dd2e8-public-tls-certs\") pod \"nova-api-0\" (UID: \"9edd25fe-7b6f-45d9-835e-6294924dd2e8\") " pod="openstack/nova-api-0" Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.619106 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9edd25fe-7b6f-45d9-835e-6294924dd2e8-logs\") pod \"nova-api-0\" (UID: \"9edd25fe-7b6f-45d9-835e-6294924dd2e8\") " pod="openstack/nova-api-0" Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.619200 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9edd25fe-7b6f-45d9-835e-6294924dd2e8-config-data\") pod \"nova-api-0\" (UID: \"9edd25fe-7b6f-45d9-835e-6294924dd2e8\") " pod="openstack/nova-api-0" Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.619230 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9edd25fe-7b6f-45d9-835e-6294924dd2e8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9edd25fe-7b6f-45d9-835e-6294924dd2e8\") " pod="openstack/nova-api-0" Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.619251 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc7bd\" (UniqueName: \"kubernetes.io/projected/9edd25fe-7b6f-45d9-835e-6294924dd2e8-kube-api-access-wc7bd\") pod \"nova-api-0\" (UID: \"9edd25fe-7b6f-45d9-835e-6294924dd2e8\") " pod="openstack/nova-api-0" Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.619312 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9edd25fe-7b6f-45d9-835e-6294924dd2e8-internal-tls-certs\") pod \"nova-api-0\" (UID: \"9edd25fe-7b6f-45d9-835e-6294924dd2e8\") " pod="openstack/nova-api-0" Mar 19 19:21:07 crc kubenswrapper[4826]: E0319 19:21:07.635196 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b662033fd78e3e49d124f351a53cb51de19019121010bc527f15b070a0b6cbd5\": container with ID starting with b662033fd78e3e49d124f351a53cb51de19019121010bc527f15b070a0b6cbd5 not found: ID does not exist" containerID="b662033fd78e3e49d124f351a53cb51de19019121010bc527f15b070a0b6cbd5" Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.635441 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b662033fd78e3e49d124f351a53cb51de19019121010bc527f15b070a0b6cbd5"} err="failed to get container status \"b662033fd78e3e49d124f351a53cb51de19019121010bc527f15b070a0b6cbd5\": rpc error: code = NotFound desc = could not find container \"b662033fd78e3e49d124f351a53cb51de19019121010bc527f15b070a0b6cbd5\": container with ID starting with b662033fd78e3e49d124f351a53cb51de19019121010bc527f15b070a0b6cbd5 not found: ID does not exist" Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.702759 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.716425 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.717414 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.738335 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9edd25fe-7b6f-45d9-835e-6294924dd2e8-public-tls-certs\") pod \"nova-api-0\" (UID: \"9edd25fe-7b6f-45d9-835e-6294924dd2e8\") " pod="openstack/nova-api-0" Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.738394 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9edd25fe-7b6f-45d9-835e-6294924dd2e8-logs\") pod \"nova-api-0\" (UID: \"9edd25fe-7b6f-45d9-835e-6294924dd2e8\") " pod="openstack/nova-api-0" Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.738600 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9edd25fe-7b6f-45d9-835e-6294924dd2e8-config-data\") pod \"nova-api-0\" (UID: \"9edd25fe-7b6f-45d9-835e-6294924dd2e8\") " pod="openstack/nova-api-0" Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.738640 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9edd25fe-7b6f-45d9-835e-6294924dd2e8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9edd25fe-7b6f-45d9-835e-6294924dd2e8\") " pod="openstack/nova-api-0" Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.738676 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wc7bd\" (UniqueName: \"kubernetes.io/projected/9edd25fe-7b6f-45d9-835e-6294924dd2e8-kube-api-access-wc7bd\") pod \"nova-api-0\" (UID: \"9edd25fe-7b6f-45d9-835e-6294924dd2e8\") " pod="openstack/nova-api-0" Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.738786 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9edd25fe-7b6f-45d9-835e-6294924dd2e8-internal-tls-certs\") pod \"nova-api-0\" (UID: \"9edd25fe-7b6f-45d9-835e-6294924dd2e8\") " pod="openstack/nova-api-0" Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.745188 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9edd25fe-7b6f-45d9-835e-6294924dd2e8-logs\") pod \"nova-api-0\" (UID: \"9edd25fe-7b6f-45d9-835e-6294924dd2e8\") " pod="openstack/nova-api-0" Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.792789 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc7bd\" (UniqueName: \"kubernetes.io/projected/9edd25fe-7b6f-45d9-835e-6294924dd2e8-kube-api-access-wc7bd\") pod \"nova-api-0\" (UID: \"9edd25fe-7b6f-45d9-835e-6294924dd2e8\") " pod="openstack/nova-api-0" Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.795294 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9edd25fe-7b6f-45d9-835e-6294924dd2e8-internal-tls-certs\") pod \"nova-api-0\" (UID: \"9edd25fe-7b6f-45d9-835e-6294924dd2e8\") " pod="openstack/nova-api-0" Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.795449 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9edd25fe-7b6f-45d9-835e-6294924dd2e8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9edd25fe-7b6f-45d9-835e-6294924dd2e8\") " pod="openstack/nova-api-0" Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.837423 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9edd25fe-7b6f-45d9-835e-6294924dd2e8-config-data\") pod \"nova-api-0\" (UID: \"9edd25fe-7b6f-45d9-835e-6294924dd2e8\") " pod="openstack/nova-api-0" Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.850149 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9edd25fe-7b6f-45d9-835e-6294924dd2e8-public-tls-certs\") pod \"nova-api-0\" (UID: \"9edd25fe-7b6f-45d9-835e-6294924dd2e8\") " pod="openstack/nova-api-0" Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.899540 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-4bbb2"] Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.901363 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-4bbb2" Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.904688 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.911443 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.940159 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.940483 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-4bbb2"] Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.943680 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ll8s\" (UniqueName: \"kubernetes.io/projected/a8c464b5-25b1-4543-8067-01ca9883d215-kube-api-access-4ll8s\") pod \"nova-cell1-cell-mapping-4bbb2\" (UID: \"a8c464b5-25b1-4543-8067-01ca9883d215\") " pod="openstack/nova-cell1-cell-mapping-4bbb2" Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.943774 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8c464b5-25b1-4543-8067-01ca9883d215-scripts\") pod \"nova-cell1-cell-mapping-4bbb2\" (UID: \"a8c464b5-25b1-4543-8067-01ca9883d215\") " pod="openstack/nova-cell1-cell-mapping-4bbb2" Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.943907 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8c464b5-25b1-4543-8067-01ca9883d215-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-4bbb2\" (UID: \"a8c464b5-25b1-4543-8067-01ca9883d215\") " pod="openstack/nova-cell1-cell-mapping-4bbb2" Mar 19 19:21:07 crc kubenswrapper[4826]: I0319 19:21:07.943949 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8c464b5-25b1-4543-8067-01ca9883d215-config-data\") pod \"nova-cell1-cell-mapping-4bbb2\" (UID: \"a8c464b5-25b1-4543-8067-01ca9883d215\") " pod="openstack/nova-cell1-cell-mapping-4bbb2" Mar 19 19:21:08 crc kubenswrapper[4826]: I0319 19:21:08.001538 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6bded02-bd3b-455f-9ca2-e075e45efce2" path="/var/lib/kubelet/pods/a6bded02-bd3b-455f-9ca2-e075e45efce2/volumes" Mar 19 19:21:08 crc kubenswrapper[4826]: I0319 19:21:08.045118 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8c464b5-25b1-4543-8067-01ca9883d215-scripts\") pod \"nova-cell1-cell-mapping-4bbb2\" (UID: \"a8c464b5-25b1-4543-8067-01ca9883d215\") " pod="openstack/nova-cell1-cell-mapping-4bbb2" Mar 19 19:21:08 crc kubenswrapper[4826]: I0319 19:21:08.045286 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8c464b5-25b1-4543-8067-01ca9883d215-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-4bbb2\" (UID: \"a8c464b5-25b1-4543-8067-01ca9883d215\") " pod="openstack/nova-cell1-cell-mapping-4bbb2" Mar 19 19:21:08 crc kubenswrapper[4826]: I0319 19:21:08.048118 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8c464b5-25b1-4543-8067-01ca9883d215-config-data\") pod \"nova-cell1-cell-mapping-4bbb2\" (UID: \"a8c464b5-25b1-4543-8067-01ca9883d215\") " pod="openstack/nova-cell1-cell-mapping-4bbb2" Mar 19 19:21:08 crc kubenswrapper[4826]: I0319 19:21:08.048274 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ll8s\" (UniqueName: \"kubernetes.io/projected/a8c464b5-25b1-4543-8067-01ca9883d215-kube-api-access-4ll8s\") pod \"nova-cell1-cell-mapping-4bbb2\" (UID: \"a8c464b5-25b1-4543-8067-01ca9883d215\") " pod="openstack/nova-cell1-cell-mapping-4bbb2" Mar 19 19:21:08 crc kubenswrapper[4826]: I0319 19:21:08.049237 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8c464b5-25b1-4543-8067-01ca9883d215-scripts\") pod \"nova-cell1-cell-mapping-4bbb2\" (UID: \"a8c464b5-25b1-4543-8067-01ca9883d215\") " pod="openstack/nova-cell1-cell-mapping-4bbb2" Mar 19 19:21:08 crc kubenswrapper[4826]: I0319 19:21:08.050107 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8c464b5-25b1-4543-8067-01ca9883d215-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-4bbb2\" (UID: \"a8c464b5-25b1-4543-8067-01ca9883d215\") " pod="openstack/nova-cell1-cell-mapping-4bbb2" Mar 19 19:21:08 crc kubenswrapper[4826]: I0319 19:21:08.054194 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8c464b5-25b1-4543-8067-01ca9883d215-config-data\") pod \"nova-cell1-cell-mapping-4bbb2\" (UID: \"a8c464b5-25b1-4543-8067-01ca9883d215\") " pod="openstack/nova-cell1-cell-mapping-4bbb2" Mar 19 19:21:08 crc kubenswrapper[4826]: I0319 19:21:08.066168 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ll8s\" (UniqueName: \"kubernetes.io/projected/a8c464b5-25b1-4543-8067-01ca9883d215-kube-api-access-4ll8s\") pod \"nova-cell1-cell-mapping-4bbb2\" (UID: \"a8c464b5-25b1-4543-8067-01ca9883d215\") " pod="openstack/nova-cell1-cell-mapping-4bbb2" Mar 19 19:21:08 crc kubenswrapper[4826]: I0319 19:21:08.255184 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-4bbb2" Mar 19 19:21:08 crc kubenswrapper[4826]: I0319 19:21:08.441087 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2c5c794-322f-4411-8c97-ffe26c839870","Type":"ContainerStarted","Data":"120b5a18964dd3803d5a3787a7b90b476414b8f449d8faddbe45db1a376809c7"} Mar 19 19:21:08 crc kubenswrapper[4826]: I0319 19:21:08.547028 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 19:21:08 crc kubenswrapper[4826]: W0319 19:21:08.562302 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9edd25fe_7b6f_45d9_835e_6294924dd2e8.slice/crio-2365cccb9107e41c2e7bd5f369b24edabb660690f976ffa7f1bfe146d8c725d1 WatchSource:0}: Error finding container 2365cccb9107e41c2e7bd5f369b24edabb660690f976ffa7f1bfe146d8c725d1: Status 404 returned error can't find the container with id 2365cccb9107e41c2e7bd5f369b24edabb660690f976ffa7f1bfe146d8c725d1 Mar 19 19:21:08 crc kubenswrapper[4826]: I0319 19:21:08.751930 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2f38764b-fa70-46d3-a045-024fe04b86a6" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.6:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 19:21:08 crc kubenswrapper[4826]: I0319 19:21:08.751935 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2f38764b-fa70-46d3-a045-024fe04b86a6" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.6:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 19:21:08 crc kubenswrapper[4826]: I0319 19:21:08.811393 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-4bbb2"] Mar 19 19:21:09 crc kubenswrapper[4826]: I0319 19:21:09.453251 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2c5c794-322f-4411-8c97-ffe26c839870","Type":"ContainerStarted","Data":"bf0b0b439d8cc34678151ba96a557c5b6e9d9751f1e7893172e6ddfa22680fbc"} Mar 19 19:21:09 crc kubenswrapper[4826]: I0319 19:21:09.456688 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9edd25fe-7b6f-45d9-835e-6294924dd2e8","Type":"ContainerStarted","Data":"9680c6258104006dbd167cbb1c88f520e4eeeffb438bf476d2f46d00460123e8"} Mar 19 19:21:09 crc kubenswrapper[4826]: I0319 19:21:09.456754 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9edd25fe-7b6f-45d9-835e-6294924dd2e8","Type":"ContainerStarted","Data":"b3acaa4f1ddd5de697825bd8b7ea0be75e8d7eaa7aebacc95ca99f2e3b632eac"} Mar 19 19:21:09 crc kubenswrapper[4826]: I0319 19:21:09.456768 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9edd25fe-7b6f-45d9-835e-6294924dd2e8","Type":"ContainerStarted","Data":"2365cccb9107e41c2e7bd5f369b24edabb660690f976ffa7f1bfe146d8c725d1"} Mar 19 19:21:09 crc kubenswrapper[4826]: I0319 19:21:09.458730 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-4bbb2" event={"ID":"a8c464b5-25b1-4543-8067-01ca9883d215","Type":"ContainerStarted","Data":"09baaea1ee7746a33fabaa2cd9a14a050dad1d6223ce71cfd63f0bb9d1e46a1b"} Mar 19 19:21:09 crc kubenswrapper[4826]: I0319 19:21:09.458771 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-4bbb2" event={"ID":"a8c464b5-25b1-4543-8067-01ca9883d215","Type":"ContainerStarted","Data":"47521735cd6e7c19a90933dceaf811d13aefeb17764f2e0a993f418d0f3a27a2"} Mar 19 19:21:09 crc kubenswrapper[4826]: I0319 19:21:09.483314 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.483297808 podStartE2EDuration="2.483297808s" podCreationTimestamp="2026-03-19 19:21:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:21:09.472043556 +0000 UTC m=+1494.226111869" watchObservedRunningTime="2026-03-19 19:21:09.483297808 +0000 UTC m=+1494.237366121" Mar 19 19:21:09 crc kubenswrapper[4826]: I0319 19:21:09.498965 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-4bbb2" podStartSLOduration=2.498945006 podStartE2EDuration="2.498945006s" podCreationTimestamp="2026-03-19 19:21:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:21:09.488525315 +0000 UTC m=+1494.242593628" watchObservedRunningTime="2026-03-19 19:21:09.498945006 +0000 UTC m=+1494.253013319" Mar 19 19:21:10 crc kubenswrapper[4826]: I0319 19:21:10.798883 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-f84f9ccf-p6gql" Mar 19 19:21:10 crc kubenswrapper[4826]: I0319 19:21:10.914905 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-568d7fd7cf-2l77b"] Mar 19 19:21:10 crc kubenswrapper[4826]: I0319 19:21:10.915319 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-568d7fd7cf-2l77b" podUID="9d685221-0f29-401d-af30-cbf46d71761f" containerName="dnsmasq-dns" containerID="cri-o://08ca47e73a555446029fe4510be127fe2cb0a3214a86ca16eef9fe536654a285" gracePeriod=10 Mar 19 19:21:10 crc kubenswrapper[4826]: I0319 19:21:10.970345 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rqc8n"] Mar 19 19:21:10 crc kubenswrapper[4826]: I0319 19:21:10.976501 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rqc8n" Mar 19 19:21:11 crc kubenswrapper[4826]: I0319 19:21:11.011009 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rqc8n"] Mar 19 19:21:11 crc kubenswrapper[4826]: I0319 19:21:11.149224 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stgfq\" (UniqueName: \"kubernetes.io/projected/2b5eabb2-64fe-402a-be4d-7980a750f598-kube-api-access-stgfq\") pod \"certified-operators-rqc8n\" (UID: \"2b5eabb2-64fe-402a-be4d-7980a750f598\") " pod="openshift-marketplace/certified-operators-rqc8n" Mar 19 19:21:11 crc kubenswrapper[4826]: I0319 19:21:11.149269 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b5eabb2-64fe-402a-be4d-7980a750f598-catalog-content\") pod \"certified-operators-rqc8n\" (UID: \"2b5eabb2-64fe-402a-be4d-7980a750f598\") " pod="openshift-marketplace/certified-operators-rqc8n" Mar 19 19:21:11 crc kubenswrapper[4826]: I0319 19:21:11.149365 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b5eabb2-64fe-402a-be4d-7980a750f598-utilities\") pod \"certified-operators-rqc8n\" (UID: \"2b5eabb2-64fe-402a-be4d-7980a750f598\") " pod="openshift-marketplace/certified-operators-rqc8n" Mar 19 19:21:11 crc kubenswrapper[4826]: I0319 19:21:11.251548 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b5eabb2-64fe-402a-be4d-7980a750f598-utilities\") pod \"certified-operators-rqc8n\" (UID: \"2b5eabb2-64fe-402a-be4d-7980a750f598\") " pod="openshift-marketplace/certified-operators-rqc8n" Mar 19 19:21:11 crc kubenswrapper[4826]: I0319 19:21:11.251823 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stgfq\" (UniqueName: \"kubernetes.io/projected/2b5eabb2-64fe-402a-be4d-7980a750f598-kube-api-access-stgfq\") pod \"certified-operators-rqc8n\" (UID: \"2b5eabb2-64fe-402a-be4d-7980a750f598\") " pod="openshift-marketplace/certified-operators-rqc8n" Mar 19 19:21:11 crc kubenswrapper[4826]: I0319 19:21:11.251852 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b5eabb2-64fe-402a-be4d-7980a750f598-catalog-content\") pod \"certified-operators-rqc8n\" (UID: \"2b5eabb2-64fe-402a-be4d-7980a750f598\") " pod="openshift-marketplace/certified-operators-rqc8n" Mar 19 19:21:11 crc kubenswrapper[4826]: I0319 19:21:11.252318 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b5eabb2-64fe-402a-be4d-7980a750f598-catalog-content\") pod \"certified-operators-rqc8n\" (UID: \"2b5eabb2-64fe-402a-be4d-7980a750f598\") " pod="openshift-marketplace/certified-operators-rqc8n" Mar 19 19:21:11 crc kubenswrapper[4826]: I0319 19:21:11.252578 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b5eabb2-64fe-402a-be4d-7980a750f598-utilities\") pod \"certified-operators-rqc8n\" (UID: \"2b5eabb2-64fe-402a-be4d-7980a750f598\") " pod="openshift-marketplace/certified-operators-rqc8n" Mar 19 19:21:11 crc kubenswrapper[4826]: I0319 19:21:11.275412 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stgfq\" (UniqueName: \"kubernetes.io/projected/2b5eabb2-64fe-402a-be4d-7980a750f598-kube-api-access-stgfq\") pod \"certified-operators-rqc8n\" (UID: \"2b5eabb2-64fe-402a-be4d-7980a750f598\") " pod="openshift-marketplace/certified-operators-rqc8n" Mar 19 19:21:11 crc kubenswrapper[4826]: I0319 19:21:11.396531 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rqc8n" Mar 19 19:21:11 crc kubenswrapper[4826]: I0319 19:21:11.480463 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-568d7fd7cf-2l77b" Mar 19 19:21:11 crc kubenswrapper[4826]: I0319 19:21:11.485878 4826 generic.go:334] "Generic (PLEG): container finished" podID="9d685221-0f29-401d-af30-cbf46d71761f" containerID="08ca47e73a555446029fe4510be127fe2cb0a3214a86ca16eef9fe536654a285" exitCode=0 Mar 19 19:21:11 crc kubenswrapper[4826]: I0319 19:21:11.486070 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-568d7fd7cf-2l77b" Mar 19 19:21:11 crc kubenswrapper[4826]: I0319 19:21:11.486482 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568d7fd7cf-2l77b" event={"ID":"9d685221-0f29-401d-af30-cbf46d71761f","Type":"ContainerDied","Data":"08ca47e73a555446029fe4510be127fe2cb0a3214a86ca16eef9fe536654a285"} Mar 19 19:21:11 crc kubenswrapper[4826]: I0319 19:21:11.486518 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568d7fd7cf-2l77b" event={"ID":"9d685221-0f29-401d-af30-cbf46d71761f","Type":"ContainerDied","Data":"b6aab2661df461134f6042d890f6e93d6ef0741cdf9dbb0248c37bf9029966fc"} Mar 19 19:21:11 crc kubenswrapper[4826]: I0319 19:21:11.486537 4826 scope.go:117] "RemoveContainer" containerID="08ca47e73a555446029fe4510be127fe2cb0a3214a86ca16eef9fe536654a285" Mar 19 19:21:11 crc kubenswrapper[4826]: I0319 19:21:11.578561 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2c5c794-322f-4411-8c97-ffe26c839870","Type":"ContainerStarted","Data":"fdeaf3bba11c51e0eef3996637ea9414068776a93b7a8da7359bdc2f09acc8b9"} Mar 19 19:21:11 crc kubenswrapper[4826]: I0319 19:21:11.578764 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d2c5c794-322f-4411-8c97-ffe26c839870" containerName="ceilometer-central-agent" containerID="cri-o://86706cd24f93f371f82041738d01d85cbab5881bdaada3c05f6b7a4c859a34a2" gracePeriod=30 Mar 19 19:21:11 crc kubenswrapper[4826]: I0319 19:21:11.579043 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 19 19:21:11 crc kubenswrapper[4826]: I0319 19:21:11.579400 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d2c5c794-322f-4411-8c97-ffe26c839870" containerName="proxy-httpd" containerID="cri-o://fdeaf3bba11c51e0eef3996637ea9414068776a93b7a8da7359bdc2f09acc8b9" gracePeriod=30 Mar 19 19:21:11 crc kubenswrapper[4826]: I0319 19:21:11.579446 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d2c5c794-322f-4411-8c97-ffe26c839870" containerName="sg-core" containerID="cri-o://bf0b0b439d8cc34678151ba96a557c5b6e9d9751f1e7893172e6ddfa22680fbc" gracePeriod=30 Mar 19 19:21:11 crc kubenswrapper[4826]: I0319 19:21:11.579479 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d2c5c794-322f-4411-8c97-ffe26c839870" containerName="ceilometer-notification-agent" containerID="cri-o://120b5a18964dd3803d5a3787a7b90b476414b8f449d8faddbe45db1a376809c7" gracePeriod=30 Mar 19 19:21:11 crc kubenswrapper[4826]: I0319 19:21:11.610935 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.119091478 podStartE2EDuration="6.610918472s" podCreationTimestamp="2026-03-19 19:21:05 +0000 UTC" firstStartedPulling="2026-03-19 19:21:06.154791921 +0000 UTC m=+1490.908860234" lastFinishedPulling="2026-03-19 19:21:10.646618915 +0000 UTC m=+1495.400687228" observedRunningTime="2026-03-19 19:21:11.602718223 +0000 UTC m=+1496.356786536" watchObservedRunningTime="2026-03-19 19:21:11.610918472 +0000 UTC m=+1496.364986785" Mar 19 19:21:11 crc kubenswrapper[4826]: I0319 19:21:11.641208 4826 scope.go:117] "RemoveContainer" containerID="ffea9df019c22d2622d4e303b878781de8efe11751dca9df73e3faca51be38ad" Mar 19 19:21:11 crc kubenswrapper[4826]: I0319 19:21:11.659819 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d685221-0f29-401d-af30-cbf46d71761f-config\") pod \"9d685221-0f29-401d-af30-cbf46d71761f\" (UID: \"9d685221-0f29-401d-af30-cbf46d71761f\") " Mar 19 19:21:11 crc kubenswrapper[4826]: I0319 19:21:11.660056 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d685221-0f29-401d-af30-cbf46d71761f-ovsdbserver-nb\") pod \"9d685221-0f29-401d-af30-cbf46d71761f\" (UID: \"9d685221-0f29-401d-af30-cbf46d71761f\") " Mar 19 19:21:11 crc kubenswrapper[4826]: I0319 19:21:11.660096 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9d685221-0f29-401d-af30-cbf46d71761f-dns-swift-storage-0\") pod \"9d685221-0f29-401d-af30-cbf46d71761f\" (UID: \"9d685221-0f29-401d-af30-cbf46d71761f\") " Mar 19 19:21:11 crc kubenswrapper[4826]: I0319 19:21:11.660126 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d685221-0f29-401d-af30-cbf46d71761f-dns-svc\") pod \"9d685221-0f29-401d-af30-cbf46d71761f\" (UID: \"9d685221-0f29-401d-af30-cbf46d71761f\") " Mar 19 19:21:11 crc kubenswrapper[4826]: I0319 19:21:11.660159 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnnf6\" (UniqueName: \"kubernetes.io/projected/9d685221-0f29-401d-af30-cbf46d71761f-kube-api-access-gnnf6\") pod \"9d685221-0f29-401d-af30-cbf46d71761f\" (UID: \"9d685221-0f29-401d-af30-cbf46d71761f\") " Mar 19 19:21:11 crc kubenswrapper[4826]: I0319 19:21:11.660338 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d685221-0f29-401d-af30-cbf46d71761f-ovsdbserver-sb\") pod \"9d685221-0f29-401d-af30-cbf46d71761f\" (UID: \"9d685221-0f29-401d-af30-cbf46d71761f\") " Mar 19 19:21:11 crc kubenswrapper[4826]: I0319 19:21:11.687094 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d685221-0f29-401d-af30-cbf46d71761f-kube-api-access-gnnf6" (OuterVolumeSpecName: "kube-api-access-gnnf6") pod "9d685221-0f29-401d-af30-cbf46d71761f" (UID: "9d685221-0f29-401d-af30-cbf46d71761f"). InnerVolumeSpecName "kube-api-access-gnnf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:21:11 crc kubenswrapper[4826]: I0319 19:21:11.769350 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gnnf6\" (UniqueName: \"kubernetes.io/projected/9d685221-0f29-401d-af30-cbf46d71761f-kube-api-access-gnnf6\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:11 crc kubenswrapper[4826]: I0319 19:21:11.810056 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d685221-0f29-401d-af30-cbf46d71761f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9d685221-0f29-401d-af30-cbf46d71761f" (UID: "9d685221-0f29-401d-af30-cbf46d71761f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:21:11 crc kubenswrapper[4826]: I0319 19:21:11.871804 4826 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9d685221-0f29-401d-af30-cbf46d71761f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:11 crc kubenswrapper[4826]: I0319 19:21:11.872485 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d685221-0f29-401d-af30-cbf46d71761f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9d685221-0f29-401d-af30-cbf46d71761f" (UID: "9d685221-0f29-401d-af30-cbf46d71761f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:21:11 crc kubenswrapper[4826]: I0319 19:21:11.872741 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d685221-0f29-401d-af30-cbf46d71761f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9d685221-0f29-401d-af30-cbf46d71761f" (UID: "9d685221-0f29-401d-af30-cbf46d71761f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:21:11 crc kubenswrapper[4826]: I0319 19:21:11.877583 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d685221-0f29-401d-af30-cbf46d71761f-config" (OuterVolumeSpecName: "config") pod "9d685221-0f29-401d-af30-cbf46d71761f" (UID: "9d685221-0f29-401d-af30-cbf46d71761f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:21:11 crc kubenswrapper[4826]: I0319 19:21:11.889334 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d685221-0f29-401d-af30-cbf46d71761f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9d685221-0f29-401d-af30-cbf46d71761f" (UID: "9d685221-0f29-401d-af30-cbf46d71761f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:21:11 crc kubenswrapper[4826]: I0319 19:21:11.974205 4826 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d685221-0f29-401d-af30-cbf46d71761f-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:11 crc kubenswrapper[4826]: I0319 19:21:11.974407 4826 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d685221-0f29-401d-af30-cbf46d71761f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:11 crc kubenswrapper[4826]: I0319 19:21:11.974502 4826 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d685221-0f29-401d-af30-cbf46d71761f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:11 crc kubenswrapper[4826]: I0319 19:21:11.974573 4826 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d685221-0f29-401d-af30-cbf46d71761f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:11 crc kubenswrapper[4826]: I0319 19:21:11.998014 4826 scope.go:117] "RemoveContainer" containerID="08ca47e73a555446029fe4510be127fe2cb0a3214a86ca16eef9fe536654a285" Mar 19 19:21:11 crc kubenswrapper[4826]: E0319 19:21:11.998513 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08ca47e73a555446029fe4510be127fe2cb0a3214a86ca16eef9fe536654a285\": container with ID starting with 08ca47e73a555446029fe4510be127fe2cb0a3214a86ca16eef9fe536654a285 not found: ID does not exist" containerID="08ca47e73a555446029fe4510be127fe2cb0a3214a86ca16eef9fe536654a285" Mar 19 19:21:11 crc kubenswrapper[4826]: I0319 19:21:11.998551 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08ca47e73a555446029fe4510be127fe2cb0a3214a86ca16eef9fe536654a285"} err="failed to get container status \"08ca47e73a555446029fe4510be127fe2cb0a3214a86ca16eef9fe536654a285\": rpc error: code = NotFound desc = could not find container \"08ca47e73a555446029fe4510be127fe2cb0a3214a86ca16eef9fe536654a285\": container with ID starting with 08ca47e73a555446029fe4510be127fe2cb0a3214a86ca16eef9fe536654a285 not found: ID does not exist" Mar 19 19:21:11 crc kubenswrapper[4826]: I0319 19:21:11.998577 4826 scope.go:117] "RemoveContainer" containerID="ffea9df019c22d2622d4e303b878781de8efe11751dca9df73e3faca51be38ad" Mar 19 19:21:11 crc kubenswrapper[4826]: E0319 19:21:11.999667 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffea9df019c22d2622d4e303b878781de8efe11751dca9df73e3faca51be38ad\": container with ID starting with ffea9df019c22d2622d4e303b878781de8efe11751dca9df73e3faca51be38ad not found: ID does not exist" containerID="ffea9df019c22d2622d4e303b878781de8efe11751dca9df73e3faca51be38ad" Mar 19 19:21:11 crc kubenswrapper[4826]: I0319 19:21:11.999703 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffea9df019c22d2622d4e303b878781de8efe11751dca9df73e3faca51be38ad"} err="failed to get container status \"ffea9df019c22d2622d4e303b878781de8efe11751dca9df73e3faca51be38ad\": rpc error: code = NotFound desc = could not find container \"ffea9df019c22d2622d4e303b878781de8efe11751dca9df73e3faca51be38ad\": container with ID starting with ffea9df019c22d2622d4e303b878781de8efe11751dca9df73e3faca51be38ad not found: ID does not exist" Mar 19 19:21:12 crc kubenswrapper[4826]: I0319 19:21:12.037027 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rqc8n"] Mar 19 19:21:12 crc kubenswrapper[4826]: I0319 19:21:12.125768 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-568d7fd7cf-2l77b"] Mar 19 19:21:12 crc kubenswrapper[4826]: I0319 19:21:12.137142 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-568d7fd7cf-2l77b"] Mar 19 19:21:12 crc kubenswrapper[4826]: I0319 19:21:12.607947 4826 generic.go:334] "Generic (PLEG): container finished" podID="d2c5c794-322f-4411-8c97-ffe26c839870" containerID="fdeaf3bba11c51e0eef3996637ea9414068776a93b7a8da7359bdc2f09acc8b9" exitCode=0 Mar 19 19:21:12 crc kubenswrapper[4826]: I0319 19:21:12.608341 4826 generic.go:334] "Generic (PLEG): container finished" podID="d2c5c794-322f-4411-8c97-ffe26c839870" containerID="bf0b0b439d8cc34678151ba96a557c5b6e9d9751f1e7893172e6ddfa22680fbc" exitCode=2 Mar 19 19:21:12 crc kubenswrapper[4826]: I0319 19:21:12.608357 4826 generic.go:334] "Generic (PLEG): container finished" podID="d2c5c794-322f-4411-8c97-ffe26c839870" containerID="120b5a18964dd3803d5a3787a7b90b476414b8f449d8faddbe45db1a376809c7" exitCode=0 Mar 19 19:21:12 crc kubenswrapper[4826]: I0319 19:21:12.608048 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2c5c794-322f-4411-8c97-ffe26c839870","Type":"ContainerDied","Data":"fdeaf3bba11c51e0eef3996637ea9414068776a93b7a8da7359bdc2f09acc8b9"} Mar 19 19:21:12 crc kubenswrapper[4826]: I0319 19:21:12.608454 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2c5c794-322f-4411-8c97-ffe26c839870","Type":"ContainerDied","Data":"bf0b0b439d8cc34678151ba96a557c5b6e9d9751f1e7893172e6ddfa22680fbc"} Mar 19 19:21:12 crc kubenswrapper[4826]: I0319 19:21:12.608475 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2c5c794-322f-4411-8c97-ffe26c839870","Type":"ContainerDied","Data":"120b5a18964dd3803d5a3787a7b90b476414b8f449d8faddbe45db1a376809c7"} Mar 19 19:21:12 crc kubenswrapper[4826]: I0319 19:21:12.610641 4826 generic.go:334] "Generic (PLEG): container finished" podID="2b5eabb2-64fe-402a-be4d-7980a750f598" containerID="8c678dba61fc0271b41a72fb141993aa4244bf0d15d77c3c2d7f6815f5b5145a" exitCode=0 Mar 19 19:21:12 crc kubenswrapper[4826]: I0319 19:21:12.610744 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rqc8n" event={"ID":"2b5eabb2-64fe-402a-be4d-7980a750f598","Type":"ContainerDied","Data":"8c678dba61fc0271b41a72fb141993aa4244bf0d15d77c3c2d7f6815f5b5145a"} Mar 19 19:21:12 crc kubenswrapper[4826]: I0319 19:21:12.610779 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rqc8n" event={"ID":"2b5eabb2-64fe-402a-be4d-7980a750f598","Type":"ContainerStarted","Data":"2281d633ee6455bc6913e54e27e59bf25800e6184cc831bce307e1ea5be9cab9"} Mar 19 19:21:13 crc kubenswrapper[4826]: I0319 19:21:13.994348 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d685221-0f29-401d-af30-cbf46d71761f" path="/var/lib/kubelet/pods/9d685221-0f29-401d-af30-cbf46d71761f/volumes" Mar 19 19:21:14 crc kubenswrapper[4826]: I0319 19:21:14.640389 4826 generic.go:334] "Generic (PLEG): container finished" podID="d2c5c794-322f-4411-8c97-ffe26c839870" containerID="86706cd24f93f371f82041738d01d85cbab5881bdaada3c05f6b7a4c859a34a2" exitCode=0 Mar 19 19:21:14 crc kubenswrapper[4826]: I0319 19:21:14.640766 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2c5c794-322f-4411-8c97-ffe26c839870","Type":"ContainerDied","Data":"86706cd24f93f371f82041738d01d85cbab5881bdaada3c05f6b7a4c859a34a2"} Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.229992 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.372402 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2c5c794-322f-4411-8c97-ffe26c839870-combined-ca-bundle\") pod \"d2c5c794-322f-4411-8c97-ffe26c839870\" (UID: \"d2c5c794-322f-4411-8c97-ffe26c839870\") " Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.372680 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fb55d\" (UniqueName: \"kubernetes.io/projected/d2c5c794-322f-4411-8c97-ffe26c839870-kube-api-access-fb55d\") pod \"d2c5c794-322f-4411-8c97-ffe26c839870\" (UID: \"d2c5c794-322f-4411-8c97-ffe26c839870\") " Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.372724 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2c5c794-322f-4411-8c97-ffe26c839870-sg-core-conf-yaml\") pod \"d2c5c794-322f-4411-8c97-ffe26c839870\" (UID: \"d2c5c794-322f-4411-8c97-ffe26c839870\") " Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.372781 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2c5c794-322f-4411-8c97-ffe26c839870-run-httpd\") pod \"d2c5c794-322f-4411-8c97-ffe26c839870\" (UID: \"d2c5c794-322f-4411-8c97-ffe26c839870\") " Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.372826 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2c5c794-322f-4411-8c97-ffe26c839870-scripts\") pod \"d2c5c794-322f-4411-8c97-ffe26c839870\" (UID: \"d2c5c794-322f-4411-8c97-ffe26c839870\") " Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.372854 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2c5c794-322f-4411-8c97-ffe26c839870-log-httpd\") pod \"d2c5c794-322f-4411-8c97-ffe26c839870\" (UID: \"d2c5c794-322f-4411-8c97-ffe26c839870\") " Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.372876 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2c5c794-322f-4411-8c97-ffe26c839870-config-data\") pod \"d2c5c794-322f-4411-8c97-ffe26c839870\" (UID: \"d2c5c794-322f-4411-8c97-ffe26c839870\") " Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.373274 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2c5c794-322f-4411-8c97-ffe26c839870-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d2c5c794-322f-4411-8c97-ffe26c839870" (UID: "d2c5c794-322f-4411-8c97-ffe26c839870"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.373302 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2c5c794-322f-4411-8c97-ffe26c839870-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d2c5c794-322f-4411-8c97-ffe26c839870" (UID: "d2c5c794-322f-4411-8c97-ffe26c839870"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.374028 4826 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2c5c794-322f-4411-8c97-ffe26c839870-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.374065 4826 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2c5c794-322f-4411-8c97-ffe26c839870-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.377093 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2c5c794-322f-4411-8c97-ffe26c839870-scripts" (OuterVolumeSpecName: "scripts") pod "d2c5c794-322f-4411-8c97-ffe26c839870" (UID: "d2c5c794-322f-4411-8c97-ffe26c839870"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.377366 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2c5c794-322f-4411-8c97-ffe26c839870-kube-api-access-fb55d" (OuterVolumeSpecName: "kube-api-access-fb55d") pod "d2c5c794-322f-4411-8c97-ffe26c839870" (UID: "d2c5c794-322f-4411-8c97-ffe26c839870"). InnerVolumeSpecName "kube-api-access-fb55d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.406559 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2c5c794-322f-4411-8c97-ffe26c839870-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d2c5c794-322f-4411-8c97-ffe26c839870" (UID: "d2c5c794-322f-4411-8c97-ffe26c839870"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.475467 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2c5c794-322f-4411-8c97-ffe26c839870-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d2c5c794-322f-4411-8c97-ffe26c839870" (UID: "d2c5c794-322f-4411-8c97-ffe26c839870"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.476071 4826 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2c5c794-322f-4411-8c97-ffe26c839870-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.476111 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fb55d\" (UniqueName: \"kubernetes.io/projected/d2c5c794-322f-4411-8c97-ffe26c839870-kube-api-access-fb55d\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.476128 4826 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2c5c794-322f-4411-8c97-ffe26c839870-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.476143 4826 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2c5c794-322f-4411-8c97-ffe26c839870-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.526473 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2c5c794-322f-4411-8c97-ffe26c839870-config-data" (OuterVolumeSpecName: "config-data") pod "d2c5c794-322f-4411-8c97-ffe26c839870" (UID: "d2c5c794-322f-4411-8c97-ffe26c839870"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.578487 4826 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2c5c794-322f-4411-8c97-ffe26c839870-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.658816 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2c5c794-322f-4411-8c97-ffe26c839870","Type":"ContainerDied","Data":"82dc1f47452b0089d25c8aca39aac219705b2f0f602681d93461878cb00eb05a"} Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.658838 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.658871 4826 scope.go:117] "RemoveContainer" containerID="fdeaf3bba11c51e0eef3996637ea9414068776a93b7a8da7359bdc2f09acc8b9" Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.661127 4826 generic.go:334] "Generic (PLEG): container finished" podID="a8c464b5-25b1-4543-8067-01ca9883d215" containerID="09baaea1ee7746a33fabaa2cd9a14a050dad1d6223ce71cfd63f0bb9d1e46a1b" exitCode=0 Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.661155 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-4bbb2" event={"ID":"a8c464b5-25b1-4543-8067-01ca9883d215","Type":"ContainerDied","Data":"09baaea1ee7746a33fabaa2cd9a14a050dad1d6223ce71cfd63f0bb9d1e46a1b"} Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.706561 4826 scope.go:117] "RemoveContainer" containerID="bf0b0b439d8cc34678151ba96a557c5b6e9d9751f1e7893172e6ddfa22680fbc" Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.713105 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.714759 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.714788 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.734211 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.749518 4826 scope.go:117] "RemoveContainer" containerID="120b5a18964dd3803d5a3787a7b90b476414b8f449d8faddbe45db1a376809c7" Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.750729 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:21:15 crc kubenswrapper[4826]: E0319 19:21:15.751311 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d685221-0f29-401d-af30-cbf46d71761f" containerName="dnsmasq-dns" Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.751336 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d685221-0f29-401d-af30-cbf46d71761f" containerName="dnsmasq-dns" Mar 19 19:21:15 crc kubenswrapper[4826]: E0319 19:21:15.751363 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2c5c794-322f-4411-8c97-ffe26c839870" containerName="ceilometer-notification-agent" Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.751372 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2c5c794-322f-4411-8c97-ffe26c839870" containerName="ceilometer-notification-agent" Mar 19 19:21:15 crc kubenswrapper[4826]: E0319 19:21:15.751393 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d685221-0f29-401d-af30-cbf46d71761f" containerName="init" Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.751400 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d685221-0f29-401d-af30-cbf46d71761f" containerName="init" Mar 19 19:21:15 crc kubenswrapper[4826]: E0319 19:21:15.751414 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2c5c794-322f-4411-8c97-ffe26c839870" containerName="proxy-httpd" Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.751421 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2c5c794-322f-4411-8c97-ffe26c839870" containerName="proxy-httpd" Mar 19 19:21:15 crc kubenswrapper[4826]: E0319 19:21:15.751448 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2c5c794-322f-4411-8c97-ffe26c839870" containerName="sg-core" Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.751456 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2c5c794-322f-4411-8c97-ffe26c839870" containerName="sg-core" Mar 19 19:21:15 crc kubenswrapper[4826]: E0319 19:21:15.751479 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2c5c794-322f-4411-8c97-ffe26c839870" containerName="ceilometer-central-agent" Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.751487 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2c5c794-322f-4411-8c97-ffe26c839870" containerName="ceilometer-central-agent" Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.751923 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d685221-0f29-401d-af30-cbf46d71761f" containerName="dnsmasq-dns" Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.751960 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2c5c794-322f-4411-8c97-ffe26c839870" containerName="ceilometer-notification-agent" Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.751991 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2c5c794-322f-4411-8c97-ffe26c839870" containerName="sg-core" Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.752007 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2c5c794-322f-4411-8c97-ffe26c839870" containerName="proxy-httpd" Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.752035 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2c5c794-322f-4411-8c97-ffe26c839870" containerName="ceilometer-central-agent" Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.756322 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.759403 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.759868 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.761357 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.809586 4826 scope.go:117] "RemoveContainer" containerID="86706cd24f93f371f82041738d01d85cbab5881bdaada3c05f6b7a4c859a34a2" Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.886048 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2871d9a2-f531-494d-bbc0-0c861bd78686-config-data\") pod \"ceilometer-0\" (UID: \"2871d9a2-f531-494d-bbc0-0c861bd78686\") " pod="openstack/ceilometer-0" Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.886109 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2871d9a2-f531-494d-bbc0-0c861bd78686-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2871d9a2-f531-494d-bbc0-0c861bd78686\") " pod="openstack/ceilometer-0" Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.886182 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2871d9a2-f531-494d-bbc0-0c861bd78686-scripts\") pod \"ceilometer-0\" (UID: \"2871d9a2-f531-494d-bbc0-0c861bd78686\") " pod="openstack/ceilometer-0" Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.886229 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chxx8\" (UniqueName: \"kubernetes.io/projected/2871d9a2-f531-494d-bbc0-0c861bd78686-kube-api-access-chxx8\") pod \"ceilometer-0\" (UID: \"2871d9a2-f531-494d-bbc0-0c861bd78686\") " pod="openstack/ceilometer-0" Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.886245 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2871d9a2-f531-494d-bbc0-0c861bd78686-run-httpd\") pod \"ceilometer-0\" (UID: \"2871d9a2-f531-494d-bbc0-0c861bd78686\") " pod="openstack/ceilometer-0" Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.886526 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2871d9a2-f531-494d-bbc0-0c861bd78686-log-httpd\") pod \"ceilometer-0\" (UID: \"2871d9a2-f531-494d-bbc0-0c861bd78686\") " pod="openstack/ceilometer-0" Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.886649 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2871d9a2-f531-494d-bbc0-0c861bd78686-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2871d9a2-f531-494d-bbc0-0c861bd78686\") " pod="openstack/ceilometer-0" Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.989196 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2871d9a2-f531-494d-bbc0-0c861bd78686-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2871d9a2-f531-494d-bbc0-0c861bd78686\") " pod="openstack/ceilometer-0" Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.989324 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2871d9a2-f531-494d-bbc0-0c861bd78686-scripts\") pod \"ceilometer-0\" (UID: \"2871d9a2-f531-494d-bbc0-0c861bd78686\") " pod="openstack/ceilometer-0" Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.989376 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chxx8\" (UniqueName: \"kubernetes.io/projected/2871d9a2-f531-494d-bbc0-0c861bd78686-kube-api-access-chxx8\") pod \"ceilometer-0\" (UID: \"2871d9a2-f531-494d-bbc0-0c861bd78686\") " pod="openstack/ceilometer-0" Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.989396 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2871d9a2-f531-494d-bbc0-0c861bd78686-run-httpd\") pod \"ceilometer-0\" (UID: \"2871d9a2-f531-494d-bbc0-0c861bd78686\") " pod="openstack/ceilometer-0" Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.989448 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2871d9a2-f531-494d-bbc0-0c861bd78686-log-httpd\") pod \"ceilometer-0\" (UID: \"2871d9a2-f531-494d-bbc0-0c861bd78686\") " pod="openstack/ceilometer-0" Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.989476 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2871d9a2-f531-494d-bbc0-0c861bd78686-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2871d9a2-f531-494d-bbc0-0c861bd78686\") " pod="openstack/ceilometer-0" Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.989619 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2871d9a2-f531-494d-bbc0-0c861bd78686-config-data\") pod \"ceilometer-0\" (UID: \"2871d9a2-f531-494d-bbc0-0c861bd78686\") " pod="openstack/ceilometer-0" Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.990223 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2871d9a2-f531-494d-bbc0-0c861bd78686-log-httpd\") pod \"ceilometer-0\" (UID: \"2871d9a2-f531-494d-bbc0-0c861bd78686\") " pod="openstack/ceilometer-0" Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.991478 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.992171 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.993542 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2871d9a2-f531-494d-bbc0-0c861bd78686-run-httpd\") pod \"ceilometer-0\" (UID: \"2871d9a2-f531-494d-bbc0-0c861bd78686\") " pod="openstack/ceilometer-0" Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.994944 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2c5c794-322f-4411-8c97-ffe26c839870" path="/var/lib/kubelet/pods/d2c5c794-322f-4411-8c97-ffe26c839870/volumes" Mar 19 19:21:15 crc kubenswrapper[4826]: I0319 19:21:15.998255 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2871d9a2-f531-494d-bbc0-0c861bd78686-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2871d9a2-f531-494d-bbc0-0c861bd78686\") " pod="openstack/ceilometer-0" Mar 19 19:21:16 crc kubenswrapper[4826]: I0319 19:21:16.003712 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2871d9a2-f531-494d-bbc0-0c861bd78686-config-data\") pod \"ceilometer-0\" (UID: \"2871d9a2-f531-494d-bbc0-0c861bd78686\") " pod="openstack/ceilometer-0" Mar 19 19:21:16 crc kubenswrapper[4826]: I0319 19:21:16.003787 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2871d9a2-f531-494d-bbc0-0c861bd78686-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2871d9a2-f531-494d-bbc0-0c861bd78686\") " pod="openstack/ceilometer-0" Mar 19 19:21:16 crc kubenswrapper[4826]: I0319 19:21:16.004174 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2871d9a2-f531-494d-bbc0-0c861bd78686-scripts\") pod \"ceilometer-0\" (UID: \"2871d9a2-f531-494d-bbc0-0c861bd78686\") " pod="openstack/ceilometer-0" Mar 19 19:21:16 crc kubenswrapper[4826]: I0319 19:21:16.015343 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chxx8\" (UniqueName: \"kubernetes.io/projected/2871d9a2-f531-494d-bbc0-0c861bd78686-kube-api-access-chxx8\") pod \"ceilometer-0\" (UID: \"2871d9a2-f531-494d-bbc0-0c861bd78686\") " pod="openstack/ceilometer-0" Mar 19 19:21:16 crc kubenswrapper[4826]: I0319 19:21:16.079181 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:21:16 crc kubenswrapper[4826]: I0319 19:21:16.572287 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:21:17 crc kubenswrapper[4826]: I0319 19:21:17.721865 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 19 19:21:17 crc kubenswrapper[4826]: I0319 19:21:17.730694 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 19 19:21:17 crc kubenswrapper[4826]: I0319 19:21:17.736560 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 19 19:21:17 crc kubenswrapper[4826]: I0319 19:21:17.941177 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 19 19:21:17 crc kubenswrapper[4826]: I0319 19:21:17.941220 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 19 19:21:18 crc kubenswrapper[4826]: I0319 19:21:18.704911 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 19 19:21:18 crc kubenswrapper[4826]: I0319 19:21:18.959843 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9edd25fe-7b6f-45d9-835e-6294924dd2e8" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.9:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 19:21:18 crc kubenswrapper[4826]: I0319 19:21:18.959844 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9edd25fe-7b6f-45d9-835e-6294924dd2e8" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.9:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 19:21:19 crc kubenswrapper[4826]: I0319 19:21:19.717788 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2871d9a2-f531-494d-bbc0-0c861bd78686","Type":"ContainerStarted","Data":"cb5a54c730ec4dc361ae794ceca976eea05af5e6ceaa3e176c9f15b322e470da"} Mar 19 19:21:19 crc kubenswrapper[4826]: I0319 19:21:19.721962 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-4bbb2" event={"ID":"a8c464b5-25b1-4543-8067-01ca9883d215","Type":"ContainerDied","Data":"47521735cd6e7c19a90933dceaf811d13aefeb17764f2e0a993f418d0f3a27a2"} Mar 19 19:21:19 crc kubenswrapper[4826]: I0319 19:21:19.722009 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="47521735cd6e7c19a90933dceaf811d13aefeb17764f2e0a993f418d0f3a27a2" Mar 19 19:21:19 crc kubenswrapper[4826]: I0319 19:21:19.816562 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-4bbb2" Mar 19 19:21:19 crc kubenswrapper[4826]: I0319 19:21:19.934867 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ll8s\" (UniqueName: \"kubernetes.io/projected/a8c464b5-25b1-4543-8067-01ca9883d215-kube-api-access-4ll8s\") pod \"a8c464b5-25b1-4543-8067-01ca9883d215\" (UID: \"a8c464b5-25b1-4543-8067-01ca9883d215\") " Mar 19 19:21:19 crc kubenswrapper[4826]: I0319 19:21:19.934913 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8c464b5-25b1-4543-8067-01ca9883d215-scripts\") pod \"a8c464b5-25b1-4543-8067-01ca9883d215\" (UID: \"a8c464b5-25b1-4543-8067-01ca9883d215\") " Mar 19 19:21:19 crc kubenswrapper[4826]: I0319 19:21:19.935001 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8c464b5-25b1-4543-8067-01ca9883d215-config-data\") pod \"a8c464b5-25b1-4543-8067-01ca9883d215\" (UID: \"a8c464b5-25b1-4543-8067-01ca9883d215\") " Mar 19 19:21:19 crc kubenswrapper[4826]: I0319 19:21:19.935080 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8c464b5-25b1-4543-8067-01ca9883d215-combined-ca-bundle\") pod \"a8c464b5-25b1-4543-8067-01ca9883d215\" (UID: \"a8c464b5-25b1-4543-8067-01ca9883d215\") " Mar 19 19:21:19 crc kubenswrapper[4826]: I0319 19:21:19.941147 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8c464b5-25b1-4543-8067-01ca9883d215-kube-api-access-4ll8s" (OuterVolumeSpecName: "kube-api-access-4ll8s") pod "a8c464b5-25b1-4543-8067-01ca9883d215" (UID: "a8c464b5-25b1-4543-8067-01ca9883d215"). InnerVolumeSpecName "kube-api-access-4ll8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:21:19 crc kubenswrapper[4826]: I0319 19:21:19.943581 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8c464b5-25b1-4543-8067-01ca9883d215-scripts" (OuterVolumeSpecName: "scripts") pod "a8c464b5-25b1-4543-8067-01ca9883d215" (UID: "a8c464b5-25b1-4543-8067-01ca9883d215"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:21:19 crc kubenswrapper[4826]: I0319 19:21:19.982293 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8c464b5-25b1-4543-8067-01ca9883d215-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a8c464b5-25b1-4543-8067-01ca9883d215" (UID: "a8c464b5-25b1-4543-8067-01ca9883d215"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:21:19 crc kubenswrapper[4826]: I0319 19:21:19.991829 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8c464b5-25b1-4543-8067-01ca9883d215-config-data" (OuterVolumeSpecName: "config-data") pod "a8c464b5-25b1-4543-8067-01ca9883d215" (UID: "a8c464b5-25b1-4543-8067-01ca9883d215"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:21:20 crc kubenswrapper[4826]: I0319 19:21:20.037842 4826 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8c464b5-25b1-4543-8067-01ca9883d215-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:20 crc kubenswrapper[4826]: I0319 19:21:20.037891 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ll8s\" (UniqueName: \"kubernetes.io/projected/a8c464b5-25b1-4543-8067-01ca9883d215-kube-api-access-4ll8s\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:20 crc kubenswrapper[4826]: I0319 19:21:20.037903 4826 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8c464b5-25b1-4543-8067-01ca9883d215-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:20 crc kubenswrapper[4826]: I0319 19:21:20.037913 4826 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8c464b5-25b1-4543-8067-01ca9883d215-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:20 crc kubenswrapper[4826]: I0319 19:21:20.735008 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2871d9a2-f531-494d-bbc0-0c861bd78686","Type":"ContainerStarted","Data":"ed7e294926a0ef197174066c924ebcfb807fcd4eaf2ebe457810591beb97a0c9"} Mar 19 19:21:20 crc kubenswrapper[4826]: I0319 19:21:20.738133 4826 generic.go:334] "Generic (PLEG): container finished" podID="2b5eabb2-64fe-402a-be4d-7980a750f598" containerID="2bec5c48233c0fd9af4ed7dbb91b882bf22652be3800623926f6b22489f9cd14" exitCode=0 Mar 19 19:21:20 crc kubenswrapper[4826]: I0319 19:21:20.738246 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rqc8n" event={"ID":"2b5eabb2-64fe-402a-be4d-7980a750f598","Type":"ContainerDied","Data":"2bec5c48233c0fd9af4ed7dbb91b882bf22652be3800623926f6b22489f9cd14"} Mar 19 19:21:20 crc kubenswrapper[4826]: I0319 19:21:20.738321 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-4bbb2" Mar 19 19:21:21 crc kubenswrapper[4826]: I0319 19:21:21.035675 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 19 19:21:21 crc kubenswrapper[4826]: I0319 19:21:21.035902 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9edd25fe-7b6f-45d9-835e-6294924dd2e8" containerName="nova-api-log" containerID="cri-o://b3acaa4f1ddd5de697825bd8b7ea0be75e8d7eaa7aebacc95ca99f2e3b632eac" gracePeriod=30 Mar 19 19:21:21 crc kubenswrapper[4826]: I0319 19:21:21.036322 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9edd25fe-7b6f-45d9-835e-6294924dd2e8" containerName="nova-api-api" containerID="cri-o://9680c6258104006dbd167cbb1c88f520e4eeeffb438bf476d2f46d00460123e8" gracePeriod=30 Mar 19 19:21:21 crc kubenswrapper[4826]: I0319 19:21:21.044736 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 19:21:21 crc kubenswrapper[4826]: I0319 19:21:21.045000 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="96becd19-70c0-4cc6-babc-650f7c8bac8f" containerName="nova-scheduler-scheduler" containerID="cri-o://a0ed9893e8c7a2ea039f9b4692c6d5ced301b61ca392a211050c392653e4db7a" gracePeriod=30 Mar 19 19:21:21 crc kubenswrapper[4826]: I0319 19:21:21.117461 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 19:21:21 crc kubenswrapper[4826]: I0319 19:21:21.751062 4826 generic.go:334] "Generic (PLEG): container finished" podID="9edd25fe-7b6f-45d9-835e-6294924dd2e8" containerID="b3acaa4f1ddd5de697825bd8b7ea0be75e8d7eaa7aebacc95ca99f2e3b632eac" exitCode=143 Mar 19 19:21:21 crc kubenswrapper[4826]: I0319 19:21:21.751346 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9edd25fe-7b6f-45d9-835e-6294924dd2e8","Type":"ContainerDied","Data":"b3acaa4f1ddd5de697825bd8b7ea0be75e8d7eaa7aebacc95ca99f2e3b632eac"} Mar 19 19:21:21 crc kubenswrapper[4826]: I0319 19:21:21.753802 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rqc8n" event={"ID":"2b5eabb2-64fe-402a-be4d-7980a750f598","Type":"ContainerStarted","Data":"f0961250ddbd2ec4583a7d47122a27e7133fc5f61784419a753acc2e055e17b8"} Mar 19 19:21:21 crc kubenswrapper[4826]: I0319 19:21:21.771210 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2f38764b-fa70-46d3-a045-024fe04b86a6" containerName="nova-metadata-log" containerID="cri-o://542933139d76ea84d17315b736ffc8e0232eda1953b1a59a7b95f6273643fa4f" gracePeriod=30 Mar 19 19:21:21 crc kubenswrapper[4826]: I0319 19:21:21.771525 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2871d9a2-f531-494d-bbc0-0c861bd78686","Type":"ContainerStarted","Data":"29b977b63eb31265f7dfb8fbdc21a0f9c9dec3c445d20fa03128babcecd48db0"} Mar 19 19:21:21 crc kubenswrapper[4826]: I0319 19:21:21.771612 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2f38764b-fa70-46d3-a045-024fe04b86a6" containerName="nova-metadata-metadata" containerID="cri-o://57a770e2babd457e5d0e0509b6e06c3cd5477b4dcf558f2b3169a5db02195e38" gracePeriod=30 Mar 19 19:21:21 crc kubenswrapper[4826]: I0319 19:21:21.812220 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rqc8n" podStartSLOduration=3.259385744 podStartE2EDuration="11.812203879s" podCreationTimestamp="2026-03-19 19:21:10 +0000 UTC" firstStartedPulling="2026-03-19 19:21:12.612585011 +0000 UTC m=+1497.366653324" lastFinishedPulling="2026-03-19 19:21:21.165403146 +0000 UTC m=+1505.919471459" observedRunningTime="2026-03-19 19:21:21.808052898 +0000 UTC m=+1506.562121211" watchObservedRunningTime="2026-03-19 19:21:21.812203879 +0000 UTC m=+1506.566272192" Mar 19 19:21:22 crc kubenswrapper[4826]: I0319 19:21:22.795561 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2871d9a2-f531-494d-bbc0-0c861bd78686","Type":"ContainerStarted","Data":"740b071601440020c10b8738fcbe66615ffcf7efaa3ed86d59777bd00704c87f"} Mar 19 19:21:22 crc kubenswrapper[4826]: I0319 19:21:22.798428 4826 generic.go:334] "Generic (PLEG): container finished" podID="2f38764b-fa70-46d3-a045-024fe04b86a6" containerID="542933139d76ea84d17315b736ffc8e0232eda1953b1a59a7b95f6273643fa4f" exitCode=143 Mar 19 19:21:22 crc kubenswrapper[4826]: I0319 19:21:22.800392 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2f38764b-fa70-46d3-a045-024fe04b86a6","Type":"ContainerDied","Data":"542933139d76ea84d17315b736ffc8e0232eda1953b1a59a7b95f6273643fa4f"} Mar 19 19:21:24 crc kubenswrapper[4826]: I0319 19:21:24.874546 4826 generic.go:334] "Generic (PLEG): container finished" podID="9edd25fe-7b6f-45d9-835e-6294924dd2e8" containerID="9680c6258104006dbd167cbb1c88f520e4eeeffb438bf476d2f46d00460123e8" exitCode=0 Mar 19 19:21:24 crc kubenswrapper[4826]: I0319 19:21:24.875198 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9edd25fe-7b6f-45d9-835e-6294924dd2e8","Type":"ContainerDied","Data":"9680c6258104006dbd167cbb1c88f520e4eeeffb438bf476d2f46d00460123e8"} Mar 19 19:21:24 crc kubenswrapper[4826]: I0319 19:21:24.878495 4826 generic.go:334] "Generic (PLEG): container finished" podID="96becd19-70c0-4cc6-babc-650f7c8bac8f" containerID="a0ed9893e8c7a2ea039f9b4692c6d5ced301b61ca392a211050c392653e4db7a" exitCode=0 Mar 19 19:21:24 crc kubenswrapper[4826]: I0319 19:21:24.878529 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"96becd19-70c0-4cc6-babc-650f7c8bac8f","Type":"ContainerDied","Data":"a0ed9893e8c7a2ea039f9b4692c6d5ced301b61ca392a211050c392653e4db7a"} Mar 19 19:21:24 crc kubenswrapper[4826]: I0319 19:21:24.881363 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2871d9a2-f531-494d-bbc0-0c861bd78686","Type":"ContainerStarted","Data":"7ecc35ff1fbf2252bd31a568aaab00e04b12f3f239e858907e692dc14b2cb543"} Mar 19 19:21:24 crc kubenswrapper[4826]: I0319 19:21:24.881820 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 19 19:21:24 crc kubenswrapper[4826]: I0319 19:21:24.927142 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=6.024942511 podStartE2EDuration="9.927118994s" podCreationTimestamp="2026-03-19 19:21:15 +0000 UTC" firstStartedPulling="2026-03-19 19:21:19.676219774 +0000 UTC m=+1504.430288127" lastFinishedPulling="2026-03-19 19:21:23.578396297 +0000 UTC m=+1508.332464610" observedRunningTime="2026-03-19 19:21:24.905145413 +0000 UTC m=+1509.659213726" watchObservedRunningTime="2026-03-19 19:21:24.927118994 +0000 UTC m=+1509.681187297" Mar 19 19:21:25 crc kubenswrapper[4826]: E0319 19:21:25.233387 4826 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a0ed9893e8c7a2ea039f9b4692c6d5ced301b61ca392a211050c392653e4db7a is running failed: container process not found" containerID="a0ed9893e8c7a2ea039f9b4692c6d5ced301b61ca392a211050c392653e4db7a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 19 19:21:25 crc kubenswrapper[4826]: E0319 19:21:25.234154 4826 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a0ed9893e8c7a2ea039f9b4692c6d5ced301b61ca392a211050c392653e4db7a is running failed: container process not found" containerID="a0ed9893e8c7a2ea039f9b4692c6d5ced301b61ca392a211050c392653e4db7a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 19 19:21:25 crc kubenswrapper[4826]: E0319 19:21:25.234406 4826 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a0ed9893e8c7a2ea039f9b4692c6d5ced301b61ca392a211050c392653e4db7a is running failed: container process not found" containerID="a0ed9893e8c7a2ea039f9b4692c6d5ced301b61ca392a211050c392653e4db7a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 19 19:21:25 crc kubenswrapper[4826]: E0319 19:21:25.234437 4826 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a0ed9893e8c7a2ea039f9b4692c6d5ced301b61ca392a211050c392653e4db7a is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="96becd19-70c0-4cc6-babc-650f7c8bac8f" containerName="nova-scheduler-scheduler" Mar 19 19:21:25 crc kubenswrapper[4826]: I0319 19:21:25.400012 4826 patch_prober.go:28] interesting pod/machine-config-daemon-zz87p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:21:25 crc kubenswrapper[4826]: I0319 19:21:25.400062 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:21:25 crc kubenswrapper[4826]: I0319 19:21:25.454347 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 19:21:25 crc kubenswrapper[4826]: I0319 19:21:25.464935 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 19:21:25 crc kubenswrapper[4826]: I0319 19:21:25.644598 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9edd25fe-7b6f-45d9-835e-6294924dd2e8-public-tls-certs\") pod \"9edd25fe-7b6f-45d9-835e-6294924dd2e8\" (UID: \"9edd25fe-7b6f-45d9-835e-6294924dd2e8\") " Mar 19 19:21:25 crc kubenswrapper[4826]: I0319 19:21:25.644696 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9edd25fe-7b6f-45d9-835e-6294924dd2e8-combined-ca-bundle\") pod \"9edd25fe-7b6f-45d9-835e-6294924dd2e8\" (UID: \"9edd25fe-7b6f-45d9-835e-6294924dd2e8\") " Mar 19 19:21:25 crc kubenswrapper[4826]: I0319 19:21:25.644722 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96becd19-70c0-4cc6-babc-650f7c8bac8f-config-data\") pod \"96becd19-70c0-4cc6-babc-650f7c8bac8f\" (UID: \"96becd19-70c0-4cc6-babc-650f7c8bac8f\") " Mar 19 19:21:25 crc kubenswrapper[4826]: I0319 19:21:25.644759 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7rvr\" (UniqueName: \"kubernetes.io/projected/96becd19-70c0-4cc6-babc-650f7c8bac8f-kube-api-access-r7rvr\") pod \"96becd19-70c0-4cc6-babc-650f7c8bac8f\" (UID: \"96becd19-70c0-4cc6-babc-650f7c8bac8f\") " Mar 19 19:21:25 crc kubenswrapper[4826]: I0319 19:21:25.644840 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9edd25fe-7b6f-45d9-835e-6294924dd2e8-config-data\") pod \"9edd25fe-7b6f-45d9-835e-6294924dd2e8\" (UID: \"9edd25fe-7b6f-45d9-835e-6294924dd2e8\") " Mar 19 19:21:25 crc kubenswrapper[4826]: I0319 19:21:25.644867 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9edd25fe-7b6f-45d9-835e-6294924dd2e8-internal-tls-certs\") pod \"9edd25fe-7b6f-45d9-835e-6294924dd2e8\" (UID: \"9edd25fe-7b6f-45d9-835e-6294924dd2e8\") " Mar 19 19:21:25 crc kubenswrapper[4826]: I0319 19:21:25.644913 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9edd25fe-7b6f-45d9-835e-6294924dd2e8-logs\") pod \"9edd25fe-7b6f-45d9-835e-6294924dd2e8\" (UID: \"9edd25fe-7b6f-45d9-835e-6294924dd2e8\") " Mar 19 19:21:25 crc kubenswrapper[4826]: I0319 19:21:25.644938 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wc7bd\" (UniqueName: \"kubernetes.io/projected/9edd25fe-7b6f-45d9-835e-6294924dd2e8-kube-api-access-wc7bd\") pod \"9edd25fe-7b6f-45d9-835e-6294924dd2e8\" (UID: \"9edd25fe-7b6f-45d9-835e-6294924dd2e8\") " Mar 19 19:21:25 crc kubenswrapper[4826]: I0319 19:21:25.645098 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96becd19-70c0-4cc6-babc-650f7c8bac8f-combined-ca-bundle\") pod \"96becd19-70c0-4cc6-babc-650f7c8bac8f\" (UID: \"96becd19-70c0-4cc6-babc-650f7c8bac8f\") " Mar 19 19:21:25 crc kubenswrapper[4826]: I0319 19:21:25.658093 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9edd25fe-7b6f-45d9-835e-6294924dd2e8-logs" (OuterVolumeSpecName: "logs") pod "9edd25fe-7b6f-45d9-835e-6294924dd2e8" (UID: "9edd25fe-7b6f-45d9-835e-6294924dd2e8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:21:25 crc kubenswrapper[4826]: I0319 19:21:25.669589 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9edd25fe-7b6f-45d9-835e-6294924dd2e8-kube-api-access-wc7bd" (OuterVolumeSpecName: "kube-api-access-wc7bd") pod "9edd25fe-7b6f-45d9-835e-6294924dd2e8" (UID: "9edd25fe-7b6f-45d9-835e-6294924dd2e8"). InnerVolumeSpecName "kube-api-access-wc7bd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:21:25 crc kubenswrapper[4826]: I0319 19:21:25.673308 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96becd19-70c0-4cc6-babc-650f7c8bac8f-kube-api-access-r7rvr" (OuterVolumeSpecName: "kube-api-access-r7rvr") pod "96becd19-70c0-4cc6-babc-650f7c8bac8f" (UID: "96becd19-70c0-4cc6-babc-650f7c8bac8f"). InnerVolumeSpecName "kube-api-access-r7rvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:21:25 crc kubenswrapper[4826]: I0319 19:21:25.750394 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7rvr\" (UniqueName: \"kubernetes.io/projected/96becd19-70c0-4cc6-babc-650f7c8bac8f-kube-api-access-r7rvr\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:25 crc kubenswrapper[4826]: I0319 19:21:25.750441 4826 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9edd25fe-7b6f-45d9-835e-6294924dd2e8-logs\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:25 crc kubenswrapper[4826]: I0319 19:21:25.750451 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wc7bd\" (UniqueName: \"kubernetes.io/projected/9edd25fe-7b6f-45d9-835e-6294924dd2e8-kube-api-access-wc7bd\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:25 crc kubenswrapper[4826]: I0319 19:21:25.767888 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96becd19-70c0-4cc6-babc-650f7c8bac8f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "96becd19-70c0-4cc6-babc-650f7c8bac8f" (UID: "96becd19-70c0-4cc6-babc-650f7c8bac8f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:21:25 crc kubenswrapper[4826]: I0319 19:21:25.768147 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9edd25fe-7b6f-45d9-835e-6294924dd2e8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9edd25fe-7b6f-45d9-835e-6294924dd2e8" (UID: "9edd25fe-7b6f-45d9-835e-6294924dd2e8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:21:25 crc kubenswrapper[4826]: I0319 19:21:25.808589 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96becd19-70c0-4cc6-babc-650f7c8bac8f-config-data" (OuterVolumeSpecName: "config-data") pod "96becd19-70c0-4cc6-babc-650f7c8bac8f" (UID: "96becd19-70c0-4cc6-babc-650f7c8bac8f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:21:25 crc kubenswrapper[4826]: I0319 19:21:25.836492 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9edd25fe-7b6f-45d9-835e-6294924dd2e8-config-data" (OuterVolumeSpecName: "config-data") pod "9edd25fe-7b6f-45d9-835e-6294924dd2e8" (UID: "9edd25fe-7b6f-45d9-835e-6294924dd2e8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:21:25 crc kubenswrapper[4826]: I0319 19:21:25.840908 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9edd25fe-7b6f-45d9-835e-6294924dd2e8-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9edd25fe-7b6f-45d9-835e-6294924dd2e8" (UID: "9edd25fe-7b6f-45d9-835e-6294924dd2e8"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:21:25 crc kubenswrapper[4826]: I0319 19:21:25.852144 4826 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9edd25fe-7b6f-45d9-835e-6294924dd2e8-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:25 crc kubenswrapper[4826]: I0319 19:21:25.852176 4826 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9edd25fe-7b6f-45d9-835e-6294924dd2e8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:25 crc kubenswrapper[4826]: I0319 19:21:25.852186 4826 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96becd19-70c0-4cc6-babc-650f7c8bac8f-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:25 crc kubenswrapper[4826]: I0319 19:21:25.852194 4826 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9edd25fe-7b6f-45d9-835e-6294924dd2e8-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:25 crc kubenswrapper[4826]: I0319 19:21:25.852204 4826 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96becd19-70c0-4cc6-babc-650f7c8bac8f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:25 crc kubenswrapper[4826]: I0319 19:21:25.864859 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9edd25fe-7b6f-45d9-835e-6294924dd2e8-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9edd25fe-7b6f-45d9-835e-6294924dd2e8" (UID: "9edd25fe-7b6f-45d9-835e-6294924dd2e8"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:21:25 crc kubenswrapper[4826]: I0319 19:21:25.908418 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"96becd19-70c0-4cc6-babc-650f7c8bac8f","Type":"ContainerDied","Data":"84f337e41f248c89afa8077e2022e8d63829c257ded0212b17dfdd4a3a5619e0"} Mar 19 19:21:25 crc kubenswrapper[4826]: I0319 19:21:25.908431 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 19:21:25 crc kubenswrapper[4826]: I0319 19:21:25.908473 4826 scope.go:117] "RemoveContainer" containerID="a0ed9893e8c7a2ea039f9b4692c6d5ced301b61ca392a211050c392653e4db7a" Mar 19 19:21:25 crc kubenswrapper[4826]: I0319 19:21:25.914970 4826 generic.go:334] "Generic (PLEG): container finished" podID="2f38764b-fa70-46d3-a045-024fe04b86a6" containerID="57a770e2babd457e5d0e0509b6e06c3cd5477b4dcf558f2b3169a5db02195e38" exitCode=0 Mar 19 19:21:25 crc kubenswrapper[4826]: I0319 19:21:25.915043 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2f38764b-fa70-46d3-a045-024fe04b86a6","Type":"ContainerDied","Data":"57a770e2babd457e5d0e0509b6e06c3cd5477b4dcf558f2b3169a5db02195e38"} Mar 19 19:21:25 crc kubenswrapper[4826]: I0319 19:21:25.915065 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2f38764b-fa70-46d3-a045-024fe04b86a6","Type":"ContainerDied","Data":"13bbedafdb1a9c32f973db4f1bc22003b5e7fdd2cf4505922005ec8af8d0c9da"} Mar 19 19:21:25 crc kubenswrapper[4826]: I0319 19:21:25.915076 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13bbedafdb1a9c32f973db4f1bc22003b5e7fdd2cf4505922005ec8af8d0c9da" Mar 19 19:21:25 crc kubenswrapper[4826]: I0319 19:21:25.920997 4826 generic.go:334] "Generic (PLEG): container finished" podID="c5835e9d-d28e-47a1-aa89-c61f110f2f53" containerID="12132ab8807a509dbbdbe4be36d39620780d00d23d52b74989ee6106ede014d6" exitCode=137 Mar 19 19:21:25 crc kubenswrapper[4826]: I0319 19:21:25.921051 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"c5835e9d-d28e-47a1-aa89-c61f110f2f53","Type":"ContainerDied","Data":"12132ab8807a509dbbdbe4be36d39620780d00d23d52b74989ee6106ede014d6"} Mar 19 19:21:25 crc kubenswrapper[4826]: I0319 19:21:25.928732 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 19:21:25 crc kubenswrapper[4826]: I0319 19:21:25.929046 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9edd25fe-7b6f-45d9-835e-6294924dd2e8","Type":"ContainerDied","Data":"2365cccb9107e41c2e7bd5f369b24edabb660690f976ffa7f1bfe146d8c725d1"} Mar 19 19:21:25 crc kubenswrapper[4826]: I0319 19:21:25.956803 4826 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9edd25fe-7b6f-45d9-835e-6294924dd2e8-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:25 crc kubenswrapper[4826]: I0319 19:21:25.986302 4826 scope.go:117] "RemoveContainer" containerID="9680c6258104006dbd167cbb1c88f520e4eeeffb438bf476d2f46d00460123e8" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.033229 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.101883 4826 scope.go:117] "RemoveContainer" containerID="b3acaa4f1ddd5de697825bd8b7ea0be75e8d7eaa7aebacc95ca99f2e3b632eac" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.167297 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.167334 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.167352 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 19 19:21:26 crc kubenswrapper[4826]: E0319 19:21:26.167908 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9edd25fe-7b6f-45d9-835e-6294924dd2e8" containerName="nova-api-log" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.167926 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="9edd25fe-7b6f-45d9-835e-6294924dd2e8" containerName="nova-api-log" Mar 19 19:21:26 crc kubenswrapper[4826]: E0319 19:21:26.167938 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96becd19-70c0-4cc6-babc-650f7c8bac8f" containerName="nova-scheduler-scheduler" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.167944 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="96becd19-70c0-4cc6-babc-650f7c8bac8f" containerName="nova-scheduler-scheduler" Mar 19 19:21:26 crc kubenswrapper[4826]: E0319 19:21:26.167978 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f38764b-fa70-46d3-a045-024fe04b86a6" containerName="nova-metadata-metadata" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.167985 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f38764b-fa70-46d3-a045-024fe04b86a6" containerName="nova-metadata-metadata" Mar 19 19:21:26 crc kubenswrapper[4826]: E0319 19:21:26.168001 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8c464b5-25b1-4543-8067-01ca9883d215" containerName="nova-manage" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.168007 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8c464b5-25b1-4543-8067-01ca9883d215" containerName="nova-manage" Mar 19 19:21:26 crc kubenswrapper[4826]: E0319 19:21:26.168021 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f38764b-fa70-46d3-a045-024fe04b86a6" containerName="nova-metadata-log" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.168026 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f38764b-fa70-46d3-a045-024fe04b86a6" containerName="nova-metadata-log" Mar 19 19:21:26 crc kubenswrapper[4826]: E0319 19:21:26.168041 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9edd25fe-7b6f-45d9-835e-6294924dd2e8" containerName="nova-api-api" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.168046 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="9edd25fe-7b6f-45d9-835e-6294924dd2e8" containerName="nova-api-api" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.168251 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f38764b-fa70-46d3-a045-024fe04b86a6" containerName="nova-metadata-log" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.168272 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8c464b5-25b1-4543-8067-01ca9883d215" containerName="nova-manage" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.168292 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="9edd25fe-7b6f-45d9-835e-6294924dd2e8" containerName="nova-api-log" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.168305 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="9edd25fe-7b6f-45d9-835e-6294924dd2e8" containerName="nova-api-api" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.168314 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="96becd19-70c0-4cc6-babc-650f7c8bac8f" containerName="nova-scheduler-scheduler" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.168326 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f38764b-fa70-46d3-a045-024fe04b86a6" containerName="nova-metadata-metadata" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.169520 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.169625 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.172408 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.172571 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.172840 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.180317 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.186561 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f38764b-fa70-46d3-a045-024fe04b86a6-nova-metadata-tls-certs\") pod \"2f38764b-fa70-46d3-a045-024fe04b86a6\" (UID: \"2f38764b-fa70-46d3-a045-024fe04b86a6\") " Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.186621 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f38764b-fa70-46d3-a045-024fe04b86a6-config-data\") pod \"2f38764b-fa70-46d3-a045-024fe04b86a6\" (UID: \"2f38764b-fa70-46d3-a045-024fe04b86a6\") " Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.186710 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f38764b-fa70-46d3-a045-024fe04b86a6-combined-ca-bundle\") pod \"2f38764b-fa70-46d3-a045-024fe04b86a6\" (UID: \"2f38764b-fa70-46d3-a045-024fe04b86a6\") " Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.186814 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f38764b-fa70-46d3-a045-024fe04b86a6-logs\") pod \"2f38764b-fa70-46d3-a045-024fe04b86a6\" (UID: \"2f38764b-fa70-46d3-a045-024fe04b86a6\") " Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.186946 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsrxf\" (UniqueName: \"kubernetes.io/projected/2f38764b-fa70-46d3-a045-024fe04b86a6-kube-api-access-qsrxf\") pod \"2f38764b-fa70-46d3-a045-024fe04b86a6\" (UID: \"2f38764b-fa70-46d3-a045-024fe04b86a6\") " Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.192849 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f38764b-fa70-46d3-a045-024fe04b86a6-logs" (OuterVolumeSpecName: "logs") pod "2f38764b-fa70-46d3-a045-024fe04b86a6" (UID: "2f38764b-fa70-46d3-a045-024fe04b86a6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.196085 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.202210 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f38764b-fa70-46d3-a045-024fe04b86a6-kube-api-access-qsrxf" (OuterVolumeSpecName: "kube-api-access-qsrxf") pod "2f38764b-fa70-46d3-a045-024fe04b86a6" (UID: "2f38764b-fa70-46d3-a045-024fe04b86a6"). InnerVolumeSpecName "kube-api-access-qsrxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.211153 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.212856 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.219768 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.223722 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.228686 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f38764b-fa70-46d3-a045-024fe04b86a6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2f38764b-fa70-46d3-a045-024fe04b86a6" (UID: "2f38764b-fa70-46d3-a045-024fe04b86a6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.256834 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f38764b-fa70-46d3-a045-024fe04b86a6-config-data" (OuterVolumeSpecName: "config-data") pod "2f38764b-fa70-46d3-a045-024fe04b86a6" (UID: "2f38764b-fa70-46d3-a045-024fe04b86a6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.265876 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f38764b-fa70-46d3-a045-024fe04b86a6-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "2f38764b-fa70-46d3-a045-024fe04b86a6" (UID: "2f38764b-fa70-46d3-a045-024fe04b86a6"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.268769 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.292251 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fbc03ca-3891-4c9b-8ac9-88efbe950d4d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2fbc03ca-3891-4c9b-8ac9-88efbe950d4d\") " pod="openstack/nova-api-0" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.292316 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fbc03ca-3891-4c9b-8ac9-88efbe950d4d-public-tls-certs\") pod \"nova-api-0\" (UID: \"2fbc03ca-3891-4c9b-8ac9-88efbe950d4d\") " pod="openstack/nova-api-0" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.292474 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgv7c\" (UniqueName: \"kubernetes.io/projected/c55cf1ac-b2f2-4801-bb17-8e9b703d33eb-kube-api-access-fgv7c\") pod \"nova-scheduler-0\" (UID: \"c55cf1ac-b2f2-4801-bb17-8e9b703d33eb\") " pod="openstack/nova-scheduler-0" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.292574 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fbc03ca-3891-4c9b-8ac9-88efbe950d4d-config-data\") pod \"nova-api-0\" (UID: \"2fbc03ca-3891-4c9b-8ac9-88efbe950d4d\") " pod="openstack/nova-api-0" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.292638 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c55cf1ac-b2f2-4801-bb17-8e9b703d33eb-config-data\") pod \"nova-scheduler-0\" (UID: \"c55cf1ac-b2f2-4801-bb17-8e9b703d33eb\") " pod="openstack/nova-scheduler-0" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.292708 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chwsx\" (UniqueName: \"kubernetes.io/projected/2fbc03ca-3891-4c9b-8ac9-88efbe950d4d-kube-api-access-chwsx\") pod \"nova-api-0\" (UID: \"2fbc03ca-3891-4c9b-8ac9-88efbe950d4d\") " pod="openstack/nova-api-0" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.292741 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c55cf1ac-b2f2-4801-bb17-8e9b703d33eb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c55cf1ac-b2f2-4801-bb17-8e9b703d33eb\") " pod="openstack/nova-scheduler-0" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.292807 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fbc03ca-3891-4c9b-8ac9-88efbe950d4d-logs\") pod \"nova-api-0\" (UID: \"2fbc03ca-3891-4c9b-8ac9-88efbe950d4d\") " pod="openstack/nova-api-0" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.292925 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fbc03ca-3891-4c9b-8ac9-88efbe950d4d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2fbc03ca-3891-4c9b-8ac9-88efbe950d4d\") " pod="openstack/nova-api-0" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.293988 4826 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f38764b-fa70-46d3-a045-024fe04b86a6-logs\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.294015 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsrxf\" (UniqueName: \"kubernetes.io/projected/2f38764b-fa70-46d3-a045-024fe04b86a6-kube-api-access-qsrxf\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.294030 4826 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f38764b-fa70-46d3-a045-024fe04b86a6-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.294044 4826 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f38764b-fa70-46d3-a045-024fe04b86a6-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.294054 4826 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f38764b-fa70-46d3-a045-024fe04b86a6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.395349 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzrgb\" (UniqueName: \"kubernetes.io/projected/c5835e9d-d28e-47a1-aa89-c61f110f2f53-kube-api-access-kzrgb\") pod \"c5835e9d-d28e-47a1-aa89-c61f110f2f53\" (UID: \"c5835e9d-d28e-47a1-aa89-c61f110f2f53\") " Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.395669 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5835e9d-d28e-47a1-aa89-c61f110f2f53-combined-ca-bundle\") pod \"c5835e9d-d28e-47a1-aa89-c61f110f2f53\" (UID: \"c5835e9d-d28e-47a1-aa89-c61f110f2f53\") " Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.395711 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5835e9d-d28e-47a1-aa89-c61f110f2f53-config-data\") pod \"c5835e9d-d28e-47a1-aa89-c61f110f2f53\" (UID: \"c5835e9d-d28e-47a1-aa89-c61f110f2f53\") " Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.395853 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5835e9d-d28e-47a1-aa89-c61f110f2f53-scripts\") pod \"c5835e9d-d28e-47a1-aa89-c61f110f2f53\" (UID: \"c5835e9d-d28e-47a1-aa89-c61f110f2f53\") " Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.396169 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fbc03ca-3891-4c9b-8ac9-88efbe950d4d-config-data\") pod \"nova-api-0\" (UID: \"2fbc03ca-3891-4c9b-8ac9-88efbe950d4d\") " pod="openstack/nova-api-0" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.396222 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c55cf1ac-b2f2-4801-bb17-8e9b703d33eb-config-data\") pod \"nova-scheduler-0\" (UID: \"c55cf1ac-b2f2-4801-bb17-8e9b703d33eb\") " pod="openstack/nova-scheduler-0" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.396268 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chwsx\" (UniqueName: \"kubernetes.io/projected/2fbc03ca-3891-4c9b-8ac9-88efbe950d4d-kube-api-access-chwsx\") pod \"nova-api-0\" (UID: \"2fbc03ca-3891-4c9b-8ac9-88efbe950d4d\") " pod="openstack/nova-api-0" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.396305 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c55cf1ac-b2f2-4801-bb17-8e9b703d33eb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c55cf1ac-b2f2-4801-bb17-8e9b703d33eb\") " pod="openstack/nova-scheduler-0" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.396361 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fbc03ca-3891-4c9b-8ac9-88efbe950d4d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2fbc03ca-3891-4c9b-8ac9-88efbe950d4d\") " pod="openstack/nova-api-0" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.396382 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fbc03ca-3891-4c9b-8ac9-88efbe950d4d-logs\") pod \"nova-api-0\" (UID: \"2fbc03ca-3891-4c9b-8ac9-88efbe950d4d\") " pod="openstack/nova-api-0" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.396419 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fbc03ca-3891-4c9b-8ac9-88efbe950d4d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2fbc03ca-3891-4c9b-8ac9-88efbe950d4d\") " pod="openstack/nova-api-0" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.396450 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fbc03ca-3891-4c9b-8ac9-88efbe950d4d-public-tls-certs\") pod \"nova-api-0\" (UID: \"2fbc03ca-3891-4c9b-8ac9-88efbe950d4d\") " pod="openstack/nova-api-0" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.396522 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgv7c\" (UniqueName: \"kubernetes.io/projected/c55cf1ac-b2f2-4801-bb17-8e9b703d33eb-kube-api-access-fgv7c\") pod \"nova-scheduler-0\" (UID: \"c55cf1ac-b2f2-4801-bb17-8e9b703d33eb\") " pod="openstack/nova-scheduler-0" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.397202 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fbc03ca-3891-4c9b-8ac9-88efbe950d4d-logs\") pod \"nova-api-0\" (UID: \"2fbc03ca-3891-4c9b-8ac9-88efbe950d4d\") " pod="openstack/nova-api-0" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.399777 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5835e9d-d28e-47a1-aa89-c61f110f2f53-kube-api-access-kzrgb" (OuterVolumeSpecName: "kube-api-access-kzrgb") pod "c5835e9d-d28e-47a1-aa89-c61f110f2f53" (UID: "c5835e9d-d28e-47a1-aa89-c61f110f2f53"). InnerVolumeSpecName "kube-api-access-kzrgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.402502 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fbc03ca-3891-4c9b-8ac9-88efbe950d4d-public-tls-certs\") pod \"nova-api-0\" (UID: \"2fbc03ca-3891-4c9b-8ac9-88efbe950d4d\") " pod="openstack/nova-api-0" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.408713 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c55cf1ac-b2f2-4801-bb17-8e9b703d33eb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c55cf1ac-b2f2-4801-bb17-8e9b703d33eb\") " pod="openstack/nova-scheduler-0" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.412178 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c55cf1ac-b2f2-4801-bb17-8e9b703d33eb-config-data\") pod \"nova-scheduler-0\" (UID: \"c55cf1ac-b2f2-4801-bb17-8e9b703d33eb\") " pod="openstack/nova-scheduler-0" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.413181 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5835e9d-d28e-47a1-aa89-c61f110f2f53-scripts" (OuterVolumeSpecName: "scripts") pod "c5835e9d-d28e-47a1-aa89-c61f110f2f53" (UID: "c5835e9d-d28e-47a1-aa89-c61f110f2f53"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.413540 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fbc03ca-3891-4c9b-8ac9-88efbe950d4d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2fbc03ca-3891-4c9b-8ac9-88efbe950d4d\") " pod="openstack/nova-api-0" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.414226 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fbc03ca-3891-4c9b-8ac9-88efbe950d4d-config-data\") pod \"nova-api-0\" (UID: \"2fbc03ca-3891-4c9b-8ac9-88efbe950d4d\") " pod="openstack/nova-api-0" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.415031 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgv7c\" (UniqueName: \"kubernetes.io/projected/c55cf1ac-b2f2-4801-bb17-8e9b703d33eb-kube-api-access-fgv7c\") pod \"nova-scheduler-0\" (UID: \"c55cf1ac-b2f2-4801-bb17-8e9b703d33eb\") " pod="openstack/nova-scheduler-0" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.417161 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fbc03ca-3891-4c9b-8ac9-88efbe950d4d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2fbc03ca-3891-4c9b-8ac9-88efbe950d4d\") " pod="openstack/nova-api-0" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.418177 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chwsx\" (UniqueName: \"kubernetes.io/projected/2fbc03ca-3891-4c9b-8ac9-88efbe950d4d-kube-api-access-chwsx\") pod \"nova-api-0\" (UID: \"2fbc03ca-3891-4c9b-8ac9-88efbe950d4d\") " pod="openstack/nova-api-0" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.498885 4826 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5835e9d-d28e-47a1-aa89-c61f110f2f53-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.498916 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzrgb\" (UniqueName: \"kubernetes.io/projected/c5835e9d-d28e-47a1-aa89-c61f110f2f53-kube-api-access-kzrgb\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.531706 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5835e9d-d28e-47a1-aa89-c61f110f2f53-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c5835e9d-d28e-47a1-aa89-c61f110f2f53" (UID: "c5835e9d-d28e-47a1-aa89-c61f110f2f53"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.546718 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5835e9d-d28e-47a1-aa89-c61f110f2f53-config-data" (OuterVolumeSpecName: "config-data") pod "c5835e9d-d28e-47a1-aa89-c61f110f2f53" (UID: "c5835e9d-d28e-47a1-aa89-c61f110f2f53"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.547176 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.565781 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.602374 4826 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5835e9d-d28e-47a1-aa89-c61f110f2f53-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.602415 4826 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5835e9d-d28e-47a1-aa89-c61f110f2f53-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.944743 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.944794 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.944839 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"c5835e9d-d28e-47a1-aa89-c61f110f2f53","Type":"ContainerDied","Data":"75fef9943fcfbb86259f93012d6d2771ddc07682ef67813d64a9442994826d17"} Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.945908 4826 scope.go:117] "RemoveContainer" containerID="12132ab8807a509dbbdbe4be36d39620780d00d23d52b74989ee6106ede014d6" Mar 19 19:21:26 crc kubenswrapper[4826]: I0319 19:21:26.971961 4826 scope.go:117] "RemoveContainer" containerID="283dc0c21fc4d13f34abba2b9306e1cc1ea1732a61cf2f788cb5b3c56edf823e" Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.003538 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.041417 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.067623 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.068278 4826 scope.go:117] "RemoveContainer" containerID="0d7b9305c553990c2ae667a61313bf01d6ab1104cbc53cc92394c953df61bc43" Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.089366 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.103281 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Mar 19 19:21:27 crc kubenswrapper[4826]: E0319 19:21:27.103864 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5835e9d-d28e-47a1-aa89-c61f110f2f53" containerName="aodh-api" Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.103888 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5835e9d-d28e-47a1-aa89-c61f110f2f53" containerName="aodh-api" Mar 19 19:21:27 crc kubenswrapper[4826]: E0319 19:21:27.103905 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5835e9d-d28e-47a1-aa89-c61f110f2f53" containerName="aodh-notifier" Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.103913 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5835e9d-d28e-47a1-aa89-c61f110f2f53" containerName="aodh-notifier" Mar 19 19:21:27 crc kubenswrapper[4826]: E0319 19:21:27.103927 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5835e9d-d28e-47a1-aa89-c61f110f2f53" containerName="aodh-listener" Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.103934 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5835e9d-d28e-47a1-aa89-c61f110f2f53" containerName="aodh-listener" Mar 19 19:21:27 crc kubenswrapper[4826]: E0319 19:21:27.103958 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5835e9d-d28e-47a1-aa89-c61f110f2f53" containerName="aodh-evaluator" Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.103967 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5835e9d-d28e-47a1-aa89-c61f110f2f53" containerName="aodh-evaluator" Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.104218 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5835e9d-d28e-47a1-aa89-c61f110f2f53" containerName="aodh-notifier" Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.104239 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5835e9d-d28e-47a1-aa89-c61f110f2f53" containerName="aodh-listener" Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.104263 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5835e9d-d28e-47a1-aa89-c61f110f2f53" containerName="aodh-evaluator" Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.104269 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5835e9d-d28e-47a1-aa89-c61f110f2f53" containerName="aodh-api" Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.106396 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.109334 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.109788 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.110522 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.110672 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.112230 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-pqw5p" Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.119806 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.123641 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.125725 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.125905 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.135341 4826 scope.go:117] "RemoveContainer" containerID="066bb8481e3cc8105d0dc22392d969ef963ef4abb0cfa4d433ffeff5fca3f3ac" Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.141165 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 19 19:21:27 crc kubenswrapper[4826]: W0319 19:21:27.147977 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fbc03ca_3891_4c9b_8ac9_88efbe950d4d.slice/crio-88ff868891261353fd192165f5991f23bf91ab1cffaaf5e949a0a2b75ebaa717 WatchSource:0}: Error finding container 88ff868891261353fd192165f5991f23bf91ab1cffaaf5e949a0a2b75ebaa717: Status 404 returned error can't find the container with id 88ff868891261353fd192165f5991f23bf91ab1cffaaf5e949a0a2b75ebaa717 Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.152479 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.171967 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.216685 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.232361 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5660601a-af93-417b-b7c8-b68893743919-config-data\") pod \"nova-metadata-0\" (UID: \"5660601a-af93-417b-b7c8-b68893743919\") " pod="openstack/nova-metadata-0" Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.232499 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tpfs\" (UniqueName: \"kubernetes.io/projected/5660601a-af93-417b-b7c8-b68893743919-kube-api-access-7tpfs\") pod \"nova-metadata-0\" (UID: \"5660601a-af93-417b-b7c8-b68893743919\") " pod="openstack/nova-metadata-0" Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.232628 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5660601a-af93-417b-b7c8-b68893743919-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5660601a-af93-417b-b7c8-b68893743919\") " pod="openstack/nova-metadata-0" Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.232823 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5660601a-af93-417b-b7c8-b68893743919-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5660601a-af93-417b-b7c8-b68893743919\") " pod="openstack/nova-metadata-0" Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.233088 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spk8l\" (UniqueName: \"kubernetes.io/projected/ed2acf4d-53a0-4bbb-bc0d-c021cf699d91-kube-api-access-spk8l\") pod \"aodh-0\" (UID: \"ed2acf4d-53a0-4bbb-bc0d-c021cf699d91\") " pod="openstack/aodh-0" Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.233207 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5660601a-af93-417b-b7c8-b68893743919-logs\") pod \"nova-metadata-0\" (UID: \"5660601a-af93-417b-b7c8-b68893743919\") " pod="openstack/nova-metadata-0" Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.233391 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed2acf4d-53a0-4bbb-bc0d-c021cf699d91-config-data\") pod \"aodh-0\" (UID: \"ed2acf4d-53a0-4bbb-bc0d-c021cf699d91\") " pod="openstack/aodh-0" Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.233478 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed2acf4d-53a0-4bbb-bc0d-c021cf699d91-public-tls-certs\") pod \"aodh-0\" (UID: \"ed2acf4d-53a0-4bbb-bc0d-c021cf699d91\") " pod="openstack/aodh-0" Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.233564 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed2acf4d-53a0-4bbb-bc0d-c021cf699d91-internal-tls-certs\") pod \"aodh-0\" (UID: \"ed2acf4d-53a0-4bbb-bc0d-c021cf699d91\") " pod="openstack/aodh-0" Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.233771 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed2acf4d-53a0-4bbb-bc0d-c021cf699d91-scripts\") pod \"aodh-0\" (UID: \"ed2acf4d-53a0-4bbb-bc0d-c021cf699d91\") " pod="openstack/aodh-0" Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.233851 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed2acf4d-53a0-4bbb-bc0d-c021cf699d91-combined-ca-bundle\") pod \"aodh-0\" (UID: \"ed2acf4d-53a0-4bbb-bc0d-c021cf699d91\") " pod="openstack/aodh-0" Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.335898 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tpfs\" (UniqueName: \"kubernetes.io/projected/5660601a-af93-417b-b7c8-b68893743919-kube-api-access-7tpfs\") pod \"nova-metadata-0\" (UID: \"5660601a-af93-417b-b7c8-b68893743919\") " pod="openstack/nova-metadata-0" Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.335975 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5660601a-af93-417b-b7c8-b68893743919-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5660601a-af93-417b-b7c8-b68893743919\") " pod="openstack/nova-metadata-0" Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.335994 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5660601a-af93-417b-b7c8-b68893743919-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5660601a-af93-417b-b7c8-b68893743919\") " pod="openstack/nova-metadata-0" Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.336046 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spk8l\" (UniqueName: \"kubernetes.io/projected/ed2acf4d-53a0-4bbb-bc0d-c021cf699d91-kube-api-access-spk8l\") pod \"aodh-0\" (UID: \"ed2acf4d-53a0-4bbb-bc0d-c021cf699d91\") " pod="openstack/aodh-0" Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.336076 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5660601a-af93-417b-b7c8-b68893743919-logs\") pod \"nova-metadata-0\" (UID: \"5660601a-af93-417b-b7c8-b68893743919\") " pod="openstack/nova-metadata-0" Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.336120 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed2acf4d-53a0-4bbb-bc0d-c021cf699d91-config-data\") pod \"aodh-0\" (UID: \"ed2acf4d-53a0-4bbb-bc0d-c021cf699d91\") " pod="openstack/aodh-0" Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.336142 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed2acf4d-53a0-4bbb-bc0d-c021cf699d91-public-tls-certs\") pod \"aodh-0\" (UID: \"ed2acf4d-53a0-4bbb-bc0d-c021cf699d91\") " pod="openstack/aodh-0" Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.336166 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed2acf4d-53a0-4bbb-bc0d-c021cf699d91-internal-tls-certs\") pod \"aodh-0\" (UID: \"ed2acf4d-53a0-4bbb-bc0d-c021cf699d91\") " pod="openstack/aodh-0" Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.336208 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed2acf4d-53a0-4bbb-bc0d-c021cf699d91-scripts\") pod \"aodh-0\" (UID: \"ed2acf4d-53a0-4bbb-bc0d-c021cf699d91\") " pod="openstack/aodh-0" Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.336226 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed2acf4d-53a0-4bbb-bc0d-c021cf699d91-combined-ca-bundle\") pod \"aodh-0\" (UID: \"ed2acf4d-53a0-4bbb-bc0d-c021cf699d91\") " pod="openstack/aodh-0" Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.336268 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5660601a-af93-417b-b7c8-b68893743919-config-data\") pod \"nova-metadata-0\" (UID: \"5660601a-af93-417b-b7c8-b68893743919\") " pod="openstack/nova-metadata-0" Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.337404 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5660601a-af93-417b-b7c8-b68893743919-logs\") pod \"nova-metadata-0\" (UID: \"5660601a-af93-417b-b7c8-b68893743919\") " pod="openstack/nova-metadata-0" Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.343794 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed2acf4d-53a0-4bbb-bc0d-c021cf699d91-config-data\") pod \"aodh-0\" (UID: \"ed2acf4d-53a0-4bbb-bc0d-c021cf699d91\") " pod="openstack/aodh-0" Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.346618 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed2acf4d-53a0-4bbb-bc0d-c021cf699d91-combined-ca-bundle\") pod \"aodh-0\" (UID: \"ed2acf4d-53a0-4bbb-bc0d-c021cf699d91\") " pod="openstack/aodh-0" Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.354808 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed2acf4d-53a0-4bbb-bc0d-c021cf699d91-scripts\") pod \"aodh-0\" (UID: \"ed2acf4d-53a0-4bbb-bc0d-c021cf699d91\") " pod="openstack/aodh-0" Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.355040 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed2acf4d-53a0-4bbb-bc0d-c021cf699d91-internal-tls-certs\") pod \"aodh-0\" (UID: \"ed2acf4d-53a0-4bbb-bc0d-c021cf699d91\") " pod="openstack/aodh-0" Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.357358 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5660601a-af93-417b-b7c8-b68893743919-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5660601a-af93-417b-b7c8-b68893743919\") " pod="openstack/nova-metadata-0" Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.361285 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5660601a-af93-417b-b7c8-b68893743919-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5660601a-af93-417b-b7c8-b68893743919\") " pod="openstack/nova-metadata-0" Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.361707 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed2acf4d-53a0-4bbb-bc0d-c021cf699d91-public-tls-certs\") pod \"aodh-0\" (UID: \"ed2acf4d-53a0-4bbb-bc0d-c021cf699d91\") " pod="openstack/aodh-0" Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.364284 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tpfs\" (UniqueName: \"kubernetes.io/projected/5660601a-af93-417b-b7c8-b68893743919-kube-api-access-7tpfs\") pod \"nova-metadata-0\" (UID: \"5660601a-af93-417b-b7c8-b68893743919\") " pod="openstack/nova-metadata-0" Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.370011 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spk8l\" (UniqueName: \"kubernetes.io/projected/ed2acf4d-53a0-4bbb-bc0d-c021cf699d91-kube-api-access-spk8l\") pod \"aodh-0\" (UID: \"ed2acf4d-53a0-4bbb-bc0d-c021cf699d91\") " pod="openstack/aodh-0" Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.370343 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5660601a-af93-417b-b7c8-b68893743919-config-data\") pod \"nova-metadata-0\" (UID: \"5660601a-af93-417b-b7c8-b68893743919\") " pod="openstack/nova-metadata-0" Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.411290 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.450167 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.957280 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2fbc03ca-3891-4c9b-8ac9-88efbe950d4d","Type":"ContainerStarted","Data":"fe9647307c950e11f8d5bee935de5d03e94a2cc85151ee3ee923195f45253b67"} Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.957966 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2fbc03ca-3891-4c9b-8ac9-88efbe950d4d","Type":"ContainerStarted","Data":"1f9201039c4883043594ab844a335544bd0c1696345e8df9a46361b92bf92309"} Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.957979 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2fbc03ca-3891-4c9b-8ac9-88efbe950d4d","Type":"ContainerStarted","Data":"88ff868891261353fd192165f5991f23bf91ab1cffaaf5e949a0a2b75ebaa717"} Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.963091 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c55cf1ac-b2f2-4801-bb17-8e9b703d33eb","Type":"ContainerStarted","Data":"c2509e9acb47fce7feebcff2ca9a8aff00e503782e84378af72b09850241bdc6"} Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.963123 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c55cf1ac-b2f2-4801-bb17-8e9b703d33eb","Type":"ContainerStarted","Data":"cd035a9e4139a13592dcb9bd66a35609c237d6126a7c18a8503220f240b45c71"} Mar 19 19:21:27 crc kubenswrapper[4826]: I0319 19:21:27.965817 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 19:21:28 crc kubenswrapper[4826]: I0319 19:21:28.001133 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.001111731 podStartE2EDuration="2.001111731s" podCreationTimestamp="2026-03-19 19:21:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:21:27.98453373 +0000 UTC m=+1512.738602063" watchObservedRunningTime="2026-03-19 19:21:28.001111731 +0000 UTC m=+1512.755180044" Mar 19 19:21:28 crc kubenswrapper[4826]: I0319 19:21:28.007075 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f38764b-fa70-46d3-a045-024fe04b86a6" path="/var/lib/kubelet/pods/2f38764b-fa70-46d3-a045-024fe04b86a6/volumes" Mar 19 19:21:28 crc kubenswrapper[4826]: I0319 19:21:28.017354 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96becd19-70c0-4cc6-babc-650f7c8bac8f" path="/var/lib/kubelet/pods/96becd19-70c0-4cc6-babc-650f7c8bac8f/volumes" Mar 19 19:21:28 crc kubenswrapper[4826]: I0319 19:21:28.018418 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9edd25fe-7b6f-45d9-835e-6294924dd2e8" path="/var/lib/kubelet/pods/9edd25fe-7b6f-45d9-835e-6294924dd2e8/volumes" Mar 19 19:21:28 crc kubenswrapper[4826]: I0319 19:21:28.019068 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5835e9d-d28e-47a1-aa89-c61f110f2f53" path="/var/lib/kubelet/pods/c5835e9d-d28e-47a1-aa89-c61f110f2f53/volumes" Mar 19 19:21:28 crc kubenswrapper[4826]: I0319 19:21:28.026911 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.026895654 podStartE2EDuration="2.026895654s" podCreationTimestamp="2026-03-19 19:21:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:21:28.007014184 +0000 UTC m=+1512.761082507" watchObservedRunningTime="2026-03-19 19:21:28.026895654 +0000 UTC m=+1512.780963957" Mar 19 19:21:28 crc kubenswrapper[4826]: I0319 19:21:28.075543 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 19 19:21:28 crc kubenswrapper[4826]: I0319 19:21:28.974696 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ed2acf4d-53a0-4bbb-bc0d-c021cf699d91","Type":"ContainerStarted","Data":"d5a9eb116013690bd9a216eb73d3c0899145605bb5f7509456aa201f07962be1"} Mar 19 19:21:28 crc kubenswrapper[4826]: I0319 19:21:28.975898 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ed2acf4d-53a0-4bbb-bc0d-c021cf699d91","Type":"ContainerStarted","Data":"e0afcfa5950c62b0559388abb3a46dadde1fd5d30cabcde3a94d6f3a15e07660"} Mar 19 19:21:28 crc kubenswrapper[4826]: I0319 19:21:28.980874 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5660601a-af93-417b-b7c8-b68893743919","Type":"ContainerStarted","Data":"7e360f1e9c9e8bf9a83e945e68dbe4a36bb50c9dcfc993bda7746e7a0ad7b89f"} Mar 19 19:21:28 crc kubenswrapper[4826]: I0319 19:21:28.980919 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5660601a-af93-417b-b7c8-b68893743919","Type":"ContainerStarted","Data":"700391e8029e2e4b30d9ee4c955d4702bcb18837e43e176977dbda861737b900"} Mar 19 19:21:28 crc kubenswrapper[4826]: I0319 19:21:28.980935 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5660601a-af93-417b-b7c8-b68893743919","Type":"ContainerStarted","Data":"8f761d69f70ea71f80f59b9295db246d8985b03ee44c0bacecd3afe718d63dc9"} Mar 19 19:21:29 crc kubenswrapper[4826]: I0319 19:21:29.001241 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.001222263 podStartE2EDuration="2.001222263s" podCreationTimestamp="2026-03-19 19:21:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:21:28.998118648 +0000 UTC m=+1513.752186971" watchObservedRunningTime="2026-03-19 19:21:29.001222263 +0000 UTC m=+1513.755290586" Mar 19 19:21:31 crc kubenswrapper[4826]: I0319 19:21:31.030952 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ed2acf4d-53a0-4bbb-bc0d-c021cf699d91","Type":"ContainerStarted","Data":"1809704acf553601df52423f6734335622681a932932ff9a09d1aecac4f31610"} Mar 19 19:21:31 crc kubenswrapper[4826]: I0319 19:21:31.398970 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rqc8n" Mar 19 19:21:31 crc kubenswrapper[4826]: I0319 19:21:31.399700 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rqc8n" Mar 19 19:21:31 crc kubenswrapper[4826]: I0319 19:21:31.475061 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rqc8n" Mar 19 19:21:31 crc kubenswrapper[4826]: I0319 19:21:31.566214 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 19 19:21:32 crc kubenswrapper[4826]: I0319 19:21:32.048288 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ed2acf4d-53a0-4bbb-bc0d-c021cf699d91","Type":"ContainerStarted","Data":"4a573f40f01ececb85f329d1f7d54bb1ee1c8a9122f52617986cfe85a9b58e47"} Mar 19 19:21:32 crc kubenswrapper[4826]: I0319 19:21:32.142519 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rqc8n" Mar 19 19:21:32 crc kubenswrapper[4826]: I0319 19:21:32.238930 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rqc8n"] Mar 19 19:21:32 crc kubenswrapper[4826]: I0319 19:21:32.307483 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2v9rn"] Mar 19 19:21:32 crc kubenswrapper[4826]: I0319 19:21:32.308243 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2v9rn" podUID="4baa475c-f008-41cf-99d8-88723e43abf0" containerName="registry-server" containerID="cri-o://bdfa9673fef9a52cc9d3757cd500cdd05b6aa8bd455fa9fab3a2b7a96e629739" gracePeriod=2 Mar 19 19:21:32 crc kubenswrapper[4826]: I0319 19:21:32.930625 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2v9rn" Mar 19 19:21:33 crc kubenswrapper[4826]: I0319 19:21:33.004434 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ph2nv\" (UniqueName: \"kubernetes.io/projected/4baa475c-f008-41cf-99d8-88723e43abf0-kube-api-access-ph2nv\") pod \"4baa475c-f008-41cf-99d8-88723e43abf0\" (UID: \"4baa475c-f008-41cf-99d8-88723e43abf0\") " Mar 19 19:21:33 crc kubenswrapper[4826]: I0319 19:21:33.004633 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4baa475c-f008-41cf-99d8-88723e43abf0-utilities\") pod \"4baa475c-f008-41cf-99d8-88723e43abf0\" (UID: \"4baa475c-f008-41cf-99d8-88723e43abf0\") " Mar 19 19:21:33 crc kubenswrapper[4826]: I0319 19:21:33.004706 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4baa475c-f008-41cf-99d8-88723e43abf0-catalog-content\") pod \"4baa475c-f008-41cf-99d8-88723e43abf0\" (UID: \"4baa475c-f008-41cf-99d8-88723e43abf0\") " Mar 19 19:21:33 crc kubenswrapper[4826]: I0319 19:21:33.005936 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4baa475c-f008-41cf-99d8-88723e43abf0-utilities" (OuterVolumeSpecName: "utilities") pod "4baa475c-f008-41cf-99d8-88723e43abf0" (UID: "4baa475c-f008-41cf-99d8-88723e43abf0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:21:33 crc kubenswrapper[4826]: I0319 19:21:33.018058 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4baa475c-f008-41cf-99d8-88723e43abf0-kube-api-access-ph2nv" (OuterVolumeSpecName: "kube-api-access-ph2nv") pod "4baa475c-f008-41cf-99d8-88723e43abf0" (UID: "4baa475c-f008-41cf-99d8-88723e43abf0"). InnerVolumeSpecName "kube-api-access-ph2nv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:21:33 crc kubenswrapper[4826]: I0319 19:21:33.110398 4826 generic.go:334] "Generic (PLEG): container finished" podID="4baa475c-f008-41cf-99d8-88723e43abf0" containerID="bdfa9673fef9a52cc9d3757cd500cdd05b6aa8bd455fa9fab3a2b7a96e629739" exitCode=0 Mar 19 19:21:33 crc kubenswrapper[4826]: I0319 19:21:33.110610 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2v9rn" event={"ID":"4baa475c-f008-41cf-99d8-88723e43abf0","Type":"ContainerDied","Data":"bdfa9673fef9a52cc9d3757cd500cdd05b6aa8bd455fa9fab3a2b7a96e629739"} Mar 19 19:21:33 crc kubenswrapper[4826]: I0319 19:21:33.110638 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2v9rn" event={"ID":"4baa475c-f008-41cf-99d8-88723e43abf0","Type":"ContainerDied","Data":"11f2dd7d8e81892f718cd65b9e12b759fb3352a5f952ed556a8f6f53278f27c2"} Mar 19 19:21:33 crc kubenswrapper[4826]: I0319 19:21:33.110707 4826 scope.go:117] "RemoveContainer" containerID="bdfa9673fef9a52cc9d3757cd500cdd05b6aa8bd455fa9fab3a2b7a96e629739" Mar 19 19:21:33 crc kubenswrapper[4826]: I0319 19:21:33.110844 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2v9rn" Mar 19 19:21:33 crc kubenswrapper[4826]: I0319 19:21:33.115787 4826 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4baa475c-f008-41cf-99d8-88723e43abf0-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:33 crc kubenswrapper[4826]: I0319 19:21:33.115812 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ph2nv\" (UniqueName: \"kubernetes.io/projected/4baa475c-f008-41cf-99d8-88723e43abf0-kube-api-access-ph2nv\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:33 crc kubenswrapper[4826]: I0319 19:21:33.133559 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ed2acf4d-53a0-4bbb-bc0d-c021cf699d91","Type":"ContainerStarted","Data":"f88cb585e4bb2afd745d3e2e177e1850ff875a9d51deba8343e25dfe245845e4"} Mar 19 19:21:33 crc kubenswrapper[4826]: I0319 19:21:33.153960 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4baa475c-f008-41cf-99d8-88723e43abf0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4baa475c-f008-41cf-99d8-88723e43abf0" (UID: "4baa475c-f008-41cf-99d8-88723e43abf0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:21:33 crc kubenswrapper[4826]: I0319 19:21:33.167224 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=3.471621421 podStartE2EDuration="7.167206431s" podCreationTimestamp="2026-03-19 19:21:26 +0000 UTC" firstStartedPulling="2026-03-19 19:21:28.084883765 +0000 UTC m=+1512.838952078" lastFinishedPulling="2026-03-19 19:21:31.780468775 +0000 UTC m=+1516.534537088" observedRunningTime="2026-03-19 19:21:33.167081568 +0000 UTC m=+1517.921149891" watchObservedRunningTime="2026-03-19 19:21:33.167206431 +0000 UTC m=+1517.921274754" Mar 19 19:21:33 crc kubenswrapper[4826]: I0319 19:21:33.219372 4826 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4baa475c-f008-41cf-99d8-88723e43abf0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:33 crc kubenswrapper[4826]: I0319 19:21:33.237844 4826 scope.go:117] "RemoveContainer" containerID="072758da9dcfd68b3f3f95263e3295ce07af12457db174669a59799058edac93" Mar 19 19:21:33 crc kubenswrapper[4826]: I0319 19:21:33.353547 4826 scope.go:117] "RemoveContainer" containerID="a12383b49bba35c8a07ee384ef9f634c2cb1c1a73d15fd4d30d4a3260552997d" Mar 19 19:21:33 crc kubenswrapper[4826]: I0319 19:21:33.404353 4826 scope.go:117] "RemoveContainer" containerID="bdfa9673fef9a52cc9d3757cd500cdd05b6aa8bd455fa9fab3a2b7a96e629739" Mar 19 19:21:33 crc kubenswrapper[4826]: E0319 19:21:33.405142 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdfa9673fef9a52cc9d3757cd500cdd05b6aa8bd455fa9fab3a2b7a96e629739\": container with ID starting with bdfa9673fef9a52cc9d3757cd500cdd05b6aa8bd455fa9fab3a2b7a96e629739 not found: ID does not exist" containerID="bdfa9673fef9a52cc9d3757cd500cdd05b6aa8bd455fa9fab3a2b7a96e629739" Mar 19 19:21:33 crc kubenswrapper[4826]: I0319 19:21:33.405181 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdfa9673fef9a52cc9d3757cd500cdd05b6aa8bd455fa9fab3a2b7a96e629739"} err="failed to get container status \"bdfa9673fef9a52cc9d3757cd500cdd05b6aa8bd455fa9fab3a2b7a96e629739\": rpc error: code = NotFound desc = could not find container \"bdfa9673fef9a52cc9d3757cd500cdd05b6aa8bd455fa9fab3a2b7a96e629739\": container with ID starting with bdfa9673fef9a52cc9d3757cd500cdd05b6aa8bd455fa9fab3a2b7a96e629739 not found: ID does not exist" Mar 19 19:21:33 crc kubenswrapper[4826]: I0319 19:21:33.405207 4826 scope.go:117] "RemoveContainer" containerID="072758da9dcfd68b3f3f95263e3295ce07af12457db174669a59799058edac93" Mar 19 19:21:33 crc kubenswrapper[4826]: E0319 19:21:33.406646 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"072758da9dcfd68b3f3f95263e3295ce07af12457db174669a59799058edac93\": container with ID starting with 072758da9dcfd68b3f3f95263e3295ce07af12457db174669a59799058edac93 not found: ID does not exist" containerID="072758da9dcfd68b3f3f95263e3295ce07af12457db174669a59799058edac93" Mar 19 19:21:33 crc kubenswrapper[4826]: I0319 19:21:33.406703 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"072758da9dcfd68b3f3f95263e3295ce07af12457db174669a59799058edac93"} err="failed to get container status \"072758da9dcfd68b3f3f95263e3295ce07af12457db174669a59799058edac93\": rpc error: code = NotFound desc = could not find container \"072758da9dcfd68b3f3f95263e3295ce07af12457db174669a59799058edac93\": container with ID starting with 072758da9dcfd68b3f3f95263e3295ce07af12457db174669a59799058edac93 not found: ID does not exist" Mar 19 19:21:33 crc kubenswrapper[4826]: I0319 19:21:33.406722 4826 scope.go:117] "RemoveContainer" containerID="a12383b49bba35c8a07ee384ef9f634c2cb1c1a73d15fd4d30d4a3260552997d" Mar 19 19:21:33 crc kubenswrapper[4826]: E0319 19:21:33.407163 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a12383b49bba35c8a07ee384ef9f634c2cb1c1a73d15fd4d30d4a3260552997d\": container with ID starting with a12383b49bba35c8a07ee384ef9f634c2cb1c1a73d15fd4d30d4a3260552997d not found: ID does not exist" containerID="a12383b49bba35c8a07ee384ef9f634c2cb1c1a73d15fd4d30d4a3260552997d" Mar 19 19:21:33 crc kubenswrapper[4826]: I0319 19:21:33.407205 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a12383b49bba35c8a07ee384ef9f634c2cb1c1a73d15fd4d30d4a3260552997d"} err="failed to get container status \"a12383b49bba35c8a07ee384ef9f634c2cb1c1a73d15fd4d30d4a3260552997d\": rpc error: code = NotFound desc = could not find container \"a12383b49bba35c8a07ee384ef9f634c2cb1c1a73d15fd4d30d4a3260552997d\": container with ID starting with a12383b49bba35c8a07ee384ef9f634c2cb1c1a73d15fd4d30d4a3260552997d not found: ID does not exist" Mar 19 19:21:33 crc kubenswrapper[4826]: I0319 19:21:33.490732 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2v9rn"] Mar 19 19:21:33 crc kubenswrapper[4826]: I0319 19:21:33.538018 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2v9rn"] Mar 19 19:21:33 crc kubenswrapper[4826]: I0319 19:21:33.994200 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4baa475c-f008-41cf-99d8-88723e43abf0" path="/var/lib/kubelet/pods/4baa475c-f008-41cf-99d8-88723e43abf0/volumes" Mar 19 19:21:36 crc kubenswrapper[4826]: I0319 19:21:36.549048 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 19 19:21:36 crc kubenswrapper[4826]: I0319 19:21:36.551915 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 19 19:21:36 crc kubenswrapper[4826]: I0319 19:21:36.566290 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 19 19:21:36 crc kubenswrapper[4826]: I0319 19:21:36.620963 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 19 19:21:37 crc kubenswrapper[4826]: I0319 19:21:37.219718 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 19 19:21:37 crc kubenswrapper[4826]: I0319 19:21:37.412327 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 19 19:21:37 crc kubenswrapper[4826]: I0319 19:21:37.412521 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 19 19:21:37 crc kubenswrapper[4826]: I0319 19:21:37.563817 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2fbc03ca-3891-4c9b-8ac9-88efbe950d4d" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.13:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 19:21:37 crc kubenswrapper[4826]: I0319 19:21:37.563854 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2fbc03ca-3891-4c9b-8ac9-88efbe950d4d" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.13:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 19:21:38 crc kubenswrapper[4826]: I0319 19:21:38.425818 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5660601a-af93-417b-b7c8-b68893743919" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.16:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 19:21:38 crc kubenswrapper[4826]: I0319 19:21:38.425840 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5660601a-af93-417b-b7c8-b68893743919" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.16:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 19:21:44 crc kubenswrapper[4826]: I0319 19:21:44.548149 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 19 19:21:44 crc kubenswrapper[4826]: I0319 19:21:44.548743 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 19 19:21:45 crc kubenswrapper[4826]: I0319 19:21:45.412443 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 19 19:21:45 crc kubenswrapper[4826]: I0319 19:21:45.412892 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 19 19:21:46 crc kubenswrapper[4826]: I0319 19:21:46.089270 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 19 19:21:46 crc kubenswrapper[4826]: I0319 19:21:46.562060 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 19 19:21:46 crc kubenswrapper[4826]: I0319 19:21:46.562389 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 19 19:21:46 crc kubenswrapper[4826]: I0319 19:21:46.578874 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 19 19:21:46 crc kubenswrapper[4826]: I0319 19:21:46.579529 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 19 19:21:47 crc kubenswrapper[4826]: I0319 19:21:47.420256 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 19 19:21:47 crc kubenswrapper[4826]: I0319 19:21:47.420837 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 19 19:21:47 crc kubenswrapper[4826]: I0319 19:21:47.428970 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 19 19:21:48 crc kubenswrapper[4826]: I0319 19:21:48.346033 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 19 19:21:50 crc kubenswrapper[4826]: I0319 19:21:50.966586 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 19 19:21:50 crc kubenswrapper[4826]: I0319 19:21:50.967396 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="7be203b6-dbb5-49d5-935f-9844ee4d6c11" containerName="kube-state-metrics" containerID="cri-o://5bb004602f7948e5a1d29c36fe0641da926560d38c110106c8ccbaf1a5f48fa0" gracePeriod=30 Mar 19 19:21:51 crc kubenswrapper[4826]: I0319 19:21:51.064326 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 19 19:21:51 crc kubenswrapper[4826]: I0319 19:21:51.064556 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mysqld-exporter-0" podUID="144ed0ba-58bd-4991-92af-8ed3f401e52c" containerName="mysqld-exporter" containerID="cri-o://1fa1f710f9d4180d7dfec7f36e73a9bbdc447e1b123dce5f4813e8f7c60a6c69" gracePeriod=30 Mar 19 19:21:51 crc kubenswrapper[4826]: I0319 19:21:51.375170 4826 generic.go:334] "Generic (PLEG): container finished" podID="7be203b6-dbb5-49d5-935f-9844ee4d6c11" containerID="5bb004602f7948e5a1d29c36fe0641da926560d38c110106c8ccbaf1a5f48fa0" exitCode=2 Mar 19 19:21:51 crc kubenswrapper[4826]: I0319 19:21:51.375427 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"7be203b6-dbb5-49d5-935f-9844ee4d6c11","Type":"ContainerDied","Data":"5bb004602f7948e5a1d29c36fe0641da926560d38c110106c8ccbaf1a5f48fa0"} Mar 19 19:21:51 crc kubenswrapper[4826]: I0319 19:21:51.384200 4826 generic.go:334] "Generic (PLEG): container finished" podID="144ed0ba-58bd-4991-92af-8ed3f401e52c" containerID="1fa1f710f9d4180d7dfec7f36e73a9bbdc447e1b123dce5f4813e8f7c60a6c69" exitCode=2 Mar 19 19:21:51 crc kubenswrapper[4826]: I0319 19:21:51.384241 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"144ed0ba-58bd-4991-92af-8ed3f401e52c","Type":"ContainerDied","Data":"1fa1f710f9d4180d7dfec7f36e73a9bbdc447e1b123dce5f4813e8f7c60a6c69"} Mar 19 19:21:51 crc kubenswrapper[4826]: I0319 19:21:51.552838 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 19 19:21:51 crc kubenswrapper[4826]: I0319 19:21:51.688587 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8pbr\" (UniqueName: \"kubernetes.io/projected/7be203b6-dbb5-49d5-935f-9844ee4d6c11-kube-api-access-d8pbr\") pod \"7be203b6-dbb5-49d5-935f-9844ee4d6c11\" (UID: \"7be203b6-dbb5-49d5-935f-9844ee4d6c11\") " Mar 19 19:21:51 crc kubenswrapper[4826]: I0319 19:21:51.695042 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7be203b6-dbb5-49d5-935f-9844ee4d6c11-kube-api-access-d8pbr" (OuterVolumeSpecName: "kube-api-access-d8pbr") pod "7be203b6-dbb5-49d5-935f-9844ee4d6c11" (UID: "7be203b6-dbb5-49d5-935f-9844ee4d6c11"). InnerVolumeSpecName "kube-api-access-d8pbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:21:51 crc kubenswrapper[4826]: I0319 19:21:51.730937 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 19 19:21:51 crc kubenswrapper[4826]: I0319 19:21:51.793473 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8pbr\" (UniqueName: \"kubernetes.io/projected/7be203b6-dbb5-49d5-935f-9844ee4d6c11-kube-api-access-d8pbr\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:51 crc kubenswrapper[4826]: I0319 19:21:51.894480 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/144ed0ba-58bd-4991-92af-8ed3f401e52c-combined-ca-bundle\") pod \"144ed0ba-58bd-4991-92af-8ed3f401e52c\" (UID: \"144ed0ba-58bd-4991-92af-8ed3f401e52c\") " Mar 19 19:21:51 crc kubenswrapper[4826]: I0319 19:21:51.894528 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/144ed0ba-58bd-4991-92af-8ed3f401e52c-config-data\") pod \"144ed0ba-58bd-4991-92af-8ed3f401e52c\" (UID: \"144ed0ba-58bd-4991-92af-8ed3f401e52c\") " Mar 19 19:21:51 crc kubenswrapper[4826]: I0319 19:21:51.894666 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tbt6\" (UniqueName: \"kubernetes.io/projected/144ed0ba-58bd-4991-92af-8ed3f401e52c-kube-api-access-4tbt6\") pod \"144ed0ba-58bd-4991-92af-8ed3f401e52c\" (UID: \"144ed0ba-58bd-4991-92af-8ed3f401e52c\") " Mar 19 19:21:51 crc kubenswrapper[4826]: I0319 19:21:51.912240 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/144ed0ba-58bd-4991-92af-8ed3f401e52c-kube-api-access-4tbt6" (OuterVolumeSpecName: "kube-api-access-4tbt6") pod "144ed0ba-58bd-4991-92af-8ed3f401e52c" (UID: "144ed0ba-58bd-4991-92af-8ed3f401e52c"). InnerVolumeSpecName "kube-api-access-4tbt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:21:51 crc kubenswrapper[4826]: I0319 19:21:51.924637 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/144ed0ba-58bd-4991-92af-8ed3f401e52c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "144ed0ba-58bd-4991-92af-8ed3f401e52c" (UID: "144ed0ba-58bd-4991-92af-8ed3f401e52c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:21:51 crc kubenswrapper[4826]: I0319 19:21:51.962609 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/144ed0ba-58bd-4991-92af-8ed3f401e52c-config-data" (OuterVolumeSpecName: "config-data") pod "144ed0ba-58bd-4991-92af-8ed3f401e52c" (UID: "144ed0ba-58bd-4991-92af-8ed3f401e52c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.002492 4826 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/144ed0ba-58bd-4991-92af-8ed3f401e52c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.002553 4826 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/144ed0ba-58bd-4991-92af-8ed3f401e52c-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.002574 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tbt6\" (UniqueName: \"kubernetes.io/projected/144ed0ba-58bd-4991-92af-8ed3f401e52c-kube-api-access-4tbt6\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.396960 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"144ed0ba-58bd-4991-92af-8ed3f401e52c","Type":"ContainerDied","Data":"744ebc93e8d7de4b825bb37739953fa667633fcf7cc9927c4152231f46c21762"} Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.397319 4826 scope.go:117] "RemoveContainer" containerID="1fa1f710f9d4180d7dfec7f36e73a9bbdc447e1b123dce5f4813e8f7c60a6c69" Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.397000 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.400089 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"7be203b6-dbb5-49d5-935f-9844ee4d6c11","Type":"ContainerDied","Data":"5bb48a28dbd5ccc899c69c66e3bc9b2c57c5fe24c6d4bb5c9c72272b6309228a"} Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.400151 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.457727 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.465953 4826 scope.go:117] "RemoveContainer" containerID="5bb004602f7948e5a1d29c36fe0641da926560d38c110106c8ccbaf1a5f48fa0" Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.504820 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.520384 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.536750 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.552221 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 19 19:21:52 crc kubenswrapper[4826]: E0319 19:21:52.552746 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4baa475c-f008-41cf-99d8-88723e43abf0" containerName="extract-utilities" Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.552762 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="4baa475c-f008-41cf-99d8-88723e43abf0" containerName="extract-utilities" Mar 19 19:21:52 crc kubenswrapper[4826]: E0319 19:21:52.552779 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4baa475c-f008-41cf-99d8-88723e43abf0" containerName="extract-content" Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.552784 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="4baa475c-f008-41cf-99d8-88723e43abf0" containerName="extract-content" Mar 19 19:21:52 crc kubenswrapper[4826]: E0319 19:21:52.552813 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="144ed0ba-58bd-4991-92af-8ed3f401e52c" containerName="mysqld-exporter" Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.552819 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="144ed0ba-58bd-4991-92af-8ed3f401e52c" containerName="mysqld-exporter" Mar 19 19:21:52 crc kubenswrapper[4826]: E0319 19:21:52.552828 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7be203b6-dbb5-49d5-935f-9844ee4d6c11" containerName="kube-state-metrics" Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.552834 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="7be203b6-dbb5-49d5-935f-9844ee4d6c11" containerName="kube-state-metrics" Mar 19 19:21:52 crc kubenswrapper[4826]: E0319 19:21:52.552856 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4baa475c-f008-41cf-99d8-88723e43abf0" containerName="registry-server" Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.552863 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="4baa475c-f008-41cf-99d8-88723e43abf0" containerName="registry-server" Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.553065 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="144ed0ba-58bd-4991-92af-8ed3f401e52c" containerName="mysqld-exporter" Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.553077 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="7be203b6-dbb5-49d5-935f-9844ee4d6c11" containerName="kube-state-metrics" Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.553097 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="4baa475c-f008-41cf-99d8-88723e43abf0" containerName="registry-server" Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.554063 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.556859 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.557034 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.569053 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-0"] Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.570712 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.577110 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-config-data" Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.577642 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-mysqld-exporter-svc" Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.595683 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.612097 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.623749 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vt6c6"] Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.626606 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vt6c6" Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.637586 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vt6c6"] Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.727888 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvpkq\" (UniqueName: \"kubernetes.io/projected/a472fa43-9652-40de-9038-082a4314b962-kube-api-access-wvpkq\") pod \"mysqld-exporter-0\" (UID: \"a472fa43-9652-40de-9038-082a4314b962\") " pod="openstack/mysqld-exporter-0" Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.727936 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mq7q\" (UniqueName: \"kubernetes.io/projected/bd1bf3ae-da1b-48c4-a1fb-b185d9af0440-kube-api-access-8mq7q\") pod \"redhat-marketplace-vt6c6\" (UID: \"bd1bf3ae-da1b-48c4-a1fb-b185d9af0440\") " pod="openshift-marketplace/redhat-marketplace-vt6c6" Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.727970 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a472fa43-9652-40de-9038-082a4314b962-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"a472fa43-9652-40de-9038-082a4314b962\") " pod="openstack/mysqld-exporter-0" Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.728292 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a472fa43-9652-40de-9038-082a4314b962-config-data\") pod \"mysqld-exporter-0\" (UID: \"a472fa43-9652-40de-9038-082a4314b962\") " pod="openstack/mysqld-exporter-0" Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.728408 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/06606200-618f-46b2-afb9-e5e2738fe2dd-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"06606200-618f-46b2-afb9-e5e2738fe2dd\") " pod="openstack/kube-state-metrics-0" Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.728587 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/06606200-618f-46b2-afb9-e5e2738fe2dd-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"06606200-618f-46b2-afb9-e5e2738fe2dd\") " pod="openstack/kube-state-metrics-0" Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.728924 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtqcg\" (UniqueName: \"kubernetes.io/projected/06606200-618f-46b2-afb9-e5e2738fe2dd-kube-api-access-dtqcg\") pod \"kube-state-metrics-0\" (UID: \"06606200-618f-46b2-afb9-e5e2738fe2dd\") " pod="openstack/kube-state-metrics-0" Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.729136 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd1bf3ae-da1b-48c4-a1fb-b185d9af0440-catalog-content\") pod \"redhat-marketplace-vt6c6\" (UID: \"bd1bf3ae-da1b-48c4-a1fb-b185d9af0440\") " pod="openshift-marketplace/redhat-marketplace-vt6c6" Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.729216 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/a472fa43-9652-40de-9038-082a4314b962-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"a472fa43-9652-40de-9038-082a4314b962\") " pod="openstack/mysqld-exporter-0" Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.729285 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd1bf3ae-da1b-48c4-a1fb-b185d9af0440-utilities\") pod \"redhat-marketplace-vt6c6\" (UID: \"bd1bf3ae-da1b-48c4-a1fb-b185d9af0440\") " pod="openshift-marketplace/redhat-marketplace-vt6c6" Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.729308 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06606200-618f-46b2-afb9-e5e2738fe2dd-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"06606200-618f-46b2-afb9-e5e2738fe2dd\") " pod="openstack/kube-state-metrics-0" Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.831223 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd1bf3ae-da1b-48c4-a1fb-b185d9af0440-catalog-content\") pod \"redhat-marketplace-vt6c6\" (UID: \"bd1bf3ae-da1b-48c4-a1fb-b185d9af0440\") " pod="openshift-marketplace/redhat-marketplace-vt6c6" Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.831503 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/a472fa43-9652-40de-9038-082a4314b962-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"a472fa43-9652-40de-9038-082a4314b962\") " pod="openstack/mysqld-exporter-0" Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.831621 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd1bf3ae-da1b-48c4-a1fb-b185d9af0440-utilities\") pod \"redhat-marketplace-vt6c6\" (UID: \"bd1bf3ae-da1b-48c4-a1fb-b185d9af0440\") " pod="openshift-marketplace/redhat-marketplace-vt6c6" Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.831737 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd1bf3ae-da1b-48c4-a1fb-b185d9af0440-catalog-content\") pod \"redhat-marketplace-vt6c6\" (UID: \"bd1bf3ae-da1b-48c4-a1fb-b185d9af0440\") " pod="openshift-marketplace/redhat-marketplace-vt6c6" Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.831746 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06606200-618f-46b2-afb9-e5e2738fe2dd-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"06606200-618f-46b2-afb9-e5e2738fe2dd\") " pod="openstack/kube-state-metrics-0" Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.831871 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvpkq\" (UniqueName: \"kubernetes.io/projected/a472fa43-9652-40de-9038-082a4314b962-kube-api-access-wvpkq\") pod \"mysqld-exporter-0\" (UID: \"a472fa43-9652-40de-9038-082a4314b962\") " pod="openstack/mysqld-exporter-0" Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.831916 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mq7q\" (UniqueName: \"kubernetes.io/projected/bd1bf3ae-da1b-48c4-a1fb-b185d9af0440-kube-api-access-8mq7q\") pod \"redhat-marketplace-vt6c6\" (UID: \"bd1bf3ae-da1b-48c4-a1fb-b185d9af0440\") " pod="openshift-marketplace/redhat-marketplace-vt6c6" Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.831986 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a472fa43-9652-40de-9038-082a4314b962-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"a472fa43-9652-40de-9038-082a4314b962\") " pod="openstack/mysqld-exporter-0" Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.832051 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a472fa43-9652-40de-9038-082a4314b962-config-data\") pod \"mysqld-exporter-0\" (UID: \"a472fa43-9652-40de-9038-082a4314b962\") " pod="openstack/mysqld-exporter-0" Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.832087 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd1bf3ae-da1b-48c4-a1fb-b185d9af0440-utilities\") pod \"redhat-marketplace-vt6c6\" (UID: \"bd1bf3ae-da1b-48c4-a1fb-b185d9af0440\") " pod="openshift-marketplace/redhat-marketplace-vt6c6" Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.832107 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/06606200-618f-46b2-afb9-e5e2738fe2dd-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"06606200-618f-46b2-afb9-e5e2738fe2dd\") " pod="openstack/kube-state-metrics-0" Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.832238 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/06606200-618f-46b2-afb9-e5e2738fe2dd-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"06606200-618f-46b2-afb9-e5e2738fe2dd\") " pod="openstack/kube-state-metrics-0" Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.832458 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtqcg\" (UniqueName: \"kubernetes.io/projected/06606200-618f-46b2-afb9-e5e2738fe2dd-kube-api-access-dtqcg\") pod \"kube-state-metrics-0\" (UID: \"06606200-618f-46b2-afb9-e5e2738fe2dd\") " pod="openstack/kube-state-metrics-0" Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.845788 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/a472fa43-9652-40de-9038-082a4314b962-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"a472fa43-9652-40de-9038-082a4314b962\") " pod="openstack/mysqld-exporter-0" Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.845819 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a472fa43-9652-40de-9038-082a4314b962-config-data\") pod \"mysqld-exporter-0\" (UID: \"a472fa43-9652-40de-9038-082a4314b962\") " pod="openstack/mysqld-exporter-0" Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.846223 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06606200-618f-46b2-afb9-e5e2738fe2dd-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"06606200-618f-46b2-afb9-e5e2738fe2dd\") " pod="openstack/kube-state-metrics-0" Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.846292 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a472fa43-9652-40de-9038-082a4314b962-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"a472fa43-9652-40de-9038-082a4314b962\") " pod="openstack/mysqld-exporter-0" Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.846570 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/06606200-618f-46b2-afb9-e5e2738fe2dd-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"06606200-618f-46b2-afb9-e5e2738fe2dd\") " pod="openstack/kube-state-metrics-0" Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.867512 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/06606200-618f-46b2-afb9-e5e2738fe2dd-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"06606200-618f-46b2-afb9-e5e2738fe2dd\") " pod="openstack/kube-state-metrics-0" Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.881394 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvpkq\" (UniqueName: \"kubernetes.io/projected/a472fa43-9652-40de-9038-082a4314b962-kube-api-access-wvpkq\") pod \"mysqld-exporter-0\" (UID: \"a472fa43-9652-40de-9038-082a4314b962\") " pod="openstack/mysqld-exporter-0" Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.881404 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mq7q\" (UniqueName: \"kubernetes.io/projected/bd1bf3ae-da1b-48c4-a1fb-b185d9af0440-kube-api-access-8mq7q\") pod \"redhat-marketplace-vt6c6\" (UID: \"bd1bf3ae-da1b-48c4-a1fb-b185d9af0440\") " pod="openshift-marketplace/redhat-marketplace-vt6c6" Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.882274 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtqcg\" (UniqueName: \"kubernetes.io/projected/06606200-618f-46b2-afb9-e5e2738fe2dd-kube-api-access-dtqcg\") pod \"kube-state-metrics-0\" (UID: \"06606200-618f-46b2-afb9-e5e2738fe2dd\") " pod="openstack/kube-state-metrics-0" Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.893208 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.896713 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 19 19:21:52 crc kubenswrapper[4826]: I0319 19:21:52.950264 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vt6c6" Mar 19 19:21:53 crc kubenswrapper[4826]: I0319 19:21:53.539760 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:21:53 crc kubenswrapper[4826]: I0319 19:21:53.540266 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2871d9a2-f531-494d-bbc0-0c861bd78686" containerName="ceilometer-central-agent" containerID="cri-o://ed7e294926a0ef197174066c924ebcfb807fcd4eaf2ebe457810591beb97a0c9" gracePeriod=30 Mar 19 19:21:53 crc kubenswrapper[4826]: I0319 19:21:53.540350 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2871d9a2-f531-494d-bbc0-0c861bd78686" containerName="ceilometer-notification-agent" containerID="cri-o://29b977b63eb31265f7dfb8fbdc21a0f9c9dec3c445d20fa03128babcecd48db0" gracePeriod=30 Mar 19 19:21:53 crc kubenswrapper[4826]: I0319 19:21:53.540287 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2871d9a2-f531-494d-bbc0-0c861bd78686" containerName="sg-core" containerID="cri-o://740b071601440020c10b8738fcbe66615ffcf7efaa3ed86d59777bd00704c87f" gracePeriod=30 Mar 19 19:21:53 crc kubenswrapper[4826]: I0319 19:21:53.540302 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2871d9a2-f531-494d-bbc0-0c861bd78686" containerName="proxy-httpd" containerID="cri-o://7ecc35ff1fbf2252bd31a568aaab00e04b12f3f239e858907e692dc14b2cb543" gracePeriod=30 Mar 19 19:21:53 crc kubenswrapper[4826]: I0319 19:21:53.573588 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 19 19:21:53 crc kubenswrapper[4826]: W0319 19:21:53.574031 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda472fa43_9652_40de_9038_082a4314b962.slice/crio-a5b9cc4206670697bea06c71218a48012d4179dbc14b2ecd64dfe7fae1128b8e WatchSource:0}: Error finding container a5b9cc4206670697bea06c71218a48012d4179dbc14b2ecd64dfe7fae1128b8e: Status 404 returned error can't find the container with id a5b9cc4206670697bea06c71218a48012d4179dbc14b2ecd64dfe7fae1128b8e Mar 19 19:21:53 crc kubenswrapper[4826]: I0319 19:21:53.599996 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 19 19:21:53 crc kubenswrapper[4826]: I0319 19:21:53.619015 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vt6c6"] Mar 19 19:21:53 crc kubenswrapper[4826]: I0319 19:21:53.995210 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="144ed0ba-58bd-4991-92af-8ed3f401e52c" path="/var/lib/kubelet/pods/144ed0ba-58bd-4991-92af-8ed3f401e52c/volumes" Mar 19 19:21:53 crc kubenswrapper[4826]: I0319 19:21:53.998243 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7be203b6-dbb5-49d5-935f-9844ee4d6c11" path="/var/lib/kubelet/pods/7be203b6-dbb5-49d5-935f-9844ee4d6c11/volumes" Mar 19 19:21:54 crc kubenswrapper[4826]: I0319 19:21:54.439958 4826 generic.go:334] "Generic (PLEG): container finished" podID="2871d9a2-f531-494d-bbc0-0c861bd78686" containerID="7ecc35ff1fbf2252bd31a568aaab00e04b12f3f239e858907e692dc14b2cb543" exitCode=0 Mar 19 19:21:54 crc kubenswrapper[4826]: I0319 19:21:54.440214 4826 generic.go:334] "Generic (PLEG): container finished" podID="2871d9a2-f531-494d-bbc0-0c861bd78686" containerID="740b071601440020c10b8738fcbe66615ffcf7efaa3ed86d59777bd00704c87f" exitCode=2 Mar 19 19:21:54 crc kubenswrapper[4826]: I0319 19:21:54.440221 4826 generic.go:334] "Generic (PLEG): container finished" podID="2871d9a2-f531-494d-bbc0-0c861bd78686" containerID="29b977b63eb31265f7dfb8fbdc21a0f9c9dec3c445d20fa03128babcecd48db0" exitCode=0 Mar 19 19:21:54 crc kubenswrapper[4826]: I0319 19:21:54.440228 4826 generic.go:334] "Generic (PLEG): container finished" podID="2871d9a2-f531-494d-bbc0-0c861bd78686" containerID="ed7e294926a0ef197174066c924ebcfb807fcd4eaf2ebe457810591beb97a0c9" exitCode=0 Mar 19 19:21:54 crc kubenswrapper[4826]: I0319 19:21:54.440263 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2871d9a2-f531-494d-bbc0-0c861bd78686","Type":"ContainerDied","Data":"7ecc35ff1fbf2252bd31a568aaab00e04b12f3f239e858907e692dc14b2cb543"} Mar 19 19:21:54 crc kubenswrapper[4826]: I0319 19:21:54.440290 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2871d9a2-f531-494d-bbc0-0c861bd78686","Type":"ContainerDied","Data":"740b071601440020c10b8738fcbe66615ffcf7efaa3ed86d59777bd00704c87f"} Mar 19 19:21:54 crc kubenswrapper[4826]: I0319 19:21:54.440301 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2871d9a2-f531-494d-bbc0-0c861bd78686","Type":"ContainerDied","Data":"29b977b63eb31265f7dfb8fbdc21a0f9c9dec3c445d20fa03128babcecd48db0"} Mar 19 19:21:54 crc kubenswrapper[4826]: I0319 19:21:54.440310 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2871d9a2-f531-494d-bbc0-0c861bd78686","Type":"ContainerDied","Data":"ed7e294926a0ef197174066c924ebcfb807fcd4eaf2ebe457810591beb97a0c9"} Mar 19 19:21:54 crc kubenswrapper[4826]: I0319 19:21:54.442426 4826 generic.go:334] "Generic (PLEG): container finished" podID="bd1bf3ae-da1b-48c4-a1fb-b185d9af0440" containerID="cd82e12c8562217e656b3e3cf61edc2dd8af3766ce910aef2540a71d3b29bd8d" exitCode=0 Mar 19 19:21:54 crc kubenswrapper[4826]: I0319 19:21:54.442490 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vt6c6" event={"ID":"bd1bf3ae-da1b-48c4-a1fb-b185d9af0440","Type":"ContainerDied","Data":"cd82e12c8562217e656b3e3cf61edc2dd8af3766ce910aef2540a71d3b29bd8d"} Mar 19 19:21:54 crc kubenswrapper[4826]: I0319 19:21:54.442514 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vt6c6" event={"ID":"bd1bf3ae-da1b-48c4-a1fb-b185d9af0440","Type":"ContainerStarted","Data":"b7da84f67e5138897bd314477ed242fc90021a93ca6f897ffb359f132238d7a6"} Mar 19 19:21:54 crc kubenswrapper[4826]: I0319 19:21:54.449521 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"06606200-618f-46b2-afb9-e5e2738fe2dd","Type":"ContainerStarted","Data":"e8090ff562e4747dcb56536a38fb539f83dbea2c650e0675e579d269b88c7b97"} Mar 19 19:21:54 crc kubenswrapper[4826]: I0319 19:21:54.455729 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"a472fa43-9652-40de-9038-082a4314b962","Type":"ContainerStarted","Data":"a5b9cc4206670697bea06c71218a48012d4179dbc14b2ecd64dfe7fae1128b8e"} Mar 19 19:21:54 crc kubenswrapper[4826]: I0319 19:21:54.603762 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:21:54 crc kubenswrapper[4826]: I0319 19:21:54.704739 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2871d9a2-f531-494d-bbc0-0c861bd78686-scripts\") pod \"2871d9a2-f531-494d-bbc0-0c861bd78686\" (UID: \"2871d9a2-f531-494d-bbc0-0c861bd78686\") " Mar 19 19:21:54 crc kubenswrapper[4826]: I0319 19:21:54.704797 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chxx8\" (UniqueName: \"kubernetes.io/projected/2871d9a2-f531-494d-bbc0-0c861bd78686-kube-api-access-chxx8\") pod \"2871d9a2-f531-494d-bbc0-0c861bd78686\" (UID: \"2871d9a2-f531-494d-bbc0-0c861bd78686\") " Mar 19 19:21:54 crc kubenswrapper[4826]: I0319 19:21:54.704864 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2871d9a2-f531-494d-bbc0-0c861bd78686-config-data\") pod \"2871d9a2-f531-494d-bbc0-0c861bd78686\" (UID: \"2871d9a2-f531-494d-bbc0-0c861bd78686\") " Mar 19 19:21:54 crc kubenswrapper[4826]: I0319 19:21:54.704905 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2871d9a2-f531-494d-bbc0-0c861bd78686-sg-core-conf-yaml\") pod \"2871d9a2-f531-494d-bbc0-0c861bd78686\" (UID: \"2871d9a2-f531-494d-bbc0-0c861bd78686\") " Mar 19 19:21:54 crc kubenswrapper[4826]: I0319 19:21:54.705042 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2871d9a2-f531-494d-bbc0-0c861bd78686-log-httpd\") pod \"2871d9a2-f531-494d-bbc0-0c861bd78686\" (UID: \"2871d9a2-f531-494d-bbc0-0c861bd78686\") " Mar 19 19:21:54 crc kubenswrapper[4826]: I0319 19:21:54.705107 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2871d9a2-f531-494d-bbc0-0c861bd78686-run-httpd\") pod \"2871d9a2-f531-494d-bbc0-0c861bd78686\" (UID: \"2871d9a2-f531-494d-bbc0-0c861bd78686\") " Mar 19 19:21:54 crc kubenswrapper[4826]: I0319 19:21:54.705139 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2871d9a2-f531-494d-bbc0-0c861bd78686-combined-ca-bundle\") pod \"2871d9a2-f531-494d-bbc0-0c861bd78686\" (UID: \"2871d9a2-f531-494d-bbc0-0c861bd78686\") " Mar 19 19:21:54 crc kubenswrapper[4826]: I0319 19:21:54.705404 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2871d9a2-f531-494d-bbc0-0c861bd78686-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2871d9a2-f531-494d-bbc0-0c861bd78686" (UID: "2871d9a2-f531-494d-bbc0-0c861bd78686"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:21:54 crc kubenswrapper[4826]: I0319 19:21:54.705435 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2871d9a2-f531-494d-bbc0-0c861bd78686-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2871d9a2-f531-494d-bbc0-0c861bd78686" (UID: "2871d9a2-f531-494d-bbc0-0c861bd78686"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:21:54 crc kubenswrapper[4826]: I0319 19:21:54.706266 4826 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2871d9a2-f531-494d-bbc0-0c861bd78686-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:54 crc kubenswrapper[4826]: I0319 19:21:54.706296 4826 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2871d9a2-f531-494d-bbc0-0c861bd78686-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:54 crc kubenswrapper[4826]: I0319 19:21:54.710368 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2871d9a2-f531-494d-bbc0-0c861bd78686-scripts" (OuterVolumeSpecName: "scripts") pod "2871d9a2-f531-494d-bbc0-0c861bd78686" (UID: "2871d9a2-f531-494d-bbc0-0c861bd78686"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:21:54 crc kubenswrapper[4826]: I0319 19:21:54.715444 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2871d9a2-f531-494d-bbc0-0c861bd78686-kube-api-access-chxx8" (OuterVolumeSpecName: "kube-api-access-chxx8") pod "2871d9a2-f531-494d-bbc0-0c861bd78686" (UID: "2871d9a2-f531-494d-bbc0-0c861bd78686"). InnerVolumeSpecName "kube-api-access-chxx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:21:54 crc kubenswrapper[4826]: I0319 19:21:54.761410 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2871d9a2-f531-494d-bbc0-0c861bd78686-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2871d9a2-f531-494d-bbc0-0c861bd78686" (UID: "2871d9a2-f531-494d-bbc0-0c861bd78686"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:21:54 crc kubenswrapper[4826]: I0319 19:21:54.806232 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2871d9a2-f531-494d-bbc0-0c861bd78686-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2871d9a2-f531-494d-bbc0-0c861bd78686" (UID: "2871d9a2-f531-494d-bbc0-0c861bd78686"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:21:54 crc kubenswrapper[4826]: I0319 19:21:54.808019 4826 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2871d9a2-f531-494d-bbc0-0c861bd78686-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:54 crc kubenswrapper[4826]: I0319 19:21:54.808050 4826 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2871d9a2-f531-494d-bbc0-0c861bd78686-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:54 crc kubenswrapper[4826]: I0319 19:21:54.808064 4826 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2871d9a2-f531-494d-bbc0-0c861bd78686-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:54 crc kubenswrapper[4826]: I0319 19:21:54.808077 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chxx8\" (UniqueName: \"kubernetes.io/projected/2871d9a2-f531-494d-bbc0-0c861bd78686-kube-api-access-chxx8\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:54 crc kubenswrapper[4826]: I0319 19:21:54.851803 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2871d9a2-f531-494d-bbc0-0c861bd78686-config-data" (OuterVolumeSpecName: "config-data") pod "2871d9a2-f531-494d-bbc0-0c861bd78686" (UID: "2871d9a2-f531-494d-bbc0-0c861bd78686"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:21:54 crc kubenswrapper[4826]: I0319 19:21:54.909739 4826 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2871d9a2-f531-494d-bbc0-0c861bd78686-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:21:55 crc kubenswrapper[4826]: I0319 19:21:55.400958 4826 patch_prober.go:28] interesting pod/machine-config-daemon-zz87p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:21:55 crc kubenswrapper[4826]: I0319 19:21:55.401301 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:21:55 crc kubenswrapper[4826]: I0319 19:21:55.401348 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" Mar 19 19:21:55 crc kubenswrapper[4826]: I0319 19:21:55.402174 4826 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"856447f1cdc796c080402d3bfb76d7471741ca95039714006756d0cb980e424c"} pod="openshift-machine-config-operator/machine-config-daemon-zz87p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 19:21:55 crc kubenswrapper[4826]: I0319 19:21:55.402232 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" containerID="cri-o://856447f1cdc796c080402d3bfb76d7471741ca95039714006756d0cb980e424c" gracePeriod=600 Mar 19 19:21:55 crc kubenswrapper[4826]: I0319 19:21:55.466891 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2871d9a2-f531-494d-bbc0-0c861bd78686","Type":"ContainerDied","Data":"cb5a54c730ec4dc361ae794ceca976eea05af5e6ceaa3e176c9f15b322e470da"} Mar 19 19:21:55 crc kubenswrapper[4826]: I0319 19:21:55.466951 4826 scope.go:117] "RemoveContainer" containerID="7ecc35ff1fbf2252bd31a568aaab00e04b12f3f239e858907e692dc14b2cb543" Mar 19 19:21:55 crc kubenswrapper[4826]: I0319 19:21:55.467100 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:21:55 crc kubenswrapper[4826]: I0319 19:21:55.473432 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vt6c6" event={"ID":"bd1bf3ae-da1b-48c4-a1fb-b185d9af0440","Type":"ContainerStarted","Data":"c8c7e99525722c823cd41b44e6fc4aab3fd1ced1f3bfed12c3e2b2a6ee99dc79"} Mar 19 19:21:55 crc kubenswrapper[4826]: I0319 19:21:55.476918 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"06606200-618f-46b2-afb9-e5e2738fe2dd","Type":"ContainerStarted","Data":"81d2c55cdb1e1d21961a423a119cafb491b9ef9812c7bd1544f75f348811b865"} Mar 19 19:21:55 crc kubenswrapper[4826]: I0319 19:21:55.477721 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 19 19:21:55 crc kubenswrapper[4826]: I0319 19:21:55.485784 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"a472fa43-9652-40de-9038-082a4314b962","Type":"ContainerStarted","Data":"8220af8d1fd93174d9eff619aab79145989d48d2658022a4507a9308c3810228"} Mar 19 19:21:55 crc kubenswrapper[4826]: I0319 19:21:55.503254 4826 scope.go:117] "RemoveContainer" containerID="740b071601440020c10b8738fcbe66615ffcf7efaa3ed86d59777bd00704c87f" Mar 19 19:21:55 crc kubenswrapper[4826]: E0319 19:21:55.542547 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:21:55 crc kubenswrapper[4826]: I0319 19:21:55.547714 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.176450311 podStartE2EDuration="3.547693393s" podCreationTimestamp="2026-03-19 19:21:52 +0000 UTC" firstStartedPulling="2026-03-19 19:21:53.569401759 +0000 UTC m=+1538.323470072" lastFinishedPulling="2026-03-19 19:21:53.940644841 +0000 UTC m=+1538.694713154" observedRunningTime="2026-03-19 19:21:55.526876279 +0000 UTC m=+1540.280944592" watchObservedRunningTime="2026-03-19 19:21:55.547693393 +0000 UTC m=+1540.301761716" Mar 19 19:21:55 crc kubenswrapper[4826]: I0319 19:21:55.595881 4826 scope.go:117] "RemoveContainer" containerID="29b977b63eb31265f7dfb8fbdc21a0f9c9dec3c445d20fa03128babcecd48db0" Mar 19 19:21:55 crc kubenswrapper[4826]: I0319 19:21:55.599819 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:21:55 crc kubenswrapper[4826]: I0319 19:21:55.613217 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:21:55 crc kubenswrapper[4826]: I0319 19:21:55.633063 4826 scope.go:117] "RemoveContainer" containerID="ed7e294926a0ef197174066c924ebcfb807fcd4eaf2ebe457810591beb97a0c9" Mar 19 19:21:55 crc kubenswrapper[4826]: I0319 19:21:55.635300 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-0" podStartSLOduration=3.094179621 podStartE2EDuration="3.635280939s" podCreationTimestamp="2026-03-19 19:21:52 +0000 UTC" firstStartedPulling="2026-03-19 19:21:53.580759473 +0000 UTC m=+1538.334827786" lastFinishedPulling="2026-03-19 19:21:54.121860791 +0000 UTC m=+1538.875929104" observedRunningTime="2026-03-19 19:21:55.553874542 +0000 UTC m=+1540.307942855" watchObservedRunningTime="2026-03-19 19:21:55.635280939 +0000 UTC m=+1540.389349252" Mar 19 19:21:55 crc kubenswrapper[4826]: I0319 19:21:55.649116 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:21:55 crc kubenswrapper[4826]: E0319 19:21:55.649957 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2871d9a2-f531-494d-bbc0-0c861bd78686" containerName="ceilometer-notification-agent" Mar 19 19:21:55 crc kubenswrapper[4826]: I0319 19:21:55.649983 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="2871d9a2-f531-494d-bbc0-0c861bd78686" containerName="ceilometer-notification-agent" Mar 19 19:21:55 crc kubenswrapper[4826]: E0319 19:21:55.650008 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2871d9a2-f531-494d-bbc0-0c861bd78686" containerName="proxy-httpd" Mar 19 19:21:55 crc kubenswrapper[4826]: I0319 19:21:55.650018 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="2871d9a2-f531-494d-bbc0-0c861bd78686" containerName="proxy-httpd" Mar 19 19:21:55 crc kubenswrapper[4826]: E0319 19:21:55.650046 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2871d9a2-f531-494d-bbc0-0c861bd78686" containerName="ceilometer-central-agent" Mar 19 19:21:55 crc kubenswrapper[4826]: I0319 19:21:55.650057 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="2871d9a2-f531-494d-bbc0-0c861bd78686" containerName="ceilometer-central-agent" Mar 19 19:21:55 crc kubenswrapper[4826]: E0319 19:21:55.650070 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2871d9a2-f531-494d-bbc0-0c861bd78686" containerName="sg-core" Mar 19 19:21:55 crc kubenswrapper[4826]: I0319 19:21:55.650077 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="2871d9a2-f531-494d-bbc0-0c861bd78686" containerName="sg-core" Mar 19 19:21:55 crc kubenswrapper[4826]: I0319 19:21:55.650395 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="2871d9a2-f531-494d-bbc0-0c861bd78686" containerName="ceilometer-central-agent" Mar 19 19:21:55 crc kubenswrapper[4826]: I0319 19:21:55.650414 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="2871d9a2-f531-494d-bbc0-0c861bd78686" containerName="ceilometer-notification-agent" Mar 19 19:21:55 crc kubenswrapper[4826]: I0319 19:21:55.650437 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="2871d9a2-f531-494d-bbc0-0c861bd78686" containerName="sg-core" Mar 19 19:21:55 crc kubenswrapper[4826]: I0319 19:21:55.650458 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="2871d9a2-f531-494d-bbc0-0c861bd78686" containerName="proxy-httpd" Mar 19 19:21:55 crc kubenswrapper[4826]: I0319 19:21:55.653325 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:21:55 crc kubenswrapper[4826]: I0319 19:21:55.656959 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 19 19:21:55 crc kubenswrapper[4826]: I0319 19:21:55.657207 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 19 19:21:55 crc kubenswrapper[4826]: I0319 19:21:55.657322 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 19 19:21:55 crc kubenswrapper[4826]: I0319 19:21:55.662773 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:21:55 crc kubenswrapper[4826]: I0319 19:21:55.833787 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7c065c18-3cb1-46f6-9d8d-b8acd51eb933-log-httpd\") pod \"ceilometer-0\" (UID: \"7c065c18-3cb1-46f6-9d8d-b8acd51eb933\") " pod="openstack/ceilometer-0" Mar 19 19:21:55 crc kubenswrapper[4826]: I0319 19:21:55.833823 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c065c18-3cb1-46f6-9d8d-b8acd51eb933-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7c065c18-3cb1-46f6-9d8d-b8acd51eb933\") " pod="openstack/ceilometer-0" Mar 19 19:21:55 crc kubenswrapper[4826]: I0319 19:21:55.833853 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c065c18-3cb1-46f6-9d8d-b8acd51eb933-config-data\") pod \"ceilometer-0\" (UID: \"7c065c18-3cb1-46f6-9d8d-b8acd51eb933\") " pod="openstack/ceilometer-0" Mar 19 19:21:55 crc kubenswrapper[4826]: I0319 19:21:55.834041 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7c065c18-3cb1-46f6-9d8d-b8acd51eb933-run-httpd\") pod \"ceilometer-0\" (UID: \"7c065c18-3cb1-46f6-9d8d-b8acd51eb933\") " pod="openstack/ceilometer-0" Mar 19 19:21:55 crc kubenswrapper[4826]: I0319 19:21:55.834157 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7c065c18-3cb1-46f6-9d8d-b8acd51eb933-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7c065c18-3cb1-46f6-9d8d-b8acd51eb933\") " pod="openstack/ceilometer-0" Mar 19 19:21:55 crc kubenswrapper[4826]: I0319 19:21:55.834389 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c065c18-3cb1-46f6-9d8d-b8acd51eb933-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7c065c18-3cb1-46f6-9d8d-b8acd51eb933\") " pod="openstack/ceilometer-0" Mar 19 19:21:55 crc kubenswrapper[4826]: I0319 19:21:55.834519 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c065c18-3cb1-46f6-9d8d-b8acd51eb933-scripts\") pod \"ceilometer-0\" (UID: \"7c065c18-3cb1-46f6-9d8d-b8acd51eb933\") " pod="openstack/ceilometer-0" Mar 19 19:21:55 crc kubenswrapper[4826]: I0319 19:21:55.834640 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gl7w8\" (UniqueName: \"kubernetes.io/projected/7c065c18-3cb1-46f6-9d8d-b8acd51eb933-kube-api-access-gl7w8\") pod \"ceilometer-0\" (UID: \"7c065c18-3cb1-46f6-9d8d-b8acd51eb933\") " pod="openstack/ceilometer-0" Mar 19 19:21:55 crc kubenswrapper[4826]: I0319 19:21:55.936320 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7c065c18-3cb1-46f6-9d8d-b8acd51eb933-log-httpd\") pod \"ceilometer-0\" (UID: \"7c065c18-3cb1-46f6-9d8d-b8acd51eb933\") " pod="openstack/ceilometer-0" Mar 19 19:21:55 crc kubenswrapper[4826]: I0319 19:21:55.936669 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c065c18-3cb1-46f6-9d8d-b8acd51eb933-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7c065c18-3cb1-46f6-9d8d-b8acd51eb933\") " pod="openstack/ceilometer-0" Mar 19 19:21:55 crc kubenswrapper[4826]: I0319 19:21:55.936711 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c065c18-3cb1-46f6-9d8d-b8acd51eb933-config-data\") pod \"ceilometer-0\" (UID: \"7c065c18-3cb1-46f6-9d8d-b8acd51eb933\") " pod="openstack/ceilometer-0" Mar 19 19:21:55 crc kubenswrapper[4826]: I0319 19:21:55.936745 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7c065c18-3cb1-46f6-9d8d-b8acd51eb933-run-httpd\") pod \"ceilometer-0\" (UID: \"7c065c18-3cb1-46f6-9d8d-b8acd51eb933\") " pod="openstack/ceilometer-0" Mar 19 19:21:55 crc kubenswrapper[4826]: I0319 19:21:55.936781 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7c065c18-3cb1-46f6-9d8d-b8acd51eb933-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7c065c18-3cb1-46f6-9d8d-b8acd51eb933\") " pod="openstack/ceilometer-0" Mar 19 19:21:55 crc kubenswrapper[4826]: I0319 19:21:55.936833 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c065c18-3cb1-46f6-9d8d-b8acd51eb933-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7c065c18-3cb1-46f6-9d8d-b8acd51eb933\") " pod="openstack/ceilometer-0" Mar 19 19:21:55 crc kubenswrapper[4826]: I0319 19:21:55.936830 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7c065c18-3cb1-46f6-9d8d-b8acd51eb933-log-httpd\") pod \"ceilometer-0\" (UID: \"7c065c18-3cb1-46f6-9d8d-b8acd51eb933\") " pod="openstack/ceilometer-0" Mar 19 19:21:55 crc kubenswrapper[4826]: I0319 19:21:55.936869 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c065c18-3cb1-46f6-9d8d-b8acd51eb933-scripts\") pod \"ceilometer-0\" (UID: \"7c065c18-3cb1-46f6-9d8d-b8acd51eb933\") " pod="openstack/ceilometer-0" Mar 19 19:21:55 crc kubenswrapper[4826]: I0319 19:21:55.936916 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gl7w8\" (UniqueName: \"kubernetes.io/projected/7c065c18-3cb1-46f6-9d8d-b8acd51eb933-kube-api-access-gl7w8\") pod \"ceilometer-0\" (UID: \"7c065c18-3cb1-46f6-9d8d-b8acd51eb933\") " pod="openstack/ceilometer-0" Mar 19 19:21:55 crc kubenswrapper[4826]: I0319 19:21:55.937150 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7c065c18-3cb1-46f6-9d8d-b8acd51eb933-run-httpd\") pod \"ceilometer-0\" (UID: \"7c065c18-3cb1-46f6-9d8d-b8acd51eb933\") " pod="openstack/ceilometer-0" Mar 19 19:21:55 crc kubenswrapper[4826]: I0319 19:21:55.942328 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c065c18-3cb1-46f6-9d8d-b8acd51eb933-config-data\") pod \"ceilometer-0\" (UID: \"7c065c18-3cb1-46f6-9d8d-b8acd51eb933\") " pod="openstack/ceilometer-0" Mar 19 19:21:55 crc kubenswrapper[4826]: I0319 19:21:55.943189 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c065c18-3cb1-46f6-9d8d-b8acd51eb933-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7c065c18-3cb1-46f6-9d8d-b8acd51eb933\") " pod="openstack/ceilometer-0" Mar 19 19:21:55 crc kubenswrapper[4826]: I0319 19:21:55.943780 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7c065c18-3cb1-46f6-9d8d-b8acd51eb933-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7c065c18-3cb1-46f6-9d8d-b8acd51eb933\") " pod="openstack/ceilometer-0" Mar 19 19:21:55 crc kubenswrapper[4826]: I0319 19:21:55.943937 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c065c18-3cb1-46f6-9d8d-b8acd51eb933-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7c065c18-3cb1-46f6-9d8d-b8acd51eb933\") " pod="openstack/ceilometer-0" Mar 19 19:21:55 crc kubenswrapper[4826]: I0319 19:21:55.945286 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c065c18-3cb1-46f6-9d8d-b8acd51eb933-scripts\") pod \"ceilometer-0\" (UID: \"7c065c18-3cb1-46f6-9d8d-b8acd51eb933\") " pod="openstack/ceilometer-0" Mar 19 19:21:55 crc kubenswrapper[4826]: I0319 19:21:55.964092 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gl7w8\" (UniqueName: \"kubernetes.io/projected/7c065c18-3cb1-46f6-9d8d-b8acd51eb933-kube-api-access-gl7w8\") pod \"ceilometer-0\" (UID: \"7c065c18-3cb1-46f6-9d8d-b8acd51eb933\") " pod="openstack/ceilometer-0" Mar 19 19:21:55 crc kubenswrapper[4826]: I0319 19:21:55.986685 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:21:56 crc kubenswrapper[4826]: I0319 19:21:56.005398 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2871d9a2-f531-494d-bbc0-0c861bd78686" path="/var/lib/kubelet/pods/2871d9a2-f531-494d-bbc0-0c861bd78686/volumes" Mar 19 19:21:56 crc kubenswrapper[4826]: W0319 19:21:56.499264 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c065c18_3cb1_46f6_9d8d_b8acd51eb933.slice/crio-41dea259fd69848f6dd597e0a493342713ef6e92c3bec929fad2da9a3f66eeb0 WatchSource:0}: Error finding container 41dea259fd69848f6dd597e0a493342713ef6e92c3bec929fad2da9a3f66eeb0: Status 404 returned error can't find the container with id 41dea259fd69848f6dd597e0a493342713ef6e92c3bec929fad2da9a3f66eeb0 Mar 19 19:21:56 crc kubenswrapper[4826]: I0319 19:21:56.500534 4826 generic.go:334] "Generic (PLEG): container finished" podID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerID="856447f1cdc796c080402d3bfb76d7471741ca95039714006756d0cb980e424c" exitCode=0 Mar 19 19:21:56 crc kubenswrapper[4826]: I0319 19:21:56.500629 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" event={"ID":"b456fa3f-c7a7-45ca-b560-e7a9b21be05a","Type":"ContainerDied","Data":"856447f1cdc796c080402d3bfb76d7471741ca95039714006756d0cb980e424c"} Mar 19 19:21:56 crc kubenswrapper[4826]: I0319 19:21:56.500788 4826 scope.go:117] "RemoveContainer" containerID="1238977d0e09446586a9032546be2d2ff642cd7a1d8371018f40396f2b3eff68" Mar 19 19:21:56 crc kubenswrapper[4826]: I0319 19:21:56.501558 4826 scope.go:117] "RemoveContainer" containerID="856447f1cdc796c080402d3bfb76d7471741ca95039714006756d0cb980e424c" Mar 19 19:21:56 crc kubenswrapper[4826]: E0319 19:21:56.502122 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:21:56 crc kubenswrapper[4826]: I0319 19:21:56.506704 4826 generic.go:334] "Generic (PLEG): container finished" podID="bd1bf3ae-da1b-48c4-a1fb-b185d9af0440" containerID="c8c7e99525722c823cd41b44e6fc4aab3fd1ced1f3bfed12c3e2b2a6ee99dc79" exitCode=0 Mar 19 19:21:56 crc kubenswrapper[4826]: I0319 19:21:56.506797 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vt6c6" event={"ID":"bd1bf3ae-da1b-48c4-a1fb-b185d9af0440","Type":"ContainerDied","Data":"c8c7e99525722c823cd41b44e6fc4aab3fd1ced1f3bfed12c3e2b2a6ee99dc79"} Mar 19 19:21:56 crc kubenswrapper[4826]: I0319 19:21:56.509453 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:21:57 crc kubenswrapper[4826]: I0319 19:21:57.519034 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7c065c18-3cb1-46f6-9d8d-b8acd51eb933","Type":"ContainerStarted","Data":"2983ef44b125d695c38d8c53c1cf5447b2305d4b4a37af84d506ccb8a394e0f9"} Mar 19 19:21:57 crc kubenswrapper[4826]: I0319 19:21:57.519563 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7c065c18-3cb1-46f6-9d8d-b8acd51eb933","Type":"ContainerStarted","Data":"41dea259fd69848f6dd597e0a493342713ef6e92c3bec929fad2da9a3f66eeb0"} Mar 19 19:21:57 crc kubenswrapper[4826]: I0319 19:21:57.524934 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vt6c6" event={"ID":"bd1bf3ae-da1b-48c4-a1fb-b185d9af0440","Type":"ContainerStarted","Data":"9573f814854db176be441430dc75696361b4dd70f44144ecee45928e29b620af"} Mar 19 19:21:57 crc kubenswrapper[4826]: I0319 19:21:57.551598 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vt6c6" podStartSLOduration=2.8442173200000003 podStartE2EDuration="5.551577085s" podCreationTimestamp="2026-03-19 19:21:52 +0000 UTC" firstStartedPulling="2026-03-19 19:21:54.44951099 +0000 UTC m=+1539.203579303" lastFinishedPulling="2026-03-19 19:21:57.156870755 +0000 UTC m=+1541.910939068" observedRunningTime="2026-03-19 19:21:57.542708461 +0000 UTC m=+1542.296776824" watchObservedRunningTime="2026-03-19 19:21:57.551577085 +0000 UTC m=+1542.305645408" Mar 19 19:21:58 crc kubenswrapper[4826]: I0319 19:21:58.537557 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7c065c18-3cb1-46f6-9d8d-b8acd51eb933","Type":"ContainerStarted","Data":"b98ed83fad09b0ae1304a85c937ab659ddd95c941aaffe2313de3d75ea41161e"} Mar 19 19:21:59 crc kubenswrapper[4826]: I0319 19:21:59.452709 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-xfct2"] Mar 19 19:21:59 crc kubenswrapper[4826]: I0319 19:21:59.468457 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-xfct2"] Mar 19 19:21:59 crc kubenswrapper[4826]: I0319 19:21:59.549518 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-pvzvx"] Mar 19 19:21:59 crc kubenswrapper[4826]: I0319 19:21:59.551570 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7c065c18-3cb1-46f6-9d8d-b8acd51eb933","Type":"ContainerStarted","Data":"173d9613546ad6e955d3b37fe9e4f67e5b002df88700f3b9adbbe182ac4ac885"} Mar 19 19:21:59 crc kubenswrapper[4826]: I0319 19:21:59.551705 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-pvzvx" Mar 19 19:21:59 crc kubenswrapper[4826]: I0319 19:21:59.567407 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-pvzvx"] Mar 19 19:21:59 crc kubenswrapper[4826]: I0319 19:21:59.751042 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nr8jz\" (UniqueName: \"kubernetes.io/projected/be3204f1-b777-49ae-8ba5-2f30f639dd1e-kube-api-access-nr8jz\") pod \"heat-db-sync-pvzvx\" (UID: \"be3204f1-b777-49ae-8ba5-2f30f639dd1e\") " pod="openstack/heat-db-sync-pvzvx" Mar 19 19:21:59 crc kubenswrapper[4826]: I0319 19:21:59.751424 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be3204f1-b777-49ae-8ba5-2f30f639dd1e-combined-ca-bundle\") pod \"heat-db-sync-pvzvx\" (UID: \"be3204f1-b777-49ae-8ba5-2f30f639dd1e\") " pod="openstack/heat-db-sync-pvzvx" Mar 19 19:21:59 crc kubenswrapper[4826]: I0319 19:21:59.751472 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be3204f1-b777-49ae-8ba5-2f30f639dd1e-config-data\") pod \"heat-db-sync-pvzvx\" (UID: \"be3204f1-b777-49ae-8ba5-2f30f639dd1e\") " pod="openstack/heat-db-sync-pvzvx" Mar 19 19:21:59 crc kubenswrapper[4826]: I0319 19:21:59.852683 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nr8jz\" (UniqueName: \"kubernetes.io/projected/be3204f1-b777-49ae-8ba5-2f30f639dd1e-kube-api-access-nr8jz\") pod \"heat-db-sync-pvzvx\" (UID: \"be3204f1-b777-49ae-8ba5-2f30f639dd1e\") " pod="openstack/heat-db-sync-pvzvx" Mar 19 19:21:59 crc kubenswrapper[4826]: I0319 19:21:59.852793 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be3204f1-b777-49ae-8ba5-2f30f639dd1e-combined-ca-bundle\") pod \"heat-db-sync-pvzvx\" (UID: \"be3204f1-b777-49ae-8ba5-2f30f639dd1e\") " pod="openstack/heat-db-sync-pvzvx" Mar 19 19:21:59 crc kubenswrapper[4826]: I0319 19:21:59.852830 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be3204f1-b777-49ae-8ba5-2f30f639dd1e-config-data\") pod \"heat-db-sync-pvzvx\" (UID: \"be3204f1-b777-49ae-8ba5-2f30f639dd1e\") " pod="openstack/heat-db-sync-pvzvx" Mar 19 19:21:59 crc kubenswrapper[4826]: I0319 19:21:59.858484 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be3204f1-b777-49ae-8ba5-2f30f639dd1e-combined-ca-bundle\") pod \"heat-db-sync-pvzvx\" (UID: \"be3204f1-b777-49ae-8ba5-2f30f639dd1e\") " pod="openstack/heat-db-sync-pvzvx" Mar 19 19:21:59 crc kubenswrapper[4826]: I0319 19:21:59.865938 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be3204f1-b777-49ae-8ba5-2f30f639dd1e-config-data\") pod \"heat-db-sync-pvzvx\" (UID: \"be3204f1-b777-49ae-8ba5-2f30f639dd1e\") " pod="openstack/heat-db-sync-pvzvx" Mar 19 19:21:59 crc kubenswrapper[4826]: I0319 19:21:59.869134 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nr8jz\" (UniqueName: \"kubernetes.io/projected/be3204f1-b777-49ae-8ba5-2f30f639dd1e-kube-api-access-nr8jz\") pod \"heat-db-sync-pvzvx\" (UID: \"be3204f1-b777-49ae-8ba5-2f30f639dd1e\") " pod="openstack/heat-db-sync-pvzvx" Mar 19 19:21:59 crc kubenswrapper[4826]: I0319 19:21:59.884364 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-pvzvx" Mar 19 19:22:00 crc kubenswrapper[4826]: I0319 19:22:00.033333 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d92c353f-6fae-4be8-8580-4066bb56e856" path="/var/lib/kubelet/pods/d92c353f-6fae-4be8-8580-4066bb56e856/volumes" Mar 19 19:22:00 crc kubenswrapper[4826]: I0319 19:22:00.157805 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565802-wsq4t"] Mar 19 19:22:00 crc kubenswrapper[4826]: I0319 19:22:00.160012 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565802-wsq4t" Mar 19 19:22:00 crc kubenswrapper[4826]: I0319 19:22:00.162065 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b27wl" Mar 19 19:22:00 crc kubenswrapper[4826]: I0319 19:22:00.162098 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 19:22:00 crc kubenswrapper[4826]: I0319 19:22:00.162868 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 19:22:00 crc kubenswrapper[4826]: I0319 19:22:00.163957 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl9dm\" (UniqueName: \"kubernetes.io/projected/f8d51938-c21d-4578-afe3-ecbbf8d67bd2-kube-api-access-fl9dm\") pod \"auto-csr-approver-29565802-wsq4t\" (UID: \"f8d51938-c21d-4578-afe3-ecbbf8d67bd2\") " pod="openshift-infra/auto-csr-approver-29565802-wsq4t" Mar 19 19:22:00 crc kubenswrapper[4826]: I0319 19:22:00.183715 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565802-wsq4t"] Mar 19 19:22:00 crc kubenswrapper[4826]: I0319 19:22:00.266895 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fl9dm\" (UniqueName: \"kubernetes.io/projected/f8d51938-c21d-4578-afe3-ecbbf8d67bd2-kube-api-access-fl9dm\") pod \"auto-csr-approver-29565802-wsq4t\" (UID: \"f8d51938-c21d-4578-afe3-ecbbf8d67bd2\") " pod="openshift-infra/auto-csr-approver-29565802-wsq4t" Mar 19 19:22:00 crc kubenswrapper[4826]: I0319 19:22:00.284587 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl9dm\" (UniqueName: \"kubernetes.io/projected/f8d51938-c21d-4578-afe3-ecbbf8d67bd2-kube-api-access-fl9dm\") pod \"auto-csr-approver-29565802-wsq4t\" (UID: \"f8d51938-c21d-4578-afe3-ecbbf8d67bd2\") " pod="openshift-infra/auto-csr-approver-29565802-wsq4t" Mar 19 19:22:00 crc kubenswrapper[4826]: I0319 19:22:00.401444 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-pvzvx"] Mar 19 19:22:00 crc kubenswrapper[4826]: W0319 19:22:00.403812 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe3204f1_b777_49ae_8ba5_2f30f639dd1e.slice/crio-2f8d9bece542e6e111d1cd175f7388306671578d27f4e7d2a6941db673d89e5e WatchSource:0}: Error finding container 2f8d9bece542e6e111d1cd175f7388306671578d27f4e7d2a6941db673d89e5e: Status 404 returned error can't find the container with id 2f8d9bece542e6e111d1cd175f7388306671578d27f4e7d2a6941db673d89e5e Mar 19 19:22:00 crc kubenswrapper[4826]: I0319 19:22:00.489703 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565802-wsq4t" Mar 19 19:22:00 crc kubenswrapper[4826]: I0319 19:22:00.565358 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-pvzvx" event={"ID":"be3204f1-b777-49ae-8ba5-2f30f639dd1e","Type":"ContainerStarted","Data":"2f8d9bece542e6e111d1cd175f7388306671578d27f4e7d2a6941db673d89e5e"} Mar 19 19:22:00 crc kubenswrapper[4826]: I0319 19:22:00.982700 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565802-wsq4t"] Mar 19 19:22:00 crc kubenswrapper[4826]: W0319 19:22:00.988714 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8d51938_c21d_4578_afe3_ecbbf8d67bd2.slice/crio-7fa5934b782f6c5910f2ce829515c1fbe29c0ae1374730fdcb5e8b8ab032faba WatchSource:0}: Error finding container 7fa5934b782f6c5910f2ce829515c1fbe29c0ae1374730fdcb5e8b8ab032faba: Status 404 returned error can't find the container with id 7fa5934b782f6c5910f2ce829515c1fbe29c0ae1374730fdcb5e8b8ab032faba Mar 19 19:22:01 crc kubenswrapper[4826]: I0319 19:22:01.582306 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7c065c18-3cb1-46f6-9d8d-b8acd51eb933","Type":"ContainerStarted","Data":"ba1baf74e4b19172dc90c6ef4fe85222e46a7144877370076758cbc3b75dcc99"} Mar 19 19:22:01 crc kubenswrapper[4826]: I0319 19:22:01.583899 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 19 19:22:01 crc kubenswrapper[4826]: I0319 19:22:01.590836 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565802-wsq4t" event={"ID":"f8d51938-c21d-4578-afe3-ecbbf8d67bd2","Type":"ContainerStarted","Data":"7fa5934b782f6c5910f2ce829515c1fbe29c0ae1374730fdcb5e8b8ab032faba"} Mar 19 19:22:01 crc kubenswrapper[4826]: I0319 19:22:01.889253 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.301729677 podStartE2EDuration="6.889236563s" podCreationTimestamp="2026-03-19 19:21:55 +0000 UTC" firstStartedPulling="2026-03-19 19:21:56.504036017 +0000 UTC m=+1541.258104330" lastFinishedPulling="2026-03-19 19:22:01.091542903 +0000 UTC m=+1545.845611216" observedRunningTime="2026-03-19 19:22:01.606987241 +0000 UTC m=+1546.361055574" watchObservedRunningTime="2026-03-19 19:22:01.889236563 +0000 UTC m=+1546.643304876" Mar 19 19:22:01 crc kubenswrapper[4826]: I0319 19:22:01.904125 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 19 19:22:02 crc kubenswrapper[4826]: I0319 19:22:02.529227 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:22:02 crc kubenswrapper[4826]: I0319 19:22:02.926814 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 19:22:02 crc kubenswrapper[4826]: I0319 19:22:02.929268 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 19 19:22:02 crc kubenswrapper[4826]: I0319 19:22:02.951709 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vt6c6" Mar 19 19:22:02 crc kubenswrapper[4826]: I0319 19:22:02.952217 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vt6c6" Mar 19 19:22:03 crc kubenswrapper[4826]: I0319 19:22:03.060582 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vt6c6" Mar 19 19:22:03 crc kubenswrapper[4826]: I0319 19:22:03.644861 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7c065c18-3cb1-46f6-9d8d-b8acd51eb933" containerName="ceilometer-central-agent" containerID="cri-o://2983ef44b125d695c38d8c53c1cf5447b2305d4b4a37af84d506ccb8a394e0f9" gracePeriod=30 Mar 19 19:22:03 crc kubenswrapper[4826]: I0319 19:22:03.645767 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565802-wsq4t" event={"ID":"f8d51938-c21d-4578-afe3-ecbbf8d67bd2","Type":"ContainerStarted","Data":"760861aec5f18a584b8e36885125a3777bbbca2857f1b2c7547668c3e91fe1d9"} Mar 19 19:22:03 crc kubenswrapper[4826]: I0319 19:22:03.645997 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7c065c18-3cb1-46f6-9d8d-b8acd51eb933" containerName="proxy-httpd" containerID="cri-o://ba1baf74e4b19172dc90c6ef4fe85222e46a7144877370076758cbc3b75dcc99" gracePeriod=30 Mar 19 19:22:03 crc kubenswrapper[4826]: I0319 19:22:03.646075 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7c065c18-3cb1-46f6-9d8d-b8acd51eb933" containerName="sg-core" containerID="cri-o://173d9613546ad6e955d3b37fe9e4f67e5b002df88700f3b9adbbe182ac4ac885" gracePeriod=30 Mar 19 19:22:03 crc kubenswrapper[4826]: I0319 19:22:03.646113 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7c065c18-3cb1-46f6-9d8d-b8acd51eb933" containerName="ceilometer-notification-agent" containerID="cri-o://b98ed83fad09b0ae1304a85c937ab659ddd95c941aaffe2313de3d75ea41161e" gracePeriod=30 Mar 19 19:22:03 crc kubenswrapper[4826]: I0319 19:22:03.727075 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565802-wsq4t" podStartSLOduration=2.695469829 podStartE2EDuration="3.727051802s" podCreationTimestamp="2026-03-19 19:22:00 +0000 UTC" firstStartedPulling="2026-03-19 19:22:00.994849406 +0000 UTC m=+1545.748917719" lastFinishedPulling="2026-03-19 19:22:02.026431379 +0000 UTC m=+1546.780499692" observedRunningTime="2026-03-19 19:22:03.675118967 +0000 UTC m=+1548.429187270" watchObservedRunningTime="2026-03-19 19:22:03.727051802 +0000 UTC m=+1548.481120115" Mar 19 19:22:03 crc kubenswrapper[4826]: I0319 19:22:03.796287 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vt6c6" Mar 19 19:22:03 crc kubenswrapper[4826]: I0319 19:22:03.911252 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vt6c6"] Mar 19 19:22:04 crc kubenswrapper[4826]: I0319 19:22:04.667813 4826 generic.go:334] "Generic (PLEG): container finished" podID="7c065c18-3cb1-46f6-9d8d-b8acd51eb933" containerID="ba1baf74e4b19172dc90c6ef4fe85222e46a7144877370076758cbc3b75dcc99" exitCode=0 Mar 19 19:22:04 crc kubenswrapper[4826]: I0319 19:22:04.667856 4826 generic.go:334] "Generic (PLEG): container finished" podID="7c065c18-3cb1-46f6-9d8d-b8acd51eb933" containerID="173d9613546ad6e955d3b37fe9e4f67e5b002df88700f3b9adbbe182ac4ac885" exitCode=2 Mar 19 19:22:04 crc kubenswrapper[4826]: I0319 19:22:04.667867 4826 generic.go:334] "Generic (PLEG): container finished" podID="7c065c18-3cb1-46f6-9d8d-b8acd51eb933" containerID="b98ed83fad09b0ae1304a85c937ab659ddd95c941aaffe2313de3d75ea41161e" exitCode=0 Mar 19 19:22:04 crc kubenswrapper[4826]: I0319 19:22:04.667870 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7c065c18-3cb1-46f6-9d8d-b8acd51eb933","Type":"ContainerDied","Data":"ba1baf74e4b19172dc90c6ef4fe85222e46a7144877370076758cbc3b75dcc99"} Mar 19 19:22:04 crc kubenswrapper[4826]: I0319 19:22:04.667921 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7c065c18-3cb1-46f6-9d8d-b8acd51eb933","Type":"ContainerDied","Data":"173d9613546ad6e955d3b37fe9e4f67e5b002df88700f3b9adbbe182ac4ac885"} Mar 19 19:22:04 crc kubenswrapper[4826]: I0319 19:22:04.667932 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7c065c18-3cb1-46f6-9d8d-b8acd51eb933","Type":"ContainerDied","Data":"b98ed83fad09b0ae1304a85c937ab659ddd95c941aaffe2313de3d75ea41161e"} Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.399432 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.429633 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c065c18-3cb1-46f6-9d8d-b8acd51eb933-ceilometer-tls-certs\") pod \"7c065c18-3cb1-46f6-9d8d-b8acd51eb933\" (UID: \"7c065c18-3cb1-46f6-9d8d-b8acd51eb933\") " Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.429729 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gl7w8\" (UniqueName: \"kubernetes.io/projected/7c065c18-3cb1-46f6-9d8d-b8acd51eb933-kube-api-access-gl7w8\") pod \"7c065c18-3cb1-46f6-9d8d-b8acd51eb933\" (UID: \"7c065c18-3cb1-46f6-9d8d-b8acd51eb933\") " Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.429934 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c065c18-3cb1-46f6-9d8d-b8acd51eb933-combined-ca-bundle\") pod \"7c065c18-3cb1-46f6-9d8d-b8acd51eb933\" (UID: \"7c065c18-3cb1-46f6-9d8d-b8acd51eb933\") " Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.430424 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7c065c18-3cb1-46f6-9d8d-b8acd51eb933-run-httpd\") pod \"7c065c18-3cb1-46f6-9d8d-b8acd51eb933\" (UID: \"7c065c18-3cb1-46f6-9d8d-b8acd51eb933\") " Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.430483 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7c065c18-3cb1-46f6-9d8d-b8acd51eb933-sg-core-conf-yaml\") pod \"7c065c18-3cb1-46f6-9d8d-b8acd51eb933\" (UID: \"7c065c18-3cb1-46f6-9d8d-b8acd51eb933\") " Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.430546 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c065c18-3cb1-46f6-9d8d-b8acd51eb933-config-data\") pod \"7c065c18-3cb1-46f6-9d8d-b8acd51eb933\" (UID: \"7c065c18-3cb1-46f6-9d8d-b8acd51eb933\") " Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.430619 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7c065c18-3cb1-46f6-9d8d-b8acd51eb933-log-httpd\") pod \"7c065c18-3cb1-46f6-9d8d-b8acd51eb933\" (UID: \"7c065c18-3cb1-46f6-9d8d-b8acd51eb933\") " Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.430711 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c065c18-3cb1-46f6-9d8d-b8acd51eb933-scripts\") pod \"7c065c18-3cb1-46f6-9d8d-b8acd51eb933\" (UID: \"7c065c18-3cb1-46f6-9d8d-b8acd51eb933\") " Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.430738 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c065c18-3cb1-46f6-9d8d-b8acd51eb933-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7c065c18-3cb1-46f6-9d8d-b8acd51eb933" (UID: "7c065c18-3cb1-46f6-9d8d-b8acd51eb933"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.431052 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c065c18-3cb1-46f6-9d8d-b8acd51eb933-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7c065c18-3cb1-46f6-9d8d-b8acd51eb933" (UID: "7c065c18-3cb1-46f6-9d8d-b8acd51eb933"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.437424 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c065c18-3cb1-46f6-9d8d-b8acd51eb933-scripts" (OuterVolumeSpecName: "scripts") pod "7c065c18-3cb1-46f6-9d8d-b8acd51eb933" (UID: "7c065c18-3cb1-46f6-9d8d-b8acd51eb933"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.437752 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c065c18-3cb1-46f6-9d8d-b8acd51eb933-kube-api-access-gl7w8" (OuterVolumeSpecName: "kube-api-access-gl7w8") pod "7c065c18-3cb1-46f6-9d8d-b8acd51eb933" (UID: "7c065c18-3cb1-46f6-9d8d-b8acd51eb933"). InnerVolumeSpecName "kube-api-access-gl7w8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.503556 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c065c18-3cb1-46f6-9d8d-b8acd51eb933-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7c065c18-3cb1-46f6-9d8d-b8acd51eb933" (UID: "7c065c18-3cb1-46f6-9d8d-b8acd51eb933"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.535584 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gl7w8\" (UniqueName: \"kubernetes.io/projected/7c065c18-3cb1-46f6-9d8d-b8acd51eb933-kube-api-access-gl7w8\") on node \"crc\" DevicePath \"\"" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.535615 4826 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7c065c18-3cb1-46f6-9d8d-b8acd51eb933-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.535624 4826 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7c065c18-3cb1-46f6-9d8d-b8acd51eb933-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.535633 4826 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7c065c18-3cb1-46f6-9d8d-b8acd51eb933-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.535640 4826 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c065c18-3cb1-46f6-9d8d-b8acd51eb933-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.595040 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c065c18-3cb1-46f6-9d8d-b8acd51eb933-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7c065c18-3cb1-46f6-9d8d-b8acd51eb933" (UID: "7c065c18-3cb1-46f6-9d8d-b8acd51eb933"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.602954 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c065c18-3cb1-46f6-9d8d-b8acd51eb933-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "7c065c18-3cb1-46f6-9d8d-b8acd51eb933" (UID: "7c065c18-3cb1-46f6-9d8d-b8acd51eb933"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.637529 4826 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c065c18-3cb1-46f6-9d8d-b8acd51eb933-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.637575 4826 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c065c18-3cb1-46f6-9d8d-b8acd51eb933-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.646484 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c065c18-3cb1-46f6-9d8d-b8acd51eb933-config-data" (OuterVolumeSpecName: "config-data") pod "7c065c18-3cb1-46f6-9d8d-b8acd51eb933" (UID: "7c065c18-3cb1-46f6-9d8d-b8acd51eb933"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.688696 4826 generic.go:334] "Generic (PLEG): container finished" podID="7c065c18-3cb1-46f6-9d8d-b8acd51eb933" containerID="2983ef44b125d695c38d8c53c1cf5447b2305d4b4a37af84d506ccb8a394e0f9" exitCode=0 Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.688828 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.688863 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7c065c18-3cb1-46f6-9d8d-b8acd51eb933","Type":"ContainerDied","Data":"2983ef44b125d695c38d8c53c1cf5447b2305d4b4a37af84d506ccb8a394e0f9"} Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.688912 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7c065c18-3cb1-46f6-9d8d-b8acd51eb933","Type":"ContainerDied","Data":"41dea259fd69848f6dd597e0a493342713ef6e92c3bec929fad2da9a3f66eeb0"} Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.688930 4826 scope.go:117] "RemoveContainer" containerID="ba1baf74e4b19172dc90c6ef4fe85222e46a7144877370076758cbc3b75dcc99" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.692803 4826 generic.go:334] "Generic (PLEG): container finished" podID="f8d51938-c21d-4578-afe3-ecbbf8d67bd2" containerID="760861aec5f18a584b8e36885125a3777bbbca2857f1b2c7547668c3e91fe1d9" exitCode=0 Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.692996 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vt6c6" podUID="bd1bf3ae-da1b-48c4-a1fb-b185d9af0440" containerName="registry-server" containerID="cri-o://9573f814854db176be441430dc75696361b4dd70f44144ecee45928e29b620af" gracePeriod=2 Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.693198 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565802-wsq4t" event={"ID":"f8d51938-c21d-4578-afe3-ecbbf8d67bd2","Type":"ContainerDied","Data":"760861aec5f18a584b8e36885125a3777bbbca2857f1b2c7547668c3e91fe1d9"} Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.721600 4826 scope.go:117] "RemoveContainer" containerID="173d9613546ad6e955d3b37fe9e4f67e5b002df88700f3b9adbbe182ac4ac885" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.740919 4826 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c065c18-3cb1-46f6-9d8d-b8acd51eb933-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.743758 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.762676 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.787832 4826 scope.go:117] "RemoveContainer" containerID="b98ed83fad09b0ae1304a85c937ab659ddd95c941aaffe2313de3d75ea41161e" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.787995 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:22:05 crc kubenswrapper[4826]: E0319 19:22:05.788533 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c065c18-3cb1-46f6-9d8d-b8acd51eb933" containerName="proxy-httpd" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.788554 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c065c18-3cb1-46f6-9d8d-b8acd51eb933" containerName="proxy-httpd" Mar 19 19:22:05 crc kubenswrapper[4826]: E0319 19:22:05.788572 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c065c18-3cb1-46f6-9d8d-b8acd51eb933" containerName="ceilometer-central-agent" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.788585 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c065c18-3cb1-46f6-9d8d-b8acd51eb933" containerName="ceilometer-central-agent" Mar 19 19:22:05 crc kubenswrapper[4826]: E0319 19:22:05.788607 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c065c18-3cb1-46f6-9d8d-b8acd51eb933" containerName="ceilometer-notification-agent" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.788614 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c065c18-3cb1-46f6-9d8d-b8acd51eb933" containerName="ceilometer-notification-agent" Mar 19 19:22:05 crc kubenswrapper[4826]: E0319 19:22:05.788631 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c065c18-3cb1-46f6-9d8d-b8acd51eb933" containerName="sg-core" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.788638 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c065c18-3cb1-46f6-9d8d-b8acd51eb933" containerName="sg-core" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.788898 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c065c18-3cb1-46f6-9d8d-b8acd51eb933" containerName="ceilometer-notification-agent" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.788922 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c065c18-3cb1-46f6-9d8d-b8acd51eb933" containerName="sg-core" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.788936 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c065c18-3cb1-46f6-9d8d-b8acd51eb933" containerName="proxy-httpd" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.788958 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c065c18-3cb1-46f6-9d8d-b8acd51eb933" containerName="ceilometer-central-agent" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.793614 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.799505 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.799917 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.800135 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.800338 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.843838 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab298593-ac97-4031-8bfc-b0e5be9b341a-log-httpd\") pod \"ceilometer-0\" (UID: \"ab298593-ac97-4031-8bfc-b0e5be9b341a\") " pod="openstack/ceilometer-0" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.844026 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab298593-ac97-4031-8bfc-b0e5be9b341a-config-data\") pod \"ceilometer-0\" (UID: \"ab298593-ac97-4031-8bfc-b0e5be9b341a\") " pod="openstack/ceilometer-0" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.844068 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ab298593-ac97-4031-8bfc-b0e5be9b341a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ab298593-ac97-4031-8bfc-b0e5be9b341a\") " pod="openstack/ceilometer-0" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.844173 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab298593-ac97-4031-8bfc-b0e5be9b341a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ab298593-ac97-4031-8bfc-b0e5be9b341a\") " pod="openstack/ceilometer-0" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.844388 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab298593-ac97-4031-8bfc-b0e5be9b341a-scripts\") pod \"ceilometer-0\" (UID: \"ab298593-ac97-4031-8bfc-b0e5be9b341a\") " pod="openstack/ceilometer-0" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.844506 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvlqc\" (UniqueName: \"kubernetes.io/projected/ab298593-ac97-4031-8bfc-b0e5be9b341a-kube-api-access-mvlqc\") pod \"ceilometer-0\" (UID: \"ab298593-ac97-4031-8bfc-b0e5be9b341a\") " pod="openstack/ceilometer-0" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.844590 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab298593-ac97-4031-8bfc-b0e5be9b341a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ab298593-ac97-4031-8bfc-b0e5be9b341a\") " pod="openstack/ceilometer-0" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.844760 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab298593-ac97-4031-8bfc-b0e5be9b341a-run-httpd\") pod \"ceilometer-0\" (UID: \"ab298593-ac97-4031-8bfc-b0e5be9b341a\") " pod="openstack/ceilometer-0" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.869777 4826 scope.go:117] "RemoveContainer" containerID="2983ef44b125d695c38d8c53c1cf5447b2305d4b4a37af84d506ccb8a394e0f9" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.908101 4826 scope.go:117] "RemoveContainer" containerID="ba1baf74e4b19172dc90c6ef4fe85222e46a7144877370076758cbc3b75dcc99" Mar 19 19:22:05 crc kubenswrapper[4826]: E0319 19:22:05.908527 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba1baf74e4b19172dc90c6ef4fe85222e46a7144877370076758cbc3b75dcc99\": container with ID starting with ba1baf74e4b19172dc90c6ef4fe85222e46a7144877370076758cbc3b75dcc99 not found: ID does not exist" containerID="ba1baf74e4b19172dc90c6ef4fe85222e46a7144877370076758cbc3b75dcc99" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.908620 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba1baf74e4b19172dc90c6ef4fe85222e46a7144877370076758cbc3b75dcc99"} err="failed to get container status \"ba1baf74e4b19172dc90c6ef4fe85222e46a7144877370076758cbc3b75dcc99\": rpc error: code = NotFound desc = could not find container \"ba1baf74e4b19172dc90c6ef4fe85222e46a7144877370076758cbc3b75dcc99\": container with ID starting with ba1baf74e4b19172dc90c6ef4fe85222e46a7144877370076758cbc3b75dcc99 not found: ID does not exist" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.908676 4826 scope.go:117] "RemoveContainer" containerID="173d9613546ad6e955d3b37fe9e4f67e5b002df88700f3b9adbbe182ac4ac885" Mar 19 19:22:05 crc kubenswrapper[4826]: E0319 19:22:05.908974 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"173d9613546ad6e955d3b37fe9e4f67e5b002df88700f3b9adbbe182ac4ac885\": container with ID starting with 173d9613546ad6e955d3b37fe9e4f67e5b002df88700f3b9adbbe182ac4ac885 not found: ID does not exist" containerID="173d9613546ad6e955d3b37fe9e4f67e5b002df88700f3b9adbbe182ac4ac885" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.909016 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"173d9613546ad6e955d3b37fe9e4f67e5b002df88700f3b9adbbe182ac4ac885"} err="failed to get container status \"173d9613546ad6e955d3b37fe9e4f67e5b002df88700f3b9adbbe182ac4ac885\": rpc error: code = NotFound desc = could not find container \"173d9613546ad6e955d3b37fe9e4f67e5b002df88700f3b9adbbe182ac4ac885\": container with ID starting with 173d9613546ad6e955d3b37fe9e4f67e5b002df88700f3b9adbbe182ac4ac885 not found: ID does not exist" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.909066 4826 scope.go:117] "RemoveContainer" containerID="b98ed83fad09b0ae1304a85c937ab659ddd95c941aaffe2313de3d75ea41161e" Mar 19 19:22:05 crc kubenswrapper[4826]: E0319 19:22:05.909321 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b98ed83fad09b0ae1304a85c937ab659ddd95c941aaffe2313de3d75ea41161e\": container with ID starting with b98ed83fad09b0ae1304a85c937ab659ddd95c941aaffe2313de3d75ea41161e not found: ID does not exist" containerID="b98ed83fad09b0ae1304a85c937ab659ddd95c941aaffe2313de3d75ea41161e" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.909364 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b98ed83fad09b0ae1304a85c937ab659ddd95c941aaffe2313de3d75ea41161e"} err="failed to get container status \"b98ed83fad09b0ae1304a85c937ab659ddd95c941aaffe2313de3d75ea41161e\": rpc error: code = NotFound desc = could not find container \"b98ed83fad09b0ae1304a85c937ab659ddd95c941aaffe2313de3d75ea41161e\": container with ID starting with b98ed83fad09b0ae1304a85c937ab659ddd95c941aaffe2313de3d75ea41161e not found: ID does not exist" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.909386 4826 scope.go:117] "RemoveContainer" containerID="2983ef44b125d695c38d8c53c1cf5447b2305d4b4a37af84d506ccb8a394e0f9" Mar 19 19:22:05 crc kubenswrapper[4826]: E0319 19:22:05.910117 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2983ef44b125d695c38d8c53c1cf5447b2305d4b4a37af84d506ccb8a394e0f9\": container with ID starting with 2983ef44b125d695c38d8c53c1cf5447b2305d4b4a37af84d506ccb8a394e0f9 not found: ID does not exist" containerID="2983ef44b125d695c38d8c53c1cf5447b2305d4b4a37af84d506ccb8a394e0f9" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.910150 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2983ef44b125d695c38d8c53c1cf5447b2305d4b4a37af84d506ccb8a394e0f9"} err="failed to get container status \"2983ef44b125d695c38d8c53c1cf5447b2305d4b4a37af84d506ccb8a394e0f9\": rpc error: code = NotFound desc = could not find container \"2983ef44b125d695c38d8c53c1cf5447b2305d4b4a37af84d506ccb8a394e0f9\": container with ID starting with 2983ef44b125d695c38d8c53c1cf5447b2305d4b4a37af84d506ccb8a394e0f9 not found: ID does not exist" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.946968 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab298593-ac97-4031-8bfc-b0e5be9b341a-scripts\") pod \"ceilometer-0\" (UID: \"ab298593-ac97-4031-8bfc-b0e5be9b341a\") " pod="openstack/ceilometer-0" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.947040 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvlqc\" (UniqueName: \"kubernetes.io/projected/ab298593-ac97-4031-8bfc-b0e5be9b341a-kube-api-access-mvlqc\") pod \"ceilometer-0\" (UID: \"ab298593-ac97-4031-8bfc-b0e5be9b341a\") " pod="openstack/ceilometer-0" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.947081 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab298593-ac97-4031-8bfc-b0e5be9b341a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ab298593-ac97-4031-8bfc-b0e5be9b341a\") " pod="openstack/ceilometer-0" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.947119 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab298593-ac97-4031-8bfc-b0e5be9b341a-run-httpd\") pod \"ceilometer-0\" (UID: \"ab298593-ac97-4031-8bfc-b0e5be9b341a\") " pod="openstack/ceilometer-0" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.947170 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab298593-ac97-4031-8bfc-b0e5be9b341a-log-httpd\") pod \"ceilometer-0\" (UID: \"ab298593-ac97-4031-8bfc-b0e5be9b341a\") " pod="openstack/ceilometer-0" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.947241 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab298593-ac97-4031-8bfc-b0e5be9b341a-config-data\") pod \"ceilometer-0\" (UID: \"ab298593-ac97-4031-8bfc-b0e5be9b341a\") " pod="openstack/ceilometer-0" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.947270 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ab298593-ac97-4031-8bfc-b0e5be9b341a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ab298593-ac97-4031-8bfc-b0e5be9b341a\") " pod="openstack/ceilometer-0" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.947288 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab298593-ac97-4031-8bfc-b0e5be9b341a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ab298593-ac97-4031-8bfc-b0e5be9b341a\") " pod="openstack/ceilometer-0" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.950454 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab298593-ac97-4031-8bfc-b0e5be9b341a-run-httpd\") pod \"ceilometer-0\" (UID: \"ab298593-ac97-4031-8bfc-b0e5be9b341a\") " pod="openstack/ceilometer-0" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.951351 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab298593-ac97-4031-8bfc-b0e5be9b341a-log-httpd\") pod \"ceilometer-0\" (UID: \"ab298593-ac97-4031-8bfc-b0e5be9b341a\") " pod="openstack/ceilometer-0" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.952699 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab298593-ac97-4031-8bfc-b0e5be9b341a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ab298593-ac97-4031-8bfc-b0e5be9b341a\") " pod="openstack/ceilometer-0" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.960629 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab298593-ac97-4031-8bfc-b0e5be9b341a-config-data\") pod \"ceilometer-0\" (UID: \"ab298593-ac97-4031-8bfc-b0e5be9b341a\") " pod="openstack/ceilometer-0" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.963160 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab298593-ac97-4031-8bfc-b0e5be9b341a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ab298593-ac97-4031-8bfc-b0e5be9b341a\") " pod="openstack/ceilometer-0" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.963304 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ab298593-ac97-4031-8bfc-b0e5be9b341a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ab298593-ac97-4031-8bfc-b0e5be9b341a\") " pod="openstack/ceilometer-0" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.971049 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab298593-ac97-4031-8bfc-b0e5be9b341a-scripts\") pod \"ceilometer-0\" (UID: \"ab298593-ac97-4031-8bfc-b0e5be9b341a\") " pod="openstack/ceilometer-0" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.972519 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvlqc\" (UniqueName: \"kubernetes.io/projected/ab298593-ac97-4031-8bfc-b0e5be9b341a-kube-api-access-mvlqc\") pod \"ceilometer-0\" (UID: \"ab298593-ac97-4031-8bfc-b0e5be9b341a\") " pod="openstack/ceilometer-0" Mar 19 19:22:05 crc kubenswrapper[4826]: I0319 19:22:05.998408 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c065c18-3cb1-46f6-9d8d-b8acd51eb933" path="/var/lib/kubelet/pods/7c065c18-3cb1-46f6-9d8d-b8acd51eb933/volumes" Mar 19 19:22:06 crc kubenswrapper[4826]: I0319 19:22:06.132834 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 19:22:06 crc kubenswrapper[4826]: I0319 19:22:06.343832 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vt6c6" Mar 19 19:22:06 crc kubenswrapper[4826]: I0319 19:22:06.460960 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd1bf3ae-da1b-48c4-a1fb-b185d9af0440-utilities\") pod \"bd1bf3ae-da1b-48c4-a1fb-b185d9af0440\" (UID: \"bd1bf3ae-da1b-48c4-a1fb-b185d9af0440\") " Mar 19 19:22:06 crc kubenswrapper[4826]: I0319 19:22:06.461089 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mq7q\" (UniqueName: \"kubernetes.io/projected/bd1bf3ae-da1b-48c4-a1fb-b185d9af0440-kube-api-access-8mq7q\") pod \"bd1bf3ae-da1b-48c4-a1fb-b185d9af0440\" (UID: \"bd1bf3ae-da1b-48c4-a1fb-b185d9af0440\") " Mar 19 19:22:06 crc kubenswrapper[4826]: I0319 19:22:06.461254 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd1bf3ae-da1b-48c4-a1fb-b185d9af0440-catalog-content\") pod \"bd1bf3ae-da1b-48c4-a1fb-b185d9af0440\" (UID: \"bd1bf3ae-da1b-48c4-a1fb-b185d9af0440\") " Mar 19 19:22:06 crc kubenswrapper[4826]: I0319 19:22:06.462415 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd1bf3ae-da1b-48c4-a1fb-b185d9af0440-utilities" (OuterVolumeSpecName: "utilities") pod "bd1bf3ae-da1b-48c4-a1fb-b185d9af0440" (UID: "bd1bf3ae-da1b-48c4-a1fb-b185d9af0440"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:22:06 crc kubenswrapper[4826]: I0319 19:22:06.466982 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd1bf3ae-da1b-48c4-a1fb-b185d9af0440-kube-api-access-8mq7q" (OuterVolumeSpecName: "kube-api-access-8mq7q") pod "bd1bf3ae-da1b-48c4-a1fb-b185d9af0440" (UID: "bd1bf3ae-da1b-48c4-a1fb-b185d9af0440"). InnerVolumeSpecName "kube-api-access-8mq7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:22:06 crc kubenswrapper[4826]: I0319 19:22:06.488152 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd1bf3ae-da1b-48c4-a1fb-b185d9af0440-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bd1bf3ae-da1b-48c4-a1fb-b185d9af0440" (UID: "bd1bf3ae-da1b-48c4-a1fb-b185d9af0440"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:22:06 crc kubenswrapper[4826]: I0319 19:22:06.563843 4826 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd1bf3ae-da1b-48c4-a1fb-b185d9af0440-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 19:22:06 crc kubenswrapper[4826]: I0319 19:22:06.563880 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mq7q\" (UniqueName: \"kubernetes.io/projected/bd1bf3ae-da1b-48c4-a1fb-b185d9af0440-kube-api-access-8mq7q\") on node \"crc\" DevicePath \"\"" Mar 19 19:22:06 crc kubenswrapper[4826]: I0319 19:22:06.563892 4826 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd1bf3ae-da1b-48c4-a1fb-b185d9af0440-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 19:22:06 crc kubenswrapper[4826]: I0319 19:22:06.712762 4826 generic.go:334] "Generic (PLEG): container finished" podID="bd1bf3ae-da1b-48c4-a1fb-b185d9af0440" containerID="9573f814854db176be441430dc75696361b4dd70f44144ecee45928e29b620af" exitCode=0 Mar 19 19:22:06 crc kubenswrapper[4826]: I0319 19:22:06.712846 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vt6c6" event={"ID":"bd1bf3ae-da1b-48c4-a1fb-b185d9af0440","Type":"ContainerDied","Data":"9573f814854db176be441430dc75696361b4dd70f44144ecee45928e29b620af"} Mar 19 19:22:06 crc kubenswrapper[4826]: I0319 19:22:06.712917 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vt6c6" Mar 19 19:22:06 crc kubenswrapper[4826]: I0319 19:22:06.713092 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vt6c6" event={"ID":"bd1bf3ae-da1b-48c4-a1fb-b185d9af0440","Type":"ContainerDied","Data":"b7da84f67e5138897bd314477ed242fc90021a93ca6f897ffb359f132238d7a6"} Mar 19 19:22:06 crc kubenswrapper[4826]: I0319 19:22:06.713133 4826 scope.go:117] "RemoveContainer" containerID="9573f814854db176be441430dc75696361b4dd70f44144ecee45928e29b620af" Mar 19 19:22:06 crc kubenswrapper[4826]: I0319 19:22:06.725359 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 19:22:06 crc kubenswrapper[4826]: I0319 19:22:06.756453 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-2" podUID="ad041e2d-3400-49ce-b25f-0d335f3b6738" containerName="rabbitmq" containerID="cri-o://e599ec2e8b0d98e5639b385d8980140372266b12b05d4bf6c5c497a47fd71073" gracePeriod=604796 Mar 19 19:22:06 crc kubenswrapper[4826]: I0319 19:22:06.810914 4826 scope.go:117] "RemoveContainer" containerID="c8c7e99525722c823cd41b44e6fc4aab3fd1ced1f3bfed12c3e2b2a6ee99dc79" Mar 19 19:22:06 crc kubenswrapper[4826]: I0319 19:22:06.830172 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vt6c6"] Mar 19 19:22:06 crc kubenswrapper[4826]: I0319 19:22:06.843457 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vt6c6"] Mar 19 19:22:06 crc kubenswrapper[4826]: I0319 19:22:06.873825 4826 scope.go:117] "RemoveContainer" containerID="cd82e12c8562217e656b3e3cf61edc2dd8af3766ce910aef2540a71d3b29bd8d" Mar 19 19:22:06 crc kubenswrapper[4826]: I0319 19:22:06.916939 4826 scope.go:117] "RemoveContainer" containerID="9573f814854db176be441430dc75696361b4dd70f44144ecee45928e29b620af" Mar 19 19:22:06 crc kubenswrapper[4826]: E0319 19:22:06.918299 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9573f814854db176be441430dc75696361b4dd70f44144ecee45928e29b620af\": container with ID starting with 9573f814854db176be441430dc75696361b4dd70f44144ecee45928e29b620af not found: ID does not exist" containerID="9573f814854db176be441430dc75696361b4dd70f44144ecee45928e29b620af" Mar 19 19:22:06 crc kubenswrapper[4826]: I0319 19:22:06.918355 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9573f814854db176be441430dc75696361b4dd70f44144ecee45928e29b620af"} err="failed to get container status \"9573f814854db176be441430dc75696361b4dd70f44144ecee45928e29b620af\": rpc error: code = NotFound desc = could not find container \"9573f814854db176be441430dc75696361b4dd70f44144ecee45928e29b620af\": container with ID starting with 9573f814854db176be441430dc75696361b4dd70f44144ecee45928e29b620af not found: ID does not exist" Mar 19 19:22:06 crc kubenswrapper[4826]: I0319 19:22:06.918390 4826 scope.go:117] "RemoveContainer" containerID="c8c7e99525722c823cd41b44e6fc4aab3fd1ced1f3bfed12c3e2b2a6ee99dc79" Mar 19 19:22:06 crc kubenswrapper[4826]: E0319 19:22:06.928534 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8c7e99525722c823cd41b44e6fc4aab3fd1ced1f3bfed12c3e2b2a6ee99dc79\": container with ID starting with c8c7e99525722c823cd41b44e6fc4aab3fd1ced1f3bfed12c3e2b2a6ee99dc79 not found: ID does not exist" containerID="c8c7e99525722c823cd41b44e6fc4aab3fd1ced1f3bfed12c3e2b2a6ee99dc79" Mar 19 19:22:06 crc kubenswrapper[4826]: I0319 19:22:06.928563 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8c7e99525722c823cd41b44e6fc4aab3fd1ced1f3bfed12c3e2b2a6ee99dc79"} err="failed to get container status \"c8c7e99525722c823cd41b44e6fc4aab3fd1ced1f3bfed12c3e2b2a6ee99dc79\": rpc error: code = NotFound desc = could not find container \"c8c7e99525722c823cd41b44e6fc4aab3fd1ced1f3bfed12c3e2b2a6ee99dc79\": container with ID starting with c8c7e99525722c823cd41b44e6fc4aab3fd1ced1f3bfed12c3e2b2a6ee99dc79 not found: ID does not exist" Mar 19 19:22:06 crc kubenswrapper[4826]: I0319 19:22:06.928589 4826 scope.go:117] "RemoveContainer" containerID="cd82e12c8562217e656b3e3cf61edc2dd8af3766ce910aef2540a71d3b29bd8d" Mar 19 19:22:06 crc kubenswrapper[4826]: E0319 19:22:06.929643 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd82e12c8562217e656b3e3cf61edc2dd8af3766ce910aef2540a71d3b29bd8d\": container with ID starting with cd82e12c8562217e656b3e3cf61edc2dd8af3766ce910aef2540a71d3b29bd8d not found: ID does not exist" containerID="cd82e12c8562217e656b3e3cf61edc2dd8af3766ce910aef2540a71d3b29bd8d" Mar 19 19:22:06 crc kubenswrapper[4826]: I0319 19:22:06.929678 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd82e12c8562217e656b3e3cf61edc2dd8af3766ce910aef2540a71d3b29bd8d"} err="failed to get container status \"cd82e12c8562217e656b3e3cf61edc2dd8af3766ce910aef2540a71d3b29bd8d\": rpc error: code = NotFound desc = could not find container \"cd82e12c8562217e656b3e3cf61edc2dd8af3766ce910aef2540a71d3b29bd8d\": container with ID starting with cd82e12c8562217e656b3e3cf61edc2dd8af3766ce910aef2540a71d3b29bd8d not found: ID does not exist" Mar 19 19:22:07 crc kubenswrapper[4826]: I0319 19:22:07.070799 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565802-wsq4t" Mar 19 19:22:07 crc kubenswrapper[4826]: I0319 19:22:07.181300 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fl9dm\" (UniqueName: \"kubernetes.io/projected/f8d51938-c21d-4578-afe3-ecbbf8d67bd2-kube-api-access-fl9dm\") pod \"f8d51938-c21d-4578-afe3-ecbbf8d67bd2\" (UID: \"f8d51938-c21d-4578-afe3-ecbbf8d67bd2\") " Mar 19 19:22:07 crc kubenswrapper[4826]: I0319 19:22:07.188021 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8d51938-c21d-4578-afe3-ecbbf8d67bd2-kube-api-access-fl9dm" (OuterVolumeSpecName: "kube-api-access-fl9dm") pod "f8d51938-c21d-4578-afe3-ecbbf8d67bd2" (UID: "f8d51938-c21d-4578-afe3-ecbbf8d67bd2"). InnerVolumeSpecName "kube-api-access-fl9dm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:22:07 crc kubenswrapper[4826]: I0319 19:22:07.284473 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fl9dm\" (UniqueName: \"kubernetes.io/projected/f8d51938-c21d-4578-afe3-ecbbf8d67bd2-kube-api-access-fl9dm\") on node \"crc\" DevicePath \"\"" Mar 19 19:22:07 crc kubenswrapper[4826]: I0319 19:22:07.543908 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="69dc8d23-ac18-40b1-99d9-365705c5753b" containerName="rabbitmq" containerID="cri-o://0905438bb03dc381ef571e8ce2b64bc797077a674ddcc377521605ee89289434" gracePeriod=604796 Mar 19 19:22:07 crc kubenswrapper[4826]: I0319 19:22:07.726605 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab298593-ac97-4031-8bfc-b0e5be9b341a","Type":"ContainerStarted","Data":"5d9c7ce6e64fc92e4d94a7f8d951d16c5fe17d00df6bdf5701ccdbe285d62c3a"} Mar 19 19:22:07 crc kubenswrapper[4826]: I0319 19:22:07.729671 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565802-wsq4t" event={"ID":"f8d51938-c21d-4578-afe3-ecbbf8d67bd2","Type":"ContainerDied","Data":"7fa5934b782f6c5910f2ce829515c1fbe29c0ae1374730fdcb5e8b8ab032faba"} Mar 19 19:22:07 crc kubenswrapper[4826]: I0319 19:22:07.729701 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7fa5934b782f6c5910f2ce829515c1fbe29c0ae1374730fdcb5e8b8ab032faba" Mar 19 19:22:07 crc kubenswrapper[4826]: I0319 19:22:07.729709 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565802-wsq4t" Mar 19 19:22:07 crc kubenswrapper[4826]: I0319 19:22:07.810169 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565796-wqw9j"] Mar 19 19:22:07 crc kubenswrapper[4826]: I0319 19:22:07.822671 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565796-wqw9j"] Mar 19 19:22:07 crc kubenswrapper[4826]: I0319 19:22:07.993793 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d5fbfcc-053e-4453-961c-91a0719cdaa6" path="/var/lib/kubelet/pods/4d5fbfcc-053e-4453-961c-91a0719cdaa6/volumes" Mar 19 19:22:07 crc kubenswrapper[4826]: I0319 19:22:07.996488 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd1bf3ae-da1b-48c4-a1fb-b185d9af0440" path="/var/lib/kubelet/pods/bd1bf3ae-da1b-48c4-a1fb-b185d9af0440/volumes" Mar 19 19:22:09 crc kubenswrapper[4826]: I0319 19:22:09.843590 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="69dc8d23-ac18-40b1-99d9-365705c5753b" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.132:5671: connect: connection refused" Mar 19 19:22:10 crc kubenswrapper[4826]: I0319 19:22:10.573016 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-2" podUID="ad041e2d-3400-49ce-b25f-0d335f3b6738" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.134:5671: connect: connection refused" Mar 19 19:22:10 crc kubenswrapper[4826]: I0319 19:22:10.976919 4826 scope.go:117] "RemoveContainer" containerID="856447f1cdc796c080402d3bfb76d7471741ca95039714006756d0cb980e424c" Mar 19 19:22:10 crc kubenswrapper[4826]: E0319 19:22:10.977438 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:22:13 crc kubenswrapper[4826]: I0319 19:22:13.857219 4826 generic.go:334] "Generic (PLEG): container finished" podID="69dc8d23-ac18-40b1-99d9-365705c5753b" containerID="0905438bb03dc381ef571e8ce2b64bc797077a674ddcc377521605ee89289434" exitCode=0 Mar 19 19:22:13 crc kubenswrapper[4826]: I0319 19:22:13.857302 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"69dc8d23-ac18-40b1-99d9-365705c5753b","Type":"ContainerDied","Data":"0905438bb03dc381ef571e8ce2b64bc797077a674ddcc377521605ee89289434"} Mar 19 19:22:13 crc kubenswrapper[4826]: I0319 19:22:13.879803 4826 generic.go:334] "Generic (PLEG): container finished" podID="ad041e2d-3400-49ce-b25f-0d335f3b6738" containerID="e599ec2e8b0d98e5639b385d8980140372266b12b05d4bf6c5c497a47fd71073" exitCode=0 Mar 19 19:22:13 crc kubenswrapper[4826]: I0319 19:22:13.879838 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"ad041e2d-3400-49ce-b25f-0d335f3b6738","Type":"ContainerDied","Data":"e599ec2e8b0d98e5639b385d8980140372266b12b05d4bf6c5c497a47fd71073"} Mar 19 19:22:15 crc kubenswrapper[4826]: I0319 19:22:15.951834 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b75489c6f-gqz8n"] Mar 19 19:22:15 crc kubenswrapper[4826]: E0319 19:22:15.953783 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd1bf3ae-da1b-48c4-a1fb-b185d9af0440" containerName="registry-server" Mar 19 19:22:15 crc kubenswrapper[4826]: I0319 19:22:15.953805 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd1bf3ae-da1b-48c4-a1fb-b185d9af0440" containerName="registry-server" Mar 19 19:22:15 crc kubenswrapper[4826]: E0319 19:22:15.953893 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8d51938-c21d-4578-afe3-ecbbf8d67bd2" containerName="oc" Mar 19 19:22:15 crc kubenswrapper[4826]: I0319 19:22:15.953903 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8d51938-c21d-4578-afe3-ecbbf8d67bd2" containerName="oc" Mar 19 19:22:15 crc kubenswrapper[4826]: E0319 19:22:15.953931 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd1bf3ae-da1b-48c4-a1fb-b185d9af0440" containerName="extract-content" Mar 19 19:22:15 crc kubenswrapper[4826]: I0319 19:22:15.953939 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd1bf3ae-da1b-48c4-a1fb-b185d9af0440" containerName="extract-content" Mar 19 19:22:15 crc kubenswrapper[4826]: E0319 19:22:15.953973 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd1bf3ae-da1b-48c4-a1fb-b185d9af0440" containerName="extract-utilities" Mar 19 19:22:15 crc kubenswrapper[4826]: I0319 19:22:15.953983 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd1bf3ae-da1b-48c4-a1fb-b185d9af0440" containerName="extract-utilities" Mar 19 19:22:15 crc kubenswrapper[4826]: I0319 19:22:15.954612 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd1bf3ae-da1b-48c4-a1fb-b185d9af0440" containerName="registry-server" Mar 19 19:22:15 crc kubenswrapper[4826]: I0319 19:22:15.957150 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8d51938-c21d-4578-afe3-ecbbf8d67bd2" containerName="oc" Mar 19 19:22:15 crc kubenswrapper[4826]: I0319 19:22:15.962261 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b75489c6f-gqz8n" Mar 19 19:22:15 crc kubenswrapper[4826]: I0319 19:22:15.965778 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Mar 19 19:22:16 crc kubenswrapper[4826]: I0319 19:22:16.027827 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b75489c6f-gqz8n"] Mar 19 19:22:16 crc kubenswrapper[4826]: I0319 19:22:16.049286 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lwx8\" (UniqueName: \"kubernetes.io/projected/af1734c8-f29e-4752-8f25-9f6a79025585-kube-api-access-8lwx8\") pod \"dnsmasq-dns-5b75489c6f-gqz8n\" (UID: \"af1734c8-f29e-4752-8f25-9f6a79025585\") " pod="openstack/dnsmasq-dns-5b75489c6f-gqz8n" Mar 19 19:22:16 crc kubenswrapper[4826]: I0319 19:22:16.049391 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/af1734c8-f29e-4752-8f25-9f6a79025585-openstack-edpm-ipam\") pod \"dnsmasq-dns-5b75489c6f-gqz8n\" (UID: \"af1734c8-f29e-4752-8f25-9f6a79025585\") " pod="openstack/dnsmasq-dns-5b75489c6f-gqz8n" Mar 19 19:22:16 crc kubenswrapper[4826]: I0319 19:22:16.049429 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af1734c8-f29e-4752-8f25-9f6a79025585-ovsdbserver-nb\") pod \"dnsmasq-dns-5b75489c6f-gqz8n\" (UID: \"af1734c8-f29e-4752-8f25-9f6a79025585\") " pod="openstack/dnsmasq-dns-5b75489c6f-gqz8n" Mar 19 19:22:16 crc kubenswrapper[4826]: I0319 19:22:16.049534 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af1734c8-f29e-4752-8f25-9f6a79025585-config\") pod \"dnsmasq-dns-5b75489c6f-gqz8n\" (UID: \"af1734c8-f29e-4752-8f25-9f6a79025585\") " pod="openstack/dnsmasq-dns-5b75489c6f-gqz8n" Mar 19 19:22:16 crc kubenswrapper[4826]: I0319 19:22:16.049595 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/af1734c8-f29e-4752-8f25-9f6a79025585-dns-swift-storage-0\") pod \"dnsmasq-dns-5b75489c6f-gqz8n\" (UID: \"af1734c8-f29e-4752-8f25-9f6a79025585\") " pod="openstack/dnsmasq-dns-5b75489c6f-gqz8n" Mar 19 19:22:16 crc kubenswrapper[4826]: I0319 19:22:16.049616 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af1734c8-f29e-4752-8f25-9f6a79025585-dns-svc\") pod \"dnsmasq-dns-5b75489c6f-gqz8n\" (UID: \"af1734c8-f29e-4752-8f25-9f6a79025585\") " pod="openstack/dnsmasq-dns-5b75489c6f-gqz8n" Mar 19 19:22:16 crc kubenswrapper[4826]: I0319 19:22:16.049684 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af1734c8-f29e-4752-8f25-9f6a79025585-ovsdbserver-sb\") pod \"dnsmasq-dns-5b75489c6f-gqz8n\" (UID: \"af1734c8-f29e-4752-8f25-9f6a79025585\") " pod="openstack/dnsmasq-dns-5b75489c6f-gqz8n" Mar 19 19:22:16 crc kubenswrapper[4826]: I0319 19:22:16.151731 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af1734c8-f29e-4752-8f25-9f6a79025585-config\") pod \"dnsmasq-dns-5b75489c6f-gqz8n\" (UID: \"af1734c8-f29e-4752-8f25-9f6a79025585\") " pod="openstack/dnsmasq-dns-5b75489c6f-gqz8n" Mar 19 19:22:16 crc kubenswrapper[4826]: I0319 19:22:16.151864 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/af1734c8-f29e-4752-8f25-9f6a79025585-dns-swift-storage-0\") pod \"dnsmasq-dns-5b75489c6f-gqz8n\" (UID: \"af1734c8-f29e-4752-8f25-9f6a79025585\") " pod="openstack/dnsmasq-dns-5b75489c6f-gqz8n" Mar 19 19:22:16 crc kubenswrapper[4826]: I0319 19:22:16.151886 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af1734c8-f29e-4752-8f25-9f6a79025585-dns-svc\") pod \"dnsmasq-dns-5b75489c6f-gqz8n\" (UID: \"af1734c8-f29e-4752-8f25-9f6a79025585\") " pod="openstack/dnsmasq-dns-5b75489c6f-gqz8n" Mar 19 19:22:16 crc kubenswrapper[4826]: I0319 19:22:16.151945 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af1734c8-f29e-4752-8f25-9f6a79025585-ovsdbserver-sb\") pod \"dnsmasq-dns-5b75489c6f-gqz8n\" (UID: \"af1734c8-f29e-4752-8f25-9f6a79025585\") " pod="openstack/dnsmasq-dns-5b75489c6f-gqz8n" Mar 19 19:22:16 crc kubenswrapper[4826]: I0319 19:22:16.152006 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lwx8\" (UniqueName: \"kubernetes.io/projected/af1734c8-f29e-4752-8f25-9f6a79025585-kube-api-access-8lwx8\") pod \"dnsmasq-dns-5b75489c6f-gqz8n\" (UID: \"af1734c8-f29e-4752-8f25-9f6a79025585\") " pod="openstack/dnsmasq-dns-5b75489c6f-gqz8n" Mar 19 19:22:16 crc kubenswrapper[4826]: I0319 19:22:16.152057 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/af1734c8-f29e-4752-8f25-9f6a79025585-openstack-edpm-ipam\") pod \"dnsmasq-dns-5b75489c6f-gqz8n\" (UID: \"af1734c8-f29e-4752-8f25-9f6a79025585\") " pod="openstack/dnsmasq-dns-5b75489c6f-gqz8n" Mar 19 19:22:16 crc kubenswrapper[4826]: I0319 19:22:16.152089 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af1734c8-f29e-4752-8f25-9f6a79025585-ovsdbserver-nb\") pod \"dnsmasq-dns-5b75489c6f-gqz8n\" (UID: \"af1734c8-f29e-4752-8f25-9f6a79025585\") " pod="openstack/dnsmasq-dns-5b75489c6f-gqz8n" Mar 19 19:22:16 crc kubenswrapper[4826]: I0319 19:22:16.153478 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/af1734c8-f29e-4752-8f25-9f6a79025585-openstack-edpm-ipam\") pod \"dnsmasq-dns-5b75489c6f-gqz8n\" (UID: \"af1734c8-f29e-4752-8f25-9f6a79025585\") " pod="openstack/dnsmasq-dns-5b75489c6f-gqz8n" Mar 19 19:22:16 crc kubenswrapper[4826]: I0319 19:22:16.154449 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af1734c8-f29e-4752-8f25-9f6a79025585-dns-svc\") pod \"dnsmasq-dns-5b75489c6f-gqz8n\" (UID: \"af1734c8-f29e-4752-8f25-9f6a79025585\") " pod="openstack/dnsmasq-dns-5b75489c6f-gqz8n" Mar 19 19:22:16 crc kubenswrapper[4826]: I0319 19:22:16.154513 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/af1734c8-f29e-4752-8f25-9f6a79025585-dns-swift-storage-0\") pod \"dnsmasq-dns-5b75489c6f-gqz8n\" (UID: \"af1734c8-f29e-4752-8f25-9f6a79025585\") " pod="openstack/dnsmasq-dns-5b75489c6f-gqz8n" Mar 19 19:22:16 crc kubenswrapper[4826]: I0319 19:22:16.154739 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af1734c8-f29e-4752-8f25-9f6a79025585-ovsdbserver-sb\") pod \"dnsmasq-dns-5b75489c6f-gqz8n\" (UID: \"af1734c8-f29e-4752-8f25-9f6a79025585\") " pod="openstack/dnsmasq-dns-5b75489c6f-gqz8n" Mar 19 19:22:16 crc kubenswrapper[4826]: I0319 19:22:16.154768 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af1734c8-f29e-4752-8f25-9f6a79025585-ovsdbserver-nb\") pod \"dnsmasq-dns-5b75489c6f-gqz8n\" (UID: \"af1734c8-f29e-4752-8f25-9f6a79025585\") " pod="openstack/dnsmasq-dns-5b75489c6f-gqz8n" Mar 19 19:22:16 crc kubenswrapper[4826]: I0319 19:22:16.156287 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af1734c8-f29e-4752-8f25-9f6a79025585-config\") pod \"dnsmasq-dns-5b75489c6f-gqz8n\" (UID: \"af1734c8-f29e-4752-8f25-9f6a79025585\") " pod="openstack/dnsmasq-dns-5b75489c6f-gqz8n" Mar 19 19:22:16 crc kubenswrapper[4826]: I0319 19:22:16.170760 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lwx8\" (UniqueName: \"kubernetes.io/projected/af1734c8-f29e-4752-8f25-9f6a79025585-kube-api-access-8lwx8\") pod \"dnsmasq-dns-5b75489c6f-gqz8n\" (UID: \"af1734c8-f29e-4752-8f25-9f6a79025585\") " pod="openstack/dnsmasq-dns-5b75489c6f-gqz8n" Mar 19 19:22:16 crc kubenswrapper[4826]: I0319 19:22:16.314932 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b75489c6f-gqz8n" Mar 19 19:22:19 crc kubenswrapper[4826]: I0319 19:22:19.843739 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="69dc8d23-ac18-40b1-99d9-365705c5753b" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.132:5671: connect: connection refused" Mar 19 19:22:21 crc kubenswrapper[4826]: I0319 19:22:21.976981 4826 scope.go:117] "RemoveContainer" containerID="856447f1cdc796c080402d3bfb76d7471741ca95039714006756d0cb980e424c" Mar 19 19:22:21 crc kubenswrapper[4826]: E0319 19:22:21.977953 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.060596 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.071384 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.075582 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"69dc8d23-ac18-40b1-99d9-365705c5753b","Type":"ContainerDied","Data":"40d6e310243716b9d6f4a2f5859aab977e50d0f43d3b6857cbbeb1778e8122dc"} Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.075608 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.075642 4826 scope.go:117] "RemoveContainer" containerID="0905438bb03dc381ef571e8ce2b64bc797077a674ddcc377521605ee89289434" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.078908 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"ad041e2d-3400-49ce-b25f-0d335f3b6738","Type":"ContainerDied","Data":"12d0d4d94335881c332d8a29d3a7295c94f1cd8e18970716075bf0c2905733c6"} Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.078981 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.123534 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ad041e2d-3400-49ce-b25f-0d335f3b6738-rabbitmq-tls\") pod \"ad041e2d-3400-49ce-b25f-0d335f3b6738\" (UID: \"ad041e2d-3400-49ce-b25f-0d335f3b6738\") " Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.128766 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3b6961b-20a9-4b00-9638-0d75e0bb359a\") pod \"ad041e2d-3400-49ce-b25f-0d335f3b6738\" (UID: \"ad041e2d-3400-49ce-b25f-0d335f3b6738\") " Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.136686 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad041e2d-3400-49ce-b25f-0d335f3b6738-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "ad041e2d-3400-49ce-b25f-0d335f3b6738" (UID: "ad041e2d-3400-49ce-b25f-0d335f3b6738"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.157853 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5f383150-575a-4ec4-8521-2f187b5ecf9e\") pod \"69dc8d23-ac18-40b1-99d9-365705c5753b\" (UID: \"69dc8d23-ac18-40b1-99d9-365705c5753b\") " Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.157926 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ad041e2d-3400-49ce-b25f-0d335f3b6738-rabbitmq-plugins\") pod \"ad041e2d-3400-49ce-b25f-0d335f3b6738\" (UID: \"ad041e2d-3400-49ce-b25f-0d335f3b6738\") " Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.158000 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/69dc8d23-ac18-40b1-99d9-365705c5753b-erlang-cookie-secret\") pod \"69dc8d23-ac18-40b1-99d9-365705c5753b\" (UID: \"69dc8d23-ac18-40b1-99d9-365705c5753b\") " Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.158030 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/69dc8d23-ac18-40b1-99d9-365705c5753b-rabbitmq-erlang-cookie\") pod \"69dc8d23-ac18-40b1-99d9-365705c5753b\" (UID: \"69dc8d23-ac18-40b1-99d9-365705c5753b\") " Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.158102 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/69dc8d23-ac18-40b1-99d9-365705c5753b-server-conf\") pod \"69dc8d23-ac18-40b1-99d9-365705c5753b\" (UID: \"69dc8d23-ac18-40b1-99d9-365705c5753b\") " Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.158150 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad041e2d-3400-49ce-b25f-0d335f3b6738-config-data\") pod \"ad041e2d-3400-49ce-b25f-0d335f3b6738\" (UID: \"ad041e2d-3400-49ce-b25f-0d335f3b6738\") " Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.158211 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/69dc8d23-ac18-40b1-99d9-365705c5753b-rabbitmq-plugins\") pod \"69dc8d23-ac18-40b1-99d9-365705c5753b\" (UID: \"69dc8d23-ac18-40b1-99d9-365705c5753b\") " Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.158248 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/69dc8d23-ac18-40b1-99d9-365705c5753b-plugins-conf\") pod \"69dc8d23-ac18-40b1-99d9-365705c5753b\" (UID: \"69dc8d23-ac18-40b1-99d9-365705c5753b\") " Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.158272 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4c5s\" (UniqueName: \"kubernetes.io/projected/ad041e2d-3400-49ce-b25f-0d335f3b6738-kube-api-access-r4c5s\") pod \"ad041e2d-3400-49ce-b25f-0d335f3b6738\" (UID: \"ad041e2d-3400-49ce-b25f-0d335f3b6738\") " Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.158303 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/69dc8d23-ac18-40b1-99d9-365705c5753b-rabbitmq-tls\") pod \"69dc8d23-ac18-40b1-99d9-365705c5753b\" (UID: \"69dc8d23-ac18-40b1-99d9-365705c5753b\") " Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.158361 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/69dc8d23-ac18-40b1-99d9-365705c5753b-config-data\") pod \"69dc8d23-ac18-40b1-99d9-365705c5753b\" (UID: \"69dc8d23-ac18-40b1-99d9-365705c5753b\") " Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.158380 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ad041e2d-3400-49ce-b25f-0d335f3b6738-server-conf\") pod \"ad041e2d-3400-49ce-b25f-0d335f3b6738\" (UID: \"ad041e2d-3400-49ce-b25f-0d335f3b6738\") " Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.158416 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ad041e2d-3400-49ce-b25f-0d335f3b6738-rabbitmq-erlang-cookie\") pod \"ad041e2d-3400-49ce-b25f-0d335f3b6738\" (UID: \"ad041e2d-3400-49ce-b25f-0d335f3b6738\") " Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.158490 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/69dc8d23-ac18-40b1-99d9-365705c5753b-pod-info\") pod \"69dc8d23-ac18-40b1-99d9-365705c5753b\" (UID: \"69dc8d23-ac18-40b1-99d9-365705c5753b\") " Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.158544 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xxg5\" (UniqueName: \"kubernetes.io/projected/69dc8d23-ac18-40b1-99d9-365705c5753b-kube-api-access-6xxg5\") pod \"69dc8d23-ac18-40b1-99d9-365705c5753b\" (UID: \"69dc8d23-ac18-40b1-99d9-365705c5753b\") " Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.158565 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ad041e2d-3400-49ce-b25f-0d335f3b6738-rabbitmq-confd\") pod \"ad041e2d-3400-49ce-b25f-0d335f3b6738\" (UID: \"ad041e2d-3400-49ce-b25f-0d335f3b6738\") " Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.158588 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ad041e2d-3400-49ce-b25f-0d335f3b6738-pod-info\") pod \"ad041e2d-3400-49ce-b25f-0d335f3b6738\" (UID: \"ad041e2d-3400-49ce-b25f-0d335f3b6738\") " Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.158622 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ad041e2d-3400-49ce-b25f-0d335f3b6738-plugins-conf\") pod \"ad041e2d-3400-49ce-b25f-0d335f3b6738\" (UID: \"ad041e2d-3400-49ce-b25f-0d335f3b6738\") " Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.158644 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ad041e2d-3400-49ce-b25f-0d335f3b6738-erlang-cookie-secret\") pod \"ad041e2d-3400-49ce-b25f-0d335f3b6738\" (UID: \"ad041e2d-3400-49ce-b25f-0d335f3b6738\") " Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.158710 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/69dc8d23-ac18-40b1-99d9-365705c5753b-rabbitmq-confd\") pod \"69dc8d23-ac18-40b1-99d9-365705c5753b\" (UID: \"69dc8d23-ac18-40b1-99d9-365705c5753b\") " Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.159534 4826 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ad041e2d-3400-49ce-b25f-0d335f3b6738-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.162502 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69dc8d23-ac18-40b1-99d9-365705c5753b-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "69dc8d23-ac18-40b1-99d9-365705c5753b" (UID: "69dc8d23-ac18-40b1-99d9-365705c5753b"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.164555 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69dc8d23-ac18-40b1-99d9-365705c5753b-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "69dc8d23-ac18-40b1-99d9-365705c5753b" (UID: "69dc8d23-ac18-40b1-99d9-365705c5753b"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.166817 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69dc8d23-ac18-40b1-99d9-365705c5753b-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "69dc8d23-ac18-40b1-99d9-365705c5753b" (UID: "69dc8d23-ac18-40b1-99d9-365705c5753b"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.167846 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad041e2d-3400-49ce-b25f-0d335f3b6738-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "ad041e2d-3400-49ce-b25f-0d335f3b6738" (UID: "ad041e2d-3400-49ce-b25f-0d335f3b6738"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.168241 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad041e2d-3400-49ce-b25f-0d335f3b6738-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "ad041e2d-3400-49ce-b25f-0d335f3b6738" (UID: "ad041e2d-3400-49ce-b25f-0d335f3b6738"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.172832 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad041e2d-3400-49ce-b25f-0d335f3b6738-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "ad041e2d-3400-49ce-b25f-0d335f3b6738" (UID: "ad041e2d-3400-49ce-b25f-0d335f3b6738"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.178692 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69dc8d23-ac18-40b1-99d9-365705c5753b-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "69dc8d23-ac18-40b1-99d9-365705c5753b" (UID: "69dc8d23-ac18-40b1-99d9-365705c5753b"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.180372 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69dc8d23-ac18-40b1-99d9-365705c5753b-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "69dc8d23-ac18-40b1-99d9-365705c5753b" (UID: "69dc8d23-ac18-40b1-99d9-365705c5753b"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.184983 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad041e2d-3400-49ce-b25f-0d335f3b6738-kube-api-access-r4c5s" (OuterVolumeSpecName: "kube-api-access-r4c5s") pod "ad041e2d-3400-49ce-b25f-0d335f3b6738" (UID: "ad041e2d-3400-49ce-b25f-0d335f3b6738"). InnerVolumeSpecName "kube-api-access-r4c5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.189629 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad041e2d-3400-49ce-b25f-0d335f3b6738-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "ad041e2d-3400-49ce-b25f-0d335f3b6738" (UID: "ad041e2d-3400-49ce-b25f-0d335f3b6738"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.189647 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/69dc8d23-ac18-40b1-99d9-365705c5753b-pod-info" (OuterVolumeSpecName: "pod-info") pod "69dc8d23-ac18-40b1-99d9-365705c5753b" (UID: "69dc8d23-ac18-40b1-99d9-365705c5753b"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.189781 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69dc8d23-ac18-40b1-99d9-365705c5753b-kube-api-access-6xxg5" (OuterVolumeSpecName: "kube-api-access-6xxg5") pod "69dc8d23-ac18-40b1-99d9-365705c5753b" (UID: "69dc8d23-ac18-40b1-99d9-365705c5753b"). InnerVolumeSpecName "kube-api-access-6xxg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.190256 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3b6961b-20a9-4b00-9638-0d75e0bb359a" (OuterVolumeSpecName: "persistence") pod "ad041e2d-3400-49ce-b25f-0d335f3b6738" (UID: "ad041e2d-3400-49ce-b25f-0d335f3b6738"). InnerVolumeSpecName "pvc-c3b6961b-20a9-4b00-9638-0d75e0bb359a". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.190545 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/ad041e2d-3400-49ce-b25f-0d335f3b6738-pod-info" (OuterVolumeSpecName: "pod-info") pod "ad041e2d-3400-49ce-b25f-0d335f3b6738" (UID: "ad041e2d-3400-49ce-b25f-0d335f3b6738"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.241571 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5f383150-575a-4ec4-8521-2f187b5ecf9e" (OuterVolumeSpecName: "persistence") pod "69dc8d23-ac18-40b1-99d9-365705c5753b" (UID: "69dc8d23-ac18-40b1-99d9-365705c5753b"). InnerVolumeSpecName "pvc-5f383150-575a-4ec4-8521-2f187b5ecf9e". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.270285 4826 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/69dc8d23-ac18-40b1-99d9-365705c5753b-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.270351 4826 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/69dc8d23-ac18-40b1-99d9-365705c5753b-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.270372 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4c5s\" (UniqueName: \"kubernetes.io/projected/ad041e2d-3400-49ce-b25f-0d335f3b6738-kube-api-access-r4c5s\") on node \"crc\" DevicePath \"\"" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.270394 4826 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/69dc8d23-ac18-40b1-99d9-365705c5753b-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.270411 4826 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ad041e2d-3400-49ce-b25f-0d335f3b6738-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.270429 4826 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/69dc8d23-ac18-40b1-99d9-365705c5753b-pod-info\") on node \"crc\" DevicePath \"\"" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.270443 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xxg5\" (UniqueName: \"kubernetes.io/projected/69dc8d23-ac18-40b1-99d9-365705c5753b-kube-api-access-6xxg5\") on node \"crc\" DevicePath \"\"" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.270453 4826 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ad041e2d-3400-49ce-b25f-0d335f3b6738-pod-info\") on node \"crc\" DevicePath \"\"" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.270462 4826 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ad041e2d-3400-49ce-b25f-0d335f3b6738-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.270470 4826 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ad041e2d-3400-49ce-b25f-0d335f3b6738-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.270503 4826 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-c3b6961b-20a9-4b00-9638-0d75e0bb359a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3b6961b-20a9-4b00-9638-0d75e0bb359a\") on node \"crc\" " Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.270517 4826 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-5f383150-575a-4ec4-8521-2f187b5ecf9e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5f383150-575a-4ec4-8521-2f187b5ecf9e\") on node \"crc\" " Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.270529 4826 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ad041e2d-3400-49ce-b25f-0d335f3b6738-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.270541 4826 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/69dc8d23-ac18-40b1-99d9-365705c5753b-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.270551 4826 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/69dc8d23-ac18-40b1-99d9-365705c5753b-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.287853 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad041e2d-3400-49ce-b25f-0d335f3b6738-config-data" (OuterVolumeSpecName: "config-data") pod "ad041e2d-3400-49ce-b25f-0d335f3b6738" (UID: "ad041e2d-3400-49ce-b25f-0d335f3b6738"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.288961 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69dc8d23-ac18-40b1-99d9-365705c5753b-config-data" (OuterVolumeSpecName: "config-data") pod "69dc8d23-ac18-40b1-99d9-365705c5753b" (UID: "69dc8d23-ac18-40b1-99d9-365705c5753b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.331229 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad041e2d-3400-49ce-b25f-0d335f3b6738-server-conf" (OuterVolumeSpecName: "server-conf") pod "ad041e2d-3400-49ce-b25f-0d335f3b6738" (UID: "ad041e2d-3400-49ce-b25f-0d335f3b6738"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.345998 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69dc8d23-ac18-40b1-99d9-365705c5753b-server-conf" (OuterVolumeSpecName: "server-conf") pod "69dc8d23-ac18-40b1-99d9-365705c5753b" (UID: "69dc8d23-ac18-40b1-99d9-365705c5753b"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.372357 4826 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/69dc8d23-ac18-40b1-99d9-365705c5753b-server-conf\") on node \"crc\" DevicePath \"\"" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.372574 4826 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad041e2d-3400-49ce-b25f-0d335f3b6738-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.372686 4826 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/69dc8d23-ac18-40b1-99d9-365705c5753b-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.372759 4826 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ad041e2d-3400-49ce-b25f-0d335f3b6738-server-conf\") on node \"crc\" DevicePath \"\"" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.381893 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69dc8d23-ac18-40b1-99d9-365705c5753b-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "69dc8d23-ac18-40b1-99d9-365705c5753b" (UID: "69dc8d23-ac18-40b1-99d9-365705c5753b"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.406183 4826 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.406186 4826 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.406936 4826 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-5f383150-575a-4ec4-8521-2f187b5ecf9e" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5f383150-575a-4ec4-8521-2f187b5ecf9e") on node "crc" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.406945 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad041e2d-3400-49ce-b25f-0d335f3b6738-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "ad041e2d-3400-49ce-b25f-0d335f3b6738" (UID: "ad041e2d-3400-49ce-b25f-0d335f3b6738"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.406962 4826 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-c3b6961b-20a9-4b00-9638-0d75e0bb359a" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3b6961b-20a9-4b00-9638-0d75e0bb359a") on node "crc" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.482543 4826 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ad041e2d-3400-49ce-b25f-0d335f3b6738-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.482836 4826 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/69dc8d23-ac18-40b1-99d9-365705c5753b-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.482924 4826 reconciler_common.go:293] "Volume detached for volume \"pvc-c3b6961b-20a9-4b00-9638-0d75e0bb359a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3b6961b-20a9-4b00-9638-0d75e0bb359a\") on node \"crc\" DevicePath \"\"" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.483015 4826 reconciler_common.go:293] "Volume detached for volume \"pvc-5f383150-575a-4ec4-8521-2f187b5ecf9e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5f383150-575a-4ec4-8521-2f187b5ecf9e\") on node \"crc\" DevicePath \"\"" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.573207 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-2" podUID="ad041e2d-3400-49ce-b25f-0d335f3b6738" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.134:5671: i/o timeout" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.727031 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.741832 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.755825 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.778931 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.800948 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 19:22:25 crc kubenswrapper[4826]: E0319 19:22:25.801834 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad041e2d-3400-49ce-b25f-0d335f3b6738" containerName="setup-container" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.801926 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad041e2d-3400-49ce-b25f-0d335f3b6738" containerName="setup-container" Mar 19 19:22:25 crc kubenswrapper[4826]: E0319 19:22:25.801996 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69dc8d23-ac18-40b1-99d9-365705c5753b" containerName="setup-container" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.802054 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="69dc8d23-ac18-40b1-99d9-365705c5753b" containerName="setup-container" Mar 19 19:22:25 crc kubenswrapper[4826]: E0319 19:22:25.802138 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad041e2d-3400-49ce-b25f-0d335f3b6738" containerName="rabbitmq" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.802203 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad041e2d-3400-49ce-b25f-0d335f3b6738" containerName="rabbitmq" Mar 19 19:22:25 crc kubenswrapper[4826]: E0319 19:22:25.802279 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69dc8d23-ac18-40b1-99d9-365705c5753b" containerName="rabbitmq" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.802351 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="69dc8d23-ac18-40b1-99d9-365705c5753b" containerName="rabbitmq" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.802624 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="69dc8d23-ac18-40b1-99d9-365705c5753b" containerName="rabbitmq" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.802721 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad041e2d-3400-49ce-b25f-0d335f3b6738" containerName="rabbitmq" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.804191 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.806341 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-x4lnc" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.807223 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.807374 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.807535 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.807738 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.809051 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.814352 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-2"] Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.815904 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.822218 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.850485 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.870497 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.900834 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/208228fc-8848-4817-96ea-48e37f6386ce-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"208228fc-8848-4817-96ea-48e37f6386ce\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.900900 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/de5e1809-5292-4c32-a83e-9dbb01f1db4b-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"de5e1809-5292-4c32-a83e-9dbb01f1db4b\") " pod="openstack/rabbitmq-server-2" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.900937 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/de5e1809-5292-4c32-a83e-9dbb01f1db4b-pod-info\") pod \"rabbitmq-server-2\" (UID: \"de5e1809-5292-4c32-a83e-9dbb01f1db4b\") " pod="openstack/rabbitmq-server-2" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.900965 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/de5e1809-5292-4c32-a83e-9dbb01f1db4b-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"de5e1809-5292-4c32-a83e-9dbb01f1db4b\") " pod="openstack/rabbitmq-server-2" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.901008 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5f383150-575a-4ec4-8521-2f187b5ecf9e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5f383150-575a-4ec4-8521-2f187b5ecf9e\") pod \"rabbitmq-cell1-server-0\" (UID: \"208228fc-8848-4817-96ea-48e37f6386ce\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.901051 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/208228fc-8848-4817-96ea-48e37f6386ce-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"208228fc-8848-4817-96ea-48e37f6386ce\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.901146 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/208228fc-8848-4817-96ea-48e37f6386ce-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"208228fc-8848-4817-96ea-48e37f6386ce\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.901202 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/208228fc-8848-4817-96ea-48e37f6386ce-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"208228fc-8848-4817-96ea-48e37f6386ce\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.901282 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/de5e1809-5292-4c32-a83e-9dbb01f1db4b-server-conf\") pod \"rabbitmq-server-2\" (UID: \"de5e1809-5292-4c32-a83e-9dbb01f1db4b\") " pod="openstack/rabbitmq-server-2" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.901317 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjwbl\" (UniqueName: \"kubernetes.io/projected/de5e1809-5292-4c32-a83e-9dbb01f1db4b-kube-api-access-wjwbl\") pod \"rabbitmq-server-2\" (UID: \"de5e1809-5292-4c32-a83e-9dbb01f1db4b\") " pod="openstack/rabbitmq-server-2" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.901363 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/de5e1809-5292-4c32-a83e-9dbb01f1db4b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"de5e1809-5292-4c32-a83e-9dbb01f1db4b\") " pod="openstack/rabbitmq-server-2" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.901396 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/de5e1809-5292-4c32-a83e-9dbb01f1db4b-config-data\") pod \"rabbitmq-server-2\" (UID: \"de5e1809-5292-4c32-a83e-9dbb01f1db4b\") " pod="openstack/rabbitmq-server-2" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.901424 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/208228fc-8848-4817-96ea-48e37f6386ce-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"208228fc-8848-4817-96ea-48e37f6386ce\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.901452 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/208228fc-8848-4817-96ea-48e37f6386ce-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"208228fc-8848-4817-96ea-48e37f6386ce\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.901563 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/de5e1809-5292-4c32-a83e-9dbb01f1db4b-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"de5e1809-5292-4c32-a83e-9dbb01f1db4b\") " pod="openstack/rabbitmq-server-2" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.901599 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr26z\" (UniqueName: \"kubernetes.io/projected/208228fc-8848-4817-96ea-48e37f6386ce-kube-api-access-cr26z\") pod \"rabbitmq-cell1-server-0\" (UID: \"208228fc-8848-4817-96ea-48e37f6386ce\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.901649 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/de5e1809-5292-4c32-a83e-9dbb01f1db4b-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"de5e1809-5292-4c32-a83e-9dbb01f1db4b\") " pod="openstack/rabbitmq-server-2" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.901761 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c3b6961b-20a9-4b00-9638-0d75e0bb359a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3b6961b-20a9-4b00-9638-0d75e0bb359a\") pod \"rabbitmq-server-2\" (UID: \"de5e1809-5292-4c32-a83e-9dbb01f1db4b\") " pod="openstack/rabbitmq-server-2" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.902367 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/208228fc-8848-4817-96ea-48e37f6386ce-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"208228fc-8848-4817-96ea-48e37f6386ce\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.902420 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/208228fc-8848-4817-96ea-48e37f6386ce-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"208228fc-8848-4817-96ea-48e37f6386ce\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.902464 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/de5e1809-5292-4c32-a83e-9dbb01f1db4b-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"de5e1809-5292-4c32-a83e-9dbb01f1db4b\") " pod="openstack/rabbitmq-server-2" Mar 19 19:22:25 crc kubenswrapper[4826]: I0319 19:22:25.902511 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/208228fc-8848-4817-96ea-48e37f6386ce-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"208228fc-8848-4817-96ea-48e37f6386ce\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:22:26 crc kubenswrapper[4826]: I0319 19:22:26.004932 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/de5e1809-5292-4c32-a83e-9dbb01f1db4b-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"de5e1809-5292-4c32-a83e-9dbb01f1db4b\") " pod="openstack/rabbitmq-server-2" Mar 19 19:22:26 crc kubenswrapper[4826]: I0319 19:22:26.004989 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5f383150-575a-4ec4-8521-2f187b5ecf9e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5f383150-575a-4ec4-8521-2f187b5ecf9e\") pod \"rabbitmq-cell1-server-0\" (UID: \"208228fc-8848-4817-96ea-48e37f6386ce\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:22:26 crc kubenswrapper[4826]: I0319 19:22:26.005020 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/208228fc-8848-4817-96ea-48e37f6386ce-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"208228fc-8848-4817-96ea-48e37f6386ce\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:22:26 crc kubenswrapper[4826]: I0319 19:22:26.005077 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/208228fc-8848-4817-96ea-48e37f6386ce-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"208228fc-8848-4817-96ea-48e37f6386ce\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:22:26 crc kubenswrapper[4826]: I0319 19:22:26.005111 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/208228fc-8848-4817-96ea-48e37f6386ce-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"208228fc-8848-4817-96ea-48e37f6386ce\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:22:26 crc kubenswrapper[4826]: I0319 19:22:26.005153 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/de5e1809-5292-4c32-a83e-9dbb01f1db4b-server-conf\") pod \"rabbitmq-server-2\" (UID: \"de5e1809-5292-4c32-a83e-9dbb01f1db4b\") " pod="openstack/rabbitmq-server-2" Mar 19 19:22:26 crc kubenswrapper[4826]: I0319 19:22:26.005173 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjwbl\" (UniqueName: \"kubernetes.io/projected/de5e1809-5292-4c32-a83e-9dbb01f1db4b-kube-api-access-wjwbl\") pod \"rabbitmq-server-2\" (UID: \"de5e1809-5292-4c32-a83e-9dbb01f1db4b\") " pod="openstack/rabbitmq-server-2" Mar 19 19:22:26 crc kubenswrapper[4826]: I0319 19:22:26.005191 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/de5e1809-5292-4c32-a83e-9dbb01f1db4b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"de5e1809-5292-4c32-a83e-9dbb01f1db4b\") " pod="openstack/rabbitmq-server-2" Mar 19 19:22:26 crc kubenswrapper[4826]: I0319 19:22:26.005212 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/de5e1809-5292-4c32-a83e-9dbb01f1db4b-config-data\") pod \"rabbitmq-server-2\" (UID: \"de5e1809-5292-4c32-a83e-9dbb01f1db4b\") " pod="openstack/rabbitmq-server-2" Mar 19 19:22:26 crc kubenswrapper[4826]: I0319 19:22:26.005228 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/208228fc-8848-4817-96ea-48e37f6386ce-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"208228fc-8848-4817-96ea-48e37f6386ce\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:22:26 crc kubenswrapper[4826]: I0319 19:22:26.005241 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/208228fc-8848-4817-96ea-48e37f6386ce-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"208228fc-8848-4817-96ea-48e37f6386ce\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:22:26 crc kubenswrapper[4826]: I0319 19:22:26.005302 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/de5e1809-5292-4c32-a83e-9dbb01f1db4b-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"de5e1809-5292-4c32-a83e-9dbb01f1db4b\") " pod="openstack/rabbitmq-server-2" Mar 19 19:22:26 crc kubenswrapper[4826]: I0319 19:22:26.005321 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cr26z\" (UniqueName: \"kubernetes.io/projected/208228fc-8848-4817-96ea-48e37f6386ce-kube-api-access-cr26z\") pod \"rabbitmq-cell1-server-0\" (UID: \"208228fc-8848-4817-96ea-48e37f6386ce\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:22:26 crc kubenswrapper[4826]: I0319 19:22:26.005353 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/de5e1809-5292-4c32-a83e-9dbb01f1db4b-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"de5e1809-5292-4c32-a83e-9dbb01f1db4b\") " pod="openstack/rabbitmq-server-2" Mar 19 19:22:26 crc kubenswrapper[4826]: I0319 19:22:26.005369 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c3b6961b-20a9-4b00-9638-0d75e0bb359a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3b6961b-20a9-4b00-9638-0d75e0bb359a\") pod \"rabbitmq-server-2\" (UID: \"de5e1809-5292-4c32-a83e-9dbb01f1db4b\") " pod="openstack/rabbitmq-server-2" Mar 19 19:22:26 crc kubenswrapper[4826]: I0319 19:22:26.005509 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/208228fc-8848-4817-96ea-48e37f6386ce-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"208228fc-8848-4817-96ea-48e37f6386ce\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:22:26 crc kubenswrapper[4826]: I0319 19:22:26.005526 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/208228fc-8848-4817-96ea-48e37f6386ce-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"208228fc-8848-4817-96ea-48e37f6386ce\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:22:26 crc kubenswrapper[4826]: I0319 19:22:26.005545 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/de5e1809-5292-4c32-a83e-9dbb01f1db4b-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"de5e1809-5292-4c32-a83e-9dbb01f1db4b\") " pod="openstack/rabbitmq-server-2" Mar 19 19:22:26 crc kubenswrapper[4826]: I0319 19:22:26.005564 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/208228fc-8848-4817-96ea-48e37f6386ce-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"208228fc-8848-4817-96ea-48e37f6386ce\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:22:26 crc kubenswrapper[4826]: I0319 19:22:26.005631 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/208228fc-8848-4817-96ea-48e37f6386ce-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"208228fc-8848-4817-96ea-48e37f6386ce\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:22:26 crc kubenswrapper[4826]: I0319 19:22:26.005647 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/de5e1809-5292-4c32-a83e-9dbb01f1db4b-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"de5e1809-5292-4c32-a83e-9dbb01f1db4b\") " pod="openstack/rabbitmq-server-2" Mar 19 19:22:26 crc kubenswrapper[4826]: I0319 19:22:26.005679 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/de5e1809-5292-4c32-a83e-9dbb01f1db4b-pod-info\") pod \"rabbitmq-server-2\" (UID: \"de5e1809-5292-4c32-a83e-9dbb01f1db4b\") " pod="openstack/rabbitmq-server-2" Mar 19 19:22:26 crc kubenswrapper[4826]: I0319 19:22:26.011757 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/208228fc-8848-4817-96ea-48e37f6386ce-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"208228fc-8848-4817-96ea-48e37f6386ce\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:22:26 crc kubenswrapper[4826]: I0319 19:22:26.012193 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/de5e1809-5292-4c32-a83e-9dbb01f1db4b-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"de5e1809-5292-4c32-a83e-9dbb01f1db4b\") " pod="openstack/rabbitmq-server-2" Mar 19 19:22:26 crc kubenswrapper[4826]: I0319 19:22:26.013459 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/208228fc-8848-4817-96ea-48e37f6386ce-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"208228fc-8848-4817-96ea-48e37f6386ce\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:22:26 crc kubenswrapper[4826]: I0319 19:22:26.013815 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/de5e1809-5292-4c32-a83e-9dbb01f1db4b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"de5e1809-5292-4c32-a83e-9dbb01f1db4b\") " pod="openstack/rabbitmq-server-2" Mar 19 19:22:26 crc kubenswrapper[4826]: I0319 19:22:26.014344 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/208228fc-8848-4817-96ea-48e37f6386ce-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"208228fc-8848-4817-96ea-48e37f6386ce\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:22:26 crc kubenswrapper[4826]: I0319 19:22:26.017610 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/de5e1809-5292-4c32-a83e-9dbb01f1db4b-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"de5e1809-5292-4c32-a83e-9dbb01f1db4b\") " pod="openstack/rabbitmq-server-2" Mar 19 19:22:26 crc kubenswrapper[4826]: I0319 19:22:26.017982 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/de5e1809-5292-4c32-a83e-9dbb01f1db4b-pod-info\") pod \"rabbitmq-server-2\" (UID: \"de5e1809-5292-4c32-a83e-9dbb01f1db4b\") " pod="openstack/rabbitmq-server-2" Mar 19 19:22:26 crc kubenswrapper[4826]: I0319 19:22:26.018099 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/de5e1809-5292-4c32-a83e-9dbb01f1db4b-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"de5e1809-5292-4c32-a83e-9dbb01f1db4b\") " pod="openstack/rabbitmq-server-2" Mar 19 19:22:26 crc kubenswrapper[4826]: I0319 19:22:26.019374 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/de5e1809-5292-4c32-a83e-9dbb01f1db4b-config-data\") pod \"rabbitmq-server-2\" (UID: \"de5e1809-5292-4c32-a83e-9dbb01f1db4b\") " pod="openstack/rabbitmq-server-2" Mar 19 19:22:26 crc kubenswrapper[4826]: I0319 19:22:26.020389 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/208228fc-8848-4817-96ea-48e37f6386ce-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"208228fc-8848-4817-96ea-48e37f6386ce\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:22:26 crc kubenswrapper[4826]: I0319 19:22:26.020594 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/208228fc-8848-4817-96ea-48e37f6386ce-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"208228fc-8848-4817-96ea-48e37f6386ce\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:22:26 crc kubenswrapper[4826]: I0319 19:22:26.020717 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/208228fc-8848-4817-96ea-48e37f6386ce-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"208228fc-8848-4817-96ea-48e37f6386ce\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:22:26 crc kubenswrapper[4826]: I0319 19:22:26.020891 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/208228fc-8848-4817-96ea-48e37f6386ce-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"208228fc-8848-4817-96ea-48e37f6386ce\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:22:26 crc kubenswrapper[4826]: I0319 19:22:26.021231 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/de5e1809-5292-4c32-a83e-9dbb01f1db4b-server-conf\") pod \"rabbitmq-server-2\" (UID: \"de5e1809-5292-4c32-a83e-9dbb01f1db4b\") " pod="openstack/rabbitmq-server-2" Mar 19 19:22:26 crc kubenswrapper[4826]: I0319 19:22:26.022647 4826 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 19:22:26 crc kubenswrapper[4826]: I0319 19:22:26.022715 4826 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5f383150-575a-4ec4-8521-2f187b5ecf9e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5f383150-575a-4ec4-8521-2f187b5ecf9e\") pod \"rabbitmq-cell1-server-0\" (UID: \"208228fc-8848-4817-96ea-48e37f6386ce\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a75865b4496e6c0b397e1e83bc349881282d7666fda34b40e214046b93469f8a/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:22:26 crc kubenswrapper[4826]: I0319 19:22:26.022941 4826 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 19:22:26 crc kubenswrapper[4826]: I0319 19:22:26.022973 4826 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c3b6961b-20a9-4b00-9638-0d75e0bb359a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3b6961b-20a9-4b00-9638-0d75e0bb359a\") pod \"rabbitmq-server-2\" (UID: \"de5e1809-5292-4c32-a83e-9dbb01f1db4b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7c227a1c94246a3d17c388fbd1d144f5240f0189ff653b85b39092d420b9acf6/globalmount\"" pod="openstack/rabbitmq-server-2" Mar 19 19:22:26 crc kubenswrapper[4826]: I0319 19:22:26.024881 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/208228fc-8848-4817-96ea-48e37f6386ce-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"208228fc-8848-4817-96ea-48e37f6386ce\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:22:26 crc kubenswrapper[4826]: I0319 19:22:26.024965 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/208228fc-8848-4817-96ea-48e37f6386ce-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"208228fc-8848-4817-96ea-48e37f6386ce\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:22:26 crc kubenswrapper[4826]: I0319 19:22:26.034142 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cr26z\" (UniqueName: \"kubernetes.io/projected/208228fc-8848-4817-96ea-48e37f6386ce-kube-api-access-cr26z\") pod \"rabbitmq-cell1-server-0\" (UID: \"208228fc-8848-4817-96ea-48e37f6386ce\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:22:26 crc kubenswrapper[4826]: I0319 19:22:26.036114 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69dc8d23-ac18-40b1-99d9-365705c5753b" path="/var/lib/kubelet/pods/69dc8d23-ac18-40b1-99d9-365705c5753b/volumes" Mar 19 19:22:26 crc kubenswrapper[4826]: I0319 19:22:26.038541 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/de5e1809-5292-4c32-a83e-9dbb01f1db4b-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"de5e1809-5292-4c32-a83e-9dbb01f1db4b\") " pod="openstack/rabbitmq-server-2" Mar 19 19:22:26 crc kubenswrapper[4826]: I0319 19:22:26.040377 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/de5e1809-5292-4c32-a83e-9dbb01f1db4b-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"de5e1809-5292-4c32-a83e-9dbb01f1db4b\") " pod="openstack/rabbitmq-server-2" Mar 19 19:22:26 crc kubenswrapper[4826]: I0319 19:22:26.041340 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjwbl\" (UniqueName: \"kubernetes.io/projected/de5e1809-5292-4c32-a83e-9dbb01f1db4b-kube-api-access-wjwbl\") pod \"rabbitmq-server-2\" (UID: \"de5e1809-5292-4c32-a83e-9dbb01f1db4b\") " pod="openstack/rabbitmq-server-2" Mar 19 19:22:26 crc kubenswrapper[4826]: I0319 19:22:26.048250 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad041e2d-3400-49ce-b25f-0d335f3b6738" path="/var/lib/kubelet/pods/ad041e2d-3400-49ce-b25f-0d335f3b6738/volumes" Mar 19 19:22:26 crc kubenswrapper[4826]: I0319 19:22:26.116552 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5f383150-575a-4ec4-8521-2f187b5ecf9e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5f383150-575a-4ec4-8521-2f187b5ecf9e\") pod \"rabbitmq-cell1-server-0\" (UID: \"208228fc-8848-4817-96ea-48e37f6386ce\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:22:26 crc kubenswrapper[4826]: I0319 19:22:26.133275 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:22:26 crc kubenswrapper[4826]: I0319 19:22:26.206675 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c3b6961b-20a9-4b00-9638-0d75e0bb359a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3b6961b-20a9-4b00-9638-0d75e0bb359a\") pod \"rabbitmq-server-2\" (UID: \"de5e1809-5292-4c32-a83e-9dbb01f1db4b\") " pod="openstack/rabbitmq-server-2" Mar 19 19:22:26 crc kubenswrapper[4826]: E0319 19:22:26.232116 4826 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested" Mar 19 19:22:26 crc kubenswrapper[4826]: E0319 19:22:26.232200 4826 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested" Mar 19 19:22:26 crc kubenswrapper[4826]: E0319 19:22:26.232369 4826 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68bh56bh57bh56ch5bfh594h566hd7h57fh5b4h58fh5ffhf7h59h65fh77h5cfh66ch66bh57fh64fh5ch565hc8h5fdh67dh598h68ch579h65fh8bhf5q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mvlqc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(ab298593-ac97-4031-8bfc-b0e5be9b341a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 19:22:26 crc kubenswrapper[4826]: I0319 19:22:26.450898 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Mar 19 19:22:27 crc kubenswrapper[4826]: I0319 19:22:27.419113 4826 scope.go:117] "RemoveContainer" containerID="d503c343673c800d54ff6e6cc56a18acaec57bf393c9bb2a22d379eea6512b2d" Mar 19 19:22:27 crc kubenswrapper[4826]: E0319 19:22:27.443833 4826 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested" Mar 19 19:22:27 crc kubenswrapper[4826]: E0319 19:22:27.443877 4826 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested" Mar 19 19:22:27 crc kubenswrapper[4826]: E0319 19:22:27.444153 4826 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:heat-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested,Command:[/bin/bash],Args:[-c /usr/bin/heat-manage --config-dir /etc/heat/heat.conf.d db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/heat/heat.conf.d/00-default.conf,SubPath:00-default.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/heat/heat.conf.d/01-custom.conf,SubPath:01-custom.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nr8jz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42418,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42418,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-db-sync-pvzvx_openstack(be3204f1-b777-49ae-8ba5-2f30f639dd1e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 19:22:27 crc kubenswrapper[4826]: E0319 19:22:27.445260 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/heat-db-sync-pvzvx" podUID="be3204f1-b777-49ae-8ba5-2f30f639dd1e" Mar 19 19:22:27 crc kubenswrapper[4826]: I0319 19:22:27.569552 4826 scope.go:117] "RemoveContainer" containerID="e599ec2e8b0d98e5639b385d8980140372266b12b05d4bf6c5c497a47fd71073" Mar 19 19:22:27 crc kubenswrapper[4826]: I0319 19:22:27.733282 4826 scope.go:117] "RemoveContainer" containerID="4949977f15e132445d8d9e1657957d62bc88426ae01376ce6c0dd6719b94f8e3" Mar 19 19:22:28 crc kubenswrapper[4826]: I0319 19:22:28.140354 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b75489c6f-gqz8n"] Mar 19 19:22:28 crc kubenswrapper[4826]: W0319 19:22:28.149914 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf1734c8_f29e_4752_8f25_9f6a79025585.slice/crio-cbb9b15063cf3bc5bb6fa1ca3d37cbee20b9841c1f7401bf4e14035b37378d2b WatchSource:0}: Error finding container cbb9b15063cf3bc5bb6fa1ca3d37cbee20b9841c1f7401bf4e14035b37378d2b: Status 404 returned error can't find the container with id cbb9b15063cf3bc5bb6fa1ca3d37cbee20b9841c1f7401bf4e14035b37378d2b Mar 19 19:22:28 crc kubenswrapper[4826]: I0319 19:22:28.154188 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab298593-ac97-4031-8bfc-b0e5be9b341a","Type":"ContainerStarted","Data":"dacc496cd5f34d0a818ca8fe2096b259f814162c54acdfb0cf59ac63feb21a71"} Mar 19 19:22:28 crc kubenswrapper[4826]: E0319 19:22:28.157523 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested\\\"\"" pod="openstack/heat-db-sync-pvzvx" podUID="be3204f1-b777-49ae-8ba5-2f30f639dd1e" Mar 19 19:22:28 crc kubenswrapper[4826]: I0319 19:22:28.243046 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 19 19:22:28 crc kubenswrapper[4826]: W0319 19:22:28.248691 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod208228fc_8848_4817_96ea_48e37f6386ce.slice/crio-595375bcb7a70b90d74c0d3224f5b6131388e541967ee390b3572d84cb5d37ae WatchSource:0}: Error finding container 595375bcb7a70b90d74c0d3224f5b6131388e541967ee390b3572d84cb5d37ae: Status 404 returned error can't find the container with id 595375bcb7a70b90d74c0d3224f5b6131388e541967ee390b3572d84cb5d37ae Mar 19 19:22:28 crc kubenswrapper[4826]: I0319 19:22:28.260125 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 19:22:29 crc kubenswrapper[4826]: I0319 19:22:29.172408 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"208228fc-8848-4817-96ea-48e37f6386ce","Type":"ContainerStarted","Data":"595375bcb7a70b90d74c0d3224f5b6131388e541967ee390b3572d84cb5d37ae"} Mar 19 19:22:29 crc kubenswrapper[4826]: I0319 19:22:29.174456 4826 generic.go:334] "Generic (PLEG): container finished" podID="af1734c8-f29e-4752-8f25-9f6a79025585" containerID="5ae5b159495549edaff33f3363ab91ead16c5b2da3d477c4886b32dae2a28f10" exitCode=0 Mar 19 19:22:29 crc kubenswrapper[4826]: I0319 19:22:29.174548 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b75489c6f-gqz8n" event={"ID":"af1734c8-f29e-4752-8f25-9f6a79025585","Type":"ContainerDied","Data":"5ae5b159495549edaff33f3363ab91ead16c5b2da3d477c4886b32dae2a28f10"} Mar 19 19:22:29 crc kubenswrapper[4826]: I0319 19:22:29.174589 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b75489c6f-gqz8n" event={"ID":"af1734c8-f29e-4752-8f25-9f6a79025585","Type":"ContainerStarted","Data":"cbb9b15063cf3bc5bb6fa1ca3d37cbee20b9841c1f7401bf4e14035b37378d2b"} Mar 19 19:22:29 crc kubenswrapper[4826]: I0319 19:22:29.181022 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab298593-ac97-4031-8bfc-b0e5be9b341a","Type":"ContainerStarted","Data":"8110d75b80fb4ede399b8d93f09f724390828cf9e4fab780892a44765987383f"} Mar 19 19:22:29 crc kubenswrapper[4826]: I0319 19:22:29.185439 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"de5e1809-5292-4c32-a83e-9dbb01f1db4b","Type":"ContainerStarted","Data":"95aa38fac29cdd47cbec9aef078c6618ace7f1b65c304570b6767257e8453ac3"} Mar 19 19:22:30 crc kubenswrapper[4826]: I0319 19:22:30.220966 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b75489c6f-gqz8n" event={"ID":"af1734c8-f29e-4752-8f25-9f6a79025585","Type":"ContainerStarted","Data":"297d9e67c38b84abe12517591a2fdf70b5de6e47f9cd2a7015f7f7be2b570a69"} Mar 19 19:22:30 crc kubenswrapper[4826]: I0319 19:22:30.221384 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b75489c6f-gqz8n" Mar 19 19:22:30 crc kubenswrapper[4826]: I0319 19:22:30.226916 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"de5e1809-5292-4c32-a83e-9dbb01f1db4b","Type":"ContainerStarted","Data":"85cd82324840327d65a54cf0e628c0891c26bad8f87b2bdaf19c767754370f8e"} Mar 19 19:22:30 crc kubenswrapper[4826]: I0319 19:22:30.257072 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b75489c6f-gqz8n" podStartSLOduration=15.257042133 podStartE2EDuration="15.257042133s" podCreationTimestamp="2026-03-19 19:22:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:22:30.244873558 +0000 UTC m=+1574.998941911" watchObservedRunningTime="2026-03-19 19:22:30.257042133 +0000 UTC m=+1575.011110486" Mar 19 19:22:30 crc kubenswrapper[4826]: E0319 19:22:30.835765 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="ab298593-ac97-4031-8bfc-b0e5be9b341a" Mar 19 19:22:31 crc kubenswrapper[4826]: I0319 19:22:31.239987 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"208228fc-8848-4817-96ea-48e37f6386ce","Type":"ContainerStarted","Data":"b2162d04f73e1d7502688b6dbf701b8b79bcaa056cf6a8c51449be4bb7bd7717"} Mar 19 19:22:31 crc kubenswrapper[4826]: I0319 19:22:31.243024 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab298593-ac97-4031-8bfc-b0e5be9b341a","Type":"ContainerStarted","Data":"3d79f3eb0762b6d097d5adc529528de98868de647ea25aa50c65126b234a2e03"} Mar 19 19:22:31 crc kubenswrapper[4826]: E0319 19:22:31.245318 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested\\\"\"" pod="openstack/ceilometer-0" podUID="ab298593-ac97-4031-8bfc-b0e5be9b341a" Mar 19 19:22:32 crc kubenswrapper[4826]: I0319 19:22:32.258642 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 19 19:22:32 crc kubenswrapper[4826]: E0319 19:22:32.261522 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested\\\"\"" pod="openstack/ceilometer-0" podUID="ab298593-ac97-4031-8bfc-b0e5be9b341a" Mar 19 19:22:32 crc kubenswrapper[4826]: I0319 19:22:32.978473 4826 scope.go:117] "RemoveContainer" containerID="856447f1cdc796c080402d3bfb76d7471741ca95039714006756d0cb980e424c" Mar 19 19:22:32 crc kubenswrapper[4826]: E0319 19:22:32.979067 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:22:33 crc kubenswrapper[4826]: E0319 19:22:33.272937 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested\\\"\"" pod="openstack/ceilometer-0" podUID="ab298593-ac97-4031-8bfc-b0e5be9b341a" Mar 19 19:22:36 crc kubenswrapper[4826]: I0319 19:22:36.317528 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b75489c6f-gqz8n" Mar 19 19:22:36 crc kubenswrapper[4826]: I0319 19:22:36.410437 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f84f9ccf-p6gql"] Mar 19 19:22:36 crc kubenswrapper[4826]: I0319 19:22:36.410740 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-f84f9ccf-p6gql" podUID="198aca80-7461-4f41-b6ff-9fd31d3d28e2" containerName="dnsmasq-dns" containerID="cri-o://420f2b8bdc247dc7b891bff22203648c58a40a66299159e214fe030add8758f5" gracePeriod=10 Mar 19 19:22:36 crc kubenswrapper[4826]: I0319 19:22:36.695312 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d75f767dc-qk9tk"] Mar 19 19:22:36 crc kubenswrapper[4826]: I0319 19:22:36.697693 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d75f767dc-qk9tk" Mar 19 19:22:36 crc kubenswrapper[4826]: I0319 19:22:36.708892 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d75f767dc-qk9tk"] Mar 19 19:22:36 crc kubenswrapper[4826]: I0319 19:22:36.718133 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce7b509e-beae-4a19-803c-339c76a4d51e-ovsdbserver-sb\") pod \"dnsmasq-dns-5d75f767dc-qk9tk\" (UID: \"ce7b509e-beae-4a19-803c-339c76a4d51e\") " pod="openstack/dnsmasq-dns-5d75f767dc-qk9tk" Mar 19 19:22:36 crc kubenswrapper[4826]: I0319 19:22:36.718173 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ce7b509e-beae-4a19-803c-339c76a4d51e-openstack-edpm-ipam\") pod \"dnsmasq-dns-5d75f767dc-qk9tk\" (UID: \"ce7b509e-beae-4a19-803c-339c76a4d51e\") " pod="openstack/dnsmasq-dns-5d75f767dc-qk9tk" Mar 19 19:22:36 crc kubenswrapper[4826]: I0319 19:22:36.718231 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce7b509e-beae-4a19-803c-339c76a4d51e-config\") pod \"dnsmasq-dns-5d75f767dc-qk9tk\" (UID: \"ce7b509e-beae-4a19-803c-339c76a4d51e\") " pod="openstack/dnsmasq-dns-5d75f767dc-qk9tk" Mar 19 19:22:36 crc kubenswrapper[4826]: I0319 19:22:36.718250 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce7b509e-beae-4a19-803c-339c76a4d51e-dns-swift-storage-0\") pod \"dnsmasq-dns-5d75f767dc-qk9tk\" (UID: \"ce7b509e-beae-4a19-803c-339c76a4d51e\") " pod="openstack/dnsmasq-dns-5d75f767dc-qk9tk" Mar 19 19:22:36 crc kubenswrapper[4826]: I0319 19:22:36.718274 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kssnh\" (UniqueName: \"kubernetes.io/projected/ce7b509e-beae-4a19-803c-339c76a4d51e-kube-api-access-kssnh\") pod \"dnsmasq-dns-5d75f767dc-qk9tk\" (UID: \"ce7b509e-beae-4a19-803c-339c76a4d51e\") " pod="openstack/dnsmasq-dns-5d75f767dc-qk9tk" Mar 19 19:22:36 crc kubenswrapper[4826]: I0319 19:22:36.718322 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce7b509e-beae-4a19-803c-339c76a4d51e-dns-svc\") pod \"dnsmasq-dns-5d75f767dc-qk9tk\" (UID: \"ce7b509e-beae-4a19-803c-339c76a4d51e\") " pod="openstack/dnsmasq-dns-5d75f767dc-qk9tk" Mar 19 19:22:36 crc kubenswrapper[4826]: I0319 19:22:36.718373 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce7b509e-beae-4a19-803c-339c76a4d51e-ovsdbserver-nb\") pod \"dnsmasq-dns-5d75f767dc-qk9tk\" (UID: \"ce7b509e-beae-4a19-803c-339c76a4d51e\") " pod="openstack/dnsmasq-dns-5d75f767dc-qk9tk" Mar 19 19:22:36 crc kubenswrapper[4826]: I0319 19:22:36.819134 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce7b509e-beae-4a19-803c-339c76a4d51e-config\") pod \"dnsmasq-dns-5d75f767dc-qk9tk\" (UID: \"ce7b509e-beae-4a19-803c-339c76a4d51e\") " pod="openstack/dnsmasq-dns-5d75f767dc-qk9tk" Mar 19 19:22:36 crc kubenswrapper[4826]: I0319 19:22:36.819168 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce7b509e-beae-4a19-803c-339c76a4d51e-dns-swift-storage-0\") pod \"dnsmasq-dns-5d75f767dc-qk9tk\" (UID: \"ce7b509e-beae-4a19-803c-339c76a4d51e\") " pod="openstack/dnsmasq-dns-5d75f767dc-qk9tk" Mar 19 19:22:36 crc kubenswrapper[4826]: I0319 19:22:36.819201 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kssnh\" (UniqueName: \"kubernetes.io/projected/ce7b509e-beae-4a19-803c-339c76a4d51e-kube-api-access-kssnh\") pod \"dnsmasq-dns-5d75f767dc-qk9tk\" (UID: \"ce7b509e-beae-4a19-803c-339c76a4d51e\") " pod="openstack/dnsmasq-dns-5d75f767dc-qk9tk" Mar 19 19:22:36 crc kubenswrapper[4826]: I0319 19:22:36.819256 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce7b509e-beae-4a19-803c-339c76a4d51e-dns-svc\") pod \"dnsmasq-dns-5d75f767dc-qk9tk\" (UID: \"ce7b509e-beae-4a19-803c-339c76a4d51e\") " pod="openstack/dnsmasq-dns-5d75f767dc-qk9tk" Mar 19 19:22:36 crc kubenswrapper[4826]: I0319 19:22:36.819308 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce7b509e-beae-4a19-803c-339c76a4d51e-ovsdbserver-nb\") pod \"dnsmasq-dns-5d75f767dc-qk9tk\" (UID: \"ce7b509e-beae-4a19-803c-339c76a4d51e\") " pod="openstack/dnsmasq-dns-5d75f767dc-qk9tk" Mar 19 19:22:36 crc kubenswrapper[4826]: I0319 19:22:36.819397 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce7b509e-beae-4a19-803c-339c76a4d51e-ovsdbserver-sb\") pod \"dnsmasq-dns-5d75f767dc-qk9tk\" (UID: \"ce7b509e-beae-4a19-803c-339c76a4d51e\") " pod="openstack/dnsmasq-dns-5d75f767dc-qk9tk" Mar 19 19:22:36 crc kubenswrapper[4826]: I0319 19:22:36.819420 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ce7b509e-beae-4a19-803c-339c76a4d51e-openstack-edpm-ipam\") pod \"dnsmasq-dns-5d75f767dc-qk9tk\" (UID: \"ce7b509e-beae-4a19-803c-339c76a4d51e\") " pod="openstack/dnsmasq-dns-5d75f767dc-qk9tk" Mar 19 19:22:36 crc kubenswrapper[4826]: I0319 19:22:36.821695 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ce7b509e-beae-4a19-803c-339c76a4d51e-openstack-edpm-ipam\") pod \"dnsmasq-dns-5d75f767dc-qk9tk\" (UID: \"ce7b509e-beae-4a19-803c-339c76a4d51e\") " pod="openstack/dnsmasq-dns-5d75f767dc-qk9tk" Mar 19 19:22:36 crc kubenswrapper[4826]: I0319 19:22:36.821946 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce7b509e-beae-4a19-803c-339c76a4d51e-config\") pod \"dnsmasq-dns-5d75f767dc-qk9tk\" (UID: \"ce7b509e-beae-4a19-803c-339c76a4d51e\") " pod="openstack/dnsmasq-dns-5d75f767dc-qk9tk" Mar 19 19:22:36 crc kubenswrapper[4826]: I0319 19:22:36.822509 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce7b509e-beae-4a19-803c-339c76a4d51e-dns-svc\") pod \"dnsmasq-dns-5d75f767dc-qk9tk\" (UID: \"ce7b509e-beae-4a19-803c-339c76a4d51e\") " pod="openstack/dnsmasq-dns-5d75f767dc-qk9tk" Mar 19 19:22:36 crc kubenswrapper[4826]: I0319 19:22:36.825675 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce7b509e-beae-4a19-803c-339c76a4d51e-dns-swift-storage-0\") pod \"dnsmasq-dns-5d75f767dc-qk9tk\" (UID: \"ce7b509e-beae-4a19-803c-339c76a4d51e\") " pod="openstack/dnsmasq-dns-5d75f767dc-qk9tk" Mar 19 19:22:36 crc kubenswrapper[4826]: I0319 19:22:36.827734 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce7b509e-beae-4a19-803c-339c76a4d51e-ovsdbserver-nb\") pod \"dnsmasq-dns-5d75f767dc-qk9tk\" (UID: \"ce7b509e-beae-4a19-803c-339c76a4d51e\") " pod="openstack/dnsmasq-dns-5d75f767dc-qk9tk" Mar 19 19:22:36 crc kubenswrapper[4826]: I0319 19:22:36.827924 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce7b509e-beae-4a19-803c-339c76a4d51e-ovsdbserver-sb\") pod \"dnsmasq-dns-5d75f767dc-qk9tk\" (UID: \"ce7b509e-beae-4a19-803c-339c76a4d51e\") " pod="openstack/dnsmasq-dns-5d75f767dc-qk9tk" Mar 19 19:22:36 crc kubenswrapper[4826]: I0319 19:22:36.848032 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kssnh\" (UniqueName: \"kubernetes.io/projected/ce7b509e-beae-4a19-803c-339c76a4d51e-kube-api-access-kssnh\") pod \"dnsmasq-dns-5d75f767dc-qk9tk\" (UID: \"ce7b509e-beae-4a19-803c-339c76a4d51e\") " pod="openstack/dnsmasq-dns-5d75f767dc-qk9tk" Mar 19 19:22:37 crc kubenswrapper[4826]: I0319 19:22:37.061498 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d75f767dc-qk9tk" Mar 19 19:22:37 crc kubenswrapper[4826]: I0319 19:22:37.192719 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84f9ccf-p6gql" Mar 19 19:22:37 crc kubenswrapper[4826]: I0319 19:22:37.230207 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kf4mx\" (UniqueName: \"kubernetes.io/projected/198aca80-7461-4f41-b6ff-9fd31d3d28e2-kube-api-access-kf4mx\") pod \"198aca80-7461-4f41-b6ff-9fd31d3d28e2\" (UID: \"198aca80-7461-4f41-b6ff-9fd31d3d28e2\") " Mar 19 19:22:37 crc kubenswrapper[4826]: I0319 19:22:37.230616 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/198aca80-7461-4f41-b6ff-9fd31d3d28e2-dns-svc\") pod \"198aca80-7461-4f41-b6ff-9fd31d3d28e2\" (UID: \"198aca80-7461-4f41-b6ff-9fd31d3d28e2\") " Mar 19 19:22:37 crc kubenswrapper[4826]: I0319 19:22:37.230735 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/198aca80-7461-4f41-b6ff-9fd31d3d28e2-ovsdbserver-sb\") pod \"198aca80-7461-4f41-b6ff-9fd31d3d28e2\" (UID: \"198aca80-7461-4f41-b6ff-9fd31d3d28e2\") " Mar 19 19:22:37 crc kubenswrapper[4826]: I0319 19:22:37.230808 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/198aca80-7461-4f41-b6ff-9fd31d3d28e2-config\") pod \"198aca80-7461-4f41-b6ff-9fd31d3d28e2\" (UID: \"198aca80-7461-4f41-b6ff-9fd31d3d28e2\") " Mar 19 19:22:37 crc kubenswrapper[4826]: I0319 19:22:37.230939 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/198aca80-7461-4f41-b6ff-9fd31d3d28e2-dns-swift-storage-0\") pod \"198aca80-7461-4f41-b6ff-9fd31d3d28e2\" (UID: \"198aca80-7461-4f41-b6ff-9fd31d3d28e2\") " Mar 19 19:22:37 crc kubenswrapper[4826]: I0319 19:22:37.231034 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/198aca80-7461-4f41-b6ff-9fd31d3d28e2-ovsdbserver-nb\") pod \"198aca80-7461-4f41-b6ff-9fd31d3d28e2\" (UID: \"198aca80-7461-4f41-b6ff-9fd31d3d28e2\") " Mar 19 19:22:37 crc kubenswrapper[4826]: I0319 19:22:37.258245 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/198aca80-7461-4f41-b6ff-9fd31d3d28e2-kube-api-access-kf4mx" (OuterVolumeSpecName: "kube-api-access-kf4mx") pod "198aca80-7461-4f41-b6ff-9fd31d3d28e2" (UID: "198aca80-7461-4f41-b6ff-9fd31d3d28e2"). InnerVolumeSpecName "kube-api-access-kf4mx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:22:37 crc kubenswrapper[4826]: I0319 19:22:37.341961 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kf4mx\" (UniqueName: \"kubernetes.io/projected/198aca80-7461-4f41-b6ff-9fd31d3d28e2-kube-api-access-kf4mx\") on node \"crc\" DevicePath \"\"" Mar 19 19:22:37 crc kubenswrapper[4826]: I0319 19:22:37.344771 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/198aca80-7461-4f41-b6ff-9fd31d3d28e2-config" (OuterVolumeSpecName: "config") pod "198aca80-7461-4f41-b6ff-9fd31d3d28e2" (UID: "198aca80-7461-4f41-b6ff-9fd31d3d28e2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:22:37 crc kubenswrapper[4826]: I0319 19:22:37.355646 4826 generic.go:334] "Generic (PLEG): container finished" podID="198aca80-7461-4f41-b6ff-9fd31d3d28e2" containerID="420f2b8bdc247dc7b891bff22203648c58a40a66299159e214fe030add8758f5" exitCode=0 Mar 19 19:22:37 crc kubenswrapper[4826]: I0319 19:22:37.355829 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84f9ccf-p6gql" event={"ID":"198aca80-7461-4f41-b6ff-9fd31d3d28e2","Type":"ContainerDied","Data":"420f2b8bdc247dc7b891bff22203648c58a40a66299159e214fe030add8758f5"} Mar 19 19:22:37 crc kubenswrapper[4826]: I0319 19:22:37.355924 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84f9ccf-p6gql" event={"ID":"198aca80-7461-4f41-b6ff-9fd31d3d28e2","Type":"ContainerDied","Data":"c0d5fe54ce8ab26b51dddc5552e4c9cae64c7fd049abcc0692a4539d07ad818a"} Mar 19 19:22:37 crc kubenswrapper[4826]: I0319 19:22:37.355975 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84f9ccf-p6gql" Mar 19 19:22:37 crc kubenswrapper[4826]: I0319 19:22:37.356001 4826 scope.go:117] "RemoveContainer" containerID="420f2b8bdc247dc7b891bff22203648c58a40a66299159e214fe030add8758f5" Mar 19 19:22:37 crc kubenswrapper[4826]: I0319 19:22:37.407798 4826 scope.go:117] "RemoveContainer" containerID="8fab521938bb68ddea22381c171439950485d7ad21844c9b7b3ade5ced33a7d3" Mar 19 19:22:37 crc kubenswrapper[4826]: I0319 19:22:37.417788 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/198aca80-7461-4f41-b6ff-9fd31d3d28e2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "198aca80-7461-4f41-b6ff-9fd31d3d28e2" (UID: "198aca80-7461-4f41-b6ff-9fd31d3d28e2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:22:37 crc kubenswrapper[4826]: I0319 19:22:37.426141 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/198aca80-7461-4f41-b6ff-9fd31d3d28e2-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "198aca80-7461-4f41-b6ff-9fd31d3d28e2" (UID: "198aca80-7461-4f41-b6ff-9fd31d3d28e2"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:22:37 crc kubenswrapper[4826]: I0319 19:22:37.434832 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/198aca80-7461-4f41-b6ff-9fd31d3d28e2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "198aca80-7461-4f41-b6ff-9fd31d3d28e2" (UID: "198aca80-7461-4f41-b6ff-9fd31d3d28e2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:22:37 crc kubenswrapper[4826]: I0319 19:22:37.443481 4826 scope.go:117] "RemoveContainer" containerID="420f2b8bdc247dc7b891bff22203648c58a40a66299159e214fe030add8758f5" Mar 19 19:22:37 crc kubenswrapper[4826]: I0319 19:22:37.443903 4826 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/198aca80-7461-4f41-b6ff-9fd31d3d28e2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 19:22:37 crc kubenswrapper[4826]: I0319 19:22:37.443933 4826 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/198aca80-7461-4f41-b6ff-9fd31d3d28e2-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:22:37 crc kubenswrapper[4826]: I0319 19:22:37.443943 4826 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/198aca80-7461-4f41-b6ff-9fd31d3d28e2-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 19 19:22:37 crc kubenswrapper[4826]: I0319 19:22:37.443954 4826 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/198aca80-7461-4f41-b6ff-9fd31d3d28e2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 19:22:37 crc kubenswrapper[4826]: E0319 19:22:37.444858 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"420f2b8bdc247dc7b891bff22203648c58a40a66299159e214fe030add8758f5\": container with ID starting with 420f2b8bdc247dc7b891bff22203648c58a40a66299159e214fe030add8758f5 not found: ID does not exist" containerID="420f2b8bdc247dc7b891bff22203648c58a40a66299159e214fe030add8758f5" Mar 19 19:22:37 crc kubenswrapper[4826]: I0319 19:22:37.444894 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/198aca80-7461-4f41-b6ff-9fd31d3d28e2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "198aca80-7461-4f41-b6ff-9fd31d3d28e2" (UID: "198aca80-7461-4f41-b6ff-9fd31d3d28e2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:22:37 crc kubenswrapper[4826]: I0319 19:22:37.444901 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"420f2b8bdc247dc7b891bff22203648c58a40a66299159e214fe030add8758f5"} err="failed to get container status \"420f2b8bdc247dc7b891bff22203648c58a40a66299159e214fe030add8758f5\": rpc error: code = NotFound desc = could not find container \"420f2b8bdc247dc7b891bff22203648c58a40a66299159e214fe030add8758f5\": container with ID starting with 420f2b8bdc247dc7b891bff22203648c58a40a66299159e214fe030add8758f5 not found: ID does not exist" Mar 19 19:22:37 crc kubenswrapper[4826]: I0319 19:22:37.444948 4826 scope.go:117] "RemoveContainer" containerID="8fab521938bb68ddea22381c171439950485d7ad21844c9b7b3ade5ced33a7d3" Mar 19 19:22:37 crc kubenswrapper[4826]: E0319 19:22:37.445254 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fab521938bb68ddea22381c171439950485d7ad21844c9b7b3ade5ced33a7d3\": container with ID starting with 8fab521938bb68ddea22381c171439950485d7ad21844c9b7b3ade5ced33a7d3 not found: ID does not exist" containerID="8fab521938bb68ddea22381c171439950485d7ad21844c9b7b3ade5ced33a7d3" Mar 19 19:22:37 crc kubenswrapper[4826]: I0319 19:22:37.445285 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fab521938bb68ddea22381c171439950485d7ad21844c9b7b3ade5ced33a7d3"} err="failed to get container status \"8fab521938bb68ddea22381c171439950485d7ad21844c9b7b3ade5ced33a7d3\": rpc error: code = NotFound desc = could not find container \"8fab521938bb68ddea22381c171439950485d7ad21844c9b7b3ade5ced33a7d3\": container with ID starting with 8fab521938bb68ddea22381c171439950485d7ad21844c9b7b3ade5ced33a7d3 not found: ID does not exist" Mar 19 19:22:37 crc kubenswrapper[4826]: I0319 19:22:37.548389 4826 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/198aca80-7461-4f41-b6ff-9fd31d3d28e2-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 19:22:37 crc kubenswrapper[4826]: I0319 19:22:37.603923 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d75f767dc-qk9tk"] Mar 19 19:22:37 crc kubenswrapper[4826]: I0319 19:22:37.813461 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f84f9ccf-p6gql"] Mar 19 19:22:37 crc kubenswrapper[4826]: I0319 19:22:37.827885 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f84f9ccf-p6gql"] Mar 19 19:22:37 crc kubenswrapper[4826]: I0319 19:22:37.989920 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="198aca80-7461-4f41-b6ff-9fd31d3d28e2" path="/var/lib/kubelet/pods/198aca80-7461-4f41-b6ff-9fd31d3d28e2/volumes" Mar 19 19:22:38 crc kubenswrapper[4826]: I0319 19:22:38.374423 4826 generic.go:334] "Generic (PLEG): container finished" podID="ce7b509e-beae-4a19-803c-339c76a4d51e" containerID="6f2feb6fcf621cc7d4c73211ed4e496a9d23f37b7263e4230115dc422235dde0" exitCode=0 Mar 19 19:22:38 crc kubenswrapper[4826]: I0319 19:22:38.374487 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d75f767dc-qk9tk" event={"ID":"ce7b509e-beae-4a19-803c-339c76a4d51e","Type":"ContainerDied","Data":"6f2feb6fcf621cc7d4c73211ed4e496a9d23f37b7263e4230115dc422235dde0"} Mar 19 19:22:38 crc kubenswrapper[4826]: I0319 19:22:38.374908 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d75f767dc-qk9tk" event={"ID":"ce7b509e-beae-4a19-803c-339c76a4d51e","Type":"ContainerStarted","Data":"95d97a9e6d930c1a257f6edc21acbff00ca11e884c34698f04290249ca2ae648"} Mar 19 19:22:39 crc kubenswrapper[4826]: I0319 19:22:39.392158 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d75f767dc-qk9tk" event={"ID":"ce7b509e-beae-4a19-803c-339c76a4d51e","Type":"ContainerStarted","Data":"408f01833d25ef7dc6397988c518073b96955bd610d13c55bc6f4692813d0d41"} Mar 19 19:22:39 crc kubenswrapper[4826]: I0319 19:22:39.393750 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d75f767dc-qk9tk" Mar 19 19:22:39 crc kubenswrapper[4826]: I0319 19:22:39.415523 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d75f767dc-qk9tk" podStartSLOduration=3.415508907 podStartE2EDuration="3.415508907s" podCreationTimestamp="2026-03-19 19:22:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:22:39.413433336 +0000 UTC m=+1584.167501689" watchObservedRunningTime="2026-03-19 19:22:39.415508907 +0000 UTC m=+1584.169577220" Mar 19 19:22:41 crc kubenswrapper[4826]: I0319 19:22:41.456198 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-pvzvx" event={"ID":"be3204f1-b777-49ae-8ba5-2f30f639dd1e","Type":"ContainerStarted","Data":"85d96ce366d983e2fc3b550894d052f631e25ebfeae724bed7516dfd925e0ee2"} Mar 19 19:22:41 crc kubenswrapper[4826]: I0319 19:22:41.499266 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-pvzvx" podStartSLOduration=2.717655136 podStartE2EDuration="42.499250019s" podCreationTimestamp="2026-03-19 19:21:59 +0000 UTC" firstStartedPulling="2026-03-19 19:22:00.406031575 +0000 UTC m=+1545.160099888" lastFinishedPulling="2026-03-19 19:22:40.187626418 +0000 UTC m=+1584.941694771" observedRunningTime="2026-03-19 19:22:41.49226213 +0000 UTC m=+1586.246330453" watchObservedRunningTime="2026-03-19 19:22:41.499250019 +0000 UTC m=+1586.253318332" Mar 19 19:22:43 crc kubenswrapper[4826]: I0319 19:22:43.477324 4826 generic.go:334] "Generic (PLEG): container finished" podID="be3204f1-b777-49ae-8ba5-2f30f639dd1e" containerID="85d96ce366d983e2fc3b550894d052f631e25ebfeae724bed7516dfd925e0ee2" exitCode=0 Mar 19 19:22:43 crc kubenswrapper[4826]: I0319 19:22:43.477361 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-pvzvx" event={"ID":"be3204f1-b777-49ae-8ba5-2f30f639dd1e","Type":"ContainerDied","Data":"85d96ce366d983e2fc3b550894d052f631e25ebfeae724bed7516dfd925e0ee2"} Mar 19 19:22:45 crc kubenswrapper[4826]: I0319 19:22:45.047522 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-pvzvx" Mar 19 19:22:45 crc kubenswrapper[4826]: I0319 19:22:45.051951 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nr8jz\" (UniqueName: \"kubernetes.io/projected/be3204f1-b777-49ae-8ba5-2f30f639dd1e-kube-api-access-nr8jz\") pod \"be3204f1-b777-49ae-8ba5-2f30f639dd1e\" (UID: \"be3204f1-b777-49ae-8ba5-2f30f639dd1e\") " Mar 19 19:22:45 crc kubenswrapper[4826]: I0319 19:22:45.052143 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be3204f1-b777-49ae-8ba5-2f30f639dd1e-combined-ca-bundle\") pod \"be3204f1-b777-49ae-8ba5-2f30f639dd1e\" (UID: \"be3204f1-b777-49ae-8ba5-2f30f639dd1e\") " Mar 19 19:22:45 crc kubenswrapper[4826]: I0319 19:22:45.052340 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be3204f1-b777-49ae-8ba5-2f30f639dd1e-config-data\") pod \"be3204f1-b777-49ae-8ba5-2f30f639dd1e\" (UID: \"be3204f1-b777-49ae-8ba5-2f30f639dd1e\") " Mar 19 19:22:45 crc kubenswrapper[4826]: I0319 19:22:45.058018 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be3204f1-b777-49ae-8ba5-2f30f639dd1e-kube-api-access-nr8jz" (OuterVolumeSpecName: "kube-api-access-nr8jz") pod "be3204f1-b777-49ae-8ba5-2f30f639dd1e" (UID: "be3204f1-b777-49ae-8ba5-2f30f639dd1e"). InnerVolumeSpecName "kube-api-access-nr8jz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:22:45 crc kubenswrapper[4826]: I0319 19:22:45.095501 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be3204f1-b777-49ae-8ba5-2f30f639dd1e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "be3204f1-b777-49ae-8ba5-2f30f639dd1e" (UID: "be3204f1-b777-49ae-8ba5-2f30f639dd1e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:22:45 crc kubenswrapper[4826]: I0319 19:22:45.155490 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nr8jz\" (UniqueName: \"kubernetes.io/projected/be3204f1-b777-49ae-8ba5-2f30f639dd1e-kube-api-access-nr8jz\") on node \"crc\" DevicePath \"\"" Mar 19 19:22:45 crc kubenswrapper[4826]: I0319 19:22:45.155533 4826 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be3204f1-b777-49ae-8ba5-2f30f639dd1e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:22:45 crc kubenswrapper[4826]: I0319 19:22:45.176336 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be3204f1-b777-49ae-8ba5-2f30f639dd1e-config-data" (OuterVolumeSpecName: "config-data") pod "be3204f1-b777-49ae-8ba5-2f30f639dd1e" (UID: "be3204f1-b777-49ae-8ba5-2f30f639dd1e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:22:45 crc kubenswrapper[4826]: I0319 19:22:45.258624 4826 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be3204f1-b777-49ae-8ba5-2f30f639dd1e-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:22:45 crc kubenswrapper[4826]: I0319 19:22:45.507228 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-pvzvx" event={"ID":"be3204f1-b777-49ae-8ba5-2f30f639dd1e","Type":"ContainerDied","Data":"2f8d9bece542e6e111d1cd175f7388306671578d27f4e7d2a6941db673d89e5e"} Mar 19 19:22:45 crc kubenswrapper[4826]: I0319 19:22:45.507277 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f8d9bece542e6e111d1cd175f7388306671578d27f4e7d2a6941db673d89e5e" Mar 19 19:22:45 crc kubenswrapper[4826]: I0319 19:22:45.507310 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-pvzvx" Mar 19 19:22:46 crc kubenswrapper[4826]: I0319 19:22:46.001513 4826 scope.go:117] "RemoveContainer" containerID="856447f1cdc796c080402d3bfb76d7471741ca95039714006756d0cb980e424c" Mar 19 19:22:46 crc kubenswrapper[4826]: E0319 19:22:46.002214 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:22:46 crc kubenswrapper[4826]: I0319 19:22:46.997332 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.062943 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d75f767dc-qk9tk" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.154039 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b75489c6f-gqz8n"] Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.154310 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b75489c6f-gqz8n" podUID="af1734c8-f29e-4752-8f25-9f6a79025585" containerName="dnsmasq-dns" containerID="cri-o://297d9e67c38b84abe12517591a2fdf70b5de6e47f9cd2a7015f7f7be2b570a69" gracePeriod=10 Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.176788 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-84bf69fdcb-b6hq4"] Mar 19 19:22:47 crc kubenswrapper[4826]: E0319 19:22:47.177356 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be3204f1-b777-49ae-8ba5-2f30f639dd1e" containerName="heat-db-sync" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.177373 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="be3204f1-b777-49ae-8ba5-2f30f639dd1e" containerName="heat-db-sync" Mar 19 19:22:47 crc kubenswrapper[4826]: E0319 19:22:47.177396 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="198aca80-7461-4f41-b6ff-9fd31d3d28e2" containerName="dnsmasq-dns" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.177404 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="198aca80-7461-4f41-b6ff-9fd31d3d28e2" containerName="dnsmasq-dns" Mar 19 19:22:47 crc kubenswrapper[4826]: E0319 19:22:47.177417 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="198aca80-7461-4f41-b6ff-9fd31d3d28e2" containerName="init" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.177424 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="198aca80-7461-4f41-b6ff-9fd31d3d28e2" containerName="init" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.177715 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="198aca80-7461-4f41-b6ff-9fd31d3d28e2" containerName="dnsmasq-dns" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.177735 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="be3204f1-b777-49ae-8ba5-2f30f639dd1e" containerName="heat-db-sync" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.178789 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-84bf69fdcb-b6hq4" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.198458 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-84bf69fdcb-b6hq4"] Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.245488 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-5b5447b648-5hq9h"] Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.250832 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8d87bc1-29fa-4219-8c55-968d58f697e8-config-data\") pod \"heat-engine-84bf69fdcb-b6hq4\" (UID: \"a8d87bc1-29fa-4219-8c55-968d58f697e8\") " pod="openstack/heat-engine-84bf69fdcb-b6hq4" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.252805 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8d87bc1-29fa-4219-8c55-968d58f697e8-combined-ca-bundle\") pod \"heat-engine-84bf69fdcb-b6hq4\" (UID: \"a8d87bc1-29fa-4219-8c55-968d58f697e8\") " pod="openstack/heat-engine-84bf69fdcb-b6hq4" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.252930 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a8d87bc1-29fa-4219-8c55-968d58f697e8-config-data-custom\") pod \"heat-engine-84bf69fdcb-b6hq4\" (UID: \"a8d87bc1-29fa-4219-8c55-968d58f697e8\") " pod="openstack/heat-engine-84bf69fdcb-b6hq4" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.253031 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pg5lk\" (UniqueName: \"kubernetes.io/projected/a8d87bc1-29fa-4219-8c55-968d58f697e8-kube-api-access-pg5lk\") pod \"heat-engine-84bf69fdcb-b6hq4\" (UID: \"a8d87bc1-29fa-4219-8c55-968d58f697e8\") " pod="openstack/heat-engine-84bf69fdcb-b6hq4" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.264691 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5b5447b648-5hq9h" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.295714 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-5b5447b648-5hq9h"] Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.317533 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-fb594b459-7sf97"] Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.319206 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-fb594b459-7sf97" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.340885 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-fb594b459-7sf97"] Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.355375 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n64dx\" (UniqueName: \"kubernetes.io/projected/77ff195c-0819-4764-b09f-fd10a1aea177-kube-api-access-n64dx\") pod \"heat-api-fb594b459-7sf97\" (UID: \"77ff195c-0819-4764-b09f-fd10a1aea177\") " pod="openstack/heat-api-fb594b459-7sf97" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.355426 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45e3dc79-4f5e-4bec-a579-41b93f1d6150-config-data\") pod \"heat-cfnapi-5b5447b648-5hq9h\" (UID: \"45e3dc79-4f5e-4bec-a579-41b93f1d6150\") " pod="openstack/heat-cfnapi-5b5447b648-5hq9h" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.355463 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/45e3dc79-4f5e-4bec-a579-41b93f1d6150-public-tls-certs\") pod \"heat-cfnapi-5b5447b648-5hq9h\" (UID: \"45e3dc79-4f5e-4bec-a579-41b93f1d6150\") " pod="openstack/heat-cfnapi-5b5447b648-5hq9h" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.355486 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/77ff195c-0819-4764-b09f-fd10a1aea177-internal-tls-certs\") pod \"heat-api-fb594b459-7sf97\" (UID: \"77ff195c-0819-4764-b09f-fd10a1aea177\") " pod="openstack/heat-api-fb594b459-7sf97" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.355549 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8d87bc1-29fa-4219-8c55-968d58f697e8-config-data\") pod \"heat-engine-84bf69fdcb-b6hq4\" (UID: \"a8d87bc1-29fa-4219-8c55-968d58f697e8\") " pod="openstack/heat-engine-84bf69fdcb-b6hq4" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.355572 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvr8b\" (UniqueName: \"kubernetes.io/projected/45e3dc79-4f5e-4bec-a579-41b93f1d6150-kube-api-access-fvr8b\") pod \"heat-cfnapi-5b5447b648-5hq9h\" (UID: \"45e3dc79-4f5e-4bec-a579-41b93f1d6150\") " pod="openstack/heat-cfnapi-5b5447b648-5hq9h" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.355597 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77ff195c-0819-4764-b09f-fd10a1aea177-config-data\") pod \"heat-api-fb594b459-7sf97\" (UID: \"77ff195c-0819-4764-b09f-fd10a1aea177\") " pod="openstack/heat-api-fb594b459-7sf97" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.355614 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/45e3dc79-4f5e-4bec-a579-41b93f1d6150-config-data-custom\") pod \"heat-cfnapi-5b5447b648-5hq9h\" (UID: \"45e3dc79-4f5e-4bec-a579-41b93f1d6150\") " pod="openstack/heat-cfnapi-5b5447b648-5hq9h" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.355631 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8d87bc1-29fa-4219-8c55-968d58f697e8-combined-ca-bundle\") pod \"heat-engine-84bf69fdcb-b6hq4\" (UID: \"a8d87bc1-29fa-4219-8c55-968d58f697e8\") " pod="openstack/heat-engine-84bf69fdcb-b6hq4" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.355648 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/77ff195c-0819-4764-b09f-fd10a1aea177-public-tls-certs\") pod \"heat-api-fb594b459-7sf97\" (UID: \"77ff195c-0819-4764-b09f-fd10a1aea177\") " pod="openstack/heat-api-fb594b459-7sf97" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.355735 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/77ff195c-0819-4764-b09f-fd10a1aea177-config-data-custom\") pod \"heat-api-fb594b459-7sf97\" (UID: \"77ff195c-0819-4764-b09f-fd10a1aea177\") " pod="openstack/heat-api-fb594b459-7sf97" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.355760 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a8d87bc1-29fa-4219-8c55-968d58f697e8-config-data-custom\") pod \"heat-engine-84bf69fdcb-b6hq4\" (UID: \"a8d87bc1-29fa-4219-8c55-968d58f697e8\") " pod="openstack/heat-engine-84bf69fdcb-b6hq4" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.355810 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77ff195c-0819-4764-b09f-fd10a1aea177-combined-ca-bundle\") pod \"heat-api-fb594b459-7sf97\" (UID: \"77ff195c-0819-4764-b09f-fd10a1aea177\") " pod="openstack/heat-api-fb594b459-7sf97" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.355829 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/45e3dc79-4f5e-4bec-a579-41b93f1d6150-internal-tls-certs\") pod \"heat-cfnapi-5b5447b648-5hq9h\" (UID: \"45e3dc79-4f5e-4bec-a579-41b93f1d6150\") " pod="openstack/heat-cfnapi-5b5447b648-5hq9h" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.355850 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pg5lk\" (UniqueName: \"kubernetes.io/projected/a8d87bc1-29fa-4219-8c55-968d58f697e8-kube-api-access-pg5lk\") pod \"heat-engine-84bf69fdcb-b6hq4\" (UID: \"a8d87bc1-29fa-4219-8c55-968d58f697e8\") " pod="openstack/heat-engine-84bf69fdcb-b6hq4" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.355872 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45e3dc79-4f5e-4bec-a579-41b93f1d6150-combined-ca-bundle\") pod \"heat-cfnapi-5b5447b648-5hq9h\" (UID: \"45e3dc79-4f5e-4bec-a579-41b93f1d6150\") " pod="openstack/heat-cfnapi-5b5447b648-5hq9h" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.362386 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8d87bc1-29fa-4219-8c55-968d58f697e8-combined-ca-bundle\") pod \"heat-engine-84bf69fdcb-b6hq4\" (UID: \"a8d87bc1-29fa-4219-8c55-968d58f697e8\") " pod="openstack/heat-engine-84bf69fdcb-b6hq4" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.369534 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8d87bc1-29fa-4219-8c55-968d58f697e8-config-data\") pod \"heat-engine-84bf69fdcb-b6hq4\" (UID: \"a8d87bc1-29fa-4219-8c55-968d58f697e8\") " pod="openstack/heat-engine-84bf69fdcb-b6hq4" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.370236 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a8d87bc1-29fa-4219-8c55-968d58f697e8-config-data-custom\") pod \"heat-engine-84bf69fdcb-b6hq4\" (UID: \"a8d87bc1-29fa-4219-8c55-968d58f697e8\") " pod="openstack/heat-engine-84bf69fdcb-b6hq4" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.382244 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pg5lk\" (UniqueName: \"kubernetes.io/projected/a8d87bc1-29fa-4219-8c55-968d58f697e8-kube-api-access-pg5lk\") pod \"heat-engine-84bf69fdcb-b6hq4\" (UID: \"a8d87bc1-29fa-4219-8c55-968d58f697e8\") " pod="openstack/heat-engine-84bf69fdcb-b6hq4" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.458311 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvr8b\" (UniqueName: \"kubernetes.io/projected/45e3dc79-4f5e-4bec-a579-41b93f1d6150-kube-api-access-fvr8b\") pod \"heat-cfnapi-5b5447b648-5hq9h\" (UID: \"45e3dc79-4f5e-4bec-a579-41b93f1d6150\") " pod="openstack/heat-cfnapi-5b5447b648-5hq9h" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.458357 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77ff195c-0819-4764-b09f-fd10a1aea177-config-data\") pod \"heat-api-fb594b459-7sf97\" (UID: \"77ff195c-0819-4764-b09f-fd10a1aea177\") " pod="openstack/heat-api-fb594b459-7sf97" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.458385 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/45e3dc79-4f5e-4bec-a579-41b93f1d6150-config-data-custom\") pod \"heat-cfnapi-5b5447b648-5hq9h\" (UID: \"45e3dc79-4f5e-4bec-a579-41b93f1d6150\") " pod="openstack/heat-cfnapi-5b5447b648-5hq9h" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.458405 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/77ff195c-0819-4764-b09f-fd10a1aea177-public-tls-certs\") pod \"heat-api-fb594b459-7sf97\" (UID: \"77ff195c-0819-4764-b09f-fd10a1aea177\") " pod="openstack/heat-api-fb594b459-7sf97" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.458446 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/77ff195c-0819-4764-b09f-fd10a1aea177-config-data-custom\") pod \"heat-api-fb594b459-7sf97\" (UID: \"77ff195c-0819-4764-b09f-fd10a1aea177\") " pod="openstack/heat-api-fb594b459-7sf97" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.458489 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77ff195c-0819-4764-b09f-fd10a1aea177-combined-ca-bundle\") pod \"heat-api-fb594b459-7sf97\" (UID: \"77ff195c-0819-4764-b09f-fd10a1aea177\") " pod="openstack/heat-api-fb594b459-7sf97" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.458513 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/45e3dc79-4f5e-4bec-a579-41b93f1d6150-internal-tls-certs\") pod \"heat-cfnapi-5b5447b648-5hq9h\" (UID: \"45e3dc79-4f5e-4bec-a579-41b93f1d6150\") " pod="openstack/heat-cfnapi-5b5447b648-5hq9h" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.458555 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45e3dc79-4f5e-4bec-a579-41b93f1d6150-combined-ca-bundle\") pod \"heat-cfnapi-5b5447b648-5hq9h\" (UID: \"45e3dc79-4f5e-4bec-a579-41b93f1d6150\") " pod="openstack/heat-cfnapi-5b5447b648-5hq9h" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.458617 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n64dx\" (UniqueName: \"kubernetes.io/projected/77ff195c-0819-4764-b09f-fd10a1aea177-kube-api-access-n64dx\") pod \"heat-api-fb594b459-7sf97\" (UID: \"77ff195c-0819-4764-b09f-fd10a1aea177\") " pod="openstack/heat-api-fb594b459-7sf97" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.458645 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45e3dc79-4f5e-4bec-a579-41b93f1d6150-config-data\") pod \"heat-cfnapi-5b5447b648-5hq9h\" (UID: \"45e3dc79-4f5e-4bec-a579-41b93f1d6150\") " pod="openstack/heat-cfnapi-5b5447b648-5hq9h" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.458689 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/45e3dc79-4f5e-4bec-a579-41b93f1d6150-public-tls-certs\") pod \"heat-cfnapi-5b5447b648-5hq9h\" (UID: \"45e3dc79-4f5e-4bec-a579-41b93f1d6150\") " pod="openstack/heat-cfnapi-5b5447b648-5hq9h" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.458714 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/77ff195c-0819-4764-b09f-fd10a1aea177-internal-tls-certs\") pod \"heat-api-fb594b459-7sf97\" (UID: \"77ff195c-0819-4764-b09f-fd10a1aea177\") " pod="openstack/heat-api-fb594b459-7sf97" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.465017 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/45e3dc79-4f5e-4bec-a579-41b93f1d6150-public-tls-certs\") pod \"heat-cfnapi-5b5447b648-5hq9h\" (UID: \"45e3dc79-4f5e-4bec-a579-41b93f1d6150\") " pod="openstack/heat-cfnapi-5b5447b648-5hq9h" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.465334 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77ff195c-0819-4764-b09f-fd10a1aea177-combined-ca-bundle\") pod \"heat-api-fb594b459-7sf97\" (UID: \"77ff195c-0819-4764-b09f-fd10a1aea177\") " pod="openstack/heat-api-fb594b459-7sf97" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.465767 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/45e3dc79-4f5e-4bec-a579-41b93f1d6150-internal-tls-certs\") pod \"heat-cfnapi-5b5447b648-5hq9h\" (UID: \"45e3dc79-4f5e-4bec-a579-41b93f1d6150\") " pod="openstack/heat-cfnapi-5b5447b648-5hq9h" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.467030 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/45e3dc79-4f5e-4bec-a579-41b93f1d6150-config-data-custom\") pod \"heat-cfnapi-5b5447b648-5hq9h\" (UID: \"45e3dc79-4f5e-4bec-a579-41b93f1d6150\") " pod="openstack/heat-cfnapi-5b5447b648-5hq9h" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.468283 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/77ff195c-0819-4764-b09f-fd10a1aea177-config-data-custom\") pod \"heat-api-fb594b459-7sf97\" (UID: \"77ff195c-0819-4764-b09f-fd10a1aea177\") " pod="openstack/heat-api-fb594b459-7sf97" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.468446 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77ff195c-0819-4764-b09f-fd10a1aea177-config-data\") pod \"heat-api-fb594b459-7sf97\" (UID: \"77ff195c-0819-4764-b09f-fd10a1aea177\") " pod="openstack/heat-api-fb594b459-7sf97" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.471886 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/77ff195c-0819-4764-b09f-fd10a1aea177-internal-tls-certs\") pod \"heat-api-fb594b459-7sf97\" (UID: \"77ff195c-0819-4764-b09f-fd10a1aea177\") " pod="openstack/heat-api-fb594b459-7sf97" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.471950 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45e3dc79-4f5e-4bec-a579-41b93f1d6150-config-data\") pod \"heat-cfnapi-5b5447b648-5hq9h\" (UID: \"45e3dc79-4f5e-4bec-a579-41b93f1d6150\") " pod="openstack/heat-cfnapi-5b5447b648-5hq9h" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.472604 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45e3dc79-4f5e-4bec-a579-41b93f1d6150-combined-ca-bundle\") pod \"heat-cfnapi-5b5447b648-5hq9h\" (UID: \"45e3dc79-4f5e-4bec-a579-41b93f1d6150\") " pod="openstack/heat-cfnapi-5b5447b648-5hq9h" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.475384 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n64dx\" (UniqueName: \"kubernetes.io/projected/77ff195c-0819-4764-b09f-fd10a1aea177-kube-api-access-n64dx\") pod \"heat-api-fb594b459-7sf97\" (UID: \"77ff195c-0819-4764-b09f-fd10a1aea177\") " pod="openstack/heat-api-fb594b459-7sf97" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.476601 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/77ff195c-0819-4764-b09f-fd10a1aea177-public-tls-certs\") pod \"heat-api-fb594b459-7sf97\" (UID: \"77ff195c-0819-4764-b09f-fd10a1aea177\") " pod="openstack/heat-api-fb594b459-7sf97" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.486371 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-84bf69fdcb-b6hq4" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.489433 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvr8b\" (UniqueName: \"kubernetes.io/projected/45e3dc79-4f5e-4bec-a579-41b93f1d6150-kube-api-access-fvr8b\") pod \"heat-cfnapi-5b5447b648-5hq9h\" (UID: \"45e3dc79-4f5e-4bec-a579-41b93f1d6150\") " pod="openstack/heat-cfnapi-5b5447b648-5hq9h" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.498183 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5b5447b648-5hq9h" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.545353 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-fb594b459-7sf97" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.559625 4826 generic.go:334] "Generic (PLEG): container finished" podID="af1734c8-f29e-4752-8f25-9f6a79025585" containerID="297d9e67c38b84abe12517591a2fdf70b5de6e47f9cd2a7015f7f7be2b570a69" exitCode=0 Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.559974 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b75489c6f-gqz8n" event={"ID":"af1734c8-f29e-4752-8f25-9f6a79025585","Type":"ContainerDied","Data":"297d9e67c38b84abe12517591a2fdf70b5de6e47f9cd2a7015f7f7be2b570a69"} Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.736087 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b75489c6f-gqz8n" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.886621 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af1734c8-f29e-4752-8f25-9f6a79025585-ovsdbserver-sb\") pod \"af1734c8-f29e-4752-8f25-9f6a79025585\" (UID: \"af1734c8-f29e-4752-8f25-9f6a79025585\") " Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.887092 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af1734c8-f29e-4752-8f25-9f6a79025585-ovsdbserver-nb\") pod \"af1734c8-f29e-4752-8f25-9f6a79025585\" (UID: \"af1734c8-f29e-4752-8f25-9f6a79025585\") " Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.887223 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/af1734c8-f29e-4752-8f25-9f6a79025585-openstack-edpm-ipam\") pod \"af1734c8-f29e-4752-8f25-9f6a79025585\" (UID: \"af1734c8-f29e-4752-8f25-9f6a79025585\") " Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.887250 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/af1734c8-f29e-4752-8f25-9f6a79025585-dns-swift-storage-0\") pod \"af1734c8-f29e-4752-8f25-9f6a79025585\" (UID: \"af1734c8-f29e-4752-8f25-9f6a79025585\") " Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.887330 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af1734c8-f29e-4752-8f25-9f6a79025585-dns-svc\") pod \"af1734c8-f29e-4752-8f25-9f6a79025585\" (UID: \"af1734c8-f29e-4752-8f25-9f6a79025585\") " Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.887491 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lwx8\" (UniqueName: \"kubernetes.io/projected/af1734c8-f29e-4752-8f25-9f6a79025585-kube-api-access-8lwx8\") pod \"af1734c8-f29e-4752-8f25-9f6a79025585\" (UID: \"af1734c8-f29e-4752-8f25-9f6a79025585\") " Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.887560 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af1734c8-f29e-4752-8f25-9f6a79025585-config\") pod \"af1734c8-f29e-4752-8f25-9f6a79025585\" (UID: \"af1734c8-f29e-4752-8f25-9f6a79025585\") " Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.897415 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af1734c8-f29e-4752-8f25-9f6a79025585-kube-api-access-8lwx8" (OuterVolumeSpecName: "kube-api-access-8lwx8") pod "af1734c8-f29e-4752-8f25-9f6a79025585" (UID: "af1734c8-f29e-4752-8f25-9f6a79025585"). InnerVolumeSpecName "kube-api-access-8lwx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.957449 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af1734c8-f29e-4752-8f25-9f6a79025585-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "af1734c8-f29e-4752-8f25-9f6a79025585" (UID: "af1734c8-f29e-4752-8f25-9f6a79025585"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.978529 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af1734c8-f29e-4752-8f25-9f6a79025585-config" (OuterVolumeSpecName: "config") pod "af1734c8-f29e-4752-8f25-9f6a79025585" (UID: "af1734c8-f29e-4752-8f25-9f6a79025585"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.985191 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af1734c8-f29e-4752-8f25-9f6a79025585-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "af1734c8-f29e-4752-8f25-9f6a79025585" (UID: "af1734c8-f29e-4752-8f25-9f6a79025585"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.992511 4826 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/af1734c8-f29e-4752-8f25-9f6a79025585-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.992541 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lwx8\" (UniqueName: \"kubernetes.io/projected/af1734c8-f29e-4752-8f25-9f6a79025585-kube-api-access-8lwx8\") on node \"crc\" DevicePath \"\"" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.992551 4826 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af1734c8-f29e-4752-8f25-9f6a79025585-config\") on node \"crc\" DevicePath \"\"" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.992561 4826 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af1734c8-f29e-4752-8f25-9f6a79025585-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 19:22:47 crc kubenswrapper[4826]: I0319 19:22:47.993918 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af1734c8-f29e-4752-8f25-9f6a79025585-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "af1734c8-f29e-4752-8f25-9f6a79025585" (UID: "af1734c8-f29e-4752-8f25-9f6a79025585"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:22:48 crc kubenswrapper[4826]: I0319 19:22:48.019171 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af1734c8-f29e-4752-8f25-9f6a79025585-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "af1734c8-f29e-4752-8f25-9f6a79025585" (UID: "af1734c8-f29e-4752-8f25-9f6a79025585"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:22:48 crc kubenswrapper[4826]: I0319 19:22:48.027709 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af1734c8-f29e-4752-8f25-9f6a79025585-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "af1734c8-f29e-4752-8f25-9f6a79025585" (UID: "af1734c8-f29e-4752-8f25-9f6a79025585"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:22:48 crc kubenswrapper[4826]: W0319 19:22:48.036420 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45e3dc79_4f5e_4bec_a579_41b93f1d6150.slice/crio-7f12f88d3b9a9a6e131dff2f603d96cf1b25775dc336f623a62d3a6ebaa878f5 WatchSource:0}: Error finding container 7f12f88d3b9a9a6e131dff2f603d96cf1b25775dc336f623a62d3a6ebaa878f5: Status 404 returned error can't find the container with id 7f12f88d3b9a9a6e131dff2f603d96cf1b25775dc336f623a62d3a6ebaa878f5 Mar 19 19:22:48 crc kubenswrapper[4826]: I0319 19:22:48.041294 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-5b5447b648-5hq9h"] Mar 19 19:22:48 crc kubenswrapper[4826]: I0319 19:22:48.095865 4826 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af1734c8-f29e-4752-8f25-9f6a79025585-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 19:22:48 crc kubenswrapper[4826]: I0319 19:22:48.096160 4826 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/af1734c8-f29e-4752-8f25-9f6a79025585-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 19:22:48 crc kubenswrapper[4826]: I0319 19:22:48.096258 4826 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af1734c8-f29e-4752-8f25-9f6a79025585-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 19:22:48 crc kubenswrapper[4826]: I0319 19:22:48.145062 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-84bf69fdcb-b6hq4"] Mar 19 19:22:48 crc kubenswrapper[4826]: W0319 19:22:48.327018 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77ff195c_0819_4764_b09f_fd10a1aea177.slice/crio-b5c1be771ab7b0be5b67cd3d4ddfafcdeae16c1f2d3f7e2e138bd9820c8d867d WatchSource:0}: Error finding container b5c1be771ab7b0be5b67cd3d4ddfafcdeae16c1f2d3f7e2e138bd9820c8d867d: Status 404 returned error can't find the container with id b5c1be771ab7b0be5b67cd3d4ddfafcdeae16c1f2d3f7e2e138bd9820c8d867d Mar 19 19:22:48 crc kubenswrapper[4826]: I0319 19:22:48.327249 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-fb594b459-7sf97"] Mar 19 19:22:48 crc kubenswrapper[4826]: I0319 19:22:48.575316 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-84bf69fdcb-b6hq4" event={"ID":"a8d87bc1-29fa-4219-8c55-968d58f697e8","Type":"ContainerStarted","Data":"3a0f83cd948759a9afec7d6b3b85dbf79ffff91cfaff3757a393b77d42c0d972"} Mar 19 19:22:48 crc kubenswrapper[4826]: I0319 19:22:48.578237 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b75489c6f-gqz8n" Mar 19 19:22:48 crc kubenswrapper[4826]: I0319 19:22:48.578530 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b75489c6f-gqz8n" event={"ID":"af1734c8-f29e-4752-8f25-9f6a79025585","Type":"ContainerDied","Data":"cbb9b15063cf3bc5bb6fa1ca3d37cbee20b9841c1f7401bf4e14035b37378d2b"} Mar 19 19:22:48 crc kubenswrapper[4826]: I0319 19:22:48.578623 4826 scope.go:117] "RemoveContainer" containerID="297d9e67c38b84abe12517591a2fdf70b5de6e47f9cd2a7015f7f7be2b570a69" Mar 19 19:22:48 crc kubenswrapper[4826]: I0319 19:22:48.580318 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-fb594b459-7sf97" event={"ID":"77ff195c-0819-4764-b09f-fd10a1aea177","Type":"ContainerStarted","Data":"b5c1be771ab7b0be5b67cd3d4ddfafcdeae16c1f2d3f7e2e138bd9820c8d867d"} Mar 19 19:22:48 crc kubenswrapper[4826]: I0319 19:22:48.585970 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab298593-ac97-4031-8bfc-b0e5be9b341a","Type":"ContainerStarted","Data":"6bce5b1cd3e4e908191a9ef12dfd6f5c8e6ba3c2ec093ded1e9938b5a4c85dc8"} Mar 19 19:22:48 crc kubenswrapper[4826]: I0319 19:22:48.588985 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5b5447b648-5hq9h" event={"ID":"45e3dc79-4f5e-4bec-a579-41b93f1d6150","Type":"ContainerStarted","Data":"7f12f88d3b9a9a6e131dff2f603d96cf1b25775dc336f623a62d3a6ebaa878f5"} Mar 19 19:22:48 crc kubenswrapper[4826]: I0319 19:22:48.602405 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-84bf69fdcb-b6hq4" podStartSLOduration=1.602387737 podStartE2EDuration="1.602387737s" podCreationTimestamp="2026-03-19 19:22:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:22:48.592446277 +0000 UTC m=+1593.346514590" watchObservedRunningTime="2026-03-19 19:22:48.602387737 +0000 UTC m=+1593.356456050" Mar 19 19:22:48 crc kubenswrapper[4826]: I0319 19:22:48.649707 4826 scope.go:117] "RemoveContainer" containerID="5ae5b159495549edaff33f3363ab91ead16c5b2da3d477c4886b32dae2a28f10" Mar 19 19:22:48 crc kubenswrapper[4826]: I0319 19:22:48.651831 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.208918255 podStartE2EDuration="43.651809831s" podCreationTimestamp="2026-03-19 19:22:05 +0000 UTC" firstStartedPulling="2026-03-19 19:22:06.731560188 +0000 UTC m=+1551.485628501" lastFinishedPulling="2026-03-19 19:22:47.174451764 +0000 UTC m=+1591.928520077" observedRunningTime="2026-03-19 19:22:48.619285735 +0000 UTC m=+1593.373354048" watchObservedRunningTime="2026-03-19 19:22:48.651809831 +0000 UTC m=+1593.405878164" Mar 19 19:22:48 crc kubenswrapper[4826]: I0319 19:22:48.675871 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b75489c6f-gqz8n"] Mar 19 19:22:48 crc kubenswrapper[4826]: I0319 19:22:48.707084 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b75489c6f-gqz8n"] Mar 19 19:22:49 crc kubenswrapper[4826]: I0319 19:22:49.605570 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-84bf69fdcb-b6hq4" event={"ID":"a8d87bc1-29fa-4219-8c55-968d58f697e8","Type":"ContainerStarted","Data":"368bcbe49b7fbdaa92b69deb130249a2aa0d4c3ec310121a67c38deb3d02c379"} Mar 19 19:22:49 crc kubenswrapper[4826]: I0319 19:22:49.605921 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-84bf69fdcb-b6hq4" Mar 19 19:22:49 crc kubenswrapper[4826]: I0319 19:22:49.998784 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af1734c8-f29e-4752-8f25-9f6a79025585" path="/var/lib/kubelet/pods/af1734c8-f29e-4752-8f25-9f6a79025585/volumes" Mar 19 19:22:50 crc kubenswrapper[4826]: I0319 19:22:50.619138 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-fb594b459-7sf97" event={"ID":"77ff195c-0819-4764-b09f-fd10a1aea177","Type":"ContainerStarted","Data":"ec16be438f7ef0810f1ee4903435d90da9f777f43aa203aa8bd95a2a579f4c2d"} Mar 19 19:22:50 crc kubenswrapper[4826]: I0319 19:22:50.619526 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-fb594b459-7sf97" Mar 19 19:22:50 crc kubenswrapper[4826]: I0319 19:22:50.620681 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5b5447b648-5hq9h" event={"ID":"45e3dc79-4f5e-4bec-a579-41b93f1d6150","Type":"ContainerStarted","Data":"7ae96657cc53f057d3f0c0526870d9e34d27641ab689838412b40a11d3119f42"} Mar 19 19:22:50 crc kubenswrapper[4826]: I0319 19:22:50.620906 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-5b5447b648-5hq9h" Mar 19 19:22:50 crc kubenswrapper[4826]: I0319 19:22:50.637340 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-fb594b459-7sf97" podStartSLOduration=2.178931753 podStartE2EDuration="3.637321691s" podCreationTimestamp="2026-03-19 19:22:47 +0000 UTC" firstStartedPulling="2026-03-19 19:22:48.332874173 +0000 UTC m=+1593.086942486" lastFinishedPulling="2026-03-19 19:22:49.791264111 +0000 UTC m=+1594.545332424" observedRunningTime="2026-03-19 19:22:50.633755054 +0000 UTC m=+1595.387823367" watchObservedRunningTime="2026-03-19 19:22:50.637321691 +0000 UTC m=+1595.391390004" Mar 19 19:22:50 crc kubenswrapper[4826]: I0319 19:22:50.664796 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-5b5447b648-5hq9h" podStartSLOduration=1.912982054 podStartE2EDuration="3.664779463s" podCreationTimestamp="2026-03-19 19:22:47 +0000 UTC" firstStartedPulling="2026-03-19 19:22:48.03939175 +0000 UTC m=+1592.793460063" lastFinishedPulling="2026-03-19 19:22:49.791189159 +0000 UTC m=+1594.545257472" observedRunningTime="2026-03-19 19:22:50.661447083 +0000 UTC m=+1595.415515396" watchObservedRunningTime="2026-03-19 19:22:50.664779463 +0000 UTC m=+1595.418847776" Mar 19 19:22:57 crc kubenswrapper[4826]: I0319 19:22:57.976038 4826 scope.go:117] "RemoveContainer" containerID="856447f1cdc796c080402d3bfb76d7471741ca95039714006756d0cb980e424c" Mar 19 19:22:57 crc kubenswrapper[4826]: E0319 19:22:57.976833 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:22:58 crc kubenswrapper[4826]: I0319 19:22:58.864975 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-fb594b459-7sf97" Mar 19 19:22:58 crc kubenswrapper[4826]: I0319 19:22:58.960693 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-784d5749b4-gl4q7"] Mar 19 19:22:58 crc kubenswrapper[4826]: I0319 19:22:58.960914 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-784d5749b4-gl4q7" podUID="07d156e5-8ee8-46e0-880f-4cf3fd7d2aac" containerName="heat-api" containerID="cri-o://5446ca03f39f3c674da1a362b6032b333090d971f781df7dcac73f902e93848a" gracePeriod=60 Mar 19 19:22:59 crc kubenswrapper[4826]: I0319 19:22:59.513150 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-5b5447b648-5hq9h" Mar 19 19:22:59 crc kubenswrapper[4826]: I0319 19:22:59.633305 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-559768d959-n8n6w"] Mar 19 19:22:59 crc kubenswrapper[4826]: I0319 19:22:59.633514 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-559768d959-n8n6w" podUID="63a6bcd4-833d-4f25-a4ab-e890afd8feb1" containerName="heat-cfnapi" containerID="cri-o://d85d3e712985dd5f00d7a95045fa98d205a5c9d5719eccb9bdba353b07351c6f" gracePeriod=60 Mar 19 19:23:02 crc kubenswrapper[4826]: I0319 19:23:02.339342 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-784d5749b4-gl4q7" podUID="07d156e5-8ee8-46e0-880f-4cf3fd7d2aac" containerName="heat-api" probeResult="failure" output="Get \"https://10.217.0.235:8004/healthcheck\": read tcp 10.217.0.2:60784->10.217.0.235:8004: read: connection reset by peer" Mar 19 19:23:02 crc kubenswrapper[4826]: I0319 19:23:02.810461 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-559768d959-n8n6w" podUID="63a6bcd4-833d-4f25-a4ab-e890afd8feb1" containerName="heat-cfnapi" probeResult="failure" output="Get \"https://10.217.0.236:8000/healthcheck\": read tcp 10.217.0.2:53914->10.217.0.236:8000: read: connection reset by peer" Mar 19 19:23:02 crc kubenswrapper[4826]: I0319 19:23:02.828436 4826 generic.go:334] "Generic (PLEG): container finished" podID="de5e1809-5292-4c32-a83e-9dbb01f1db4b" containerID="85cd82324840327d65a54cf0e628c0891c26bad8f87b2bdaf19c767754370f8e" exitCode=0 Mar 19 19:23:02 crc kubenswrapper[4826]: I0319 19:23:02.828526 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"de5e1809-5292-4c32-a83e-9dbb01f1db4b","Type":"ContainerDied","Data":"85cd82324840327d65a54cf0e628c0891c26bad8f87b2bdaf19c767754370f8e"} Mar 19 19:23:02 crc kubenswrapper[4826]: I0319 19:23:02.832765 4826 generic.go:334] "Generic (PLEG): container finished" podID="07d156e5-8ee8-46e0-880f-4cf3fd7d2aac" containerID="5446ca03f39f3c674da1a362b6032b333090d971f781df7dcac73f902e93848a" exitCode=0 Mar 19 19:23:02 crc kubenswrapper[4826]: I0319 19:23:02.832808 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-784d5749b4-gl4q7" event={"ID":"07d156e5-8ee8-46e0-880f-4cf3fd7d2aac","Type":"ContainerDied","Data":"5446ca03f39f3c674da1a362b6032b333090d971f781df7dcac73f902e93848a"} Mar 19 19:23:03 crc kubenswrapper[4826]: I0319 19:23:03.129614 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-784d5749b4-gl4q7" Mar 19 19:23:03 crc kubenswrapper[4826]: I0319 19:23:03.249587 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/07d156e5-8ee8-46e0-880f-4cf3fd7d2aac-config-data-custom\") pod \"07d156e5-8ee8-46e0-880f-4cf3fd7d2aac\" (UID: \"07d156e5-8ee8-46e0-880f-4cf3fd7d2aac\") " Mar 19 19:23:03 crc kubenswrapper[4826]: I0319 19:23:03.249742 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07d156e5-8ee8-46e0-880f-4cf3fd7d2aac-combined-ca-bundle\") pod \"07d156e5-8ee8-46e0-880f-4cf3fd7d2aac\" (UID: \"07d156e5-8ee8-46e0-880f-4cf3fd7d2aac\") " Mar 19 19:23:03 crc kubenswrapper[4826]: I0319 19:23:03.249796 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07d156e5-8ee8-46e0-880f-4cf3fd7d2aac-config-data\") pod \"07d156e5-8ee8-46e0-880f-4cf3fd7d2aac\" (UID: \"07d156e5-8ee8-46e0-880f-4cf3fd7d2aac\") " Mar 19 19:23:03 crc kubenswrapper[4826]: I0319 19:23:03.249867 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/07d156e5-8ee8-46e0-880f-4cf3fd7d2aac-public-tls-certs\") pod \"07d156e5-8ee8-46e0-880f-4cf3fd7d2aac\" (UID: \"07d156e5-8ee8-46e0-880f-4cf3fd7d2aac\") " Mar 19 19:23:03 crc kubenswrapper[4826]: I0319 19:23:03.249970 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z27bh\" (UniqueName: \"kubernetes.io/projected/07d156e5-8ee8-46e0-880f-4cf3fd7d2aac-kube-api-access-z27bh\") pod \"07d156e5-8ee8-46e0-880f-4cf3fd7d2aac\" (UID: \"07d156e5-8ee8-46e0-880f-4cf3fd7d2aac\") " Mar 19 19:23:03 crc kubenswrapper[4826]: I0319 19:23:03.249999 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/07d156e5-8ee8-46e0-880f-4cf3fd7d2aac-internal-tls-certs\") pod \"07d156e5-8ee8-46e0-880f-4cf3fd7d2aac\" (UID: \"07d156e5-8ee8-46e0-880f-4cf3fd7d2aac\") " Mar 19 19:23:03 crc kubenswrapper[4826]: I0319 19:23:03.353045 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07d156e5-8ee8-46e0-880f-4cf3fd7d2aac-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "07d156e5-8ee8-46e0-880f-4cf3fd7d2aac" (UID: "07d156e5-8ee8-46e0-880f-4cf3fd7d2aac"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:23:03 crc kubenswrapper[4826]: I0319 19:23:03.353443 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/07d156e5-8ee8-46e0-880f-4cf3fd7d2aac-config-data-custom\") pod \"07d156e5-8ee8-46e0-880f-4cf3fd7d2aac\" (UID: \"07d156e5-8ee8-46e0-880f-4cf3fd7d2aac\") " Mar 19 19:23:03 crc kubenswrapper[4826]: W0319 19:23:03.353706 4826 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/07d156e5-8ee8-46e0-880f-4cf3fd7d2aac/volumes/kubernetes.io~secret/config-data-custom Mar 19 19:23:03 crc kubenswrapper[4826]: I0319 19:23:03.353721 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07d156e5-8ee8-46e0-880f-4cf3fd7d2aac-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "07d156e5-8ee8-46e0-880f-4cf3fd7d2aac" (UID: "07d156e5-8ee8-46e0-880f-4cf3fd7d2aac"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:23:03 crc kubenswrapper[4826]: I0319 19:23:03.355355 4826 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/07d156e5-8ee8-46e0-880f-4cf3fd7d2aac-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 19 19:23:03 crc kubenswrapper[4826]: I0319 19:23:03.358197 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07d156e5-8ee8-46e0-880f-4cf3fd7d2aac-kube-api-access-z27bh" (OuterVolumeSpecName: "kube-api-access-z27bh") pod "07d156e5-8ee8-46e0-880f-4cf3fd7d2aac" (UID: "07d156e5-8ee8-46e0-880f-4cf3fd7d2aac"). InnerVolumeSpecName "kube-api-access-z27bh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:23:03 crc kubenswrapper[4826]: I0319 19:23:03.452102 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07d156e5-8ee8-46e0-880f-4cf3fd7d2aac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "07d156e5-8ee8-46e0-880f-4cf3fd7d2aac" (UID: "07d156e5-8ee8-46e0-880f-4cf3fd7d2aac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:23:03 crc kubenswrapper[4826]: I0319 19:23:03.467806 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z27bh\" (UniqueName: \"kubernetes.io/projected/07d156e5-8ee8-46e0-880f-4cf3fd7d2aac-kube-api-access-z27bh\") on node \"crc\" DevicePath \"\"" Mar 19 19:23:03 crc kubenswrapper[4826]: I0319 19:23:03.467840 4826 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07d156e5-8ee8-46e0-880f-4cf3fd7d2aac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:23:03 crc kubenswrapper[4826]: I0319 19:23:03.472688 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07d156e5-8ee8-46e0-880f-4cf3fd7d2aac-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "07d156e5-8ee8-46e0-880f-4cf3fd7d2aac" (UID: "07d156e5-8ee8-46e0-880f-4cf3fd7d2aac"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:23:03 crc kubenswrapper[4826]: I0319 19:23:03.506749 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07d156e5-8ee8-46e0-880f-4cf3fd7d2aac-config-data" (OuterVolumeSpecName: "config-data") pod "07d156e5-8ee8-46e0-880f-4cf3fd7d2aac" (UID: "07d156e5-8ee8-46e0-880f-4cf3fd7d2aac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:23:03 crc kubenswrapper[4826]: I0319 19:23:03.510450 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07d156e5-8ee8-46e0-880f-4cf3fd7d2aac-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "07d156e5-8ee8-46e0-880f-4cf3fd7d2aac" (UID: "07d156e5-8ee8-46e0-880f-4cf3fd7d2aac"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:23:03 crc kubenswrapper[4826]: I0319 19:23:03.570290 4826 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/07d156e5-8ee8-46e0-880f-4cf3fd7d2aac-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 19:23:03 crc kubenswrapper[4826]: I0319 19:23:03.570327 4826 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/07d156e5-8ee8-46e0-880f-4cf3fd7d2aac-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 19:23:03 crc kubenswrapper[4826]: I0319 19:23:03.570337 4826 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07d156e5-8ee8-46e0-880f-4cf3fd7d2aac-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:23:03 crc kubenswrapper[4826]: I0319 19:23:03.665301 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-559768d959-n8n6w" Mar 19 19:23:03 crc kubenswrapper[4826]: I0319 19:23:03.775572 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/63a6bcd4-833d-4f25-a4ab-e890afd8feb1-internal-tls-certs\") pod \"63a6bcd4-833d-4f25-a4ab-e890afd8feb1\" (UID: \"63a6bcd4-833d-4f25-a4ab-e890afd8feb1\") " Mar 19 19:23:03 crc kubenswrapper[4826]: I0319 19:23:03.775686 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lljph\" (UniqueName: \"kubernetes.io/projected/63a6bcd4-833d-4f25-a4ab-e890afd8feb1-kube-api-access-lljph\") pod \"63a6bcd4-833d-4f25-a4ab-e890afd8feb1\" (UID: \"63a6bcd4-833d-4f25-a4ab-e890afd8feb1\") " Mar 19 19:23:03 crc kubenswrapper[4826]: I0319 19:23:03.775774 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/63a6bcd4-833d-4f25-a4ab-e890afd8feb1-public-tls-certs\") pod \"63a6bcd4-833d-4f25-a4ab-e890afd8feb1\" (UID: \"63a6bcd4-833d-4f25-a4ab-e890afd8feb1\") " Mar 19 19:23:03 crc kubenswrapper[4826]: I0319 19:23:03.775844 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63a6bcd4-833d-4f25-a4ab-e890afd8feb1-combined-ca-bundle\") pod \"63a6bcd4-833d-4f25-a4ab-e890afd8feb1\" (UID: \"63a6bcd4-833d-4f25-a4ab-e890afd8feb1\") " Mar 19 19:23:03 crc kubenswrapper[4826]: I0319 19:23:03.775964 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/63a6bcd4-833d-4f25-a4ab-e890afd8feb1-config-data-custom\") pod \"63a6bcd4-833d-4f25-a4ab-e890afd8feb1\" (UID: \"63a6bcd4-833d-4f25-a4ab-e890afd8feb1\") " Mar 19 19:23:03 crc kubenswrapper[4826]: I0319 19:23:03.776056 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63a6bcd4-833d-4f25-a4ab-e890afd8feb1-config-data\") pod \"63a6bcd4-833d-4f25-a4ab-e890afd8feb1\" (UID: \"63a6bcd4-833d-4f25-a4ab-e890afd8feb1\") " Mar 19 19:23:03 crc kubenswrapper[4826]: I0319 19:23:03.787447 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63a6bcd4-833d-4f25-a4ab-e890afd8feb1-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "63a6bcd4-833d-4f25-a4ab-e890afd8feb1" (UID: "63a6bcd4-833d-4f25-a4ab-e890afd8feb1"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:23:03 crc kubenswrapper[4826]: I0319 19:23:03.790854 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63a6bcd4-833d-4f25-a4ab-e890afd8feb1-kube-api-access-lljph" (OuterVolumeSpecName: "kube-api-access-lljph") pod "63a6bcd4-833d-4f25-a4ab-e890afd8feb1" (UID: "63a6bcd4-833d-4f25-a4ab-e890afd8feb1"). InnerVolumeSpecName "kube-api-access-lljph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:23:03 crc kubenswrapper[4826]: I0319 19:23:03.826530 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63a6bcd4-833d-4f25-a4ab-e890afd8feb1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "63a6bcd4-833d-4f25-a4ab-e890afd8feb1" (UID: "63a6bcd4-833d-4f25-a4ab-e890afd8feb1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:23:03 crc kubenswrapper[4826]: I0319 19:23:03.846127 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63a6bcd4-833d-4f25-a4ab-e890afd8feb1-config-data" (OuterVolumeSpecName: "config-data") pod "63a6bcd4-833d-4f25-a4ab-e890afd8feb1" (UID: "63a6bcd4-833d-4f25-a4ab-e890afd8feb1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:23:03 crc kubenswrapper[4826]: I0319 19:23:03.849776 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-784d5749b4-gl4q7" event={"ID":"07d156e5-8ee8-46e0-880f-4cf3fd7d2aac","Type":"ContainerDied","Data":"65068b9bcf79aa38cf3d0a9cfb5d3aa5593a175a7b589cd68d69ef048390dcec"} Mar 19 19:23:03 crc kubenswrapper[4826]: I0319 19:23:03.849823 4826 scope.go:117] "RemoveContainer" containerID="5446ca03f39f3c674da1a362b6032b333090d971f781df7dcac73f902e93848a" Mar 19 19:23:03 crc kubenswrapper[4826]: I0319 19:23:03.849990 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-784d5749b4-gl4q7" Mar 19 19:23:03 crc kubenswrapper[4826]: I0319 19:23:03.860411 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"de5e1809-5292-4c32-a83e-9dbb01f1db4b","Type":"ContainerStarted","Data":"61ee102d39bb6d8538216fde3b50927fa97aeef67c3c580239ecf93e494062fd"} Mar 19 19:23:03 crc kubenswrapper[4826]: I0319 19:23:03.860966 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-2" Mar 19 19:23:03 crc kubenswrapper[4826]: I0319 19:23:03.861819 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63a6bcd4-833d-4f25-a4ab-e890afd8feb1-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "63a6bcd4-833d-4f25-a4ab-e890afd8feb1" (UID: "63a6bcd4-833d-4f25-a4ab-e890afd8feb1"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:23:03 crc kubenswrapper[4826]: I0319 19:23:03.875044 4826 generic.go:334] "Generic (PLEG): container finished" podID="63a6bcd4-833d-4f25-a4ab-e890afd8feb1" containerID="d85d3e712985dd5f00d7a95045fa98d205a5c9d5719eccb9bdba353b07351c6f" exitCode=0 Mar 19 19:23:03 crc kubenswrapper[4826]: I0319 19:23:03.875127 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-559768d959-n8n6w" Mar 19 19:23:03 crc kubenswrapper[4826]: I0319 19:23:03.875137 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-559768d959-n8n6w" event={"ID":"63a6bcd4-833d-4f25-a4ab-e890afd8feb1","Type":"ContainerDied","Data":"d85d3e712985dd5f00d7a95045fa98d205a5c9d5719eccb9bdba353b07351c6f"} Mar 19 19:23:03 crc kubenswrapper[4826]: I0319 19:23:03.875163 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-559768d959-n8n6w" event={"ID":"63a6bcd4-833d-4f25-a4ab-e890afd8feb1","Type":"ContainerDied","Data":"4a1d859f8a14665eb264785e6ff8bfa0d3c8bf459cbb4f1e28fb5a0737d6ee25"} Mar 19 19:23:03 crc kubenswrapper[4826]: I0319 19:23:03.879908 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lljph\" (UniqueName: \"kubernetes.io/projected/63a6bcd4-833d-4f25-a4ab-e890afd8feb1-kube-api-access-lljph\") on node \"crc\" DevicePath \"\"" Mar 19 19:23:03 crc kubenswrapper[4826]: I0319 19:23:03.879958 4826 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/63a6bcd4-833d-4f25-a4ab-e890afd8feb1-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 19:23:03 crc kubenswrapper[4826]: I0319 19:23:03.879970 4826 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63a6bcd4-833d-4f25-a4ab-e890afd8feb1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:23:03 crc kubenswrapper[4826]: I0319 19:23:03.879979 4826 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/63a6bcd4-833d-4f25-a4ab-e890afd8feb1-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 19 19:23:03 crc kubenswrapper[4826]: I0319 19:23:03.879988 4826 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63a6bcd4-833d-4f25-a4ab-e890afd8feb1-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:23:03 crc kubenswrapper[4826]: I0319 19:23:03.888771 4826 generic.go:334] "Generic (PLEG): container finished" podID="208228fc-8848-4817-96ea-48e37f6386ce" containerID="b2162d04f73e1d7502688b6dbf701b8b79bcaa056cf6a8c51449be4bb7bd7717" exitCode=0 Mar 19 19:23:03 crc kubenswrapper[4826]: I0319 19:23:03.888823 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"208228fc-8848-4817-96ea-48e37f6386ce","Type":"ContainerDied","Data":"b2162d04f73e1d7502688b6dbf701b8b79bcaa056cf6a8c51449be4bb7bd7717"} Mar 19 19:23:03 crc kubenswrapper[4826]: I0319 19:23:03.894775 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-2" podStartSLOduration=38.894757132 podStartE2EDuration="38.894757132s" podCreationTimestamp="2026-03-19 19:22:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:23:03.881068231 +0000 UTC m=+1608.635136554" watchObservedRunningTime="2026-03-19 19:23:03.894757132 +0000 UTC m=+1608.648825445" Mar 19 19:23:03 crc kubenswrapper[4826]: I0319 19:23:03.910428 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63a6bcd4-833d-4f25-a4ab-e890afd8feb1-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "63a6bcd4-833d-4f25-a4ab-e890afd8feb1" (UID: "63a6bcd4-833d-4f25-a4ab-e890afd8feb1"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:23:03 crc kubenswrapper[4826]: I0319 19:23:03.983428 4826 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/63a6bcd4-833d-4f25-a4ab-e890afd8feb1-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 19:23:04 crc kubenswrapper[4826]: I0319 19:23:04.045316 4826 scope.go:117] "RemoveContainer" containerID="d85d3e712985dd5f00d7a95045fa98d205a5c9d5719eccb9bdba353b07351c6f" Mar 19 19:23:04 crc kubenswrapper[4826]: I0319 19:23:04.053227 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-784d5749b4-gl4q7"] Mar 19 19:23:04 crc kubenswrapper[4826]: I0319 19:23:04.068248 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-784d5749b4-gl4q7"] Mar 19 19:23:04 crc kubenswrapper[4826]: I0319 19:23:04.078357 4826 scope.go:117] "RemoveContainer" containerID="d85d3e712985dd5f00d7a95045fa98d205a5c9d5719eccb9bdba353b07351c6f" Mar 19 19:23:04 crc kubenswrapper[4826]: E0319 19:23:04.084376 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d85d3e712985dd5f00d7a95045fa98d205a5c9d5719eccb9bdba353b07351c6f\": container with ID starting with d85d3e712985dd5f00d7a95045fa98d205a5c9d5719eccb9bdba353b07351c6f not found: ID does not exist" containerID="d85d3e712985dd5f00d7a95045fa98d205a5c9d5719eccb9bdba353b07351c6f" Mar 19 19:23:04 crc kubenswrapper[4826]: I0319 19:23:04.084416 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d85d3e712985dd5f00d7a95045fa98d205a5c9d5719eccb9bdba353b07351c6f"} err="failed to get container status \"d85d3e712985dd5f00d7a95045fa98d205a5c9d5719eccb9bdba353b07351c6f\": rpc error: code = NotFound desc = could not find container \"d85d3e712985dd5f00d7a95045fa98d205a5c9d5719eccb9bdba353b07351c6f\": container with ID starting with d85d3e712985dd5f00d7a95045fa98d205a5c9d5719eccb9bdba353b07351c6f not found: ID does not exist" Mar 19 19:23:04 crc kubenswrapper[4826]: I0319 19:23:04.206353 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-559768d959-n8n6w"] Mar 19 19:23:04 crc kubenswrapper[4826]: I0319 19:23:04.216839 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-559768d959-n8n6w"] Mar 19 19:23:04 crc kubenswrapper[4826]: I0319 19:23:04.901799 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"208228fc-8848-4817-96ea-48e37f6386ce","Type":"ContainerStarted","Data":"e9c4a1fdb2a344ab1bb67b32941ef797f2639cf780427a71b44e710b812847b3"} Mar 19 19:23:04 crc kubenswrapper[4826]: I0319 19:23:04.902276 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:23:04 crc kubenswrapper[4826]: I0319 19:23:04.932736 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=39.932716569 podStartE2EDuration="39.932716569s" podCreationTimestamp="2026-03-19 19:22:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:23:04.921171609 +0000 UTC m=+1609.675239942" watchObservedRunningTime="2026-03-19 19:23:04.932716569 +0000 UTC m=+1609.686784882" Mar 19 19:23:06 crc kubenswrapper[4826]: I0319 19:23:06.006605 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07d156e5-8ee8-46e0-880f-4cf3fd7d2aac" path="/var/lib/kubelet/pods/07d156e5-8ee8-46e0-880f-4cf3fd7d2aac/volumes" Mar 19 19:23:06 crc kubenswrapper[4826]: I0319 19:23:06.008682 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63a6bcd4-833d-4f25-a4ab-e890afd8feb1" path="/var/lib/kubelet/pods/63a6bcd4-833d-4f25-a4ab-e890afd8feb1/volumes" Mar 19 19:23:06 crc kubenswrapper[4826]: I0319 19:23:06.370752 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-k6wzl"] Mar 19 19:23:06 crc kubenswrapper[4826]: E0319 19:23:06.371929 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63a6bcd4-833d-4f25-a4ab-e890afd8feb1" containerName="heat-cfnapi" Mar 19 19:23:06 crc kubenswrapper[4826]: I0319 19:23:06.371951 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="63a6bcd4-833d-4f25-a4ab-e890afd8feb1" containerName="heat-cfnapi" Mar 19 19:23:06 crc kubenswrapper[4826]: E0319 19:23:06.371977 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07d156e5-8ee8-46e0-880f-4cf3fd7d2aac" containerName="heat-api" Mar 19 19:23:06 crc kubenswrapper[4826]: I0319 19:23:06.371984 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="07d156e5-8ee8-46e0-880f-4cf3fd7d2aac" containerName="heat-api" Mar 19 19:23:06 crc kubenswrapper[4826]: E0319 19:23:06.371994 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af1734c8-f29e-4752-8f25-9f6a79025585" containerName="dnsmasq-dns" Mar 19 19:23:06 crc kubenswrapper[4826]: I0319 19:23:06.372001 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="af1734c8-f29e-4752-8f25-9f6a79025585" containerName="dnsmasq-dns" Mar 19 19:23:06 crc kubenswrapper[4826]: E0319 19:23:06.372028 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af1734c8-f29e-4752-8f25-9f6a79025585" containerName="init" Mar 19 19:23:06 crc kubenswrapper[4826]: I0319 19:23:06.372035 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="af1734c8-f29e-4752-8f25-9f6a79025585" containerName="init" Mar 19 19:23:06 crc kubenswrapper[4826]: I0319 19:23:06.372615 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="af1734c8-f29e-4752-8f25-9f6a79025585" containerName="dnsmasq-dns" Mar 19 19:23:06 crc kubenswrapper[4826]: I0319 19:23:06.372666 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="63a6bcd4-833d-4f25-a4ab-e890afd8feb1" containerName="heat-cfnapi" Mar 19 19:23:06 crc kubenswrapper[4826]: I0319 19:23:06.372682 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="07d156e5-8ee8-46e0-880f-4cf3fd7d2aac" containerName="heat-api" Mar 19 19:23:06 crc kubenswrapper[4826]: I0319 19:23:06.373997 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-k6wzl" Mar 19 19:23:06 crc kubenswrapper[4826]: I0319 19:23:06.380004 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jchxw" Mar 19 19:23:06 crc kubenswrapper[4826]: I0319 19:23:06.381245 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 19:23:06 crc kubenswrapper[4826]: I0319 19:23:06.381374 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 19:23:06 crc kubenswrapper[4826]: I0319 19:23:06.382880 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 19:23:06 crc kubenswrapper[4826]: I0319 19:23:06.411058 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-k6wzl"] Mar 19 19:23:06 crc kubenswrapper[4826]: I0319 19:23:06.438913 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d499bbf3-6fa0-4467-92f6-7ccaa0f71b06-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-k6wzl\" (UID: \"d499bbf3-6fa0-4467-92f6-7ccaa0f71b06\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-k6wzl" Mar 19 19:23:06 crc kubenswrapper[4826]: I0319 19:23:06.439141 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d499bbf3-6fa0-4467-92f6-7ccaa0f71b06-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-k6wzl\" (UID: \"d499bbf3-6fa0-4467-92f6-7ccaa0f71b06\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-k6wzl" Mar 19 19:23:06 crc kubenswrapper[4826]: I0319 19:23:06.439285 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfvrt\" (UniqueName: \"kubernetes.io/projected/d499bbf3-6fa0-4467-92f6-7ccaa0f71b06-kube-api-access-wfvrt\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-k6wzl\" (UID: \"d499bbf3-6fa0-4467-92f6-7ccaa0f71b06\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-k6wzl" Mar 19 19:23:06 crc kubenswrapper[4826]: I0319 19:23:06.439362 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d499bbf3-6fa0-4467-92f6-7ccaa0f71b06-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-k6wzl\" (UID: \"d499bbf3-6fa0-4467-92f6-7ccaa0f71b06\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-k6wzl" Mar 19 19:23:06 crc kubenswrapper[4826]: I0319 19:23:06.541818 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d499bbf3-6fa0-4467-92f6-7ccaa0f71b06-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-k6wzl\" (UID: \"d499bbf3-6fa0-4467-92f6-7ccaa0f71b06\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-k6wzl" Mar 19 19:23:06 crc kubenswrapper[4826]: I0319 19:23:06.541978 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d499bbf3-6fa0-4467-92f6-7ccaa0f71b06-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-k6wzl\" (UID: \"d499bbf3-6fa0-4467-92f6-7ccaa0f71b06\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-k6wzl" Mar 19 19:23:06 crc kubenswrapper[4826]: I0319 19:23:06.542054 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfvrt\" (UniqueName: \"kubernetes.io/projected/d499bbf3-6fa0-4467-92f6-7ccaa0f71b06-kube-api-access-wfvrt\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-k6wzl\" (UID: \"d499bbf3-6fa0-4467-92f6-7ccaa0f71b06\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-k6wzl" Mar 19 19:23:06 crc kubenswrapper[4826]: I0319 19:23:06.542094 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d499bbf3-6fa0-4467-92f6-7ccaa0f71b06-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-k6wzl\" (UID: \"d499bbf3-6fa0-4467-92f6-7ccaa0f71b06\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-k6wzl" Mar 19 19:23:06 crc kubenswrapper[4826]: I0319 19:23:06.553449 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d499bbf3-6fa0-4467-92f6-7ccaa0f71b06-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-k6wzl\" (UID: \"d499bbf3-6fa0-4467-92f6-7ccaa0f71b06\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-k6wzl" Mar 19 19:23:06 crc kubenswrapper[4826]: I0319 19:23:06.553523 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d499bbf3-6fa0-4467-92f6-7ccaa0f71b06-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-k6wzl\" (UID: \"d499bbf3-6fa0-4467-92f6-7ccaa0f71b06\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-k6wzl" Mar 19 19:23:06 crc kubenswrapper[4826]: I0319 19:23:06.558449 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d499bbf3-6fa0-4467-92f6-7ccaa0f71b06-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-k6wzl\" (UID: \"d499bbf3-6fa0-4467-92f6-7ccaa0f71b06\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-k6wzl" Mar 19 19:23:06 crc kubenswrapper[4826]: I0319 19:23:06.558699 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfvrt\" (UniqueName: \"kubernetes.io/projected/d499bbf3-6fa0-4467-92f6-7ccaa0f71b06-kube-api-access-wfvrt\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-k6wzl\" (UID: \"d499bbf3-6fa0-4467-92f6-7ccaa0f71b06\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-k6wzl" Mar 19 19:23:06 crc kubenswrapper[4826]: I0319 19:23:06.709940 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-k6wzl" Mar 19 19:23:07 crc kubenswrapper[4826]: I0319 19:23:07.503381 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-k6wzl"] Mar 19 19:23:07 crc kubenswrapper[4826]: W0319 19:23:07.526393 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd499bbf3_6fa0_4467_92f6_7ccaa0f71b06.slice/crio-e3ef186b8e59e84e4682aed7507edc3a4e07a93141aa7ee9af8ac8c8fcd66122 WatchSource:0}: Error finding container e3ef186b8e59e84e4682aed7507edc3a4e07a93141aa7ee9af8ac8c8fcd66122: Status 404 returned error can't find the container with id e3ef186b8e59e84e4682aed7507edc3a4e07a93141aa7ee9af8ac8c8fcd66122 Mar 19 19:23:07 crc kubenswrapper[4826]: I0319 19:23:07.541351 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-84bf69fdcb-b6hq4" Mar 19 19:23:07 crc kubenswrapper[4826]: I0319 19:23:07.630611 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-5778865fb9-z27ps"] Mar 19 19:23:07 crc kubenswrapper[4826]: I0319 19:23:07.630867 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-5778865fb9-z27ps" podUID="76db7194-58de-4efa-8ffc-a18f17d2a3c4" containerName="heat-engine" containerID="cri-o://68dfaf9f9c83d4523aa658f5fab31f7d1183a95ef00b2a8f508bf2a06aba4fee" gracePeriod=60 Mar 19 19:23:07 crc kubenswrapper[4826]: I0319 19:23:07.948110 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-k6wzl" event={"ID":"d499bbf3-6fa0-4467-92f6-7ccaa0f71b06","Type":"ContainerStarted","Data":"e3ef186b8e59e84e4682aed7507edc3a4e07a93141aa7ee9af8ac8c8fcd66122"} Mar 19 19:23:09 crc kubenswrapper[4826]: I0319 19:23:09.544281 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-76txf"] Mar 19 19:23:09 crc kubenswrapper[4826]: I0319 19:23:09.561900 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-76txf"] Mar 19 19:23:09 crc kubenswrapper[4826]: I0319 19:23:09.687014 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-h4j6w"] Mar 19 19:23:09 crc kubenswrapper[4826]: I0319 19:23:09.688580 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-h4j6w" Mar 19 19:23:09 crc kubenswrapper[4826]: I0319 19:23:09.693302 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 19 19:23:09 crc kubenswrapper[4826]: I0319 19:23:09.705362 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-h4j6w"] Mar 19 19:23:09 crc kubenswrapper[4826]: I0319 19:23:09.733793 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlgld\" (UniqueName: \"kubernetes.io/projected/6a20d409-205e-4c2a-9197-a6fad3fc7e94-kube-api-access-vlgld\") pod \"aodh-db-sync-h4j6w\" (UID: \"6a20d409-205e-4c2a-9197-a6fad3fc7e94\") " pod="openstack/aodh-db-sync-h4j6w" Mar 19 19:23:09 crc kubenswrapper[4826]: I0319 19:23:09.733851 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a20d409-205e-4c2a-9197-a6fad3fc7e94-combined-ca-bundle\") pod \"aodh-db-sync-h4j6w\" (UID: \"6a20d409-205e-4c2a-9197-a6fad3fc7e94\") " pod="openstack/aodh-db-sync-h4j6w" Mar 19 19:23:09 crc kubenswrapper[4826]: I0319 19:23:09.733891 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a20d409-205e-4c2a-9197-a6fad3fc7e94-config-data\") pod \"aodh-db-sync-h4j6w\" (UID: \"6a20d409-205e-4c2a-9197-a6fad3fc7e94\") " pod="openstack/aodh-db-sync-h4j6w" Mar 19 19:23:09 crc kubenswrapper[4826]: I0319 19:23:09.734099 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a20d409-205e-4c2a-9197-a6fad3fc7e94-scripts\") pod \"aodh-db-sync-h4j6w\" (UID: \"6a20d409-205e-4c2a-9197-a6fad3fc7e94\") " pod="openstack/aodh-db-sync-h4j6w" Mar 19 19:23:09 crc kubenswrapper[4826]: I0319 19:23:09.838097 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlgld\" (UniqueName: \"kubernetes.io/projected/6a20d409-205e-4c2a-9197-a6fad3fc7e94-kube-api-access-vlgld\") pod \"aodh-db-sync-h4j6w\" (UID: \"6a20d409-205e-4c2a-9197-a6fad3fc7e94\") " pod="openstack/aodh-db-sync-h4j6w" Mar 19 19:23:09 crc kubenswrapper[4826]: I0319 19:23:09.838175 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a20d409-205e-4c2a-9197-a6fad3fc7e94-combined-ca-bundle\") pod \"aodh-db-sync-h4j6w\" (UID: \"6a20d409-205e-4c2a-9197-a6fad3fc7e94\") " pod="openstack/aodh-db-sync-h4j6w" Mar 19 19:23:09 crc kubenswrapper[4826]: I0319 19:23:09.838219 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a20d409-205e-4c2a-9197-a6fad3fc7e94-config-data\") pod \"aodh-db-sync-h4j6w\" (UID: \"6a20d409-205e-4c2a-9197-a6fad3fc7e94\") " pod="openstack/aodh-db-sync-h4j6w" Mar 19 19:23:09 crc kubenswrapper[4826]: I0319 19:23:09.838308 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a20d409-205e-4c2a-9197-a6fad3fc7e94-scripts\") pod \"aodh-db-sync-h4j6w\" (UID: \"6a20d409-205e-4c2a-9197-a6fad3fc7e94\") " pod="openstack/aodh-db-sync-h4j6w" Mar 19 19:23:09 crc kubenswrapper[4826]: I0319 19:23:09.847906 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a20d409-205e-4c2a-9197-a6fad3fc7e94-config-data\") pod \"aodh-db-sync-h4j6w\" (UID: \"6a20d409-205e-4c2a-9197-a6fad3fc7e94\") " pod="openstack/aodh-db-sync-h4j6w" Mar 19 19:23:09 crc kubenswrapper[4826]: I0319 19:23:09.855812 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a20d409-205e-4c2a-9197-a6fad3fc7e94-scripts\") pod \"aodh-db-sync-h4j6w\" (UID: \"6a20d409-205e-4c2a-9197-a6fad3fc7e94\") " pod="openstack/aodh-db-sync-h4j6w" Mar 19 19:23:09 crc kubenswrapper[4826]: I0319 19:23:09.857747 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a20d409-205e-4c2a-9197-a6fad3fc7e94-combined-ca-bundle\") pod \"aodh-db-sync-h4j6w\" (UID: \"6a20d409-205e-4c2a-9197-a6fad3fc7e94\") " pod="openstack/aodh-db-sync-h4j6w" Mar 19 19:23:09 crc kubenswrapper[4826]: I0319 19:23:09.883249 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlgld\" (UniqueName: \"kubernetes.io/projected/6a20d409-205e-4c2a-9197-a6fad3fc7e94-kube-api-access-vlgld\") pod \"aodh-db-sync-h4j6w\" (UID: \"6a20d409-205e-4c2a-9197-a6fad3fc7e94\") " pod="openstack/aodh-db-sync-h4j6w" Mar 19 19:23:10 crc kubenswrapper[4826]: I0319 19:23:10.018788 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-h4j6w" Mar 19 19:23:10 crc kubenswrapper[4826]: I0319 19:23:10.022087 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f41dad7-21ad-43b0-9e07-443bcd0c8c6a" path="/var/lib/kubelet/pods/1f41dad7-21ad-43b0-9e07-443bcd0c8c6a/volumes" Mar 19 19:23:11 crc kubenswrapper[4826]: I0319 19:23:11.699698 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-h4j6w"] Mar 19 19:23:11 crc kubenswrapper[4826]: I0319 19:23:11.976271 4826 scope.go:117] "RemoveContainer" containerID="856447f1cdc796c080402d3bfb76d7471741ca95039714006756d0cb980e424c" Mar 19 19:23:11 crc kubenswrapper[4826]: E0319 19:23:11.976585 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:23:12 crc kubenswrapper[4826]: I0319 19:23:12.028056 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-h4j6w" event={"ID":"6a20d409-205e-4c2a-9197-a6fad3fc7e94","Type":"ContainerStarted","Data":"2ac7b7e19a4602883a096b7a7e236b8f621ed0e34811bb5b795be9a9d04141f2"} Mar 19 19:23:13 crc kubenswrapper[4826]: E0319 19:23:13.079408 4826 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="68dfaf9f9c83d4523aa658f5fab31f7d1183a95ef00b2a8f508bf2a06aba4fee" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 19 19:23:13 crc kubenswrapper[4826]: E0319 19:23:13.081343 4826 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="68dfaf9f9c83d4523aa658f5fab31f7d1183a95ef00b2a8f508bf2a06aba4fee" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 19 19:23:13 crc kubenswrapper[4826]: E0319 19:23:13.082677 4826 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="68dfaf9f9c83d4523aa658f5fab31f7d1183a95ef00b2a8f508bf2a06aba4fee" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 19 19:23:13 crc kubenswrapper[4826]: E0319 19:23:13.082746 4826 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-5778865fb9-z27ps" podUID="76db7194-58de-4efa-8ffc-a18f17d2a3c4" containerName="heat-engine" Mar 19 19:23:16 crc kubenswrapper[4826]: I0319 19:23:16.141803 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 19 19:23:16 crc kubenswrapper[4826]: I0319 19:23:16.455254 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-2" Mar 19 19:23:16 crc kubenswrapper[4826]: I0319 19:23:16.533602 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 19 19:23:21 crc kubenswrapper[4826]: I0319 19:23:21.479048 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-1" podUID="2325ef7c-90a0-48f3-81f0-ede3e7f33570" containerName="rabbitmq" containerID="cri-o://44ca87a6f1315aca3fa78783b92f5e8f7f8b8fcb3cb71b35e857cd4f7578c88c" gracePeriod=604796 Mar 19 19:23:23 crc kubenswrapper[4826]: E0319 19:23:23.079241 4826 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 68dfaf9f9c83d4523aa658f5fab31f7d1183a95ef00b2a8f508bf2a06aba4fee is running failed: container process not found" containerID="68dfaf9f9c83d4523aa658f5fab31f7d1183a95ef00b2a8f508bf2a06aba4fee" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 19 19:23:23 crc kubenswrapper[4826]: E0319 19:23:23.081317 4826 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 68dfaf9f9c83d4523aa658f5fab31f7d1183a95ef00b2a8f508bf2a06aba4fee is running failed: container process not found" containerID="68dfaf9f9c83d4523aa658f5fab31f7d1183a95ef00b2a8f508bf2a06aba4fee" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 19 19:23:23 crc kubenswrapper[4826]: E0319 19:23:23.081873 4826 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 68dfaf9f9c83d4523aa658f5fab31f7d1183a95ef00b2a8f508bf2a06aba4fee is running failed: container process not found" containerID="68dfaf9f9c83d4523aa658f5fab31f7d1183a95ef00b2a8f508bf2a06aba4fee" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 19 19:23:23 crc kubenswrapper[4826]: E0319 19:23:23.081944 4826 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 68dfaf9f9c83d4523aa658f5fab31f7d1183a95ef00b2a8f508bf2a06aba4fee is running failed: container process not found" probeType="Readiness" pod="openstack/heat-engine-5778865fb9-z27ps" podUID="76db7194-58de-4efa-8ffc-a18f17d2a3c4" containerName="heat-engine" Mar 19 19:23:23 crc kubenswrapper[4826]: I0319 19:23:23.185158 4826 generic.go:334] "Generic (PLEG): container finished" podID="76db7194-58de-4efa-8ffc-a18f17d2a3c4" containerID="68dfaf9f9c83d4523aa658f5fab31f7d1183a95ef00b2a8f508bf2a06aba4fee" exitCode=0 Mar 19 19:23:23 crc kubenswrapper[4826]: I0319 19:23:23.185197 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5778865fb9-z27ps" event={"ID":"76db7194-58de-4efa-8ffc-a18f17d2a3c4","Type":"ContainerDied","Data":"68dfaf9f9c83d4523aa658f5fab31f7d1183a95ef00b2a8f508bf2a06aba4fee"} Mar 19 19:23:23 crc kubenswrapper[4826]: I0319 19:23:23.888464 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 19:23:23 crc kubenswrapper[4826]: I0319 19:23:23.900117 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 19 19:23:23 crc kubenswrapper[4826]: I0319 19:23:23.977706 4826 scope.go:117] "RemoveContainer" containerID="856447f1cdc796c080402d3bfb76d7471741ca95039714006756d0cb980e424c" Mar 19 19:23:23 crc kubenswrapper[4826]: E0319 19:23:23.978083 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:23:24 crc kubenswrapper[4826]: I0319 19:23:24.116442 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5778865fb9-z27ps" Mar 19 19:23:24 crc kubenswrapper[4826]: I0319 19:23:24.198730 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5778865fb9-z27ps" event={"ID":"76db7194-58de-4efa-8ffc-a18f17d2a3c4","Type":"ContainerDied","Data":"69740d7ca7b86c059bf74495223cd2f5a4d9d1fd8f209241cce4976d07d9c841"} Mar 19 19:23:24 crc kubenswrapper[4826]: I0319 19:23:24.199164 4826 scope.go:117] "RemoveContainer" containerID="68dfaf9f9c83d4523aa658f5fab31f7d1183a95ef00b2a8f508bf2a06aba4fee" Mar 19 19:23:24 crc kubenswrapper[4826]: I0319 19:23:24.198754 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5778865fb9-z27ps" Mar 19 19:23:24 crc kubenswrapper[4826]: I0319 19:23:24.220614 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76db7194-58de-4efa-8ffc-a18f17d2a3c4-config-data\") pod \"76db7194-58de-4efa-8ffc-a18f17d2a3c4\" (UID: \"76db7194-58de-4efa-8ffc-a18f17d2a3c4\") " Mar 19 19:23:24 crc kubenswrapper[4826]: I0319 19:23:24.220695 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/76db7194-58de-4efa-8ffc-a18f17d2a3c4-config-data-custom\") pod \"76db7194-58de-4efa-8ffc-a18f17d2a3c4\" (UID: \"76db7194-58de-4efa-8ffc-a18f17d2a3c4\") " Mar 19 19:23:24 crc kubenswrapper[4826]: I0319 19:23:24.220720 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76db7194-58de-4efa-8ffc-a18f17d2a3c4-combined-ca-bundle\") pod \"76db7194-58de-4efa-8ffc-a18f17d2a3c4\" (UID: \"76db7194-58de-4efa-8ffc-a18f17d2a3c4\") " Mar 19 19:23:24 crc kubenswrapper[4826]: I0319 19:23:24.220756 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mx9fn\" (UniqueName: \"kubernetes.io/projected/76db7194-58de-4efa-8ffc-a18f17d2a3c4-kube-api-access-mx9fn\") pod \"76db7194-58de-4efa-8ffc-a18f17d2a3c4\" (UID: \"76db7194-58de-4efa-8ffc-a18f17d2a3c4\") " Mar 19 19:23:24 crc kubenswrapper[4826]: I0319 19:23:24.227620 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76db7194-58de-4efa-8ffc-a18f17d2a3c4-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "76db7194-58de-4efa-8ffc-a18f17d2a3c4" (UID: "76db7194-58de-4efa-8ffc-a18f17d2a3c4"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:23:24 crc kubenswrapper[4826]: I0319 19:23:24.230030 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76db7194-58de-4efa-8ffc-a18f17d2a3c4-kube-api-access-mx9fn" (OuterVolumeSpecName: "kube-api-access-mx9fn") pod "76db7194-58de-4efa-8ffc-a18f17d2a3c4" (UID: "76db7194-58de-4efa-8ffc-a18f17d2a3c4"). InnerVolumeSpecName "kube-api-access-mx9fn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:23:24 crc kubenswrapper[4826]: I0319 19:23:24.289372 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76db7194-58de-4efa-8ffc-a18f17d2a3c4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "76db7194-58de-4efa-8ffc-a18f17d2a3c4" (UID: "76db7194-58de-4efa-8ffc-a18f17d2a3c4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:23:24 crc kubenswrapper[4826]: I0319 19:23:24.323718 4826 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/76db7194-58de-4efa-8ffc-a18f17d2a3c4-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 19 19:23:24 crc kubenswrapper[4826]: I0319 19:23:24.323748 4826 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76db7194-58de-4efa-8ffc-a18f17d2a3c4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:23:24 crc kubenswrapper[4826]: I0319 19:23:24.323758 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mx9fn\" (UniqueName: \"kubernetes.io/projected/76db7194-58de-4efa-8ffc-a18f17d2a3c4-kube-api-access-mx9fn\") on node \"crc\" DevicePath \"\"" Mar 19 19:23:24 crc kubenswrapper[4826]: I0319 19:23:24.334965 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76db7194-58de-4efa-8ffc-a18f17d2a3c4-config-data" (OuterVolumeSpecName: "config-data") pod "76db7194-58de-4efa-8ffc-a18f17d2a3c4" (UID: "76db7194-58de-4efa-8ffc-a18f17d2a3c4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:23:24 crc kubenswrapper[4826]: I0319 19:23:24.426370 4826 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76db7194-58de-4efa-8ffc-a18f17d2a3c4-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:23:24 crc kubenswrapper[4826]: I0319 19:23:24.536801 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-5778865fb9-z27ps"] Mar 19 19:23:24 crc kubenswrapper[4826]: I0319 19:23:24.549992 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-5778865fb9-z27ps"] Mar 19 19:23:25 crc kubenswrapper[4826]: I0319 19:23:25.239840 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-k6wzl" event={"ID":"d499bbf3-6fa0-4467-92f6-7ccaa0f71b06","Type":"ContainerStarted","Data":"118b48320b60775b2aade493f6ba8f4e8e5f07c2de860dbcca8bdb51db54e388"} Mar 19 19:23:25 crc kubenswrapper[4826]: I0319 19:23:25.246011 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-h4j6w" event={"ID":"6a20d409-205e-4c2a-9197-a6fad3fc7e94","Type":"ContainerStarted","Data":"82a7514d32eb46e0c15228fa576238ef2b003bde6ee7e20d68573001764192ed"} Mar 19 19:23:25 crc kubenswrapper[4826]: I0319 19:23:25.280988 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-k6wzl" podStartSLOduration=2.925249771 podStartE2EDuration="19.280966133s" podCreationTimestamp="2026-03-19 19:23:06 +0000 UTC" firstStartedPulling="2026-03-19 19:23:07.529741106 +0000 UTC m=+1612.283809449" lastFinishedPulling="2026-03-19 19:23:23.885457498 +0000 UTC m=+1628.639525811" observedRunningTime="2026-03-19 19:23:25.265145864 +0000 UTC m=+1630.019214207" watchObservedRunningTime="2026-03-19 19:23:25.280966133 +0000 UTC m=+1630.035034436" Mar 19 19:23:25 crc kubenswrapper[4826]: I0319 19:23:25.296936 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-h4j6w" podStartSLOduration=4.1010803 podStartE2EDuration="16.296904804s" podCreationTimestamp="2026-03-19 19:23:09 +0000 UTC" firstStartedPulling="2026-03-19 19:23:11.70087829 +0000 UTC m=+1616.454946603" lastFinishedPulling="2026-03-19 19:23:23.896702794 +0000 UTC m=+1628.650771107" observedRunningTime="2026-03-19 19:23:25.288120079 +0000 UTC m=+1630.042188402" watchObservedRunningTime="2026-03-19 19:23:25.296904804 +0000 UTC m=+1630.050973127" Mar 19 19:23:25 crc kubenswrapper[4826]: I0319 19:23:25.992458 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76db7194-58de-4efa-8ffc-a18f17d2a3c4" path="/var/lib/kubelet/pods/76db7194-58de-4efa-8ffc-a18f17d2a3c4/volumes" Mar 19 19:23:27 crc kubenswrapper[4826]: I0319 19:23:27.278768 4826 generic.go:334] "Generic (PLEG): container finished" podID="6a20d409-205e-4c2a-9197-a6fad3fc7e94" containerID="82a7514d32eb46e0c15228fa576238ef2b003bde6ee7e20d68573001764192ed" exitCode=0 Mar 19 19:23:27 crc kubenswrapper[4826]: I0319 19:23:27.279034 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-h4j6w" event={"ID":"6a20d409-205e-4c2a-9197-a6fad3fc7e94","Type":"ContainerDied","Data":"82a7514d32eb46e0c15228fa576238ef2b003bde6ee7e20d68573001764192ed"} Mar 19 19:23:27 crc kubenswrapper[4826]: I0319 19:23:27.910962 4826 scope.go:117] "RemoveContainer" containerID="04d31648074977e485b156fca3ec61b392463a5f1231866828c07c4f46cf8e09" Mar 19 19:23:27 crc kubenswrapper[4826]: I0319 19:23:27.948338 4826 scope.go:117] "RemoveContainer" containerID="3ab563d3809408b45476ed084a9feb0b7ea57fe73e485377a7e6e1960114f76f" Mar 19 19:23:28 crc kubenswrapper[4826]: I0319 19:23:28.053515 4826 scope.go:117] "RemoveContainer" containerID="d401a99a442648f9b3fadc5fa90a70eef741fe9341c23b08bf40a32aca5d3fe1" Mar 19 19:23:28 crc kubenswrapper[4826]: I0319 19:23:28.171999 4826 scope.go:117] "RemoveContainer" containerID="a3f356020523a8534a55d56aa1622f99d7f72f9c6e7be99763f558b6abe84902" Mar 19 19:23:28 crc kubenswrapper[4826]: I0319 19:23:28.294505 4826 generic.go:334] "Generic (PLEG): container finished" podID="2325ef7c-90a0-48f3-81f0-ede3e7f33570" containerID="44ca87a6f1315aca3fa78783b92f5e8f7f8b8fcb3cb71b35e857cd4f7578c88c" exitCode=0 Mar 19 19:23:28 crc kubenswrapper[4826]: I0319 19:23:28.294559 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"2325ef7c-90a0-48f3-81f0-ede3e7f33570","Type":"ContainerDied","Data":"44ca87a6f1315aca3fa78783b92f5e8f7f8b8fcb3cb71b35e857cd4f7578c88c"} Mar 19 19:23:28 crc kubenswrapper[4826]: I0319 19:23:28.342525 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Mar 19 19:23:28 crc kubenswrapper[4826]: I0319 19:23:28.436923 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2325ef7c-90a0-48f3-81f0-ede3e7f33570-plugins-conf\") pod \"2325ef7c-90a0-48f3-81f0-ede3e7f33570\" (UID: \"2325ef7c-90a0-48f3-81f0-ede3e7f33570\") " Mar 19 19:23:28 crc kubenswrapper[4826]: I0319 19:23:28.437275 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2325ef7c-90a0-48f3-81f0-ede3e7f33570-rabbitmq-tls\") pod \"2325ef7c-90a0-48f3-81f0-ede3e7f33570\" (UID: \"2325ef7c-90a0-48f3-81f0-ede3e7f33570\") " Mar 19 19:23:28 crc kubenswrapper[4826]: I0319 19:23:28.437309 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2325ef7c-90a0-48f3-81f0-ede3e7f33570-erlang-cookie-secret\") pod \"2325ef7c-90a0-48f3-81f0-ede3e7f33570\" (UID: \"2325ef7c-90a0-48f3-81f0-ede3e7f33570\") " Mar 19 19:23:28 crc kubenswrapper[4826]: I0319 19:23:28.437349 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2325ef7c-90a0-48f3-81f0-ede3e7f33570-rabbitmq-plugins\") pod \"2325ef7c-90a0-48f3-81f0-ede3e7f33570\" (UID: \"2325ef7c-90a0-48f3-81f0-ede3e7f33570\") " Mar 19 19:23:28 crc kubenswrapper[4826]: I0319 19:23:28.437401 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2325ef7c-90a0-48f3-81f0-ede3e7f33570-server-conf\") pod \"2325ef7c-90a0-48f3-81f0-ede3e7f33570\" (UID: \"2325ef7c-90a0-48f3-81f0-ede3e7f33570\") " Mar 19 19:23:28 crc kubenswrapper[4826]: I0319 19:23:28.437426 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2325ef7c-90a0-48f3-81f0-ede3e7f33570-pod-info\") pod \"2325ef7c-90a0-48f3-81f0-ede3e7f33570\" (UID: \"2325ef7c-90a0-48f3-81f0-ede3e7f33570\") " Mar 19 19:23:28 crc kubenswrapper[4826]: I0319 19:23:28.437450 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2325ef7c-90a0-48f3-81f0-ede3e7f33570-config-data\") pod \"2325ef7c-90a0-48f3-81f0-ede3e7f33570\" (UID: \"2325ef7c-90a0-48f3-81f0-ede3e7f33570\") " Mar 19 19:23:28 crc kubenswrapper[4826]: I0319 19:23:28.438163 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1ab1b068-4700-4ede-9fbf-f1e7d28eb79e\") pod \"2325ef7c-90a0-48f3-81f0-ede3e7f33570\" (UID: \"2325ef7c-90a0-48f3-81f0-ede3e7f33570\") " Mar 19 19:23:28 crc kubenswrapper[4826]: I0319 19:23:28.438206 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2325ef7c-90a0-48f3-81f0-ede3e7f33570-rabbitmq-confd\") pod \"2325ef7c-90a0-48f3-81f0-ede3e7f33570\" (UID: \"2325ef7c-90a0-48f3-81f0-ede3e7f33570\") " Mar 19 19:23:28 crc kubenswrapper[4826]: I0319 19:23:28.438278 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4vr4\" (UniqueName: \"kubernetes.io/projected/2325ef7c-90a0-48f3-81f0-ede3e7f33570-kube-api-access-m4vr4\") pod \"2325ef7c-90a0-48f3-81f0-ede3e7f33570\" (UID: \"2325ef7c-90a0-48f3-81f0-ede3e7f33570\") " Mar 19 19:23:28 crc kubenswrapper[4826]: I0319 19:23:28.438423 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2325ef7c-90a0-48f3-81f0-ede3e7f33570-rabbitmq-erlang-cookie\") pod \"2325ef7c-90a0-48f3-81f0-ede3e7f33570\" (UID: \"2325ef7c-90a0-48f3-81f0-ede3e7f33570\") " Mar 19 19:23:28 crc kubenswrapper[4826]: I0319 19:23:28.445400 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2325ef7c-90a0-48f3-81f0-ede3e7f33570-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "2325ef7c-90a0-48f3-81f0-ede3e7f33570" (UID: "2325ef7c-90a0-48f3-81f0-ede3e7f33570"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:23:28 crc kubenswrapper[4826]: I0319 19:23:28.448144 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2325ef7c-90a0-48f3-81f0-ede3e7f33570-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "2325ef7c-90a0-48f3-81f0-ede3e7f33570" (UID: "2325ef7c-90a0-48f3-81f0-ede3e7f33570"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:23:28 crc kubenswrapper[4826]: I0319 19:23:28.448801 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2325ef7c-90a0-48f3-81f0-ede3e7f33570-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "2325ef7c-90a0-48f3-81f0-ede3e7f33570" (UID: "2325ef7c-90a0-48f3-81f0-ede3e7f33570"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:23:28 crc kubenswrapper[4826]: I0319 19:23:28.449866 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2325ef7c-90a0-48f3-81f0-ede3e7f33570-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "2325ef7c-90a0-48f3-81f0-ede3e7f33570" (UID: "2325ef7c-90a0-48f3-81f0-ede3e7f33570"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:23:28 crc kubenswrapper[4826]: I0319 19:23:28.455965 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2325ef7c-90a0-48f3-81f0-ede3e7f33570-kube-api-access-m4vr4" (OuterVolumeSpecName: "kube-api-access-m4vr4") pod "2325ef7c-90a0-48f3-81f0-ede3e7f33570" (UID: "2325ef7c-90a0-48f3-81f0-ede3e7f33570"). InnerVolumeSpecName "kube-api-access-m4vr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:23:28 crc kubenswrapper[4826]: I0319 19:23:28.459598 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/2325ef7c-90a0-48f3-81f0-ede3e7f33570-pod-info" (OuterVolumeSpecName: "pod-info") pod "2325ef7c-90a0-48f3-81f0-ede3e7f33570" (UID: "2325ef7c-90a0-48f3-81f0-ede3e7f33570"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 19 19:23:28 crc kubenswrapper[4826]: I0319 19:23:28.459774 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2325ef7c-90a0-48f3-81f0-ede3e7f33570-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "2325ef7c-90a0-48f3-81f0-ede3e7f33570" (UID: "2325ef7c-90a0-48f3-81f0-ede3e7f33570"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:23:28 crc kubenswrapper[4826]: I0319 19:23:28.483054 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1ab1b068-4700-4ede-9fbf-f1e7d28eb79e" (OuterVolumeSpecName: "persistence") pod "2325ef7c-90a0-48f3-81f0-ede3e7f33570" (UID: "2325ef7c-90a0-48f3-81f0-ede3e7f33570"). InnerVolumeSpecName "pvc-1ab1b068-4700-4ede-9fbf-f1e7d28eb79e". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 19 19:23:28 crc kubenswrapper[4826]: I0319 19:23:28.516087 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2325ef7c-90a0-48f3-81f0-ede3e7f33570-config-data" (OuterVolumeSpecName: "config-data") pod "2325ef7c-90a0-48f3-81f0-ede3e7f33570" (UID: "2325ef7c-90a0-48f3-81f0-ede3e7f33570"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:23:28 crc kubenswrapper[4826]: I0319 19:23:28.533215 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2325ef7c-90a0-48f3-81f0-ede3e7f33570-server-conf" (OuterVolumeSpecName: "server-conf") pod "2325ef7c-90a0-48f3-81f0-ede3e7f33570" (UID: "2325ef7c-90a0-48f3-81f0-ede3e7f33570"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:23:28 crc kubenswrapper[4826]: I0319 19:23:28.542934 4826 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2325ef7c-90a0-48f3-81f0-ede3e7f33570-server-conf\") on node \"crc\" DevicePath \"\"" Mar 19 19:23:28 crc kubenswrapper[4826]: I0319 19:23:28.543008 4826 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2325ef7c-90a0-48f3-81f0-ede3e7f33570-pod-info\") on node \"crc\" DevicePath \"\"" Mar 19 19:23:28 crc kubenswrapper[4826]: I0319 19:23:28.543517 4826 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2325ef7c-90a0-48f3-81f0-ede3e7f33570-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:23:28 crc kubenswrapper[4826]: I0319 19:23:28.543562 4826 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-1ab1b068-4700-4ede-9fbf-f1e7d28eb79e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1ab1b068-4700-4ede-9fbf-f1e7d28eb79e\") on node \"crc\" " Mar 19 19:23:28 crc kubenswrapper[4826]: I0319 19:23:28.543577 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4vr4\" (UniqueName: \"kubernetes.io/projected/2325ef7c-90a0-48f3-81f0-ede3e7f33570-kube-api-access-m4vr4\") on node \"crc\" DevicePath \"\"" Mar 19 19:23:28 crc kubenswrapper[4826]: I0319 19:23:28.543588 4826 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2325ef7c-90a0-48f3-81f0-ede3e7f33570-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 19 19:23:28 crc kubenswrapper[4826]: I0319 19:23:28.543596 4826 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2325ef7c-90a0-48f3-81f0-ede3e7f33570-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 19 19:23:28 crc kubenswrapper[4826]: I0319 19:23:28.543605 4826 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2325ef7c-90a0-48f3-81f0-ede3e7f33570-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 19 19:23:28 crc kubenswrapper[4826]: I0319 19:23:28.543614 4826 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2325ef7c-90a0-48f3-81f0-ede3e7f33570-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 19 19:23:28 crc kubenswrapper[4826]: I0319 19:23:28.543622 4826 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2325ef7c-90a0-48f3-81f0-ede3e7f33570-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 19 19:23:28 crc kubenswrapper[4826]: I0319 19:23:28.596279 4826 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 19 19:23:28 crc kubenswrapper[4826]: I0319 19:23:28.596479 4826 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-1ab1b068-4700-4ede-9fbf-f1e7d28eb79e" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1ab1b068-4700-4ede-9fbf-f1e7d28eb79e") on node "crc" Mar 19 19:23:28 crc kubenswrapper[4826]: I0319 19:23:28.613912 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2325ef7c-90a0-48f3-81f0-ede3e7f33570-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "2325ef7c-90a0-48f3-81f0-ede3e7f33570" (UID: "2325ef7c-90a0-48f3-81f0-ede3e7f33570"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:23:28 crc kubenswrapper[4826]: I0319 19:23:28.645293 4826 reconciler_common.go:293] "Volume detached for volume \"pvc-1ab1b068-4700-4ede-9fbf-f1e7d28eb79e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1ab1b068-4700-4ede-9fbf-f1e7d28eb79e\") on node \"crc\" DevicePath \"\"" Mar 19 19:23:28 crc kubenswrapper[4826]: I0319 19:23:28.645334 4826 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2325ef7c-90a0-48f3-81f0-ede3e7f33570-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 19 19:23:28 crc kubenswrapper[4826]: I0319 19:23:28.690133 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-h4j6w" Mar 19 19:23:28 crc kubenswrapper[4826]: I0319 19:23:28.849304 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a20d409-205e-4c2a-9197-a6fad3fc7e94-config-data\") pod \"6a20d409-205e-4c2a-9197-a6fad3fc7e94\" (UID: \"6a20d409-205e-4c2a-9197-a6fad3fc7e94\") " Mar 19 19:23:28 crc kubenswrapper[4826]: I0319 19:23:28.849514 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a20d409-205e-4c2a-9197-a6fad3fc7e94-scripts\") pod \"6a20d409-205e-4c2a-9197-a6fad3fc7e94\" (UID: \"6a20d409-205e-4c2a-9197-a6fad3fc7e94\") " Mar 19 19:23:28 crc kubenswrapper[4826]: I0319 19:23:28.849542 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlgld\" (UniqueName: \"kubernetes.io/projected/6a20d409-205e-4c2a-9197-a6fad3fc7e94-kube-api-access-vlgld\") pod \"6a20d409-205e-4c2a-9197-a6fad3fc7e94\" (UID: \"6a20d409-205e-4c2a-9197-a6fad3fc7e94\") " Mar 19 19:23:28 crc kubenswrapper[4826]: I0319 19:23:28.849703 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a20d409-205e-4c2a-9197-a6fad3fc7e94-combined-ca-bundle\") pod \"6a20d409-205e-4c2a-9197-a6fad3fc7e94\" (UID: \"6a20d409-205e-4c2a-9197-a6fad3fc7e94\") " Mar 19 19:23:28 crc kubenswrapper[4826]: I0319 19:23:28.855308 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a20d409-205e-4c2a-9197-a6fad3fc7e94-scripts" (OuterVolumeSpecName: "scripts") pod "6a20d409-205e-4c2a-9197-a6fad3fc7e94" (UID: "6a20d409-205e-4c2a-9197-a6fad3fc7e94"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:23:28 crc kubenswrapper[4826]: I0319 19:23:28.859147 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a20d409-205e-4c2a-9197-a6fad3fc7e94-kube-api-access-vlgld" (OuterVolumeSpecName: "kube-api-access-vlgld") pod "6a20d409-205e-4c2a-9197-a6fad3fc7e94" (UID: "6a20d409-205e-4c2a-9197-a6fad3fc7e94"). InnerVolumeSpecName "kube-api-access-vlgld". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:23:28 crc kubenswrapper[4826]: I0319 19:23:28.957280 4826 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a20d409-205e-4c2a-9197-a6fad3fc7e94-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:23:28 crc kubenswrapper[4826]: I0319 19:23:28.957318 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlgld\" (UniqueName: \"kubernetes.io/projected/6a20d409-205e-4c2a-9197-a6fad3fc7e94-kube-api-access-vlgld\") on node \"crc\" DevicePath \"\"" Mar 19 19:23:28 crc kubenswrapper[4826]: I0319 19:23:28.970789 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a20d409-205e-4c2a-9197-a6fad3fc7e94-config-data" (OuterVolumeSpecName: "config-data") pod "6a20d409-205e-4c2a-9197-a6fad3fc7e94" (UID: "6a20d409-205e-4c2a-9197-a6fad3fc7e94"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:23:28 crc kubenswrapper[4826]: I0319 19:23:28.995007 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a20d409-205e-4c2a-9197-a6fad3fc7e94-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a20d409-205e-4c2a-9197-a6fad3fc7e94" (UID: "6a20d409-205e-4c2a-9197-a6fad3fc7e94"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:23:29 crc kubenswrapper[4826]: I0319 19:23:29.060020 4826 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a20d409-205e-4c2a-9197-a6fad3fc7e94-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:23:29 crc kubenswrapper[4826]: I0319 19:23:29.060289 4826 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a20d409-205e-4c2a-9197-a6fad3fc7e94-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:23:29 crc kubenswrapper[4826]: I0319 19:23:29.311552 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-h4j6w" Mar 19 19:23:29 crc kubenswrapper[4826]: I0319 19:23:29.311552 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-h4j6w" event={"ID":"6a20d409-205e-4c2a-9197-a6fad3fc7e94","Type":"ContainerDied","Data":"2ac7b7e19a4602883a096b7a7e236b8f621ed0e34811bb5b795be9a9d04141f2"} Mar 19 19:23:29 crc kubenswrapper[4826]: I0319 19:23:29.311690 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ac7b7e19a4602883a096b7a7e236b8f621ed0e34811bb5b795be9a9d04141f2" Mar 19 19:23:29 crc kubenswrapper[4826]: I0319 19:23:29.314139 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"2325ef7c-90a0-48f3-81f0-ede3e7f33570","Type":"ContainerDied","Data":"1069dee418723266d4b0d18b50cf3b7afbf08c0019ab1b520270ebdbbac3ca91"} Mar 19 19:23:29 crc kubenswrapper[4826]: I0319 19:23:29.314275 4826 scope.go:117] "RemoveContainer" containerID="44ca87a6f1315aca3fa78783b92f5e8f7f8b8fcb3cb71b35e857cd4f7578c88c" Mar 19 19:23:29 crc kubenswrapper[4826]: I0319 19:23:29.314216 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Mar 19 19:23:29 crc kubenswrapper[4826]: I0319 19:23:29.355988 4826 scope.go:117] "RemoveContainer" containerID="045565dbdc1fcc69a6554c263960690e114832329c56dc8084a5c107a59ae84b" Mar 19 19:23:29 crc kubenswrapper[4826]: I0319 19:23:29.368469 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 19 19:23:29 crc kubenswrapper[4826]: I0319 19:23:29.381183 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 19 19:23:29 crc kubenswrapper[4826]: I0319 19:23:29.392912 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-1"] Mar 19 19:23:29 crc kubenswrapper[4826]: E0319 19:23:29.393560 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a20d409-205e-4c2a-9197-a6fad3fc7e94" containerName="aodh-db-sync" Mar 19 19:23:29 crc kubenswrapper[4826]: I0319 19:23:29.393588 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a20d409-205e-4c2a-9197-a6fad3fc7e94" containerName="aodh-db-sync" Mar 19 19:23:29 crc kubenswrapper[4826]: E0319 19:23:29.393605 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2325ef7c-90a0-48f3-81f0-ede3e7f33570" containerName="setup-container" Mar 19 19:23:29 crc kubenswrapper[4826]: I0319 19:23:29.393612 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="2325ef7c-90a0-48f3-81f0-ede3e7f33570" containerName="setup-container" Mar 19 19:23:29 crc kubenswrapper[4826]: E0319 19:23:29.393645 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76db7194-58de-4efa-8ffc-a18f17d2a3c4" containerName="heat-engine" Mar 19 19:23:29 crc kubenswrapper[4826]: I0319 19:23:29.393667 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="76db7194-58de-4efa-8ffc-a18f17d2a3c4" containerName="heat-engine" Mar 19 19:23:29 crc kubenswrapper[4826]: E0319 19:23:29.393683 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2325ef7c-90a0-48f3-81f0-ede3e7f33570" containerName="rabbitmq" Mar 19 19:23:29 crc kubenswrapper[4826]: I0319 19:23:29.393688 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="2325ef7c-90a0-48f3-81f0-ede3e7f33570" containerName="rabbitmq" Mar 19 19:23:29 crc kubenswrapper[4826]: I0319 19:23:29.393916 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a20d409-205e-4c2a-9197-a6fad3fc7e94" containerName="aodh-db-sync" Mar 19 19:23:29 crc kubenswrapper[4826]: I0319 19:23:29.393947 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="2325ef7c-90a0-48f3-81f0-ede3e7f33570" containerName="rabbitmq" Mar 19 19:23:29 crc kubenswrapper[4826]: I0319 19:23:29.393962 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="76db7194-58de-4efa-8ffc-a18f17d2a3c4" containerName="heat-engine" Mar 19 19:23:29 crc kubenswrapper[4826]: I0319 19:23:29.395249 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Mar 19 19:23:29 crc kubenswrapper[4826]: I0319 19:23:29.404586 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 19 19:23:29 crc kubenswrapper[4826]: I0319 19:23:29.572686 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/97a64d0f-56cc-4ec0-9e02-49fbbe998f43-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"97a64d0f-56cc-4ec0-9e02-49fbbe998f43\") " pod="openstack/rabbitmq-server-1" Mar 19 19:23:29 crc kubenswrapper[4826]: I0319 19:23:29.572742 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1ab1b068-4700-4ede-9fbf-f1e7d28eb79e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1ab1b068-4700-4ede-9fbf-f1e7d28eb79e\") pod \"rabbitmq-server-1\" (UID: \"97a64d0f-56cc-4ec0-9e02-49fbbe998f43\") " pod="openstack/rabbitmq-server-1" Mar 19 19:23:29 crc kubenswrapper[4826]: I0319 19:23:29.573050 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/97a64d0f-56cc-4ec0-9e02-49fbbe998f43-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"97a64d0f-56cc-4ec0-9e02-49fbbe998f43\") " pod="openstack/rabbitmq-server-1" Mar 19 19:23:29 crc kubenswrapper[4826]: I0319 19:23:29.573237 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/97a64d0f-56cc-4ec0-9e02-49fbbe998f43-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"97a64d0f-56cc-4ec0-9e02-49fbbe998f43\") " pod="openstack/rabbitmq-server-1" Mar 19 19:23:29 crc kubenswrapper[4826]: I0319 19:23:29.573318 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/97a64d0f-56cc-4ec0-9e02-49fbbe998f43-pod-info\") pod \"rabbitmq-server-1\" (UID: \"97a64d0f-56cc-4ec0-9e02-49fbbe998f43\") " pod="openstack/rabbitmq-server-1" Mar 19 19:23:29 crc kubenswrapper[4826]: I0319 19:23:29.573554 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/97a64d0f-56cc-4ec0-9e02-49fbbe998f43-config-data\") pod \"rabbitmq-server-1\" (UID: \"97a64d0f-56cc-4ec0-9e02-49fbbe998f43\") " pod="openstack/rabbitmq-server-1" Mar 19 19:23:29 crc kubenswrapper[4826]: I0319 19:23:29.573620 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/97a64d0f-56cc-4ec0-9e02-49fbbe998f43-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"97a64d0f-56cc-4ec0-9e02-49fbbe998f43\") " pod="openstack/rabbitmq-server-1" Mar 19 19:23:29 crc kubenswrapper[4826]: I0319 19:23:29.573816 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbflt\" (UniqueName: \"kubernetes.io/projected/97a64d0f-56cc-4ec0-9e02-49fbbe998f43-kube-api-access-kbflt\") pod \"rabbitmq-server-1\" (UID: \"97a64d0f-56cc-4ec0-9e02-49fbbe998f43\") " pod="openstack/rabbitmq-server-1" Mar 19 19:23:29 crc kubenswrapper[4826]: I0319 19:23:29.573861 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/97a64d0f-56cc-4ec0-9e02-49fbbe998f43-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"97a64d0f-56cc-4ec0-9e02-49fbbe998f43\") " pod="openstack/rabbitmq-server-1" Mar 19 19:23:29 crc kubenswrapper[4826]: I0319 19:23:29.573905 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/97a64d0f-56cc-4ec0-9e02-49fbbe998f43-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"97a64d0f-56cc-4ec0-9e02-49fbbe998f43\") " pod="openstack/rabbitmq-server-1" Mar 19 19:23:29 crc kubenswrapper[4826]: I0319 19:23:29.573995 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/97a64d0f-56cc-4ec0-9e02-49fbbe998f43-server-conf\") pod \"rabbitmq-server-1\" (UID: \"97a64d0f-56cc-4ec0-9e02-49fbbe998f43\") " pod="openstack/rabbitmq-server-1" Mar 19 19:23:29 crc kubenswrapper[4826]: I0319 19:23:29.677019 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/97a64d0f-56cc-4ec0-9e02-49fbbe998f43-config-data\") pod \"rabbitmq-server-1\" (UID: \"97a64d0f-56cc-4ec0-9e02-49fbbe998f43\") " pod="openstack/rabbitmq-server-1" Mar 19 19:23:29 crc kubenswrapper[4826]: I0319 19:23:29.677092 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/97a64d0f-56cc-4ec0-9e02-49fbbe998f43-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"97a64d0f-56cc-4ec0-9e02-49fbbe998f43\") " pod="openstack/rabbitmq-server-1" Mar 19 19:23:29 crc kubenswrapper[4826]: I0319 19:23:29.677170 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbflt\" (UniqueName: \"kubernetes.io/projected/97a64d0f-56cc-4ec0-9e02-49fbbe998f43-kube-api-access-kbflt\") pod \"rabbitmq-server-1\" (UID: \"97a64d0f-56cc-4ec0-9e02-49fbbe998f43\") " pod="openstack/rabbitmq-server-1" Mar 19 19:23:29 crc kubenswrapper[4826]: I0319 19:23:29.677204 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/97a64d0f-56cc-4ec0-9e02-49fbbe998f43-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"97a64d0f-56cc-4ec0-9e02-49fbbe998f43\") " pod="openstack/rabbitmq-server-1" Mar 19 19:23:29 crc kubenswrapper[4826]: I0319 19:23:29.677246 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/97a64d0f-56cc-4ec0-9e02-49fbbe998f43-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"97a64d0f-56cc-4ec0-9e02-49fbbe998f43\") " pod="openstack/rabbitmq-server-1" Mar 19 19:23:29 crc kubenswrapper[4826]: I0319 19:23:29.677291 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/97a64d0f-56cc-4ec0-9e02-49fbbe998f43-server-conf\") pod \"rabbitmq-server-1\" (UID: \"97a64d0f-56cc-4ec0-9e02-49fbbe998f43\") " pod="openstack/rabbitmq-server-1" Mar 19 19:23:29 crc kubenswrapper[4826]: I0319 19:23:29.677393 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/97a64d0f-56cc-4ec0-9e02-49fbbe998f43-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"97a64d0f-56cc-4ec0-9e02-49fbbe998f43\") " pod="openstack/rabbitmq-server-1" Mar 19 19:23:29 crc kubenswrapper[4826]: I0319 19:23:29.677435 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1ab1b068-4700-4ede-9fbf-f1e7d28eb79e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1ab1b068-4700-4ede-9fbf-f1e7d28eb79e\") pod \"rabbitmq-server-1\" (UID: \"97a64d0f-56cc-4ec0-9e02-49fbbe998f43\") " pod="openstack/rabbitmq-server-1" Mar 19 19:23:29 crc kubenswrapper[4826]: I0319 19:23:29.677549 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/97a64d0f-56cc-4ec0-9e02-49fbbe998f43-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"97a64d0f-56cc-4ec0-9e02-49fbbe998f43\") " pod="openstack/rabbitmq-server-1" Mar 19 19:23:29 crc kubenswrapper[4826]: I0319 19:23:29.677600 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/97a64d0f-56cc-4ec0-9e02-49fbbe998f43-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"97a64d0f-56cc-4ec0-9e02-49fbbe998f43\") " pod="openstack/rabbitmq-server-1" Mar 19 19:23:29 crc kubenswrapper[4826]: I0319 19:23:29.677638 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/97a64d0f-56cc-4ec0-9e02-49fbbe998f43-pod-info\") pod \"rabbitmq-server-1\" (UID: \"97a64d0f-56cc-4ec0-9e02-49fbbe998f43\") " pod="openstack/rabbitmq-server-1" Mar 19 19:23:29 crc kubenswrapper[4826]: I0319 19:23:29.678966 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/97a64d0f-56cc-4ec0-9e02-49fbbe998f43-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"97a64d0f-56cc-4ec0-9e02-49fbbe998f43\") " pod="openstack/rabbitmq-server-1" Mar 19 19:23:29 crc kubenswrapper[4826]: I0319 19:23:29.679028 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/97a64d0f-56cc-4ec0-9e02-49fbbe998f43-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"97a64d0f-56cc-4ec0-9e02-49fbbe998f43\") " pod="openstack/rabbitmq-server-1" Mar 19 19:23:29 crc kubenswrapper[4826]: I0319 19:23:29.679283 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/97a64d0f-56cc-4ec0-9e02-49fbbe998f43-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"97a64d0f-56cc-4ec0-9e02-49fbbe998f43\") " pod="openstack/rabbitmq-server-1" Mar 19 19:23:29 crc kubenswrapper[4826]: I0319 19:23:29.679366 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/97a64d0f-56cc-4ec0-9e02-49fbbe998f43-server-conf\") pod \"rabbitmq-server-1\" (UID: \"97a64d0f-56cc-4ec0-9e02-49fbbe998f43\") " pod="openstack/rabbitmq-server-1" Mar 19 19:23:29 crc kubenswrapper[4826]: I0319 19:23:29.679878 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/97a64d0f-56cc-4ec0-9e02-49fbbe998f43-config-data\") pod \"rabbitmq-server-1\" (UID: \"97a64d0f-56cc-4ec0-9e02-49fbbe998f43\") " pod="openstack/rabbitmq-server-1" Mar 19 19:23:29 crc kubenswrapper[4826]: I0319 19:23:29.684684 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/97a64d0f-56cc-4ec0-9e02-49fbbe998f43-pod-info\") pod \"rabbitmq-server-1\" (UID: \"97a64d0f-56cc-4ec0-9e02-49fbbe998f43\") " pod="openstack/rabbitmq-server-1" Mar 19 19:23:29 crc kubenswrapper[4826]: I0319 19:23:29.684702 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/97a64d0f-56cc-4ec0-9e02-49fbbe998f43-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"97a64d0f-56cc-4ec0-9e02-49fbbe998f43\") " pod="openstack/rabbitmq-server-1" Mar 19 19:23:29 crc kubenswrapper[4826]: I0319 19:23:29.685556 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/97a64d0f-56cc-4ec0-9e02-49fbbe998f43-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"97a64d0f-56cc-4ec0-9e02-49fbbe998f43\") " pod="openstack/rabbitmq-server-1" Mar 19 19:23:29 crc kubenswrapper[4826]: I0319 19:23:29.686164 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/97a64d0f-56cc-4ec0-9e02-49fbbe998f43-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"97a64d0f-56cc-4ec0-9e02-49fbbe998f43\") " pod="openstack/rabbitmq-server-1" Mar 19 19:23:29 crc kubenswrapper[4826]: I0319 19:23:29.689685 4826 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 19:23:29 crc kubenswrapper[4826]: I0319 19:23:29.689746 4826 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1ab1b068-4700-4ede-9fbf-f1e7d28eb79e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1ab1b068-4700-4ede-9fbf-f1e7d28eb79e\") pod \"rabbitmq-server-1\" (UID: \"97a64d0f-56cc-4ec0-9e02-49fbbe998f43\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/444a67217cc3019efbdec4573c58565e98449ba1a91ed8a6d8ae2930727b7eaf/globalmount\"" pod="openstack/rabbitmq-server-1" Mar 19 19:23:29 crc kubenswrapper[4826]: I0319 19:23:29.706820 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbflt\" (UniqueName: \"kubernetes.io/projected/97a64d0f-56cc-4ec0-9e02-49fbbe998f43-kube-api-access-kbflt\") pod \"rabbitmq-server-1\" (UID: \"97a64d0f-56cc-4ec0-9e02-49fbbe998f43\") " pod="openstack/rabbitmq-server-1" Mar 19 19:23:29 crc kubenswrapper[4826]: I0319 19:23:29.714680 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Mar 19 19:23:29 crc kubenswrapper[4826]: I0319 19:23:29.715108 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="ed2acf4d-53a0-4bbb-bc0d-c021cf699d91" containerName="aodh-api" containerID="cri-o://d5a9eb116013690bd9a216eb73d3c0899145605bb5f7509456aa201f07962be1" gracePeriod=30 Mar 19 19:23:29 crc kubenswrapper[4826]: I0319 19:23:29.715184 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="ed2acf4d-53a0-4bbb-bc0d-c021cf699d91" containerName="aodh-listener" containerID="cri-o://f88cb585e4bb2afd745d3e2e177e1850ff875a9d51deba8343e25dfe245845e4" gracePeriod=30 Mar 19 19:23:29 crc kubenswrapper[4826]: I0319 19:23:29.715286 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="ed2acf4d-53a0-4bbb-bc0d-c021cf699d91" containerName="aodh-notifier" containerID="cri-o://4a573f40f01ececb85f329d1f7d54bb1ee1c8a9122f52617986cfe85a9b58e47" gracePeriod=30 Mar 19 19:23:29 crc kubenswrapper[4826]: I0319 19:23:29.716904 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="ed2acf4d-53a0-4bbb-bc0d-c021cf699d91" containerName="aodh-evaluator" containerID="cri-o://1809704acf553601df52423f6734335622681a932932ff9a09d1aecac4f31610" gracePeriod=30 Mar 19 19:23:29 crc kubenswrapper[4826]: I0319 19:23:29.789019 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1ab1b068-4700-4ede-9fbf-f1e7d28eb79e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1ab1b068-4700-4ede-9fbf-f1e7d28eb79e\") pod \"rabbitmq-server-1\" (UID: \"97a64d0f-56cc-4ec0-9e02-49fbbe998f43\") " pod="openstack/rabbitmq-server-1" Mar 19 19:23:29 crc kubenswrapper[4826]: I0319 19:23:29.987644 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2325ef7c-90a0-48f3-81f0-ede3e7f33570" path="/var/lib/kubelet/pods/2325ef7c-90a0-48f3-81f0-ede3e7f33570/volumes" Mar 19 19:23:30 crc kubenswrapper[4826]: I0319 19:23:30.036317 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Mar 19 19:23:30 crc kubenswrapper[4826]: I0319 19:23:30.330469 4826 generic.go:334] "Generic (PLEG): container finished" podID="ed2acf4d-53a0-4bbb-bc0d-c021cf699d91" containerID="1809704acf553601df52423f6734335622681a932932ff9a09d1aecac4f31610" exitCode=0 Mar 19 19:23:30 crc kubenswrapper[4826]: I0319 19:23:30.330738 4826 generic.go:334] "Generic (PLEG): container finished" podID="ed2acf4d-53a0-4bbb-bc0d-c021cf699d91" containerID="d5a9eb116013690bd9a216eb73d3c0899145605bb5f7509456aa201f07962be1" exitCode=0 Mar 19 19:23:30 crc kubenswrapper[4826]: I0319 19:23:30.330550 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ed2acf4d-53a0-4bbb-bc0d-c021cf699d91","Type":"ContainerDied","Data":"1809704acf553601df52423f6734335622681a932932ff9a09d1aecac4f31610"} Mar 19 19:23:30 crc kubenswrapper[4826]: I0319 19:23:30.330786 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ed2acf4d-53a0-4bbb-bc0d-c021cf699d91","Type":"ContainerDied","Data":"d5a9eb116013690bd9a216eb73d3c0899145605bb5f7509456aa201f07962be1"} Mar 19 19:23:30 crc kubenswrapper[4826]: I0319 19:23:30.542044 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 19 19:23:31 crc kubenswrapper[4826]: I0319 19:23:31.348095 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"97a64d0f-56cc-4ec0-9e02-49fbbe998f43","Type":"ContainerStarted","Data":"7206237d5bb3d8155bc42ec929b14d214d37a9fadad13bcc5fca1592973ee724"} Mar 19 19:23:33 crc kubenswrapper[4826]: I0319 19:23:33.369328 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"97a64d0f-56cc-4ec0-9e02-49fbbe998f43","Type":"ContainerStarted","Data":"d855a197d007a8f6edff88b142f78cce6588caa908943792c3bdad833cc04000"} Mar 19 19:23:33 crc kubenswrapper[4826]: I0319 19:23:33.373644 4826 generic.go:334] "Generic (PLEG): container finished" podID="ed2acf4d-53a0-4bbb-bc0d-c021cf699d91" containerID="f88cb585e4bb2afd745d3e2e177e1850ff875a9d51deba8343e25dfe245845e4" exitCode=0 Mar 19 19:23:33 crc kubenswrapper[4826]: I0319 19:23:33.373705 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ed2acf4d-53a0-4bbb-bc0d-c021cf699d91","Type":"ContainerDied","Data":"f88cb585e4bb2afd745d3e2e177e1850ff875a9d51deba8343e25dfe245845e4"} Mar 19 19:23:34 crc kubenswrapper[4826]: I0319 19:23:34.976468 4826 scope.go:117] "RemoveContainer" containerID="856447f1cdc796c080402d3bfb76d7471741ca95039714006756d0cb980e424c" Mar 19 19:23:34 crc kubenswrapper[4826]: E0319 19:23:34.977384 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:23:36 crc kubenswrapper[4826]: I0319 19:23:36.421482 4826 generic.go:334] "Generic (PLEG): container finished" podID="d499bbf3-6fa0-4467-92f6-7ccaa0f71b06" containerID="118b48320b60775b2aade493f6ba8f4e8e5f07c2de860dbcca8bdb51db54e388" exitCode=0 Mar 19 19:23:36 crc kubenswrapper[4826]: I0319 19:23:36.421582 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-k6wzl" event={"ID":"d499bbf3-6fa0-4467-92f6-7ccaa0f71b06","Type":"ContainerDied","Data":"118b48320b60775b2aade493f6ba8f4e8e5f07c2de860dbcca8bdb51db54e388"} Mar 19 19:23:38 crc kubenswrapper[4826]: I0319 19:23:38.062465 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-k6wzl" Mar 19 19:23:38 crc kubenswrapper[4826]: I0319 19:23:38.112522 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d499bbf3-6fa0-4467-92f6-7ccaa0f71b06-repo-setup-combined-ca-bundle\") pod \"d499bbf3-6fa0-4467-92f6-7ccaa0f71b06\" (UID: \"d499bbf3-6fa0-4467-92f6-7ccaa0f71b06\") " Mar 19 19:23:38 crc kubenswrapper[4826]: I0319 19:23:38.112965 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d499bbf3-6fa0-4467-92f6-7ccaa0f71b06-ssh-key-openstack-edpm-ipam\") pod \"d499bbf3-6fa0-4467-92f6-7ccaa0f71b06\" (UID: \"d499bbf3-6fa0-4467-92f6-7ccaa0f71b06\") " Mar 19 19:23:38 crc kubenswrapper[4826]: I0319 19:23:38.113059 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d499bbf3-6fa0-4467-92f6-7ccaa0f71b06-inventory\") pod \"d499bbf3-6fa0-4467-92f6-7ccaa0f71b06\" (UID: \"d499bbf3-6fa0-4467-92f6-7ccaa0f71b06\") " Mar 19 19:23:38 crc kubenswrapper[4826]: I0319 19:23:38.113165 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfvrt\" (UniqueName: \"kubernetes.io/projected/d499bbf3-6fa0-4467-92f6-7ccaa0f71b06-kube-api-access-wfvrt\") pod \"d499bbf3-6fa0-4467-92f6-7ccaa0f71b06\" (UID: \"d499bbf3-6fa0-4467-92f6-7ccaa0f71b06\") " Mar 19 19:23:38 crc kubenswrapper[4826]: I0319 19:23:38.118820 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d499bbf3-6fa0-4467-92f6-7ccaa0f71b06-kube-api-access-wfvrt" (OuterVolumeSpecName: "kube-api-access-wfvrt") pod "d499bbf3-6fa0-4467-92f6-7ccaa0f71b06" (UID: "d499bbf3-6fa0-4467-92f6-7ccaa0f71b06"). InnerVolumeSpecName "kube-api-access-wfvrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:23:38 crc kubenswrapper[4826]: I0319 19:23:38.124066 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d499bbf3-6fa0-4467-92f6-7ccaa0f71b06-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "d499bbf3-6fa0-4467-92f6-7ccaa0f71b06" (UID: "d499bbf3-6fa0-4467-92f6-7ccaa0f71b06"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:23:38 crc kubenswrapper[4826]: I0319 19:23:38.150803 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d499bbf3-6fa0-4467-92f6-7ccaa0f71b06-inventory" (OuterVolumeSpecName: "inventory") pod "d499bbf3-6fa0-4467-92f6-7ccaa0f71b06" (UID: "d499bbf3-6fa0-4467-92f6-7ccaa0f71b06"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:23:38 crc kubenswrapper[4826]: I0319 19:23:38.164294 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d499bbf3-6fa0-4467-92f6-7ccaa0f71b06-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d499bbf3-6fa0-4467-92f6-7ccaa0f71b06" (UID: "d499bbf3-6fa0-4467-92f6-7ccaa0f71b06"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:23:38 crc kubenswrapper[4826]: I0319 19:23:38.215932 4826 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d499bbf3-6fa0-4467-92f6-7ccaa0f71b06-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:23:38 crc kubenswrapper[4826]: I0319 19:23:38.215962 4826 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d499bbf3-6fa0-4467-92f6-7ccaa0f71b06-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 19:23:38 crc kubenswrapper[4826]: I0319 19:23:38.215978 4826 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d499bbf3-6fa0-4467-92f6-7ccaa0f71b06-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 19:23:38 crc kubenswrapper[4826]: I0319 19:23:38.215989 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfvrt\" (UniqueName: \"kubernetes.io/projected/d499bbf3-6fa0-4467-92f6-7ccaa0f71b06-kube-api-access-wfvrt\") on node \"crc\" DevicePath \"\"" Mar 19 19:23:38 crc kubenswrapper[4826]: I0319 19:23:38.448237 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-k6wzl" event={"ID":"d499bbf3-6fa0-4467-92f6-7ccaa0f71b06","Type":"ContainerDied","Data":"e3ef186b8e59e84e4682aed7507edc3a4e07a93141aa7ee9af8ac8c8fcd66122"} Mar 19 19:23:38 crc kubenswrapper[4826]: I0319 19:23:38.448284 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3ef186b8e59e84e4682aed7507edc3a4e07a93141aa7ee9af8ac8c8fcd66122" Mar 19 19:23:38 crc kubenswrapper[4826]: I0319 19:23:38.448342 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-k6wzl" Mar 19 19:23:38 crc kubenswrapper[4826]: I0319 19:23:38.586771 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-qb95v"] Mar 19 19:23:38 crc kubenswrapper[4826]: E0319 19:23:38.587879 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d499bbf3-6fa0-4467-92f6-7ccaa0f71b06" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 19 19:23:38 crc kubenswrapper[4826]: I0319 19:23:38.587905 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="d499bbf3-6fa0-4467-92f6-7ccaa0f71b06" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 19 19:23:38 crc kubenswrapper[4826]: I0319 19:23:38.588594 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="d499bbf3-6fa0-4467-92f6-7ccaa0f71b06" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 19 19:23:38 crc kubenswrapper[4826]: I0319 19:23:38.590431 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qb95v" Mar 19 19:23:38 crc kubenswrapper[4826]: I0319 19:23:38.596614 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 19:23:38 crc kubenswrapper[4826]: I0319 19:23:38.596793 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jchxw" Mar 19 19:23:38 crc kubenswrapper[4826]: I0319 19:23:38.597007 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 19:23:38 crc kubenswrapper[4826]: I0319 19:23:38.597007 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 19:23:38 crc kubenswrapper[4826]: I0319 19:23:38.614191 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-qb95v"] Mar 19 19:23:38 crc kubenswrapper[4826]: I0319 19:23:38.728085 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d00cc58b-deec-42ba-aab9-2f6cfd6ff5a4-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qb95v\" (UID: \"d00cc58b-deec-42ba-aab9-2f6cfd6ff5a4\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qb95v" Mar 19 19:23:38 crc kubenswrapper[4826]: I0319 19:23:38.728551 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rffxl\" (UniqueName: \"kubernetes.io/projected/d00cc58b-deec-42ba-aab9-2f6cfd6ff5a4-kube-api-access-rffxl\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qb95v\" (UID: \"d00cc58b-deec-42ba-aab9-2f6cfd6ff5a4\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qb95v" Mar 19 19:23:38 crc kubenswrapper[4826]: I0319 19:23:38.729145 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d00cc58b-deec-42ba-aab9-2f6cfd6ff5a4-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qb95v\" (UID: \"d00cc58b-deec-42ba-aab9-2f6cfd6ff5a4\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qb95v" Mar 19 19:23:38 crc kubenswrapper[4826]: I0319 19:23:38.831959 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d00cc58b-deec-42ba-aab9-2f6cfd6ff5a4-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qb95v\" (UID: \"d00cc58b-deec-42ba-aab9-2f6cfd6ff5a4\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qb95v" Mar 19 19:23:38 crc kubenswrapper[4826]: I0319 19:23:38.832205 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d00cc58b-deec-42ba-aab9-2f6cfd6ff5a4-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qb95v\" (UID: \"d00cc58b-deec-42ba-aab9-2f6cfd6ff5a4\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qb95v" Mar 19 19:23:38 crc kubenswrapper[4826]: I0319 19:23:38.832304 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rffxl\" (UniqueName: \"kubernetes.io/projected/d00cc58b-deec-42ba-aab9-2f6cfd6ff5a4-kube-api-access-rffxl\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qb95v\" (UID: \"d00cc58b-deec-42ba-aab9-2f6cfd6ff5a4\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qb95v" Mar 19 19:23:38 crc kubenswrapper[4826]: I0319 19:23:38.837478 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d00cc58b-deec-42ba-aab9-2f6cfd6ff5a4-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qb95v\" (UID: \"d00cc58b-deec-42ba-aab9-2f6cfd6ff5a4\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qb95v" Mar 19 19:23:38 crc kubenswrapper[4826]: I0319 19:23:38.837702 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d00cc58b-deec-42ba-aab9-2f6cfd6ff5a4-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qb95v\" (UID: \"d00cc58b-deec-42ba-aab9-2f6cfd6ff5a4\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qb95v" Mar 19 19:23:38 crc kubenswrapper[4826]: I0319 19:23:38.848256 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rffxl\" (UniqueName: \"kubernetes.io/projected/d00cc58b-deec-42ba-aab9-2f6cfd6ff5a4-kube-api-access-rffxl\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qb95v\" (UID: \"d00cc58b-deec-42ba-aab9-2f6cfd6ff5a4\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qb95v" Mar 19 19:23:38 crc kubenswrapper[4826]: I0319 19:23:38.930067 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qb95v" Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.138005 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.240109 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed2acf4d-53a0-4bbb-bc0d-c021cf699d91-combined-ca-bundle\") pod \"ed2acf4d-53a0-4bbb-bc0d-c021cf699d91\" (UID: \"ed2acf4d-53a0-4bbb-bc0d-c021cf699d91\") " Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.240190 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed2acf4d-53a0-4bbb-bc0d-c021cf699d91-public-tls-certs\") pod \"ed2acf4d-53a0-4bbb-bc0d-c021cf699d91\" (UID: \"ed2acf4d-53a0-4bbb-bc0d-c021cf699d91\") " Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.240224 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed2acf4d-53a0-4bbb-bc0d-c021cf699d91-config-data\") pod \"ed2acf4d-53a0-4bbb-bc0d-c021cf699d91\" (UID: \"ed2acf4d-53a0-4bbb-bc0d-c021cf699d91\") " Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.240246 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed2acf4d-53a0-4bbb-bc0d-c021cf699d91-scripts\") pod \"ed2acf4d-53a0-4bbb-bc0d-c021cf699d91\" (UID: \"ed2acf4d-53a0-4bbb-bc0d-c021cf699d91\") " Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.240437 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed2acf4d-53a0-4bbb-bc0d-c021cf699d91-internal-tls-certs\") pod \"ed2acf4d-53a0-4bbb-bc0d-c021cf699d91\" (UID: \"ed2acf4d-53a0-4bbb-bc0d-c021cf699d91\") " Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.240468 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spk8l\" (UniqueName: \"kubernetes.io/projected/ed2acf4d-53a0-4bbb-bc0d-c021cf699d91-kube-api-access-spk8l\") pod \"ed2acf4d-53a0-4bbb-bc0d-c021cf699d91\" (UID: \"ed2acf4d-53a0-4bbb-bc0d-c021cf699d91\") " Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.246699 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed2acf4d-53a0-4bbb-bc0d-c021cf699d91-kube-api-access-spk8l" (OuterVolumeSpecName: "kube-api-access-spk8l") pod "ed2acf4d-53a0-4bbb-bc0d-c021cf699d91" (UID: "ed2acf4d-53a0-4bbb-bc0d-c021cf699d91"). InnerVolumeSpecName "kube-api-access-spk8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.247323 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed2acf4d-53a0-4bbb-bc0d-c021cf699d91-scripts" (OuterVolumeSpecName: "scripts") pod "ed2acf4d-53a0-4bbb-bc0d-c021cf699d91" (UID: "ed2acf4d-53a0-4bbb-bc0d-c021cf699d91"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.314560 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed2acf4d-53a0-4bbb-bc0d-c021cf699d91-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ed2acf4d-53a0-4bbb-bc0d-c021cf699d91" (UID: "ed2acf4d-53a0-4bbb-bc0d-c021cf699d91"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.340840 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed2acf4d-53a0-4bbb-bc0d-c021cf699d91-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ed2acf4d-53a0-4bbb-bc0d-c021cf699d91" (UID: "ed2acf4d-53a0-4bbb-bc0d-c021cf699d91"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.351697 4826 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed2acf4d-53a0-4bbb-bc0d-c021cf699d91-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.351728 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spk8l\" (UniqueName: \"kubernetes.io/projected/ed2acf4d-53a0-4bbb-bc0d-c021cf699d91-kube-api-access-spk8l\") on node \"crc\" DevicePath \"\"" Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.351739 4826 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed2acf4d-53a0-4bbb-bc0d-c021cf699d91-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.351750 4826 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed2acf4d-53a0-4bbb-bc0d-c021cf699d91-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.420800 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed2acf4d-53a0-4bbb-bc0d-c021cf699d91-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed2acf4d-53a0-4bbb-bc0d-c021cf699d91" (UID: "ed2acf4d-53a0-4bbb-bc0d-c021cf699d91"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.440190 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed2acf4d-53a0-4bbb-bc0d-c021cf699d91-config-data" (OuterVolumeSpecName: "config-data") pod "ed2acf4d-53a0-4bbb-bc0d-c021cf699d91" (UID: "ed2acf4d-53a0-4bbb-bc0d-c021cf699d91"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.455227 4826 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed2acf4d-53a0-4bbb-bc0d-c021cf699d91-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.455261 4826 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed2acf4d-53a0-4bbb-bc0d-c021cf699d91-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.461835 4826 generic.go:334] "Generic (PLEG): container finished" podID="ed2acf4d-53a0-4bbb-bc0d-c021cf699d91" containerID="4a573f40f01ececb85f329d1f7d54bb1ee1c8a9122f52617986cfe85a9b58e47" exitCode=0 Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.461874 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ed2acf4d-53a0-4bbb-bc0d-c021cf699d91","Type":"ContainerDied","Data":"4a573f40f01ececb85f329d1f7d54bb1ee1c8a9122f52617986cfe85a9b58e47"} Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.461900 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ed2acf4d-53a0-4bbb-bc0d-c021cf699d91","Type":"ContainerDied","Data":"e0afcfa5950c62b0559388abb3a46dadde1fd5d30cabcde3a94d6f3a15e07660"} Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.461941 4826 scope.go:117] "RemoveContainer" containerID="f88cb585e4bb2afd745d3e2e177e1850ff875a9d51deba8343e25dfe245845e4" Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.462139 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.501524 4826 scope.go:117] "RemoveContainer" containerID="4a573f40f01ececb85f329d1f7d54bb1ee1c8a9122f52617986cfe85a9b58e47" Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.522758 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.533130 4826 scope.go:117] "RemoveContainer" containerID="1809704acf553601df52423f6734335622681a932932ff9a09d1aecac4f31610" Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.537729 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.551765 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Mar 19 19:23:39 crc kubenswrapper[4826]: E0319 19:23:39.552490 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed2acf4d-53a0-4bbb-bc0d-c021cf699d91" containerName="aodh-notifier" Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.552515 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed2acf4d-53a0-4bbb-bc0d-c021cf699d91" containerName="aodh-notifier" Mar 19 19:23:39 crc kubenswrapper[4826]: E0319 19:23:39.552535 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed2acf4d-53a0-4bbb-bc0d-c021cf699d91" containerName="aodh-listener" Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.552544 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed2acf4d-53a0-4bbb-bc0d-c021cf699d91" containerName="aodh-listener" Mar 19 19:23:39 crc kubenswrapper[4826]: E0319 19:23:39.552571 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed2acf4d-53a0-4bbb-bc0d-c021cf699d91" containerName="aodh-evaluator" Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.552579 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed2acf4d-53a0-4bbb-bc0d-c021cf699d91" containerName="aodh-evaluator" Mar 19 19:23:39 crc kubenswrapper[4826]: E0319 19:23:39.552601 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed2acf4d-53a0-4bbb-bc0d-c021cf699d91" containerName="aodh-api" Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.552610 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed2acf4d-53a0-4bbb-bc0d-c021cf699d91" containerName="aodh-api" Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.552889 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed2acf4d-53a0-4bbb-bc0d-c021cf699d91" containerName="aodh-listener" Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.552925 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed2acf4d-53a0-4bbb-bc0d-c021cf699d91" containerName="aodh-api" Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.552954 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed2acf4d-53a0-4bbb-bc0d-c021cf699d91" containerName="aodh-evaluator" Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.552968 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed2acf4d-53a0-4bbb-bc0d-c021cf699d91" containerName="aodh-notifier" Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.555592 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.556886 4826 scope.go:117] "RemoveContainer" containerID="d5a9eb116013690bd9a216eb73d3c0899145605bb5f7509456aa201f07962be1" Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.561030 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.561179 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-pqw5p" Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.561291 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.561393 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.563413 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.564544 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.592225 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-qb95v"] Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.594663 4826 scope.go:117] "RemoveContainer" containerID="f88cb585e4bb2afd745d3e2e177e1850ff875a9d51deba8343e25dfe245845e4" Mar 19 19:23:39 crc kubenswrapper[4826]: E0319 19:23:39.595047 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f88cb585e4bb2afd745d3e2e177e1850ff875a9d51deba8343e25dfe245845e4\": container with ID starting with f88cb585e4bb2afd745d3e2e177e1850ff875a9d51deba8343e25dfe245845e4 not found: ID does not exist" containerID="f88cb585e4bb2afd745d3e2e177e1850ff875a9d51deba8343e25dfe245845e4" Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.595094 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f88cb585e4bb2afd745d3e2e177e1850ff875a9d51deba8343e25dfe245845e4"} err="failed to get container status \"f88cb585e4bb2afd745d3e2e177e1850ff875a9d51deba8343e25dfe245845e4\": rpc error: code = NotFound desc = could not find container \"f88cb585e4bb2afd745d3e2e177e1850ff875a9d51deba8343e25dfe245845e4\": container with ID starting with f88cb585e4bb2afd745d3e2e177e1850ff875a9d51deba8343e25dfe245845e4 not found: ID does not exist" Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.595122 4826 scope.go:117] "RemoveContainer" containerID="4a573f40f01ececb85f329d1f7d54bb1ee1c8a9122f52617986cfe85a9b58e47" Mar 19 19:23:39 crc kubenswrapper[4826]: E0319 19:23:39.595524 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a573f40f01ececb85f329d1f7d54bb1ee1c8a9122f52617986cfe85a9b58e47\": container with ID starting with 4a573f40f01ececb85f329d1f7d54bb1ee1c8a9122f52617986cfe85a9b58e47 not found: ID does not exist" containerID="4a573f40f01ececb85f329d1f7d54bb1ee1c8a9122f52617986cfe85a9b58e47" Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.595562 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a573f40f01ececb85f329d1f7d54bb1ee1c8a9122f52617986cfe85a9b58e47"} err="failed to get container status \"4a573f40f01ececb85f329d1f7d54bb1ee1c8a9122f52617986cfe85a9b58e47\": rpc error: code = NotFound desc = could not find container \"4a573f40f01ececb85f329d1f7d54bb1ee1c8a9122f52617986cfe85a9b58e47\": container with ID starting with 4a573f40f01ececb85f329d1f7d54bb1ee1c8a9122f52617986cfe85a9b58e47 not found: ID does not exist" Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.595591 4826 scope.go:117] "RemoveContainer" containerID="1809704acf553601df52423f6734335622681a932932ff9a09d1aecac4f31610" Mar 19 19:23:39 crc kubenswrapper[4826]: E0319 19:23:39.595820 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1809704acf553601df52423f6734335622681a932932ff9a09d1aecac4f31610\": container with ID starting with 1809704acf553601df52423f6734335622681a932932ff9a09d1aecac4f31610 not found: ID does not exist" containerID="1809704acf553601df52423f6734335622681a932932ff9a09d1aecac4f31610" Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.595846 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1809704acf553601df52423f6734335622681a932932ff9a09d1aecac4f31610"} err="failed to get container status \"1809704acf553601df52423f6734335622681a932932ff9a09d1aecac4f31610\": rpc error: code = NotFound desc = could not find container \"1809704acf553601df52423f6734335622681a932932ff9a09d1aecac4f31610\": container with ID starting with 1809704acf553601df52423f6734335622681a932932ff9a09d1aecac4f31610 not found: ID does not exist" Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.595862 4826 scope.go:117] "RemoveContainer" containerID="d5a9eb116013690bd9a216eb73d3c0899145605bb5f7509456aa201f07962be1" Mar 19 19:23:39 crc kubenswrapper[4826]: E0319 19:23:39.596065 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5a9eb116013690bd9a216eb73d3c0899145605bb5f7509456aa201f07962be1\": container with ID starting with d5a9eb116013690bd9a216eb73d3c0899145605bb5f7509456aa201f07962be1 not found: ID does not exist" containerID="d5a9eb116013690bd9a216eb73d3c0899145605bb5f7509456aa201f07962be1" Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.596111 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5a9eb116013690bd9a216eb73d3c0899145605bb5f7509456aa201f07962be1"} err="failed to get container status \"d5a9eb116013690bd9a216eb73d3c0899145605bb5f7509456aa201f07962be1\": rpc error: code = NotFound desc = could not find container \"d5a9eb116013690bd9a216eb73d3c0899145605bb5f7509456aa201f07962be1\": container with ID starting with d5a9eb116013690bd9a216eb73d3c0899145605bb5f7509456aa201f07962be1 not found: ID does not exist" Mar 19 19:23:39 crc kubenswrapper[4826]: W0319 19:23:39.600503 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd00cc58b_deec_42ba_aab9_2f6cfd6ff5a4.slice/crio-095bd6377ee546b739ead81d01d49692411fab714be3c91a0b09eab63558ad07 WatchSource:0}: Error finding container 095bd6377ee546b739ead81d01d49692411fab714be3c91a0b09eab63558ad07: Status 404 returned error can't find the container with id 095bd6377ee546b739ead81d01d49692411fab714be3c91a0b09eab63558ad07 Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.659092 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9249025-1986-4855-8225-8ca0601709ce-public-tls-certs\") pod \"aodh-0\" (UID: \"e9249025-1986-4855-8225-8ca0601709ce\") " pod="openstack/aodh-0" Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.659155 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9249025-1986-4855-8225-8ca0601709ce-config-data\") pod \"aodh-0\" (UID: \"e9249025-1986-4855-8225-8ca0601709ce\") " pod="openstack/aodh-0" Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.659285 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqpwr\" (UniqueName: \"kubernetes.io/projected/e9249025-1986-4855-8225-8ca0601709ce-kube-api-access-jqpwr\") pod \"aodh-0\" (UID: \"e9249025-1986-4855-8225-8ca0601709ce\") " pod="openstack/aodh-0" Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.659333 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9249025-1986-4855-8225-8ca0601709ce-scripts\") pod \"aodh-0\" (UID: \"e9249025-1986-4855-8225-8ca0601709ce\") " pod="openstack/aodh-0" Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.659386 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9249025-1986-4855-8225-8ca0601709ce-internal-tls-certs\") pod \"aodh-0\" (UID: \"e9249025-1986-4855-8225-8ca0601709ce\") " pod="openstack/aodh-0" Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.659422 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9249025-1986-4855-8225-8ca0601709ce-combined-ca-bundle\") pod \"aodh-0\" (UID: \"e9249025-1986-4855-8225-8ca0601709ce\") " pod="openstack/aodh-0" Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.761320 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9249025-1986-4855-8225-8ca0601709ce-scripts\") pod \"aodh-0\" (UID: \"e9249025-1986-4855-8225-8ca0601709ce\") " pod="openstack/aodh-0" Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.761438 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9249025-1986-4855-8225-8ca0601709ce-internal-tls-certs\") pod \"aodh-0\" (UID: \"e9249025-1986-4855-8225-8ca0601709ce\") " pod="openstack/aodh-0" Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.761494 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9249025-1986-4855-8225-8ca0601709ce-combined-ca-bundle\") pod \"aodh-0\" (UID: \"e9249025-1986-4855-8225-8ca0601709ce\") " pod="openstack/aodh-0" Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.761531 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9249025-1986-4855-8225-8ca0601709ce-public-tls-certs\") pod \"aodh-0\" (UID: \"e9249025-1986-4855-8225-8ca0601709ce\") " pod="openstack/aodh-0" Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.761580 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9249025-1986-4855-8225-8ca0601709ce-config-data\") pod \"aodh-0\" (UID: \"e9249025-1986-4855-8225-8ca0601709ce\") " pod="openstack/aodh-0" Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.761721 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqpwr\" (UniqueName: \"kubernetes.io/projected/e9249025-1986-4855-8225-8ca0601709ce-kube-api-access-jqpwr\") pod \"aodh-0\" (UID: \"e9249025-1986-4855-8225-8ca0601709ce\") " pod="openstack/aodh-0" Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.765471 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9249025-1986-4855-8225-8ca0601709ce-scripts\") pod \"aodh-0\" (UID: \"e9249025-1986-4855-8225-8ca0601709ce\") " pod="openstack/aodh-0" Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.765505 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9249025-1986-4855-8225-8ca0601709ce-internal-tls-certs\") pod \"aodh-0\" (UID: \"e9249025-1986-4855-8225-8ca0601709ce\") " pod="openstack/aodh-0" Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.765702 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9249025-1986-4855-8225-8ca0601709ce-config-data\") pod \"aodh-0\" (UID: \"e9249025-1986-4855-8225-8ca0601709ce\") " pod="openstack/aodh-0" Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.766015 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9249025-1986-4855-8225-8ca0601709ce-public-tls-certs\") pod \"aodh-0\" (UID: \"e9249025-1986-4855-8225-8ca0601709ce\") " pod="openstack/aodh-0" Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.766519 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9249025-1986-4855-8225-8ca0601709ce-combined-ca-bundle\") pod \"aodh-0\" (UID: \"e9249025-1986-4855-8225-8ca0601709ce\") " pod="openstack/aodh-0" Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.782051 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqpwr\" (UniqueName: \"kubernetes.io/projected/e9249025-1986-4855-8225-8ca0601709ce-kube-api-access-jqpwr\") pod \"aodh-0\" (UID: \"e9249025-1986-4855-8225-8ca0601709ce\") " pod="openstack/aodh-0" Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.881841 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 19 19:23:39 crc kubenswrapper[4826]: I0319 19:23:39.992105 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed2acf4d-53a0-4bbb-bc0d-c021cf699d91" path="/var/lib/kubelet/pods/ed2acf4d-53a0-4bbb-bc0d-c021cf699d91/volumes" Mar 19 19:23:40 crc kubenswrapper[4826]: W0319 19:23:40.367013 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9249025_1986_4855_8225_8ca0601709ce.slice/crio-15e552c94e28566e339ec9219b01b87620a1a70a40578d7d32988f88a1f80ca4 WatchSource:0}: Error finding container 15e552c94e28566e339ec9219b01b87620a1a70a40578d7d32988f88a1f80ca4: Status 404 returned error can't find the container with id 15e552c94e28566e339ec9219b01b87620a1a70a40578d7d32988f88a1f80ca4 Mar 19 19:23:40 crc kubenswrapper[4826]: I0319 19:23:40.392402 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 19 19:23:40 crc kubenswrapper[4826]: I0319 19:23:40.491609 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"e9249025-1986-4855-8225-8ca0601709ce","Type":"ContainerStarted","Data":"15e552c94e28566e339ec9219b01b87620a1a70a40578d7d32988f88a1f80ca4"} Mar 19 19:23:40 crc kubenswrapper[4826]: I0319 19:23:40.493440 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qb95v" event={"ID":"d00cc58b-deec-42ba-aab9-2f6cfd6ff5a4","Type":"ContainerStarted","Data":"0546d9ace2417b6788c39ac8e360a8e6086540c1370b4a5e9b83e7c5dcc41c78"} Mar 19 19:23:40 crc kubenswrapper[4826]: I0319 19:23:40.493489 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qb95v" event={"ID":"d00cc58b-deec-42ba-aab9-2f6cfd6ff5a4","Type":"ContainerStarted","Data":"095bd6377ee546b739ead81d01d49692411fab714be3c91a0b09eab63558ad07"} Mar 19 19:23:40 crc kubenswrapper[4826]: I0319 19:23:40.522807 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qb95v" podStartSLOduration=2.014985331 podStartE2EDuration="2.522787023s" podCreationTimestamp="2026-03-19 19:23:38 +0000 UTC" firstStartedPulling="2026-03-19 19:23:39.603767942 +0000 UTC m=+1644.357836265" lastFinishedPulling="2026-03-19 19:23:40.111569644 +0000 UTC m=+1644.865637957" observedRunningTime="2026-03-19 19:23:40.513184828 +0000 UTC m=+1645.267253161" watchObservedRunningTime="2026-03-19 19:23:40.522787023 +0000 UTC m=+1645.276855336" Mar 19 19:23:41 crc kubenswrapper[4826]: I0319 19:23:41.526388 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"e9249025-1986-4855-8225-8ca0601709ce","Type":"ContainerStarted","Data":"5eab0c63907a3995340252820ec7e90c11b05537324ce5f29f5f941f076b51f8"} Mar 19 19:23:42 crc kubenswrapper[4826]: I0319 19:23:42.539910 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"e9249025-1986-4855-8225-8ca0601709ce","Type":"ContainerStarted","Data":"5a41e379fd77186742ac926942dd7a4420174f33c370fcaef52648574126d974"} Mar 19 19:23:43 crc kubenswrapper[4826]: I0319 19:23:43.558281 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"e9249025-1986-4855-8225-8ca0601709ce","Type":"ContainerStarted","Data":"961d9d7785ffc3190133e8096a2aae129485ae8720b99b346e29243477bb7582"} Mar 19 19:23:43 crc kubenswrapper[4826]: I0319 19:23:43.560459 4826 generic.go:334] "Generic (PLEG): container finished" podID="d00cc58b-deec-42ba-aab9-2f6cfd6ff5a4" containerID="0546d9ace2417b6788c39ac8e360a8e6086540c1370b4a5e9b83e7c5dcc41c78" exitCode=0 Mar 19 19:23:43 crc kubenswrapper[4826]: I0319 19:23:43.560512 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qb95v" event={"ID":"d00cc58b-deec-42ba-aab9-2f6cfd6ff5a4","Type":"ContainerDied","Data":"0546d9ace2417b6788c39ac8e360a8e6086540c1370b4a5e9b83e7c5dcc41c78"} Mar 19 19:23:44 crc kubenswrapper[4826]: I0319 19:23:44.574589 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"e9249025-1986-4855-8225-8ca0601709ce","Type":"ContainerStarted","Data":"0bb24a6907df3e957cbf2380399af219a759a53d763daf1c2ffdbf442b9be95c"} Mar 19 19:23:44 crc kubenswrapper[4826]: I0319 19:23:44.610037 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=1.9023050160000001 podStartE2EDuration="5.61001946s" podCreationTimestamp="2026-03-19 19:23:39 +0000 UTC" firstStartedPulling="2026-03-19 19:23:40.38744426 +0000 UTC m=+1645.141512573" lastFinishedPulling="2026-03-19 19:23:44.095158704 +0000 UTC m=+1648.849227017" observedRunningTime="2026-03-19 19:23:44.602107205 +0000 UTC m=+1649.356175518" watchObservedRunningTime="2026-03-19 19:23:44.61001946 +0000 UTC m=+1649.364087773" Mar 19 19:23:45 crc kubenswrapper[4826]: I0319 19:23:45.263209 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qb95v" Mar 19 19:23:45 crc kubenswrapper[4826]: I0319 19:23:45.424238 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d00cc58b-deec-42ba-aab9-2f6cfd6ff5a4-ssh-key-openstack-edpm-ipam\") pod \"d00cc58b-deec-42ba-aab9-2f6cfd6ff5a4\" (UID: \"d00cc58b-deec-42ba-aab9-2f6cfd6ff5a4\") " Mar 19 19:23:45 crc kubenswrapper[4826]: I0319 19:23:45.424623 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rffxl\" (UniqueName: \"kubernetes.io/projected/d00cc58b-deec-42ba-aab9-2f6cfd6ff5a4-kube-api-access-rffxl\") pod \"d00cc58b-deec-42ba-aab9-2f6cfd6ff5a4\" (UID: \"d00cc58b-deec-42ba-aab9-2f6cfd6ff5a4\") " Mar 19 19:23:45 crc kubenswrapper[4826]: I0319 19:23:45.424761 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d00cc58b-deec-42ba-aab9-2f6cfd6ff5a4-inventory\") pod \"d00cc58b-deec-42ba-aab9-2f6cfd6ff5a4\" (UID: \"d00cc58b-deec-42ba-aab9-2f6cfd6ff5a4\") " Mar 19 19:23:45 crc kubenswrapper[4826]: I0319 19:23:45.448462 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d00cc58b-deec-42ba-aab9-2f6cfd6ff5a4-kube-api-access-rffxl" (OuterVolumeSpecName: "kube-api-access-rffxl") pod "d00cc58b-deec-42ba-aab9-2f6cfd6ff5a4" (UID: "d00cc58b-deec-42ba-aab9-2f6cfd6ff5a4"). InnerVolumeSpecName "kube-api-access-rffxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:23:45 crc kubenswrapper[4826]: I0319 19:23:45.473426 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d00cc58b-deec-42ba-aab9-2f6cfd6ff5a4-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d00cc58b-deec-42ba-aab9-2f6cfd6ff5a4" (UID: "d00cc58b-deec-42ba-aab9-2f6cfd6ff5a4"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:23:45 crc kubenswrapper[4826]: I0319 19:23:45.485387 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d00cc58b-deec-42ba-aab9-2f6cfd6ff5a4-inventory" (OuterVolumeSpecName: "inventory") pod "d00cc58b-deec-42ba-aab9-2f6cfd6ff5a4" (UID: "d00cc58b-deec-42ba-aab9-2f6cfd6ff5a4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:23:45 crc kubenswrapper[4826]: I0319 19:23:45.529616 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rffxl\" (UniqueName: \"kubernetes.io/projected/d00cc58b-deec-42ba-aab9-2f6cfd6ff5a4-kube-api-access-rffxl\") on node \"crc\" DevicePath \"\"" Mar 19 19:23:45 crc kubenswrapper[4826]: I0319 19:23:45.529690 4826 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d00cc58b-deec-42ba-aab9-2f6cfd6ff5a4-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 19:23:45 crc kubenswrapper[4826]: I0319 19:23:45.529707 4826 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d00cc58b-deec-42ba-aab9-2f6cfd6ff5a4-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 19:23:45 crc kubenswrapper[4826]: I0319 19:23:45.589054 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qb95v" Mar 19 19:23:45 crc kubenswrapper[4826]: I0319 19:23:45.590896 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qb95v" event={"ID":"d00cc58b-deec-42ba-aab9-2f6cfd6ff5a4","Type":"ContainerDied","Data":"095bd6377ee546b739ead81d01d49692411fab714be3c91a0b09eab63558ad07"} Mar 19 19:23:45 crc kubenswrapper[4826]: I0319 19:23:45.590962 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="095bd6377ee546b739ead81d01d49692411fab714be3c91a0b09eab63558ad07" Mar 19 19:23:45 crc kubenswrapper[4826]: I0319 19:23:45.664111 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-flqqq"] Mar 19 19:23:45 crc kubenswrapper[4826]: E0319 19:23:45.664696 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d00cc58b-deec-42ba-aab9-2f6cfd6ff5a4" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 19 19:23:45 crc kubenswrapper[4826]: I0319 19:23:45.664721 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="d00cc58b-deec-42ba-aab9-2f6cfd6ff5a4" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 19 19:23:45 crc kubenswrapper[4826]: I0319 19:23:45.664967 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="d00cc58b-deec-42ba-aab9-2f6cfd6ff5a4" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 19 19:23:45 crc kubenswrapper[4826]: I0319 19:23:45.665844 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-flqqq" Mar 19 19:23:45 crc kubenswrapper[4826]: I0319 19:23:45.670849 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 19:23:45 crc kubenswrapper[4826]: I0319 19:23:45.670933 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jchxw" Mar 19 19:23:45 crc kubenswrapper[4826]: I0319 19:23:45.671020 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 19:23:45 crc kubenswrapper[4826]: I0319 19:23:45.671114 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 19:23:45 crc kubenswrapper[4826]: I0319 19:23:45.682042 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-flqqq"] Mar 19 19:23:45 crc kubenswrapper[4826]: I0319 19:23:45.837909 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a5c0489c-9ec7-4851-b96a-d2cebe602bf2-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-flqqq\" (UID: \"a5c0489c-9ec7-4851-b96a-d2cebe602bf2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-flqqq" Mar 19 19:23:45 crc kubenswrapper[4826]: I0319 19:23:45.837990 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5c0489c-9ec7-4851-b96a-d2cebe602bf2-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-flqqq\" (UID: \"a5c0489c-9ec7-4851-b96a-d2cebe602bf2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-flqqq" Mar 19 19:23:45 crc kubenswrapper[4826]: I0319 19:23:45.838167 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5c0489c-9ec7-4851-b96a-d2cebe602bf2-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-flqqq\" (UID: \"a5c0489c-9ec7-4851-b96a-d2cebe602bf2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-flqqq" Mar 19 19:23:45 crc kubenswrapper[4826]: I0319 19:23:45.838235 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crl4n\" (UniqueName: \"kubernetes.io/projected/a5c0489c-9ec7-4851-b96a-d2cebe602bf2-kube-api-access-crl4n\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-flqqq\" (UID: \"a5c0489c-9ec7-4851-b96a-d2cebe602bf2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-flqqq" Mar 19 19:23:45 crc kubenswrapper[4826]: I0319 19:23:45.940617 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5c0489c-9ec7-4851-b96a-d2cebe602bf2-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-flqqq\" (UID: \"a5c0489c-9ec7-4851-b96a-d2cebe602bf2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-flqqq" Mar 19 19:23:45 crc kubenswrapper[4826]: I0319 19:23:45.940786 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crl4n\" (UniqueName: \"kubernetes.io/projected/a5c0489c-9ec7-4851-b96a-d2cebe602bf2-kube-api-access-crl4n\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-flqqq\" (UID: \"a5c0489c-9ec7-4851-b96a-d2cebe602bf2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-flqqq" Mar 19 19:23:45 crc kubenswrapper[4826]: I0319 19:23:45.941074 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a5c0489c-9ec7-4851-b96a-d2cebe602bf2-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-flqqq\" (UID: \"a5c0489c-9ec7-4851-b96a-d2cebe602bf2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-flqqq" Mar 19 19:23:45 crc kubenswrapper[4826]: I0319 19:23:45.941158 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5c0489c-9ec7-4851-b96a-d2cebe602bf2-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-flqqq\" (UID: \"a5c0489c-9ec7-4851-b96a-d2cebe602bf2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-flqqq" Mar 19 19:23:45 crc kubenswrapper[4826]: I0319 19:23:45.944264 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5c0489c-9ec7-4851-b96a-d2cebe602bf2-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-flqqq\" (UID: \"a5c0489c-9ec7-4851-b96a-d2cebe602bf2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-flqqq" Mar 19 19:23:45 crc kubenswrapper[4826]: I0319 19:23:45.944711 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5c0489c-9ec7-4851-b96a-d2cebe602bf2-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-flqqq\" (UID: \"a5c0489c-9ec7-4851-b96a-d2cebe602bf2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-flqqq" Mar 19 19:23:45 crc kubenswrapper[4826]: I0319 19:23:45.945806 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a5c0489c-9ec7-4851-b96a-d2cebe602bf2-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-flqqq\" (UID: \"a5c0489c-9ec7-4851-b96a-d2cebe602bf2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-flqqq" Mar 19 19:23:45 crc kubenswrapper[4826]: I0319 19:23:45.969479 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crl4n\" (UniqueName: \"kubernetes.io/projected/a5c0489c-9ec7-4851-b96a-d2cebe602bf2-kube-api-access-crl4n\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-flqqq\" (UID: \"a5c0489c-9ec7-4851-b96a-d2cebe602bf2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-flqqq" Mar 19 19:23:45 crc kubenswrapper[4826]: I0319 19:23:45.989791 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-flqqq" Mar 19 19:23:46 crc kubenswrapper[4826]: W0319 19:23:46.632992 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5c0489c_9ec7_4851_b96a_d2cebe602bf2.slice/crio-9ab6392976685a713233f75b3b7a4a12d0ef4c6de7bf5b52b7cc881e31274e6c WatchSource:0}: Error finding container 9ab6392976685a713233f75b3b7a4a12d0ef4c6de7bf5b52b7cc881e31274e6c: Status 404 returned error can't find the container with id 9ab6392976685a713233f75b3b7a4a12d0ef4c6de7bf5b52b7cc881e31274e6c Mar 19 19:23:46 crc kubenswrapper[4826]: I0319 19:23:46.637430 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-flqqq"] Mar 19 19:23:46 crc kubenswrapper[4826]: I0319 19:23:46.977142 4826 scope.go:117] "RemoveContainer" containerID="856447f1cdc796c080402d3bfb76d7471741ca95039714006756d0cb980e424c" Mar 19 19:23:46 crc kubenswrapper[4826]: E0319 19:23:46.977607 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:23:47 crc kubenswrapper[4826]: I0319 19:23:47.613956 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-flqqq" event={"ID":"a5c0489c-9ec7-4851-b96a-d2cebe602bf2","Type":"ContainerStarted","Data":"082a366f0381fbf2c5caf4c2142668b55cefaccf11a958b587be50142df6ad27"} Mar 19 19:23:47 crc kubenswrapper[4826]: I0319 19:23:47.614216 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-flqqq" event={"ID":"a5c0489c-9ec7-4851-b96a-d2cebe602bf2","Type":"ContainerStarted","Data":"9ab6392976685a713233f75b3b7a4a12d0ef4c6de7bf5b52b7cc881e31274e6c"} Mar 19 19:23:47 crc kubenswrapper[4826]: I0319 19:23:47.648852 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-flqqq" podStartSLOduration=2.1835346270000002 podStartE2EDuration="2.648828545s" podCreationTimestamp="2026-03-19 19:23:45 +0000 UTC" firstStartedPulling="2026-03-19 19:23:46.636278876 +0000 UTC m=+1651.390347179" lastFinishedPulling="2026-03-19 19:23:47.101572784 +0000 UTC m=+1651.855641097" observedRunningTime="2026-03-19 19:23:47.634032751 +0000 UTC m=+1652.388101084" watchObservedRunningTime="2026-03-19 19:23:47.648828545 +0000 UTC m=+1652.402896868" Mar 19 19:23:58 crc kubenswrapper[4826]: I0319 19:23:58.978351 4826 scope.go:117] "RemoveContainer" containerID="856447f1cdc796c080402d3bfb76d7471741ca95039714006756d0cb980e424c" Mar 19 19:23:58 crc kubenswrapper[4826]: E0319 19:23:58.981078 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:24:00 crc kubenswrapper[4826]: I0319 19:24:00.158538 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565804-fjhhm"] Mar 19 19:24:00 crc kubenswrapper[4826]: I0319 19:24:00.160874 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565804-fjhhm" Mar 19 19:24:00 crc kubenswrapper[4826]: I0319 19:24:00.164325 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 19:24:00 crc kubenswrapper[4826]: I0319 19:24:00.164514 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b27wl" Mar 19 19:24:00 crc kubenswrapper[4826]: I0319 19:24:00.165030 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 19:24:00 crc kubenswrapper[4826]: I0319 19:24:00.176584 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565804-fjhhm"] Mar 19 19:24:00 crc kubenswrapper[4826]: I0319 19:24:00.341795 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh7pf\" (UniqueName: \"kubernetes.io/projected/0eadd329-3574-4dcc-a93d-163d850caa5a-kube-api-access-sh7pf\") pod \"auto-csr-approver-29565804-fjhhm\" (UID: \"0eadd329-3574-4dcc-a93d-163d850caa5a\") " pod="openshift-infra/auto-csr-approver-29565804-fjhhm" Mar 19 19:24:00 crc kubenswrapper[4826]: I0319 19:24:00.444248 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sh7pf\" (UniqueName: \"kubernetes.io/projected/0eadd329-3574-4dcc-a93d-163d850caa5a-kube-api-access-sh7pf\") pod \"auto-csr-approver-29565804-fjhhm\" (UID: \"0eadd329-3574-4dcc-a93d-163d850caa5a\") " pod="openshift-infra/auto-csr-approver-29565804-fjhhm" Mar 19 19:24:00 crc kubenswrapper[4826]: I0319 19:24:00.467878 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh7pf\" (UniqueName: \"kubernetes.io/projected/0eadd329-3574-4dcc-a93d-163d850caa5a-kube-api-access-sh7pf\") pod \"auto-csr-approver-29565804-fjhhm\" (UID: \"0eadd329-3574-4dcc-a93d-163d850caa5a\") " pod="openshift-infra/auto-csr-approver-29565804-fjhhm" Mar 19 19:24:00 crc kubenswrapper[4826]: I0319 19:24:00.483753 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565804-fjhhm" Mar 19 19:24:01 crc kubenswrapper[4826]: I0319 19:24:01.012380 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565804-fjhhm"] Mar 19 19:24:01 crc kubenswrapper[4826]: W0319 19:24:01.026611 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0eadd329_3574_4dcc_a93d_163d850caa5a.slice/crio-13e72aa4108a853ccc4b99fb2a5489dc336e3860103764b2cb7d27e0ddb6d481 WatchSource:0}: Error finding container 13e72aa4108a853ccc4b99fb2a5489dc336e3860103764b2cb7d27e0ddb6d481: Status 404 returned error can't find the container with id 13e72aa4108a853ccc4b99fb2a5489dc336e3860103764b2cb7d27e0ddb6d481 Mar 19 19:24:01 crc kubenswrapper[4826]: I0319 19:24:01.031487 4826 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 19:24:01 crc kubenswrapper[4826]: I0319 19:24:01.789941 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565804-fjhhm" event={"ID":"0eadd329-3574-4dcc-a93d-163d850caa5a","Type":"ContainerStarted","Data":"13e72aa4108a853ccc4b99fb2a5489dc336e3860103764b2cb7d27e0ddb6d481"} Mar 19 19:24:02 crc kubenswrapper[4826]: I0319 19:24:02.820982 4826 generic.go:334] "Generic (PLEG): container finished" podID="0eadd329-3574-4dcc-a93d-163d850caa5a" containerID="1822cdbf54e42ae9c9de3ddaf66eef30a5a35bec9d9f90947b3e4984139e9265" exitCode=0 Mar 19 19:24:02 crc kubenswrapper[4826]: I0319 19:24:02.821489 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565804-fjhhm" event={"ID":"0eadd329-3574-4dcc-a93d-163d850caa5a","Type":"ContainerDied","Data":"1822cdbf54e42ae9c9de3ddaf66eef30a5a35bec9d9f90947b3e4984139e9265"} Mar 19 19:24:04 crc kubenswrapper[4826]: I0319 19:24:04.400905 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565804-fjhhm" Mar 19 19:24:04 crc kubenswrapper[4826]: I0319 19:24:04.491435 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sh7pf\" (UniqueName: \"kubernetes.io/projected/0eadd329-3574-4dcc-a93d-163d850caa5a-kube-api-access-sh7pf\") pod \"0eadd329-3574-4dcc-a93d-163d850caa5a\" (UID: \"0eadd329-3574-4dcc-a93d-163d850caa5a\") " Mar 19 19:24:05 crc kubenswrapper[4826]: I0319 19:24:05.021527 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0eadd329-3574-4dcc-a93d-163d850caa5a-kube-api-access-sh7pf" (OuterVolumeSpecName: "kube-api-access-sh7pf") pod "0eadd329-3574-4dcc-a93d-163d850caa5a" (UID: "0eadd329-3574-4dcc-a93d-163d850caa5a"). InnerVolumeSpecName "kube-api-access-sh7pf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:24:05 crc kubenswrapper[4826]: I0319 19:24:05.027203 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sh7pf\" (UniqueName: \"kubernetes.io/projected/0eadd329-3574-4dcc-a93d-163d850caa5a-kube-api-access-sh7pf\") on node \"crc\" DevicePath \"\"" Mar 19 19:24:05 crc kubenswrapper[4826]: I0319 19:24:05.039597 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565804-fjhhm" event={"ID":"0eadd329-3574-4dcc-a93d-163d850caa5a","Type":"ContainerDied","Data":"13e72aa4108a853ccc4b99fb2a5489dc336e3860103764b2cb7d27e0ddb6d481"} Mar 19 19:24:05 crc kubenswrapper[4826]: I0319 19:24:05.039624 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565804-fjhhm" Mar 19 19:24:05 crc kubenswrapper[4826]: I0319 19:24:05.039686 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13e72aa4108a853ccc4b99fb2a5489dc336e3860103764b2cb7d27e0ddb6d481" Mar 19 19:24:05 crc kubenswrapper[4826]: I0319 19:24:05.495485 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565798-jlzqc"] Mar 19 19:24:05 crc kubenswrapper[4826]: I0319 19:24:05.511890 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565798-jlzqc"] Mar 19 19:24:06 crc kubenswrapper[4826]: I0319 19:24:06.005800 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eef07f29-afd3-40df-a0c7-098109beedde" path="/var/lib/kubelet/pods/eef07f29-afd3-40df-a0c7-098109beedde/volumes" Mar 19 19:24:06 crc kubenswrapper[4826]: I0319 19:24:06.061194 4826 generic.go:334] "Generic (PLEG): container finished" podID="97a64d0f-56cc-4ec0-9e02-49fbbe998f43" containerID="d855a197d007a8f6edff88b142f78cce6588caa908943792c3bdad833cc04000" exitCode=0 Mar 19 19:24:06 crc kubenswrapper[4826]: I0319 19:24:06.061396 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"97a64d0f-56cc-4ec0-9e02-49fbbe998f43","Type":"ContainerDied","Data":"d855a197d007a8f6edff88b142f78cce6588caa908943792c3bdad833cc04000"} Mar 19 19:24:07 crc kubenswrapper[4826]: I0319 19:24:07.080325 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"97a64d0f-56cc-4ec0-9e02-49fbbe998f43","Type":"ContainerStarted","Data":"418beec4f75ffde97de4fff47c14234e034f158aa132da00fa92c043460f363e"} Mar 19 19:24:07 crc kubenswrapper[4826]: I0319 19:24:07.081016 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-1" Mar 19 19:24:07 crc kubenswrapper[4826]: I0319 19:24:07.118408 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-1" podStartSLOduration=38.118382023 podStartE2EDuration="38.118382023s" podCreationTimestamp="2026-03-19 19:23:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:24:07.113913893 +0000 UTC m=+1671.867982276" watchObservedRunningTime="2026-03-19 19:24:07.118382023 +0000 UTC m=+1671.872450376" Mar 19 19:24:13 crc kubenswrapper[4826]: I0319 19:24:13.981342 4826 scope.go:117] "RemoveContainer" containerID="856447f1cdc796c080402d3bfb76d7471741ca95039714006756d0cb980e424c" Mar 19 19:24:13 crc kubenswrapper[4826]: E0319 19:24:13.983755 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:24:20 crc kubenswrapper[4826]: I0319 19:24:20.042318 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-1" Mar 19 19:24:20 crc kubenswrapper[4826]: I0319 19:24:20.119691 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 19:24:24 crc kubenswrapper[4826]: I0319 19:24:24.591683 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="e617bcf9-daaa-4a7a-949c-cdf0fc9646a5" containerName="rabbitmq" containerID="cri-o://d5305552064e09b571620e69bf94a0e89302212f965b53b1173ceb0e701975c3" gracePeriod=604796 Mar 19 19:24:26 crc kubenswrapper[4826]: I0319 19:24:26.983988 4826 scope.go:117] "RemoveContainer" containerID="856447f1cdc796c080402d3bfb76d7471741ca95039714006756d0cb980e424c" Mar 19 19:24:26 crc kubenswrapper[4826]: E0319 19:24:26.984817 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:24:28 crc kubenswrapper[4826]: I0319 19:24:28.567109 4826 scope.go:117] "RemoveContainer" containerID="030d67dc9fded5f77bdaa9901408cf79bf4434eeb018ce2652412744efa216e6" Mar 19 19:24:28 crc kubenswrapper[4826]: I0319 19:24:28.655803 4826 scope.go:117] "RemoveContainer" containerID="872c881ab948c81d0951bdb7bf9dadc68d5f06840e9b32507e13f4e851e6e533" Mar 19 19:24:30 crc kubenswrapper[4826]: I0319 19:24:30.469390 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="e617bcf9-daaa-4a7a-949c-cdf0fc9646a5" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.133:5671: connect: connection refused" Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.328251 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.440577 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e617bcf9-daaa-4a7a-949c-cdf0fc9646a5-erlang-cookie-secret\") pod \"e617bcf9-daaa-4a7a-949c-cdf0fc9646a5\" (UID: \"e617bcf9-daaa-4a7a-949c-cdf0fc9646a5\") " Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.440725 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e617bcf9-daaa-4a7a-949c-cdf0fc9646a5-plugins-conf\") pod \"e617bcf9-daaa-4a7a-949c-cdf0fc9646a5\" (UID: \"e617bcf9-daaa-4a7a-949c-cdf0fc9646a5\") " Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.440775 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e617bcf9-daaa-4a7a-949c-cdf0fc9646a5-rabbitmq-plugins\") pod \"e617bcf9-daaa-4a7a-949c-cdf0fc9646a5\" (UID: \"e617bcf9-daaa-4a7a-949c-cdf0fc9646a5\") " Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.440802 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e617bcf9-daaa-4a7a-949c-cdf0fc9646a5-server-conf\") pod \"e617bcf9-daaa-4a7a-949c-cdf0fc9646a5\" (UID: \"e617bcf9-daaa-4a7a-949c-cdf0fc9646a5\") " Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.440835 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e617bcf9-daaa-4a7a-949c-cdf0fc9646a5-pod-info\") pod \"e617bcf9-daaa-4a7a-949c-cdf0fc9646a5\" (UID: \"e617bcf9-daaa-4a7a-949c-cdf0fc9646a5\") " Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.440856 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e617bcf9-daaa-4a7a-949c-cdf0fc9646a5-config-data\") pod \"e617bcf9-daaa-4a7a-949c-cdf0fc9646a5\" (UID: \"e617bcf9-daaa-4a7a-949c-cdf0fc9646a5\") " Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.440904 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e617bcf9-daaa-4a7a-949c-cdf0fc9646a5-rabbitmq-confd\") pod \"e617bcf9-daaa-4a7a-949c-cdf0fc9646a5\" (UID: \"e617bcf9-daaa-4a7a-949c-cdf0fc9646a5\") " Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.441003 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e617bcf9-daaa-4a7a-949c-cdf0fc9646a5-rabbitmq-tls\") pod \"e617bcf9-daaa-4a7a-949c-cdf0fc9646a5\" (UID: \"e617bcf9-daaa-4a7a-949c-cdf0fc9646a5\") " Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.441914 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a4b76c04-891e-4a0b-9bcd-c8581b59c5c0\") pod \"e617bcf9-daaa-4a7a-949c-cdf0fc9646a5\" (UID: \"e617bcf9-daaa-4a7a-949c-cdf0fc9646a5\") " Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.441976 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e617bcf9-daaa-4a7a-949c-cdf0fc9646a5-rabbitmq-erlang-cookie\") pod \"e617bcf9-daaa-4a7a-949c-cdf0fc9646a5\" (UID: \"e617bcf9-daaa-4a7a-949c-cdf0fc9646a5\") " Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.442047 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvhpq\" (UniqueName: \"kubernetes.io/projected/e617bcf9-daaa-4a7a-949c-cdf0fc9646a5-kube-api-access-pvhpq\") pod \"e617bcf9-daaa-4a7a-949c-cdf0fc9646a5\" (UID: \"e617bcf9-daaa-4a7a-949c-cdf0fc9646a5\") " Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.443108 4826 generic.go:334] "Generic (PLEG): container finished" podID="e617bcf9-daaa-4a7a-949c-cdf0fc9646a5" containerID="d5305552064e09b571620e69bf94a0e89302212f965b53b1173ceb0e701975c3" exitCode=0 Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.443172 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e617bcf9-daaa-4a7a-949c-cdf0fc9646a5","Type":"ContainerDied","Data":"d5305552064e09b571620e69bf94a0e89302212f965b53b1173ceb0e701975c3"} Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.443204 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e617bcf9-daaa-4a7a-949c-cdf0fc9646a5","Type":"ContainerDied","Data":"3bb8d5045124888b616e485e6dfd698e2ad1188dbddec76c6fd9a6b02e84f3ee"} Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.443220 4826 scope.go:117] "RemoveContainer" containerID="d5305552064e09b571620e69bf94a0e89302212f965b53b1173ceb0e701975c3" Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.443385 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.444138 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e617bcf9-daaa-4a7a-949c-cdf0fc9646a5-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "e617bcf9-daaa-4a7a-949c-cdf0fc9646a5" (UID: "e617bcf9-daaa-4a7a-949c-cdf0fc9646a5"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.446122 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e617bcf9-daaa-4a7a-949c-cdf0fc9646a5-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "e617bcf9-daaa-4a7a-949c-cdf0fc9646a5" (UID: "e617bcf9-daaa-4a7a-949c-cdf0fc9646a5"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.450066 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e617bcf9-daaa-4a7a-949c-cdf0fc9646a5-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "e617bcf9-daaa-4a7a-949c-cdf0fc9646a5" (UID: "e617bcf9-daaa-4a7a-949c-cdf0fc9646a5"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.453373 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e617bcf9-daaa-4a7a-949c-cdf0fc9646a5-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "e617bcf9-daaa-4a7a-949c-cdf0fc9646a5" (UID: "e617bcf9-daaa-4a7a-949c-cdf0fc9646a5"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.459398 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e617bcf9-daaa-4a7a-949c-cdf0fc9646a5-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "e617bcf9-daaa-4a7a-949c-cdf0fc9646a5" (UID: "e617bcf9-daaa-4a7a-949c-cdf0fc9646a5"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.463940 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e617bcf9-daaa-4a7a-949c-cdf0fc9646a5-kube-api-access-pvhpq" (OuterVolumeSpecName: "kube-api-access-pvhpq") pod "e617bcf9-daaa-4a7a-949c-cdf0fc9646a5" (UID: "e617bcf9-daaa-4a7a-949c-cdf0fc9646a5"). InnerVolumeSpecName "kube-api-access-pvhpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.466715 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/e617bcf9-daaa-4a7a-949c-cdf0fc9646a5-pod-info" (OuterVolumeSpecName: "pod-info") pod "e617bcf9-daaa-4a7a-949c-cdf0fc9646a5" (UID: "e617bcf9-daaa-4a7a-949c-cdf0fc9646a5"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.523110 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e617bcf9-daaa-4a7a-949c-cdf0fc9646a5-config-data" (OuterVolumeSpecName: "config-data") pod "e617bcf9-daaa-4a7a-949c-cdf0fc9646a5" (UID: "e617bcf9-daaa-4a7a-949c-cdf0fc9646a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.545547 4826 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e617bcf9-daaa-4a7a-949c-cdf0fc9646a5-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.545576 4826 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e617bcf9-daaa-4a7a-949c-cdf0fc9646a5-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.545584 4826 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e617bcf9-daaa-4a7a-949c-cdf0fc9646a5-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.545593 4826 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e617bcf9-daaa-4a7a-949c-cdf0fc9646a5-pod-info\") on node \"crc\" DevicePath \"\"" Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.545601 4826 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e617bcf9-daaa-4a7a-949c-cdf0fc9646a5-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.545608 4826 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e617bcf9-daaa-4a7a-949c-cdf0fc9646a5-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.545618 4826 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e617bcf9-daaa-4a7a-949c-cdf0fc9646a5-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.545629 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvhpq\" (UniqueName: \"kubernetes.io/projected/e617bcf9-daaa-4a7a-949c-cdf0fc9646a5-kube-api-access-pvhpq\") on node \"crc\" DevicePath \"\"" Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.567116 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a4b76c04-891e-4a0b-9bcd-c8581b59c5c0" (OuterVolumeSpecName: "persistence") pod "e617bcf9-daaa-4a7a-949c-cdf0fc9646a5" (UID: "e617bcf9-daaa-4a7a-949c-cdf0fc9646a5"). InnerVolumeSpecName "pvc-a4b76c04-891e-4a0b-9bcd-c8581b59c5c0". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.575575 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e617bcf9-daaa-4a7a-949c-cdf0fc9646a5-server-conf" (OuterVolumeSpecName: "server-conf") pod "e617bcf9-daaa-4a7a-949c-cdf0fc9646a5" (UID: "e617bcf9-daaa-4a7a-949c-cdf0fc9646a5"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.608312 4826 scope.go:117] "RemoveContainer" containerID="9f47c3da92f7e4ec1425e458bec96773c98df6c88e2e80157791087f2af7f4bd" Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.611840 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e617bcf9-daaa-4a7a-949c-cdf0fc9646a5-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "e617bcf9-daaa-4a7a-949c-cdf0fc9646a5" (UID: "e617bcf9-daaa-4a7a-949c-cdf0fc9646a5"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.648242 4826 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e617bcf9-daaa-4a7a-949c-cdf0fc9646a5-server-conf\") on node \"crc\" DevicePath \"\"" Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.648548 4826 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e617bcf9-daaa-4a7a-949c-cdf0fc9646a5-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.648585 4826 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-a4b76c04-891e-4a0b-9bcd-c8581b59c5c0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a4b76c04-891e-4a0b-9bcd-c8581b59c5c0\") on node \"crc\" " Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.656975 4826 scope.go:117] "RemoveContainer" containerID="d5305552064e09b571620e69bf94a0e89302212f965b53b1173ceb0e701975c3" Mar 19 19:24:31 crc kubenswrapper[4826]: E0319 19:24:31.657763 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5305552064e09b571620e69bf94a0e89302212f965b53b1173ceb0e701975c3\": container with ID starting with d5305552064e09b571620e69bf94a0e89302212f965b53b1173ceb0e701975c3 not found: ID does not exist" containerID="d5305552064e09b571620e69bf94a0e89302212f965b53b1173ceb0e701975c3" Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.657805 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5305552064e09b571620e69bf94a0e89302212f965b53b1173ceb0e701975c3"} err="failed to get container status \"d5305552064e09b571620e69bf94a0e89302212f965b53b1173ceb0e701975c3\": rpc error: code = NotFound desc = could not find container \"d5305552064e09b571620e69bf94a0e89302212f965b53b1173ceb0e701975c3\": container with ID starting with d5305552064e09b571620e69bf94a0e89302212f965b53b1173ceb0e701975c3 not found: ID does not exist" Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.657832 4826 scope.go:117] "RemoveContainer" containerID="9f47c3da92f7e4ec1425e458bec96773c98df6c88e2e80157791087f2af7f4bd" Mar 19 19:24:31 crc kubenswrapper[4826]: E0319 19:24:31.658805 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f47c3da92f7e4ec1425e458bec96773c98df6c88e2e80157791087f2af7f4bd\": container with ID starting with 9f47c3da92f7e4ec1425e458bec96773c98df6c88e2e80157791087f2af7f4bd not found: ID does not exist" containerID="9f47c3da92f7e4ec1425e458bec96773c98df6c88e2e80157791087f2af7f4bd" Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.658870 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f47c3da92f7e4ec1425e458bec96773c98df6c88e2e80157791087f2af7f4bd"} err="failed to get container status \"9f47c3da92f7e4ec1425e458bec96773c98df6c88e2e80157791087f2af7f4bd\": rpc error: code = NotFound desc = could not find container \"9f47c3da92f7e4ec1425e458bec96773c98df6c88e2e80157791087f2af7f4bd\": container with ID starting with 9f47c3da92f7e4ec1425e458bec96773c98df6c88e2e80157791087f2af7f4bd not found: ID does not exist" Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.689634 4826 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.689815 4826 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-a4b76c04-891e-4a0b-9bcd-c8581b59c5c0" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a4b76c04-891e-4a0b-9bcd-c8581b59c5c0") on node "crc" Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.750722 4826 reconciler_common.go:293] "Volume detached for volume \"pvc-a4b76c04-891e-4a0b-9bcd-c8581b59c5c0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a4b76c04-891e-4a0b-9bcd-c8581b59c5c0\") on node \"crc\" DevicePath \"\"" Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.780429 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.794882 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.814012 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 19:24:31 crc kubenswrapper[4826]: E0319 19:24:31.814468 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0eadd329-3574-4dcc-a93d-163d850caa5a" containerName="oc" Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.815108 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="0eadd329-3574-4dcc-a93d-163d850caa5a" containerName="oc" Mar 19 19:24:31 crc kubenswrapper[4826]: E0319 19:24:31.815166 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e617bcf9-daaa-4a7a-949c-cdf0fc9646a5" containerName="setup-container" Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.815173 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="e617bcf9-daaa-4a7a-949c-cdf0fc9646a5" containerName="setup-container" Mar 19 19:24:31 crc kubenswrapper[4826]: E0319 19:24:31.815183 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e617bcf9-daaa-4a7a-949c-cdf0fc9646a5" containerName="rabbitmq" Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.815191 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="e617bcf9-daaa-4a7a-949c-cdf0fc9646a5" containerName="rabbitmq" Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.815431 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="0eadd329-3574-4dcc-a93d-163d850caa5a" containerName="oc" Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.815472 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="e617bcf9-daaa-4a7a-949c-cdf0fc9646a5" containerName="rabbitmq" Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.816693 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.835673 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.954358 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c7e2447b-d047-4e45-a992-ffa82c1c5215-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c7e2447b-d047-4e45-a992-ffa82c1c5215\") " pod="openstack/rabbitmq-server-0" Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.954393 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c7e2447b-d047-4e45-a992-ffa82c1c5215-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c7e2447b-d047-4e45-a992-ffa82c1c5215\") " pod="openstack/rabbitmq-server-0" Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.954428 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c7e2447b-d047-4e45-a992-ffa82c1c5215-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c7e2447b-d047-4e45-a992-ffa82c1c5215\") " pod="openstack/rabbitmq-server-0" Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.954452 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp4tb\" (UniqueName: \"kubernetes.io/projected/c7e2447b-d047-4e45-a992-ffa82c1c5215-kube-api-access-kp4tb\") pod \"rabbitmq-server-0\" (UID: \"c7e2447b-d047-4e45-a992-ffa82c1c5215\") " pod="openstack/rabbitmq-server-0" Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.954476 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a4b76c04-891e-4a0b-9bcd-c8581b59c5c0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a4b76c04-891e-4a0b-9bcd-c8581b59c5c0\") pod \"rabbitmq-server-0\" (UID: \"c7e2447b-d047-4e45-a992-ffa82c1c5215\") " pod="openstack/rabbitmq-server-0" Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.954493 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c7e2447b-d047-4e45-a992-ffa82c1c5215-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c7e2447b-d047-4e45-a992-ffa82c1c5215\") " pod="openstack/rabbitmq-server-0" Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.954516 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c7e2447b-d047-4e45-a992-ffa82c1c5215-config-data\") pod \"rabbitmq-server-0\" (UID: \"c7e2447b-d047-4e45-a992-ffa82c1c5215\") " pod="openstack/rabbitmq-server-0" Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.954643 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c7e2447b-d047-4e45-a992-ffa82c1c5215-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c7e2447b-d047-4e45-a992-ffa82c1c5215\") " pod="openstack/rabbitmq-server-0" Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.954677 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c7e2447b-d047-4e45-a992-ffa82c1c5215-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c7e2447b-d047-4e45-a992-ffa82c1c5215\") " pod="openstack/rabbitmq-server-0" Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.954712 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c7e2447b-d047-4e45-a992-ffa82c1c5215-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c7e2447b-d047-4e45-a992-ffa82c1c5215\") " pod="openstack/rabbitmq-server-0" Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.954771 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c7e2447b-d047-4e45-a992-ffa82c1c5215-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c7e2447b-d047-4e45-a992-ffa82c1c5215\") " pod="openstack/rabbitmq-server-0" Mar 19 19:24:31 crc kubenswrapper[4826]: I0319 19:24:31.988596 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e617bcf9-daaa-4a7a-949c-cdf0fc9646a5" path="/var/lib/kubelet/pods/e617bcf9-daaa-4a7a-949c-cdf0fc9646a5/volumes" Mar 19 19:24:32 crc kubenswrapper[4826]: I0319 19:24:32.056895 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c7e2447b-d047-4e45-a992-ffa82c1c5215-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c7e2447b-d047-4e45-a992-ffa82c1c5215\") " pod="openstack/rabbitmq-server-0" Mar 19 19:24:32 crc kubenswrapper[4826]: I0319 19:24:32.056938 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c7e2447b-d047-4e45-a992-ffa82c1c5215-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c7e2447b-d047-4e45-a992-ffa82c1c5215\") " pod="openstack/rabbitmq-server-0" Mar 19 19:24:32 crc kubenswrapper[4826]: I0319 19:24:32.057017 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c7e2447b-d047-4e45-a992-ffa82c1c5215-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c7e2447b-d047-4e45-a992-ffa82c1c5215\") " pod="openstack/rabbitmq-server-0" Mar 19 19:24:32 crc kubenswrapper[4826]: I0319 19:24:32.057124 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c7e2447b-d047-4e45-a992-ffa82c1c5215-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c7e2447b-d047-4e45-a992-ffa82c1c5215\") " pod="openstack/rabbitmq-server-0" Mar 19 19:24:32 crc kubenswrapper[4826]: I0319 19:24:32.057254 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c7e2447b-d047-4e45-a992-ffa82c1c5215-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c7e2447b-d047-4e45-a992-ffa82c1c5215\") " pod="openstack/rabbitmq-server-0" Mar 19 19:24:32 crc kubenswrapper[4826]: I0319 19:24:32.057270 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c7e2447b-d047-4e45-a992-ffa82c1c5215-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c7e2447b-d047-4e45-a992-ffa82c1c5215\") " pod="openstack/rabbitmq-server-0" Mar 19 19:24:32 crc kubenswrapper[4826]: I0319 19:24:32.057309 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c7e2447b-d047-4e45-a992-ffa82c1c5215-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c7e2447b-d047-4e45-a992-ffa82c1c5215\") " pod="openstack/rabbitmq-server-0" Mar 19 19:24:32 crc kubenswrapper[4826]: I0319 19:24:32.057345 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kp4tb\" (UniqueName: \"kubernetes.io/projected/c7e2447b-d047-4e45-a992-ffa82c1c5215-kube-api-access-kp4tb\") pod \"rabbitmq-server-0\" (UID: \"c7e2447b-d047-4e45-a992-ffa82c1c5215\") " pod="openstack/rabbitmq-server-0" Mar 19 19:24:32 crc kubenswrapper[4826]: I0319 19:24:32.057365 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c7e2447b-d047-4e45-a992-ffa82c1c5215-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c7e2447b-d047-4e45-a992-ffa82c1c5215\") " pod="openstack/rabbitmq-server-0" Mar 19 19:24:32 crc kubenswrapper[4826]: I0319 19:24:32.057388 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a4b76c04-891e-4a0b-9bcd-c8581b59c5c0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a4b76c04-891e-4a0b-9bcd-c8581b59c5c0\") pod \"rabbitmq-server-0\" (UID: \"c7e2447b-d047-4e45-a992-ffa82c1c5215\") " pod="openstack/rabbitmq-server-0" Mar 19 19:24:32 crc kubenswrapper[4826]: I0319 19:24:32.057414 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c7e2447b-d047-4e45-a992-ffa82c1c5215-config-data\") pod \"rabbitmq-server-0\" (UID: \"c7e2447b-d047-4e45-a992-ffa82c1c5215\") " pod="openstack/rabbitmq-server-0" Mar 19 19:24:32 crc kubenswrapper[4826]: I0319 19:24:32.059107 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c7e2447b-d047-4e45-a992-ffa82c1c5215-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c7e2447b-d047-4e45-a992-ffa82c1c5215\") " pod="openstack/rabbitmq-server-0" Mar 19 19:24:32 crc kubenswrapper[4826]: I0319 19:24:32.059377 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c7e2447b-d047-4e45-a992-ffa82c1c5215-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c7e2447b-d047-4e45-a992-ffa82c1c5215\") " pod="openstack/rabbitmq-server-0" Mar 19 19:24:32 crc kubenswrapper[4826]: I0319 19:24:32.059637 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c7e2447b-d047-4e45-a992-ffa82c1c5215-config-data\") pod \"rabbitmq-server-0\" (UID: \"c7e2447b-d047-4e45-a992-ffa82c1c5215\") " pod="openstack/rabbitmq-server-0" Mar 19 19:24:32 crc kubenswrapper[4826]: I0319 19:24:32.060195 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c7e2447b-d047-4e45-a992-ffa82c1c5215-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c7e2447b-d047-4e45-a992-ffa82c1c5215\") " pod="openstack/rabbitmq-server-0" Mar 19 19:24:32 crc kubenswrapper[4826]: I0319 19:24:32.060260 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c7e2447b-d047-4e45-a992-ffa82c1c5215-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c7e2447b-d047-4e45-a992-ffa82c1c5215\") " pod="openstack/rabbitmq-server-0" Mar 19 19:24:32 crc kubenswrapper[4826]: I0319 19:24:32.064520 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c7e2447b-d047-4e45-a992-ffa82c1c5215-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c7e2447b-d047-4e45-a992-ffa82c1c5215\") " pod="openstack/rabbitmq-server-0" Mar 19 19:24:32 crc kubenswrapper[4826]: I0319 19:24:32.064823 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c7e2447b-d047-4e45-a992-ffa82c1c5215-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c7e2447b-d047-4e45-a992-ffa82c1c5215\") " pod="openstack/rabbitmq-server-0" Mar 19 19:24:32 crc kubenswrapper[4826]: I0319 19:24:32.064904 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c7e2447b-d047-4e45-a992-ffa82c1c5215-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c7e2447b-d047-4e45-a992-ffa82c1c5215\") " pod="openstack/rabbitmq-server-0" Mar 19 19:24:32 crc kubenswrapper[4826]: I0319 19:24:32.066665 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c7e2447b-d047-4e45-a992-ffa82c1c5215-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c7e2447b-d047-4e45-a992-ffa82c1c5215\") " pod="openstack/rabbitmq-server-0" Mar 19 19:24:32 crc kubenswrapper[4826]: I0319 19:24:32.066816 4826 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 19:24:32 crc kubenswrapper[4826]: I0319 19:24:32.066850 4826 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a4b76c04-891e-4a0b-9bcd-c8581b59c5c0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a4b76c04-891e-4a0b-9bcd-c8581b59c5c0\") pod \"rabbitmq-server-0\" (UID: \"c7e2447b-d047-4e45-a992-ffa82c1c5215\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/474e894b78a3d9a61a889b2bf5e73544c970fa007f104547e7045d9e5c9c2882/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 19 19:24:32 crc kubenswrapper[4826]: I0319 19:24:32.091183 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp4tb\" (UniqueName: \"kubernetes.io/projected/c7e2447b-d047-4e45-a992-ffa82c1c5215-kube-api-access-kp4tb\") pod \"rabbitmq-server-0\" (UID: \"c7e2447b-d047-4e45-a992-ffa82c1c5215\") " pod="openstack/rabbitmq-server-0" Mar 19 19:24:32 crc kubenswrapper[4826]: I0319 19:24:32.130292 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a4b76c04-891e-4a0b-9bcd-c8581b59c5c0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a4b76c04-891e-4a0b-9bcd-c8581b59c5c0\") pod \"rabbitmq-server-0\" (UID: \"c7e2447b-d047-4e45-a992-ffa82c1c5215\") " pod="openstack/rabbitmq-server-0" Mar 19 19:24:32 crc kubenswrapper[4826]: I0319 19:24:32.134868 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 19 19:24:32 crc kubenswrapper[4826]: I0319 19:24:32.714511 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 19:24:33 crc kubenswrapper[4826]: I0319 19:24:33.473083 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c7e2447b-d047-4e45-a992-ffa82c1c5215","Type":"ContainerStarted","Data":"d86025f13099796faf8931d1055cfa7dc40ea35191a262d4f259bccf31bb7e07"} Mar 19 19:24:35 crc kubenswrapper[4826]: I0319 19:24:35.507205 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c7e2447b-d047-4e45-a992-ffa82c1c5215","Type":"ContainerStarted","Data":"6c53e39ed8c7b29878bf34d51c6227235bf17f953466027d22c9e0a5bb748e1d"} Mar 19 19:24:37 crc kubenswrapper[4826]: I0319 19:24:37.976536 4826 scope.go:117] "RemoveContainer" containerID="856447f1cdc796c080402d3bfb76d7471741ca95039714006756d0cb980e424c" Mar 19 19:24:37 crc kubenswrapper[4826]: E0319 19:24:37.977067 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:24:52 crc kubenswrapper[4826]: I0319 19:24:52.976066 4826 scope.go:117] "RemoveContainer" containerID="856447f1cdc796c080402d3bfb76d7471741ca95039714006756d0cb980e424c" Mar 19 19:24:52 crc kubenswrapper[4826]: E0319 19:24:52.977061 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:25:05 crc kubenswrapper[4826]: I0319 19:25:05.995717 4826 scope.go:117] "RemoveContainer" containerID="856447f1cdc796c080402d3bfb76d7471741ca95039714006756d0cb980e424c" Mar 19 19:25:05 crc kubenswrapper[4826]: E0319 19:25:05.997472 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:25:07 crc kubenswrapper[4826]: I0319 19:25:07.983435 4826 generic.go:334] "Generic (PLEG): container finished" podID="c7e2447b-d047-4e45-a992-ffa82c1c5215" containerID="6c53e39ed8c7b29878bf34d51c6227235bf17f953466027d22c9e0a5bb748e1d" exitCode=0 Mar 19 19:25:08 crc kubenswrapper[4826]: I0319 19:25:08.003995 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c7e2447b-d047-4e45-a992-ffa82c1c5215","Type":"ContainerDied","Data":"6c53e39ed8c7b29878bf34d51c6227235bf17f953466027d22c9e0a5bb748e1d"} Mar 19 19:25:09 crc kubenswrapper[4826]: I0319 19:25:09.008692 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c7e2447b-d047-4e45-a992-ffa82c1c5215","Type":"ContainerStarted","Data":"b73c162b34ec7ce4ca92f94694e39abc1f1e6564bcf4c10a78dd8a49758ef6e4"} Mar 19 19:25:09 crc kubenswrapper[4826]: I0319 19:25:09.011227 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 19 19:25:09 crc kubenswrapper[4826]: I0319 19:25:09.056849 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.056802091 podStartE2EDuration="38.056802091s" podCreationTimestamp="2026-03-19 19:24:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:25:09.039648559 +0000 UTC m=+1733.793716932" watchObservedRunningTime="2026-03-19 19:25:09.056802091 +0000 UTC m=+1733.810870424" Mar 19 19:25:17 crc kubenswrapper[4826]: I0319 19:25:17.976112 4826 scope.go:117] "RemoveContainer" containerID="856447f1cdc796c080402d3bfb76d7471741ca95039714006756d0cb980e424c" Mar 19 19:25:17 crc kubenswrapper[4826]: E0319 19:25:17.977025 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:25:22 crc kubenswrapper[4826]: I0319 19:25:22.141830 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 19 19:25:28 crc kubenswrapper[4826]: I0319 19:25:28.840433 4826 scope.go:117] "RemoveContainer" containerID="d4baa6814b918b1dfbc41b1d24266caac18661ff58725f3f3044a81bdf0da630" Mar 19 19:25:29 crc kubenswrapper[4826]: I0319 19:25:29.976488 4826 scope.go:117] "RemoveContainer" containerID="856447f1cdc796c080402d3bfb76d7471741ca95039714006756d0cb980e424c" Mar 19 19:25:29 crc kubenswrapper[4826]: E0319 19:25:29.977107 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:25:42 crc kubenswrapper[4826]: I0319 19:25:42.977451 4826 scope.go:117] "RemoveContainer" containerID="856447f1cdc796c080402d3bfb76d7471741ca95039714006756d0cb980e424c" Mar 19 19:25:42 crc kubenswrapper[4826]: E0319 19:25:42.978746 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:25:53 crc kubenswrapper[4826]: I0319 19:25:53.978120 4826 scope.go:117] "RemoveContainer" containerID="856447f1cdc796c080402d3bfb76d7471741ca95039714006756d0cb980e424c" Mar 19 19:25:53 crc kubenswrapper[4826]: E0319 19:25:53.979639 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:26:00 crc kubenswrapper[4826]: I0319 19:26:00.167589 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565806-qjm7v"] Mar 19 19:26:00 crc kubenswrapper[4826]: I0319 19:26:00.170154 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565806-qjm7v" Mar 19 19:26:00 crc kubenswrapper[4826]: I0319 19:26:00.172235 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b27wl" Mar 19 19:26:00 crc kubenswrapper[4826]: I0319 19:26:00.173000 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 19:26:00 crc kubenswrapper[4826]: I0319 19:26:00.174705 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 19:26:00 crc kubenswrapper[4826]: I0319 19:26:00.205060 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565806-qjm7v"] Mar 19 19:26:00 crc kubenswrapper[4826]: I0319 19:26:00.258864 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fgv8\" (UniqueName: \"kubernetes.io/projected/5ab15c7e-f20f-44a8-aa1f-4156be2adec9-kube-api-access-7fgv8\") pod \"auto-csr-approver-29565806-qjm7v\" (UID: \"5ab15c7e-f20f-44a8-aa1f-4156be2adec9\") " pod="openshift-infra/auto-csr-approver-29565806-qjm7v" Mar 19 19:26:00 crc kubenswrapper[4826]: I0319 19:26:00.361454 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fgv8\" (UniqueName: \"kubernetes.io/projected/5ab15c7e-f20f-44a8-aa1f-4156be2adec9-kube-api-access-7fgv8\") pod \"auto-csr-approver-29565806-qjm7v\" (UID: \"5ab15c7e-f20f-44a8-aa1f-4156be2adec9\") " pod="openshift-infra/auto-csr-approver-29565806-qjm7v" Mar 19 19:26:00 crc kubenswrapper[4826]: I0319 19:26:00.385048 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fgv8\" (UniqueName: \"kubernetes.io/projected/5ab15c7e-f20f-44a8-aa1f-4156be2adec9-kube-api-access-7fgv8\") pod \"auto-csr-approver-29565806-qjm7v\" (UID: \"5ab15c7e-f20f-44a8-aa1f-4156be2adec9\") " pod="openshift-infra/auto-csr-approver-29565806-qjm7v" Mar 19 19:26:00 crc kubenswrapper[4826]: I0319 19:26:00.501648 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565806-qjm7v" Mar 19 19:26:01 crc kubenswrapper[4826]: I0319 19:26:01.106651 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565806-qjm7v"] Mar 19 19:26:01 crc kubenswrapper[4826]: I0319 19:26:01.744632 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565806-qjm7v" event={"ID":"5ab15c7e-f20f-44a8-aa1f-4156be2adec9","Type":"ContainerStarted","Data":"6296cbbfce6f353bd17b713c7e7cc6b41af79070df973fa1c6c40b517f17e057"} Mar 19 19:26:02 crc kubenswrapper[4826]: I0319 19:26:02.766024 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565806-qjm7v" event={"ID":"5ab15c7e-f20f-44a8-aa1f-4156be2adec9","Type":"ContainerStarted","Data":"b3f2a66dec58233f73318862b34c57797c4ba7d6fc294398f3c9e5eabaa806c1"} Mar 19 19:26:02 crc kubenswrapper[4826]: I0319 19:26:02.798030 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565806-qjm7v" podStartSLOduration=1.6859482639999999 podStartE2EDuration="2.798008588s" podCreationTimestamp="2026-03-19 19:26:00 +0000 UTC" firstStartedPulling="2026-03-19 19:26:01.106719348 +0000 UTC m=+1785.860787671" lastFinishedPulling="2026-03-19 19:26:02.218779682 +0000 UTC m=+1786.972847995" observedRunningTime="2026-03-19 19:26:02.777679839 +0000 UTC m=+1787.531748152" watchObservedRunningTime="2026-03-19 19:26:02.798008588 +0000 UTC m=+1787.552076901" Mar 19 19:26:03 crc kubenswrapper[4826]: I0319 19:26:03.782292 4826 generic.go:334] "Generic (PLEG): container finished" podID="5ab15c7e-f20f-44a8-aa1f-4156be2adec9" containerID="b3f2a66dec58233f73318862b34c57797c4ba7d6fc294398f3c9e5eabaa806c1" exitCode=0 Mar 19 19:26:03 crc kubenswrapper[4826]: I0319 19:26:03.782408 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565806-qjm7v" event={"ID":"5ab15c7e-f20f-44a8-aa1f-4156be2adec9","Type":"ContainerDied","Data":"b3f2a66dec58233f73318862b34c57797c4ba7d6fc294398f3c9e5eabaa806c1"} Mar 19 19:26:05 crc kubenswrapper[4826]: I0319 19:26:05.220106 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565806-qjm7v" Mar 19 19:26:05 crc kubenswrapper[4826]: I0319 19:26:05.303708 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fgv8\" (UniqueName: \"kubernetes.io/projected/5ab15c7e-f20f-44a8-aa1f-4156be2adec9-kube-api-access-7fgv8\") pod \"5ab15c7e-f20f-44a8-aa1f-4156be2adec9\" (UID: \"5ab15c7e-f20f-44a8-aa1f-4156be2adec9\") " Mar 19 19:26:05 crc kubenswrapper[4826]: I0319 19:26:05.310909 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ab15c7e-f20f-44a8-aa1f-4156be2adec9-kube-api-access-7fgv8" (OuterVolumeSpecName: "kube-api-access-7fgv8") pod "5ab15c7e-f20f-44a8-aa1f-4156be2adec9" (UID: "5ab15c7e-f20f-44a8-aa1f-4156be2adec9"). InnerVolumeSpecName "kube-api-access-7fgv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:26:05 crc kubenswrapper[4826]: I0319 19:26:05.407248 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7fgv8\" (UniqueName: \"kubernetes.io/projected/5ab15c7e-f20f-44a8-aa1f-4156be2adec9-kube-api-access-7fgv8\") on node \"crc\" DevicePath \"\"" Mar 19 19:26:05 crc kubenswrapper[4826]: I0319 19:26:05.814617 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565806-qjm7v" event={"ID":"5ab15c7e-f20f-44a8-aa1f-4156be2adec9","Type":"ContainerDied","Data":"6296cbbfce6f353bd17b713c7e7cc6b41af79070df973fa1c6c40b517f17e057"} Mar 19 19:26:05 crc kubenswrapper[4826]: I0319 19:26:05.814896 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6296cbbfce6f353bd17b713c7e7cc6b41af79070df973fa1c6c40b517f17e057" Mar 19 19:26:05 crc kubenswrapper[4826]: I0319 19:26:05.814757 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565806-qjm7v" Mar 19 19:26:05 crc kubenswrapper[4826]: I0319 19:26:05.873619 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565800-l7j4f"] Mar 19 19:26:05 crc kubenswrapper[4826]: I0319 19:26:05.887209 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565800-l7j4f"] Mar 19 19:26:05 crc kubenswrapper[4826]: I0319 19:26:05.989324 4826 scope.go:117] "RemoveContainer" containerID="856447f1cdc796c080402d3bfb76d7471741ca95039714006756d0cb980e424c" Mar 19 19:26:05 crc kubenswrapper[4826]: E0319 19:26:05.989709 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:26:05 crc kubenswrapper[4826]: I0319 19:26:05.992367 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb05aceb-7f5c-42c3-a9a6-1242def4b9cc" path="/var/lib/kubelet/pods/eb05aceb-7f5c-42c3-a9a6-1242def4b9cc/volumes" Mar 19 19:26:16 crc kubenswrapper[4826]: I0319 19:26:16.977254 4826 scope.go:117] "RemoveContainer" containerID="856447f1cdc796c080402d3bfb76d7471741ca95039714006756d0cb980e424c" Mar 19 19:26:16 crc kubenswrapper[4826]: E0319 19:26:16.978065 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:26:28 crc kubenswrapper[4826]: I0319 19:26:28.988629 4826 scope.go:117] "RemoveContainer" containerID="dad70c4189d94827d3fb69e4bb747fcd71cb6a4ae9acbcdbf1334d3d619c5756" Mar 19 19:26:29 crc kubenswrapper[4826]: I0319 19:26:29.977685 4826 scope.go:117] "RemoveContainer" containerID="856447f1cdc796c080402d3bfb76d7471741ca95039714006756d0cb980e424c" Mar 19 19:26:29 crc kubenswrapper[4826]: E0319 19:26:29.978261 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:26:39 crc kubenswrapper[4826]: I0319 19:26:39.285771 4826 generic.go:334] "Generic (PLEG): container finished" podID="a5c0489c-9ec7-4851-b96a-d2cebe602bf2" containerID="082a366f0381fbf2c5caf4c2142668b55cefaccf11a958b587be50142df6ad27" exitCode=0 Mar 19 19:26:39 crc kubenswrapper[4826]: I0319 19:26:39.286265 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-flqqq" event={"ID":"a5c0489c-9ec7-4851-b96a-d2cebe602bf2","Type":"ContainerDied","Data":"082a366f0381fbf2c5caf4c2142668b55cefaccf11a958b587be50142df6ad27"} Mar 19 19:26:40 crc kubenswrapper[4826]: I0319 19:26:40.891811 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-flqqq" Mar 19 19:26:41 crc kubenswrapper[4826]: I0319 19:26:41.013544 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crl4n\" (UniqueName: \"kubernetes.io/projected/a5c0489c-9ec7-4851-b96a-d2cebe602bf2-kube-api-access-crl4n\") pod \"a5c0489c-9ec7-4851-b96a-d2cebe602bf2\" (UID: \"a5c0489c-9ec7-4851-b96a-d2cebe602bf2\") " Mar 19 19:26:41 crc kubenswrapper[4826]: I0319 19:26:41.013624 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5c0489c-9ec7-4851-b96a-d2cebe602bf2-bootstrap-combined-ca-bundle\") pod \"a5c0489c-9ec7-4851-b96a-d2cebe602bf2\" (UID: \"a5c0489c-9ec7-4851-b96a-d2cebe602bf2\") " Mar 19 19:26:41 crc kubenswrapper[4826]: I0319 19:26:41.013857 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5c0489c-9ec7-4851-b96a-d2cebe602bf2-inventory\") pod \"a5c0489c-9ec7-4851-b96a-d2cebe602bf2\" (UID: \"a5c0489c-9ec7-4851-b96a-d2cebe602bf2\") " Mar 19 19:26:41 crc kubenswrapper[4826]: I0319 19:26:41.013918 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a5c0489c-9ec7-4851-b96a-d2cebe602bf2-ssh-key-openstack-edpm-ipam\") pod \"a5c0489c-9ec7-4851-b96a-d2cebe602bf2\" (UID: \"a5c0489c-9ec7-4851-b96a-d2cebe602bf2\") " Mar 19 19:26:41 crc kubenswrapper[4826]: I0319 19:26:41.019485 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5c0489c-9ec7-4851-b96a-d2cebe602bf2-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "a5c0489c-9ec7-4851-b96a-d2cebe602bf2" (UID: "a5c0489c-9ec7-4851-b96a-d2cebe602bf2"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:26:41 crc kubenswrapper[4826]: I0319 19:26:41.029969 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5c0489c-9ec7-4851-b96a-d2cebe602bf2-kube-api-access-crl4n" (OuterVolumeSpecName: "kube-api-access-crl4n") pod "a5c0489c-9ec7-4851-b96a-d2cebe602bf2" (UID: "a5c0489c-9ec7-4851-b96a-d2cebe602bf2"). InnerVolumeSpecName "kube-api-access-crl4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:26:41 crc kubenswrapper[4826]: I0319 19:26:41.053140 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5c0489c-9ec7-4851-b96a-d2cebe602bf2-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a5c0489c-9ec7-4851-b96a-d2cebe602bf2" (UID: "a5c0489c-9ec7-4851-b96a-d2cebe602bf2"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:26:41 crc kubenswrapper[4826]: I0319 19:26:41.053555 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5c0489c-9ec7-4851-b96a-d2cebe602bf2-inventory" (OuterVolumeSpecName: "inventory") pod "a5c0489c-9ec7-4851-b96a-d2cebe602bf2" (UID: "a5c0489c-9ec7-4851-b96a-d2cebe602bf2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:26:41 crc kubenswrapper[4826]: I0319 19:26:41.117426 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crl4n\" (UniqueName: \"kubernetes.io/projected/a5c0489c-9ec7-4851-b96a-d2cebe602bf2-kube-api-access-crl4n\") on node \"crc\" DevicePath \"\"" Mar 19 19:26:41 crc kubenswrapper[4826]: I0319 19:26:41.117461 4826 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5c0489c-9ec7-4851-b96a-d2cebe602bf2-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:26:41 crc kubenswrapper[4826]: I0319 19:26:41.117475 4826 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5c0489c-9ec7-4851-b96a-d2cebe602bf2-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 19:26:41 crc kubenswrapper[4826]: I0319 19:26:41.117488 4826 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a5c0489c-9ec7-4851-b96a-d2cebe602bf2-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 19:26:41 crc kubenswrapper[4826]: I0319 19:26:41.313719 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-flqqq" event={"ID":"a5c0489c-9ec7-4851-b96a-d2cebe602bf2","Type":"ContainerDied","Data":"9ab6392976685a713233f75b3b7a4a12d0ef4c6de7bf5b52b7cc881e31274e6c"} Mar 19 19:26:41 crc kubenswrapper[4826]: I0319 19:26:41.313947 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ab6392976685a713233f75b3b7a4a12d0ef4c6de7bf5b52b7cc881e31274e6c" Mar 19 19:26:41 crc kubenswrapper[4826]: I0319 19:26:41.313843 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-flqqq" Mar 19 19:26:41 crc kubenswrapper[4826]: I0319 19:26:41.428434 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5zwm5"] Mar 19 19:26:41 crc kubenswrapper[4826]: E0319 19:26:41.429433 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5c0489c-9ec7-4851-b96a-d2cebe602bf2" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 19 19:26:41 crc kubenswrapper[4826]: I0319 19:26:41.429462 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5c0489c-9ec7-4851-b96a-d2cebe602bf2" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 19 19:26:41 crc kubenswrapper[4826]: E0319 19:26:41.429490 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ab15c7e-f20f-44a8-aa1f-4156be2adec9" containerName="oc" Mar 19 19:26:41 crc kubenswrapper[4826]: I0319 19:26:41.429498 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ab15c7e-f20f-44a8-aa1f-4156be2adec9" containerName="oc" Mar 19 19:26:41 crc kubenswrapper[4826]: I0319 19:26:41.429798 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ab15c7e-f20f-44a8-aa1f-4156be2adec9" containerName="oc" Mar 19 19:26:41 crc kubenswrapper[4826]: I0319 19:26:41.429830 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5c0489c-9ec7-4851-b96a-d2cebe602bf2" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 19 19:26:41 crc kubenswrapper[4826]: I0319 19:26:41.430864 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5zwm5" Mar 19 19:26:41 crc kubenswrapper[4826]: I0319 19:26:41.433819 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 19:26:41 crc kubenswrapper[4826]: I0319 19:26:41.434080 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jchxw" Mar 19 19:26:41 crc kubenswrapper[4826]: I0319 19:26:41.434511 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 19:26:41 crc kubenswrapper[4826]: I0319 19:26:41.437439 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 19:26:41 crc kubenswrapper[4826]: I0319 19:26:41.444716 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5zwm5"] Mar 19 19:26:41 crc kubenswrapper[4826]: I0319 19:26:41.527489 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e15b97af-3c47-4d33-82d0-20e627959de3-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5zwm5\" (UID: \"e15b97af-3c47-4d33-82d0-20e627959de3\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5zwm5" Mar 19 19:26:41 crc kubenswrapper[4826]: I0319 19:26:41.527767 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e15b97af-3c47-4d33-82d0-20e627959de3-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5zwm5\" (UID: \"e15b97af-3c47-4d33-82d0-20e627959de3\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5zwm5" Mar 19 19:26:41 crc kubenswrapper[4826]: I0319 19:26:41.527824 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7l2mv\" (UniqueName: \"kubernetes.io/projected/e15b97af-3c47-4d33-82d0-20e627959de3-kube-api-access-7l2mv\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5zwm5\" (UID: \"e15b97af-3c47-4d33-82d0-20e627959de3\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5zwm5" Mar 19 19:26:41 crc kubenswrapper[4826]: I0319 19:26:41.630148 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e15b97af-3c47-4d33-82d0-20e627959de3-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5zwm5\" (UID: \"e15b97af-3c47-4d33-82d0-20e627959de3\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5zwm5" Mar 19 19:26:41 crc kubenswrapper[4826]: I0319 19:26:41.630325 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e15b97af-3c47-4d33-82d0-20e627959de3-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5zwm5\" (UID: \"e15b97af-3c47-4d33-82d0-20e627959de3\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5zwm5" Mar 19 19:26:41 crc kubenswrapper[4826]: I0319 19:26:41.630377 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7l2mv\" (UniqueName: \"kubernetes.io/projected/e15b97af-3c47-4d33-82d0-20e627959de3-kube-api-access-7l2mv\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5zwm5\" (UID: \"e15b97af-3c47-4d33-82d0-20e627959de3\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5zwm5" Mar 19 19:26:41 crc kubenswrapper[4826]: I0319 19:26:41.635250 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e15b97af-3c47-4d33-82d0-20e627959de3-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5zwm5\" (UID: \"e15b97af-3c47-4d33-82d0-20e627959de3\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5zwm5" Mar 19 19:26:41 crc kubenswrapper[4826]: I0319 19:26:41.637926 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e15b97af-3c47-4d33-82d0-20e627959de3-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5zwm5\" (UID: \"e15b97af-3c47-4d33-82d0-20e627959de3\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5zwm5" Mar 19 19:26:41 crc kubenswrapper[4826]: I0319 19:26:41.652315 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7l2mv\" (UniqueName: \"kubernetes.io/projected/e15b97af-3c47-4d33-82d0-20e627959de3-kube-api-access-7l2mv\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5zwm5\" (UID: \"e15b97af-3c47-4d33-82d0-20e627959de3\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5zwm5" Mar 19 19:26:41 crc kubenswrapper[4826]: I0319 19:26:41.763634 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5zwm5" Mar 19 19:26:41 crc kubenswrapper[4826]: I0319 19:26:41.977568 4826 scope.go:117] "RemoveContainer" containerID="856447f1cdc796c080402d3bfb76d7471741ca95039714006756d0cb980e424c" Mar 19 19:26:41 crc kubenswrapper[4826]: E0319 19:26:41.978148 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:26:42 crc kubenswrapper[4826]: I0319 19:26:42.430336 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5zwm5"] Mar 19 19:26:42 crc kubenswrapper[4826]: W0319 19:26:42.435977 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode15b97af_3c47_4d33_82d0_20e627959de3.slice/crio-d1dea539aced12dc6809707098364259f2a819c2846df1cd2b54381dbb5ed5e0 WatchSource:0}: Error finding container d1dea539aced12dc6809707098364259f2a819c2846df1cd2b54381dbb5ed5e0: Status 404 returned error can't find the container with id d1dea539aced12dc6809707098364259f2a819c2846df1cd2b54381dbb5ed5e0 Mar 19 19:26:43 crc kubenswrapper[4826]: I0319 19:26:43.349768 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5zwm5" event={"ID":"e15b97af-3c47-4d33-82d0-20e627959de3","Type":"ContainerStarted","Data":"58eccce4e1b8fc75b2d803dc008f1b837d04fc3be3417022af66337ec98551be"} Mar 19 19:26:43 crc kubenswrapper[4826]: I0319 19:26:43.350446 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5zwm5" event={"ID":"e15b97af-3c47-4d33-82d0-20e627959de3","Type":"ContainerStarted","Data":"d1dea539aced12dc6809707098364259f2a819c2846df1cd2b54381dbb5ed5e0"} Mar 19 19:26:43 crc kubenswrapper[4826]: I0319 19:26:43.378253 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5zwm5" podStartSLOduration=1.953785006 podStartE2EDuration="2.378237141s" podCreationTimestamp="2026-03-19 19:26:41 +0000 UTC" firstStartedPulling="2026-03-19 19:26:42.44050479 +0000 UTC m=+1827.194573113" lastFinishedPulling="2026-03-19 19:26:42.864956935 +0000 UTC m=+1827.619025248" observedRunningTime="2026-03-19 19:26:43.373483885 +0000 UTC m=+1828.127552198" watchObservedRunningTime="2026-03-19 19:26:43.378237141 +0000 UTC m=+1828.132305454" Mar 19 19:26:52 crc kubenswrapper[4826]: I0319 19:26:52.976610 4826 scope.go:117] "RemoveContainer" containerID="856447f1cdc796c080402d3bfb76d7471741ca95039714006756d0cb980e424c" Mar 19 19:26:52 crc kubenswrapper[4826]: E0319 19:26:52.977704 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:27:03 crc kubenswrapper[4826]: I0319 19:27:03.110222 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-v8dg5"] Mar 19 19:27:03 crc kubenswrapper[4826]: I0319 19:27:03.150800 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-v8dg5"] Mar 19 19:27:03 crc kubenswrapper[4826]: I0319 19:27:03.168127 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-9d24-account-create-update-qr7td"] Mar 19 19:27:03 crc kubenswrapper[4826]: I0319 19:27:03.180723 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-9d24-account-create-update-qr7td"] Mar 19 19:27:03 crc kubenswrapper[4826]: I0319 19:27:03.192102 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-3908-account-create-update-qx49d"] Mar 19 19:27:03 crc kubenswrapper[4826]: I0319 19:27:03.203071 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-3908-account-create-update-qx49d"] Mar 19 19:27:03 crc kubenswrapper[4826]: I0319 19:27:03.977302 4826 scope.go:117] "RemoveContainer" containerID="856447f1cdc796c080402d3bfb76d7471741ca95039714006756d0cb980e424c" Mar 19 19:27:04 crc kubenswrapper[4826]: I0319 19:27:04.006344 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="330a9484-e555-4aa4-ae9a-7d09bf97571a" path="/var/lib/kubelet/pods/330a9484-e555-4aa4-ae9a-7d09bf97571a/volumes" Mar 19 19:27:04 crc kubenswrapper[4826]: I0319 19:27:04.007908 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8948d5dc-068b-4571-bd13-f0cbf3750004" path="/var/lib/kubelet/pods/8948d5dc-068b-4571-bd13-f0cbf3750004/volumes" Mar 19 19:27:04 crc kubenswrapper[4826]: I0319 19:27:04.008991 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6c70b2d-b15f-4fbe-a820-d24479ea0d40" path="/var/lib/kubelet/pods/a6c70b2d-b15f-4fbe-a820-d24479ea0d40/volumes" Mar 19 19:27:04 crc kubenswrapper[4826]: I0319 19:27:04.649308 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" event={"ID":"b456fa3f-c7a7-45ca-b560-e7a9b21be05a","Type":"ContainerStarted","Data":"daa7bd03e971974092a41659f4aba26392bc838aa5d2437fd4d817280d85c5e9"} Mar 19 19:27:08 crc kubenswrapper[4826]: I0319 19:27:08.077436 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-2pq4k"] Mar 19 19:27:08 crc kubenswrapper[4826]: I0319 19:27:08.103626 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-4236-account-create-update-cxc67"] Mar 19 19:27:08 crc kubenswrapper[4826]: I0319 19:27:08.125636 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-2pq4k"] Mar 19 19:27:08 crc kubenswrapper[4826]: I0319 19:27:08.138925 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-tg74r"] Mar 19 19:27:08 crc kubenswrapper[4826]: I0319 19:27:08.151825 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-pvnh2"] Mar 19 19:27:08 crc kubenswrapper[4826]: I0319 19:27:08.164529 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-4236-account-create-update-cxc67"] Mar 19 19:27:08 crc kubenswrapper[4826]: I0319 19:27:08.182397 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-8770-account-create-update-p98hn"] Mar 19 19:27:08 crc kubenswrapper[4826]: I0319 19:27:08.195775 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-pvnh2"] Mar 19 19:27:08 crc kubenswrapper[4826]: I0319 19:27:08.216697 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-tg74r"] Mar 19 19:27:08 crc kubenswrapper[4826]: I0319 19:27:08.239562 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-8770-account-create-update-p98hn"] Mar 19 19:27:09 crc kubenswrapper[4826]: I0319 19:27:09.997598 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a008787-669e-41e3-9178-b37bc657c710" path="/var/lib/kubelet/pods/0a008787-669e-41e3-9178-b37bc657c710/volumes" Mar 19 19:27:10 crc kubenswrapper[4826]: I0319 19:27:10.000225 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e3df38e-6b8b-4d2c-b45a-1ec0c7c03ac9" path="/var/lib/kubelet/pods/4e3df38e-6b8b-4d2c-b45a-1ec0c7c03ac9/volumes" Mar 19 19:27:10 crc kubenswrapper[4826]: I0319 19:27:10.001960 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="992fbead-aec8-4dbc-875a-d01481bdec46" path="/var/lib/kubelet/pods/992fbead-aec8-4dbc-875a-d01481bdec46/volumes" Mar 19 19:27:10 crc kubenswrapper[4826]: I0319 19:27:10.004568 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a413e441-0fbd-4400-84be-959ce0870e4e" path="/var/lib/kubelet/pods/a413e441-0fbd-4400-84be-959ce0870e4e/volumes" Mar 19 19:27:10 crc kubenswrapper[4826]: I0319 19:27:10.006912 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e43282f8-6f7c-4cb5-a9cd-79cc13a5be89" path="/var/lib/kubelet/pods/e43282f8-6f7c-4cb5-a9cd-79cc13a5be89/volumes" Mar 19 19:27:11 crc kubenswrapper[4826]: I0319 19:27:11.052361 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-kzx9b"] Mar 19 19:27:11 crc kubenswrapper[4826]: I0319 19:27:11.078373 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-9640-account-create-update-29jnn"] Mar 19 19:27:11 crc kubenswrapper[4826]: I0319 19:27:11.096785 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-kzx9b"] Mar 19 19:27:11 crc kubenswrapper[4826]: I0319 19:27:11.113237 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-9640-account-create-update-29jnn"] Mar 19 19:27:12 crc kubenswrapper[4826]: I0319 19:27:12.003464 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d279b05-60b2-4405-8ba1-11e707e145fe" path="/var/lib/kubelet/pods/4d279b05-60b2-4405-8ba1-11e707e145fe/volumes" Mar 19 19:27:12 crc kubenswrapper[4826]: I0319 19:27:12.005766 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c459bda1-b58d-4425-b401-1493252e282d" path="/var/lib/kubelet/pods/c459bda1-b58d-4425-b401-1493252e282d/volumes" Mar 19 19:27:15 crc kubenswrapper[4826]: I0319 19:27:15.042782 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-mk77d"] Mar 19 19:27:15 crc kubenswrapper[4826]: I0319 19:27:15.067075 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-mk77d"] Mar 19 19:27:15 crc kubenswrapper[4826]: I0319 19:27:15.998750 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="990ccce0-d217-45bb-a1de-f1da6d07e6f4" path="/var/lib/kubelet/pods/990ccce0-d217-45bb-a1de-f1da6d07e6f4/volumes" Mar 19 19:27:29 crc kubenswrapper[4826]: I0319 19:27:29.074836 4826 scope.go:117] "RemoveContainer" containerID="9aa31870c72ebdb95890aa6803e6340e13d0ef42dc8a9d3b05b88ee93d4924f8" Mar 19 19:27:29 crc kubenswrapper[4826]: I0319 19:27:29.116895 4826 scope.go:117] "RemoveContainer" containerID="4ed752e3c7d3c4d8d33dbb01bf0870a3d2b1f4de97cd4b0d27f49a7a72e04507" Mar 19 19:27:29 crc kubenswrapper[4826]: I0319 19:27:29.176310 4826 scope.go:117] "RemoveContainer" containerID="08316ceebdbdbe0ba6ffeda0832f85d21999a6b25a89134bbcbddbcd7aee20c4" Mar 19 19:27:29 crc kubenswrapper[4826]: I0319 19:27:29.245212 4826 scope.go:117] "RemoveContainer" containerID="3fe5f5bc1379a3e4a6de1823a8ed65844c5aee9768eb7d386cc56f6a4891adc1" Mar 19 19:27:29 crc kubenswrapper[4826]: I0319 19:27:29.277944 4826 scope.go:117] "RemoveContainer" containerID="63d65f9f523a0d9cd6afff7dfc7e7d72734abd437ede480556980f4a6b515765" Mar 19 19:27:29 crc kubenswrapper[4826]: I0319 19:27:29.329124 4826 scope.go:117] "RemoveContainer" containerID="27c7991ce5d2cc96d8ed0da4727b8a95d278d96e5d67b995acf8aa3e271bdcf2" Mar 19 19:27:29 crc kubenswrapper[4826]: I0319 19:27:29.384614 4826 scope.go:117] "RemoveContainer" containerID="926bfc6635ef53ba250973f547f5e9e47d5486dc664c3fc6813fb4c99d91666b" Mar 19 19:27:29 crc kubenswrapper[4826]: I0319 19:27:29.416213 4826 scope.go:117] "RemoveContainer" containerID="57a770e2babd457e5d0e0509b6e06c3cd5477b4dcf558f2b3169a5db02195e38" Mar 19 19:27:29 crc kubenswrapper[4826]: I0319 19:27:29.442516 4826 scope.go:117] "RemoveContainer" containerID="3087dbeb19b6d1d1ff271676afbdb2242509e1341844f8b9797c3b56b1f7236d" Mar 19 19:27:29 crc kubenswrapper[4826]: I0319 19:27:29.473066 4826 scope.go:117] "RemoveContainer" containerID="542933139d76ea84d17315b736ffc8e0232eda1953b1a59a7b95f6273643fa4f" Mar 19 19:27:29 crc kubenswrapper[4826]: I0319 19:27:29.510094 4826 scope.go:117] "RemoveContainer" containerID="25353c4ed41f810e11e5308d624954397db7245fdba59471262236628c14c0a9" Mar 19 19:27:29 crc kubenswrapper[4826]: I0319 19:27:29.530315 4826 scope.go:117] "RemoveContainer" containerID="7ce6eef6df66bd1bb3b844606039927fd901641babd83bd3643f424e56a42ffa" Mar 19 19:27:29 crc kubenswrapper[4826]: I0319 19:27:29.556749 4826 scope.go:117] "RemoveContainer" containerID="ceb48a59bd8031ce31f1a7b448d21311bbf6e5aecdc476c5598cb1b62e26981b" Mar 19 19:27:29 crc kubenswrapper[4826]: I0319 19:27:29.586131 4826 scope.go:117] "RemoveContainer" containerID="8b469432a0ecc0bb635b1e4504109b55673734f874c0b2fd95ffdc5772beb9f4" Mar 19 19:27:37 crc kubenswrapper[4826]: I0319 19:27:37.092751 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-dhqnj"] Mar 19 19:27:37 crc kubenswrapper[4826]: I0319 19:27:37.110414 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-stspj"] Mar 19 19:27:37 crc kubenswrapper[4826]: I0319 19:27:37.126111 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-fmgf5"] Mar 19 19:27:37 crc kubenswrapper[4826]: I0319 19:27:37.140236 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-ad6f-account-create-update-wnt5t"] Mar 19 19:27:37 crc kubenswrapper[4826]: I0319 19:27:37.154297 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-dhqnj"] Mar 19 19:27:37 crc kubenswrapper[4826]: I0319 19:27:37.171232 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-stspj"] Mar 19 19:27:37 crc kubenswrapper[4826]: I0319 19:27:37.198470 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-fmgf5"] Mar 19 19:27:37 crc kubenswrapper[4826]: I0319 19:27:37.214965 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-ad6f-account-create-update-wnt5t"] Mar 19 19:27:38 crc kubenswrapper[4826]: I0319 19:27:38.005339 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d4f258d-0295-473e-89cf-b714157c3c60" path="/var/lib/kubelet/pods/7d4f258d-0295-473e-89cf-b714157c3c60/volumes" Mar 19 19:27:38 crc kubenswrapper[4826]: I0319 19:27:38.007052 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87ca634c-b18e-4567-a7ee-00d102d65496" path="/var/lib/kubelet/pods/87ca634c-b18e-4567-a7ee-00d102d65496/volumes" Mar 19 19:27:38 crc kubenswrapper[4826]: I0319 19:27:38.010107 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f6915f1-a5f3-4816-8ed6-1f0232327393" path="/var/lib/kubelet/pods/9f6915f1-a5f3-4816-8ed6-1f0232327393/volumes" Mar 19 19:27:38 crc kubenswrapper[4826]: I0319 19:27:38.012021 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca947aa7-664b-4392-9446-bdc5afdb3d6b" path="/var/lib/kubelet/pods/ca947aa7-664b-4392-9446-bdc5afdb3d6b/volumes" Mar 19 19:27:42 crc kubenswrapper[4826]: I0319 19:27:42.052376 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-454rz"] Mar 19 19:27:42 crc kubenswrapper[4826]: I0319 19:27:42.067966 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-e3a0-account-create-update-zqd66"] Mar 19 19:27:42 crc kubenswrapper[4826]: I0319 19:27:42.081375 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-454rz"] Mar 19 19:27:42 crc kubenswrapper[4826]: I0319 19:27:42.092401 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-e3a0-account-create-update-zqd66"] Mar 19 19:27:43 crc kubenswrapper[4826]: I0319 19:27:43.042502 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-rvwbt"] Mar 19 19:27:43 crc kubenswrapper[4826]: I0319 19:27:43.063883 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-f4b2-account-create-update-lxfwz"] Mar 19 19:27:43 crc kubenswrapper[4826]: I0319 19:27:43.080440 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-1a54-account-create-update-f52lv"] Mar 19 19:27:43 crc kubenswrapper[4826]: I0319 19:27:43.095184 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-f4b2-account-create-update-lxfwz"] Mar 19 19:27:43 crc kubenswrapper[4826]: I0319 19:27:43.107940 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-rvwbt"] Mar 19 19:27:43 crc kubenswrapper[4826]: I0319 19:27:43.119202 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-1a54-account-create-update-f52lv"] Mar 19 19:27:44 crc kubenswrapper[4826]: I0319 19:27:44.012566 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="733b90f8-38f2-47b3-ae70-43edf8383cd8" path="/var/lib/kubelet/pods/733b90f8-38f2-47b3-ae70-43edf8383cd8/volumes" Mar 19 19:27:44 crc kubenswrapper[4826]: I0319 19:27:44.015740 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d888eb3-4bcd-470d-95c6-aa3d281c6332" path="/var/lib/kubelet/pods/9d888eb3-4bcd-470d-95c6-aa3d281c6332/volumes" Mar 19 19:27:44 crc kubenswrapper[4826]: I0319 19:27:44.017534 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a09e9d14-f548-48a1-bbdc-1f1588b80e3a" path="/var/lib/kubelet/pods/a09e9d14-f548-48a1-bbdc-1f1588b80e3a/volumes" Mar 19 19:27:44 crc kubenswrapper[4826]: I0319 19:27:44.022421 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8b780a2-0bd1-4947-bf59-b7c27a9c031c" path="/var/lib/kubelet/pods/f8b780a2-0bd1-4947-bf59-b7c27a9c031c/volumes" Mar 19 19:27:44 crc kubenswrapper[4826]: I0319 19:27:44.025094 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd45547e-1987-40bd-ba4a-1156803be411" path="/var/lib/kubelet/pods/fd45547e-1987-40bd-ba4a-1156803be411/volumes" Mar 19 19:27:47 crc kubenswrapper[4826]: I0319 19:27:47.046266 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-b84j2"] Mar 19 19:27:47 crc kubenswrapper[4826]: I0319 19:27:47.065739 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-b84j2"] Mar 19 19:27:47 crc kubenswrapper[4826]: I0319 19:27:47.995373 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92f6edca-b463-4c0a-b97a-3d82d73a9590" path="/var/lib/kubelet/pods/92f6edca-b463-4c0a-b97a-3d82d73a9590/volumes" Mar 19 19:28:00 crc kubenswrapper[4826]: I0319 19:28:00.164998 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565808-dff7k"] Mar 19 19:28:00 crc kubenswrapper[4826]: I0319 19:28:00.168370 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565808-dff7k" Mar 19 19:28:00 crc kubenswrapper[4826]: I0319 19:28:00.174967 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b27wl" Mar 19 19:28:00 crc kubenswrapper[4826]: I0319 19:28:00.175189 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 19:28:00 crc kubenswrapper[4826]: I0319 19:28:00.175855 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 19:28:00 crc kubenswrapper[4826]: I0319 19:28:00.179831 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565808-dff7k"] Mar 19 19:28:00 crc kubenswrapper[4826]: I0319 19:28:00.345044 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrqtw\" (UniqueName: \"kubernetes.io/projected/3ab61a4b-32ad-4a2b-b2a4-d12a17a6f63a-kube-api-access-wrqtw\") pod \"auto-csr-approver-29565808-dff7k\" (UID: \"3ab61a4b-32ad-4a2b-b2a4-d12a17a6f63a\") " pod="openshift-infra/auto-csr-approver-29565808-dff7k" Mar 19 19:28:00 crc kubenswrapper[4826]: I0319 19:28:00.447348 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrqtw\" (UniqueName: \"kubernetes.io/projected/3ab61a4b-32ad-4a2b-b2a4-d12a17a6f63a-kube-api-access-wrqtw\") pod \"auto-csr-approver-29565808-dff7k\" (UID: \"3ab61a4b-32ad-4a2b-b2a4-d12a17a6f63a\") " pod="openshift-infra/auto-csr-approver-29565808-dff7k" Mar 19 19:28:00 crc kubenswrapper[4826]: I0319 19:28:00.497738 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrqtw\" (UniqueName: \"kubernetes.io/projected/3ab61a4b-32ad-4a2b-b2a4-d12a17a6f63a-kube-api-access-wrqtw\") pod \"auto-csr-approver-29565808-dff7k\" (UID: \"3ab61a4b-32ad-4a2b-b2a4-d12a17a6f63a\") " pod="openshift-infra/auto-csr-approver-29565808-dff7k" Mar 19 19:28:00 crc kubenswrapper[4826]: I0319 19:28:00.509866 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565808-dff7k" Mar 19 19:28:01 crc kubenswrapper[4826]: I0319 19:28:01.066205 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565808-dff7k"] Mar 19 19:28:01 crc kubenswrapper[4826]: I0319 19:28:01.440783 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565808-dff7k" event={"ID":"3ab61a4b-32ad-4a2b-b2a4-d12a17a6f63a","Type":"ContainerStarted","Data":"69932159b2c96d31aa1ab12b5a01351cf5279ddc0f08a501b9f36874e025122d"} Mar 19 19:28:03 crc kubenswrapper[4826]: I0319 19:28:03.467564 4826 generic.go:334] "Generic (PLEG): container finished" podID="3ab61a4b-32ad-4a2b-b2a4-d12a17a6f63a" containerID="2e1f4d70f3f933df3f2a33288b8b007376aad8dbc3bf9c627028b05b6d1f6ab9" exitCode=0 Mar 19 19:28:03 crc kubenswrapper[4826]: I0319 19:28:03.467642 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565808-dff7k" event={"ID":"3ab61a4b-32ad-4a2b-b2a4-d12a17a6f63a","Type":"ContainerDied","Data":"2e1f4d70f3f933df3f2a33288b8b007376aad8dbc3bf9c627028b05b6d1f6ab9"} Mar 19 19:28:04 crc kubenswrapper[4826]: I0319 19:28:04.934198 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565808-dff7k" Mar 19 19:28:05 crc kubenswrapper[4826]: I0319 19:28:05.002087 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrqtw\" (UniqueName: \"kubernetes.io/projected/3ab61a4b-32ad-4a2b-b2a4-d12a17a6f63a-kube-api-access-wrqtw\") pod \"3ab61a4b-32ad-4a2b-b2a4-d12a17a6f63a\" (UID: \"3ab61a4b-32ad-4a2b-b2a4-d12a17a6f63a\") " Mar 19 19:28:05 crc kubenswrapper[4826]: I0319 19:28:05.500935 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565808-dff7k" event={"ID":"3ab61a4b-32ad-4a2b-b2a4-d12a17a6f63a","Type":"ContainerDied","Data":"69932159b2c96d31aa1ab12b5a01351cf5279ddc0f08a501b9f36874e025122d"} Mar 19 19:28:05 crc kubenswrapper[4826]: I0319 19:28:05.501019 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69932159b2c96d31aa1ab12b5a01351cf5279ddc0f08a501b9f36874e025122d" Mar 19 19:28:05 crc kubenswrapper[4826]: I0319 19:28:05.501103 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565808-dff7k" Mar 19 19:28:05 crc kubenswrapper[4826]: I0319 19:28:05.946966 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab61a4b-32ad-4a2b-b2a4-d12a17a6f63a-kube-api-access-wrqtw" (OuterVolumeSpecName: "kube-api-access-wrqtw") pod "3ab61a4b-32ad-4a2b-b2a4-d12a17a6f63a" (UID: "3ab61a4b-32ad-4a2b-b2a4-d12a17a6f63a"). InnerVolumeSpecName "kube-api-access-wrqtw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:28:06 crc kubenswrapper[4826]: I0319 19:28:06.026027 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrqtw\" (UniqueName: \"kubernetes.io/projected/3ab61a4b-32ad-4a2b-b2a4-d12a17a6f63a-kube-api-access-wrqtw\") on node \"crc\" DevicePath \"\"" Mar 19 19:28:06 crc kubenswrapper[4826]: I0319 19:28:06.086717 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565802-wsq4t"] Mar 19 19:28:06 crc kubenswrapper[4826]: I0319 19:28:06.099369 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565802-wsq4t"] Mar 19 19:28:08 crc kubenswrapper[4826]: I0319 19:28:08.006757 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8d51938-c21d-4578-afe3-ecbbf8d67bd2" path="/var/lib/kubelet/pods/f8d51938-c21d-4578-afe3-ecbbf8d67bd2/volumes" Mar 19 19:28:17 crc kubenswrapper[4826]: I0319 19:28:17.036478 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-jgxmp"] Mar 19 19:28:17 crc kubenswrapper[4826]: I0319 19:28:17.048329 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-jgxmp"] Mar 19 19:28:18 crc kubenswrapper[4826]: I0319 19:28:18.002542 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ef3b70e-ff7d-48a9-8796-8b20af6e6547" path="/var/lib/kubelet/pods/9ef3b70e-ff7d-48a9-8796-8b20af6e6547/volumes" Mar 19 19:28:26 crc kubenswrapper[4826]: I0319 19:28:26.802572 4826 generic.go:334] "Generic (PLEG): container finished" podID="e15b97af-3c47-4d33-82d0-20e627959de3" containerID="58eccce4e1b8fc75b2d803dc008f1b837d04fc3be3417022af66337ec98551be" exitCode=0 Mar 19 19:28:26 crc kubenswrapper[4826]: I0319 19:28:26.802677 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5zwm5" event={"ID":"e15b97af-3c47-4d33-82d0-20e627959de3","Type":"ContainerDied","Data":"58eccce4e1b8fc75b2d803dc008f1b837d04fc3be3417022af66337ec98551be"} Mar 19 19:28:28 crc kubenswrapper[4826]: I0319 19:28:28.325556 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5zwm5" Mar 19 19:28:28 crc kubenswrapper[4826]: I0319 19:28:28.372004 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e15b97af-3c47-4d33-82d0-20e627959de3-ssh-key-openstack-edpm-ipam\") pod \"e15b97af-3c47-4d33-82d0-20e627959de3\" (UID: \"e15b97af-3c47-4d33-82d0-20e627959de3\") " Mar 19 19:28:28 crc kubenswrapper[4826]: I0319 19:28:28.372335 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7l2mv\" (UniqueName: \"kubernetes.io/projected/e15b97af-3c47-4d33-82d0-20e627959de3-kube-api-access-7l2mv\") pod \"e15b97af-3c47-4d33-82d0-20e627959de3\" (UID: \"e15b97af-3c47-4d33-82d0-20e627959de3\") " Mar 19 19:28:28 crc kubenswrapper[4826]: I0319 19:28:28.373020 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e15b97af-3c47-4d33-82d0-20e627959de3-inventory\") pod \"e15b97af-3c47-4d33-82d0-20e627959de3\" (UID: \"e15b97af-3c47-4d33-82d0-20e627959de3\") " Mar 19 19:28:28 crc kubenswrapper[4826]: I0319 19:28:28.384867 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e15b97af-3c47-4d33-82d0-20e627959de3-kube-api-access-7l2mv" (OuterVolumeSpecName: "kube-api-access-7l2mv") pod "e15b97af-3c47-4d33-82d0-20e627959de3" (UID: "e15b97af-3c47-4d33-82d0-20e627959de3"). InnerVolumeSpecName "kube-api-access-7l2mv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:28:28 crc kubenswrapper[4826]: I0319 19:28:28.406464 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e15b97af-3c47-4d33-82d0-20e627959de3-inventory" (OuterVolumeSpecName: "inventory") pod "e15b97af-3c47-4d33-82d0-20e627959de3" (UID: "e15b97af-3c47-4d33-82d0-20e627959de3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:28:28 crc kubenswrapper[4826]: I0319 19:28:28.431979 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e15b97af-3c47-4d33-82d0-20e627959de3-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e15b97af-3c47-4d33-82d0-20e627959de3" (UID: "e15b97af-3c47-4d33-82d0-20e627959de3"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:28:28 crc kubenswrapper[4826]: I0319 19:28:28.476238 4826 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e15b97af-3c47-4d33-82d0-20e627959de3-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 19:28:28 crc kubenswrapper[4826]: I0319 19:28:28.476277 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7l2mv\" (UniqueName: \"kubernetes.io/projected/e15b97af-3c47-4d33-82d0-20e627959de3-kube-api-access-7l2mv\") on node \"crc\" DevicePath \"\"" Mar 19 19:28:28 crc kubenswrapper[4826]: I0319 19:28:28.476291 4826 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e15b97af-3c47-4d33-82d0-20e627959de3-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 19:28:28 crc kubenswrapper[4826]: I0319 19:28:28.830850 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5zwm5" event={"ID":"e15b97af-3c47-4d33-82d0-20e627959de3","Type":"ContainerDied","Data":"d1dea539aced12dc6809707098364259f2a819c2846df1cd2b54381dbb5ed5e0"} Mar 19 19:28:28 crc kubenswrapper[4826]: I0319 19:28:28.830908 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1dea539aced12dc6809707098364259f2a819c2846df1cd2b54381dbb5ed5e0" Mar 19 19:28:28 crc kubenswrapper[4826]: I0319 19:28:28.831045 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5zwm5" Mar 19 19:28:28 crc kubenswrapper[4826]: I0319 19:28:28.947945 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2bzk2"] Mar 19 19:28:28 crc kubenswrapper[4826]: E0319 19:28:28.948691 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ab61a4b-32ad-4a2b-b2a4-d12a17a6f63a" containerName="oc" Mar 19 19:28:28 crc kubenswrapper[4826]: I0319 19:28:28.948721 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ab61a4b-32ad-4a2b-b2a4-d12a17a6f63a" containerName="oc" Mar 19 19:28:28 crc kubenswrapper[4826]: E0319 19:28:28.948786 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e15b97af-3c47-4d33-82d0-20e627959de3" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 19 19:28:28 crc kubenswrapper[4826]: I0319 19:28:28.948799 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="e15b97af-3c47-4d33-82d0-20e627959de3" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 19 19:28:28 crc kubenswrapper[4826]: I0319 19:28:28.949147 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ab61a4b-32ad-4a2b-b2a4-d12a17a6f63a" containerName="oc" Mar 19 19:28:28 crc kubenswrapper[4826]: I0319 19:28:28.949177 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="e15b97af-3c47-4d33-82d0-20e627959de3" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 19 19:28:28 crc kubenswrapper[4826]: I0319 19:28:28.950468 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2bzk2" Mar 19 19:28:28 crc kubenswrapper[4826]: I0319 19:28:28.953164 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 19:28:28 crc kubenswrapper[4826]: I0319 19:28:28.953287 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 19:28:28 crc kubenswrapper[4826]: I0319 19:28:28.953557 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 19:28:28 crc kubenswrapper[4826]: I0319 19:28:28.953642 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jchxw" Mar 19 19:28:28 crc kubenswrapper[4826]: I0319 19:28:28.967401 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2bzk2"] Mar 19 19:28:28 crc kubenswrapper[4826]: I0319 19:28:28.990449 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ce8656f9-5811-44d9-bed1-39fb364ddc4f-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2bzk2\" (UID: \"ce8656f9-5811-44d9-bed1-39fb364ddc4f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2bzk2" Mar 19 19:28:28 crc kubenswrapper[4826]: I0319 19:28:28.990593 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ce8656f9-5811-44d9-bed1-39fb364ddc4f-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2bzk2\" (UID: \"ce8656f9-5811-44d9-bed1-39fb364ddc4f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2bzk2" Mar 19 19:28:28 crc kubenswrapper[4826]: I0319 19:28:28.990849 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg8qq\" (UniqueName: \"kubernetes.io/projected/ce8656f9-5811-44d9-bed1-39fb364ddc4f-kube-api-access-gg8qq\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2bzk2\" (UID: \"ce8656f9-5811-44d9-bed1-39fb364ddc4f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2bzk2" Mar 19 19:28:29 crc kubenswrapper[4826]: I0319 19:28:29.093222 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ce8656f9-5811-44d9-bed1-39fb364ddc4f-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2bzk2\" (UID: \"ce8656f9-5811-44d9-bed1-39fb364ddc4f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2bzk2" Mar 19 19:28:29 crc kubenswrapper[4826]: I0319 19:28:29.093474 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gg8qq\" (UniqueName: \"kubernetes.io/projected/ce8656f9-5811-44d9-bed1-39fb364ddc4f-kube-api-access-gg8qq\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2bzk2\" (UID: \"ce8656f9-5811-44d9-bed1-39fb364ddc4f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2bzk2" Mar 19 19:28:29 crc kubenswrapper[4826]: I0319 19:28:29.094068 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ce8656f9-5811-44d9-bed1-39fb364ddc4f-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2bzk2\" (UID: \"ce8656f9-5811-44d9-bed1-39fb364ddc4f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2bzk2" Mar 19 19:28:29 crc kubenswrapper[4826]: I0319 19:28:29.102575 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ce8656f9-5811-44d9-bed1-39fb364ddc4f-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2bzk2\" (UID: \"ce8656f9-5811-44d9-bed1-39fb364ddc4f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2bzk2" Mar 19 19:28:29 crc kubenswrapper[4826]: I0319 19:28:29.103851 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ce8656f9-5811-44d9-bed1-39fb364ddc4f-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2bzk2\" (UID: \"ce8656f9-5811-44d9-bed1-39fb364ddc4f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2bzk2" Mar 19 19:28:29 crc kubenswrapper[4826]: I0319 19:28:29.118280 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gg8qq\" (UniqueName: \"kubernetes.io/projected/ce8656f9-5811-44d9-bed1-39fb364ddc4f-kube-api-access-gg8qq\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2bzk2\" (UID: \"ce8656f9-5811-44d9-bed1-39fb364ddc4f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2bzk2" Mar 19 19:28:29 crc kubenswrapper[4826]: I0319 19:28:29.274281 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2bzk2" Mar 19 19:28:29 crc kubenswrapper[4826]: I0319 19:28:29.844525 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2bzk2"] Mar 19 19:28:29 crc kubenswrapper[4826]: I0319 19:28:29.871240 4826 scope.go:117] "RemoveContainer" containerID="c0addfd70dcb79bb9243581f38de8d3e2472393d69e690dc2f92816fc65d8ece" Mar 19 19:28:29 crc kubenswrapper[4826]: I0319 19:28:29.924027 4826 scope.go:117] "RemoveContainer" containerID="65170b93cfb43575d4256115d3e03de56f7c6e64fd93442416bc2e4a4f359520" Mar 19 19:28:29 crc kubenswrapper[4826]: I0319 19:28:29.979515 4826 scope.go:117] "RemoveContainer" containerID="21ef0dc3ebf3f9ddb2435ffd8e3564cabbac3213decd62c7806f1503260ee5e7" Mar 19 19:28:30 crc kubenswrapper[4826]: I0319 19:28:30.010593 4826 scope.go:117] "RemoveContainer" containerID="111696d1713a2f5f93fafa656988f505288d1c57dddfcbf6ba1445041e2d6b37" Mar 19 19:28:30 crc kubenswrapper[4826]: I0319 19:28:30.041368 4826 scope.go:117] "RemoveContainer" containerID="434b2e2245dc6797fb623b9cf3f35e328c48da5060bc5847bb26e5706e09ff1b" Mar 19 19:28:30 crc kubenswrapper[4826]: I0319 19:28:30.044078 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-hhk58"] Mar 19 19:28:30 crc kubenswrapper[4826]: I0319 19:28:30.055699 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-hhk58"] Mar 19 19:28:30 crc kubenswrapper[4826]: I0319 19:28:30.067315 4826 scope.go:117] "RemoveContainer" containerID="008f296df9d929c34c10dbca826659686020eccb5e94ffc11af8acc74b3caddd" Mar 19 19:28:30 crc kubenswrapper[4826]: I0319 19:28:30.092561 4826 scope.go:117] "RemoveContainer" containerID="11a96161a72d30617a9c3938a06d39b3ea5b8072c9985ba3d6cf909483a17a5a" Mar 19 19:28:30 crc kubenswrapper[4826]: I0319 19:28:30.137942 4826 scope.go:117] "RemoveContainer" containerID="08a6f4fbd3e914217a0081f36c98447e599c5ecaaabebe10ccfe789eb86e807f" Mar 19 19:28:30 crc kubenswrapper[4826]: I0319 19:28:30.156878 4826 scope.go:117] "RemoveContainer" containerID="75212f93ec3129bc16edebc174b0ad39f7df96e9f81928c5a6b537553dd3ecd8" Mar 19 19:28:30 crc kubenswrapper[4826]: I0319 19:28:30.182324 4826 scope.go:117] "RemoveContainer" containerID="bfccc4c312593561e65f147879cf5f6faaf8800563fdc4398f2a5e3f52aa9ae6" Mar 19 19:28:30 crc kubenswrapper[4826]: I0319 19:28:30.214399 4826 scope.go:117] "RemoveContainer" containerID="760861aec5f18a584b8e36885125a3777bbbca2857f1b2c7547668c3e91fe1d9" Mar 19 19:28:30 crc kubenswrapper[4826]: I0319 19:28:30.307121 4826 scope.go:117] "RemoveContainer" containerID="cca5679346998ae47aa98543875bc5d8eae4408ff4b5a140ec9b4cdd78d0d45d" Mar 19 19:28:30 crc kubenswrapper[4826]: I0319 19:28:30.866509 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2bzk2" event={"ID":"ce8656f9-5811-44d9-bed1-39fb364ddc4f","Type":"ContainerStarted","Data":"a654b351b6862ac125d35bb1c1744810d5b604067241d146401750d42d48d87b"} Mar 19 19:28:30 crc kubenswrapper[4826]: I0319 19:28:30.866900 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2bzk2" event={"ID":"ce8656f9-5811-44d9-bed1-39fb364ddc4f","Type":"ContainerStarted","Data":"67579bf99e5657f433a2ceacd79d2e251d8c182eecfcfa57355dcceccd9e712d"} Mar 19 19:28:30 crc kubenswrapper[4826]: I0319 19:28:30.903008 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2bzk2" podStartSLOduration=2.174901811 podStartE2EDuration="2.902977386s" podCreationTimestamp="2026-03-19 19:28:28 +0000 UTC" firstStartedPulling="2026-03-19 19:28:29.843393062 +0000 UTC m=+1934.597461375" lastFinishedPulling="2026-03-19 19:28:30.571468637 +0000 UTC m=+1935.325536950" observedRunningTime="2026-03-19 19:28:30.88833482 +0000 UTC m=+1935.642403173" watchObservedRunningTime="2026-03-19 19:28:30.902977386 +0000 UTC m=+1935.657045739" Mar 19 19:28:31 crc kubenswrapper[4826]: I0319 19:28:31.090410 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-s7bsn"] Mar 19 19:28:31 crc kubenswrapper[4826]: I0319 19:28:31.105459 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-s7bsn"] Mar 19 19:28:31 crc kubenswrapper[4826]: I0319 19:28:31.998796 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1abad829-61e3-47f1-b24d-58ccb40e58f7" path="/var/lib/kubelet/pods/1abad829-61e3-47f1-b24d-58ccb40e58f7/volumes" Mar 19 19:28:31 crc kubenswrapper[4826]: I0319 19:28:31.999946 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="789d342d-013e-45e5-a57b-cde9f8bc0d3f" path="/var/lib/kubelet/pods/789d342d-013e-45e5-a57b-cde9f8bc0d3f/volumes" Mar 19 19:28:41 crc kubenswrapper[4826]: I0319 19:28:41.070286 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-xkjdp"] Mar 19 19:28:41 crc kubenswrapper[4826]: I0319 19:28:41.081488 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-xkjdp"] Mar 19 19:28:42 crc kubenswrapper[4826]: I0319 19:28:42.000722 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30c5e21e-66a0-47a9-b03d-55fbfe372d1b" path="/var/lib/kubelet/pods/30c5e21e-66a0-47a9-b03d-55fbfe372d1b/volumes" Mar 19 19:28:42 crc kubenswrapper[4826]: I0319 19:28:42.046556 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-p7jzd"] Mar 19 19:28:42 crc kubenswrapper[4826]: I0319 19:28:42.061949 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-p7jzd"] Mar 19 19:28:43 crc kubenswrapper[4826]: I0319 19:28:43.993432 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c80aa39-c840-4267-9677-bb82f387073d" path="/var/lib/kubelet/pods/5c80aa39-c840-4267-9677-bb82f387073d/volumes" Mar 19 19:29:24 crc kubenswrapper[4826]: I0319 19:29:24.076121 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-ac0a-account-create-update-nrsjr"] Mar 19 19:29:24 crc kubenswrapper[4826]: I0319 19:29:24.090515 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-s9rqd"] Mar 19 19:29:24 crc kubenswrapper[4826]: I0319 19:29:24.103162 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-ac0a-account-create-update-nrsjr"] Mar 19 19:29:24 crc kubenswrapper[4826]: I0319 19:29:24.114707 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-9a8c-account-create-update-s22ds"] Mar 19 19:29:24 crc kubenswrapper[4826]: I0319 19:29:24.127938 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-96hcs"] Mar 19 19:29:24 crc kubenswrapper[4826]: I0319 19:29:24.138217 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-9a8c-account-create-update-s22ds"] Mar 19 19:29:24 crc kubenswrapper[4826]: I0319 19:29:24.148934 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-96hcs"] Mar 19 19:29:24 crc kubenswrapper[4826]: I0319 19:29:24.158520 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-s9rqd"] Mar 19 19:29:24 crc kubenswrapper[4826]: I0319 19:29:24.168897 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-n9d4c"] Mar 19 19:29:24 crc kubenswrapper[4826]: I0319 19:29:24.178472 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-n9d4c"] Mar 19 19:29:24 crc kubenswrapper[4826]: I0319 19:29:24.188970 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-c0a7-account-create-update-7mljh"] Mar 19 19:29:24 crc kubenswrapper[4826]: I0319 19:29:24.199052 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-c0a7-account-create-update-7mljh"] Mar 19 19:29:25 crc kubenswrapper[4826]: I0319 19:29:25.400196 4826 patch_prober.go:28] interesting pod/machine-config-daemon-zz87p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:29:25 crc kubenswrapper[4826]: I0319 19:29:25.400259 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:29:26 crc kubenswrapper[4826]: I0319 19:29:26.001259 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38780138-b2f9-49a0-9ed1-90ee6fbb4c11" path="/var/lib/kubelet/pods/38780138-b2f9-49a0-9ed1-90ee6fbb4c11/volumes" Mar 19 19:29:26 crc kubenswrapper[4826]: I0319 19:29:26.003498 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56d23be9-1486-41d9-9f8f-86ca8965b96c" path="/var/lib/kubelet/pods/56d23be9-1486-41d9-9f8f-86ca8965b96c/volumes" Mar 19 19:29:26 crc kubenswrapper[4826]: I0319 19:29:26.004857 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="603d736d-cb9b-41e3-b971-82c457932511" path="/var/lib/kubelet/pods/603d736d-cb9b-41e3-b971-82c457932511/volumes" Mar 19 19:29:26 crc kubenswrapper[4826]: I0319 19:29:26.006176 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81a7caa7-3837-46ff-9008-bc8784373127" path="/var/lib/kubelet/pods/81a7caa7-3837-46ff-9008-bc8784373127/volumes" Mar 19 19:29:26 crc kubenswrapper[4826]: I0319 19:29:26.008280 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9361ba16-5f40-4c7b-a698-dd74fb1d4af7" path="/var/lib/kubelet/pods/9361ba16-5f40-4c7b-a698-dd74fb1d4af7/volumes" Mar 19 19:29:26 crc kubenswrapper[4826]: I0319 19:29:26.009167 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a60e04c4-caed-4173-ac2a-4dca3c4ae080" path="/var/lib/kubelet/pods/a60e04c4-caed-4173-ac2a-4dca3c4ae080/volumes" Mar 19 19:29:30 crc kubenswrapper[4826]: I0319 19:29:30.691528 4826 scope.go:117] "RemoveContainer" containerID="ef10ff692a80616e83e4a7fc61781a325ff0c035046a7ef4b60a86295cee14d3" Mar 19 19:29:30 crc kubenswrapper[4826]: I0319 19:29:30.733557 4826 scope.go:117] "RemoveContainer" containerID="7c58f9f262ad39c6602e773a8352f8bab23f243e0f8ad7878bd22836f1982c79" Mar 19 19:29:30 crc kubenswrapper[4826]: I0319 19:29:30.823454 4826 scope.go:117] "RemoveContainer" containerID="1ef54ebc6936b221c2cf1fbad176ea2f4a00c6202a38a5c29df95790ff70da95" Mar 19 19:29:30 crc kubenswrapper[4826]: I0319 19:29:30.887625 4826 scope.go:117] "RemoveContainer" containerID="1c91fd70db79cc97b6c155a9aa2c4c06984d69be8279807468057978454a73d9" Mar 19 19:29:30 crc kubenswrapper[4826]: I0319 19:29:30.954297 4826 scope.go:117] "RemoveContainer" containerID="0deba0cf5d3615df13eb6eb2348c1c67555e0d143f4cccf28346e05fe136c7bd" Mar 19 19:29:30 crc kubenswrapper[4826]: I0319 19:29:30.999811 4826 scope.go:117] "RemoveContainer" containerID="21817210888fa22816cfc57f69fbf7ffdd658827d7d41be0bd9f61500d3d372d" Mar 19 19:29:31 crc kubenswrapper[4826]: I0319 19:29:31.054363 4826 scope.go:117] "RemoveContainer" containerID="77ec9d965c883abe338c2908131eb66c9fdb6227af45362583d8fc2afc23015c" Mar 19 19:29:31 crc kubenswrapper[4826]: I0319 19:29:31.085076 4826 scope.go:117] "RemoveContainer" containerID="fff89c799bb4a7d8f1c1471e29d1fcea876eef3b3721a68532df2c08ee66dbf4" Mar 19 19:29:31 crc kubenswrapper[4826]: I0319 19:29:31.110440 4826 scope.go:117] "RemoveContainer" containerID="5be86ed4ca6a2ad91533eb1be40cbdd99b329ac6d4f507653c629adccbddfd57" Mar 19 19:29:31 crc kubenswrapper[4826]: I0319 19:29:31.134887 4826 scope.go:117] "RemoveContainer" containerID="fdf467b26b8ede7bc3176bd1dc6d2b9e2853361e447eed4af4ca4256a3452752" Mar 19 19:29:38 crc kubenswrapper[4826]: I0319 19:29:38.872900 4826 generic.go:334] "Generic (PLEG): container finished" podID="ce8656f9-5811-44d9-bed1-39fb364ddc4f" containerID="a654b351b6862ac125d35bb1c1744810d5b604067241d146401750d42d48d87b" exitCode=0 Mar 19 19:29:38 crc kubenswrapper[4826]: I0319 19:29:38.873441 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2bzk2" event={"ID":"ce8656f9-5811-44d9-bed1-39fb364ddc4f","Type":"ContainerDied","Data":"a654b351b6862ac125d35bb1c1744810d5b604067241d146401750d42d48d87b"} Mar 19 19:29:40 crc kubenswrapper[4826]: I0319 19:29:40.504769 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2bzk2" Mar 19 19:29:40 crc kubenswrapper[4826]: I0319 19:29:40.536635 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ce8656f9-5811-44d9-bed1-39fb364ddc4f-ssh-key-openstack-edpm-ipam\") pod \"ce8656f9-5811-44d9-bed1-39fb364ddc4f\" (UID: \"ce8656f9-5811-44d9-bed1-39fb364ddc4f\") " Mar 19 19:29:40 crc kubenswrapper[4826]: I0319 19:29:40.536792 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ce8656f9-5811-44d9-bed1-39fb364ddc4f-inventory\") pod \"ce8656f9-5811-44d9-bed1-39fb364ddc4f\" (UID: \"ce8656f9-5811-44d9-bed1-39fb364ddc4f\") " Mar 19 19:29:40 crc kubenswrapper[4826]: I0319 19:29:40.536828 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gg8qq\" (UniqueName: \"kubernetes.io/projected/ce8656f9-5811-44d9-bed1-39fb364ddc4f-kube-api-access-gg8qq\") pod \"ce8656f9-5811-44d9-bed1-39fb364ddc4f\" (UID: \"ce8656f9-5811-44d9-bed1-39fb364ddc4f\") " Mar 19 19:29:40 crc kubenswrapper[4826]: I0319 19:29:40.558028 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce8656f9-5811-44d9-bed1-39fb364ddc4f-kube-api-access-gg8qq" (OuterVolumeSpecName: "kube-api-access-gg8qq") pod "ce8656f9-5811-44d9-bed1-39fb364ddc4f" (UID: "ce8656f9-5811-44d9-bed1-39fb364ddc4f"). InnerVolumeSpecName "kube-api-access-gg8qq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:29:40 crc kubenswrapper[4826]: I0319 19:29:40.577011 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce8656f9-5811-44d9-bed1-39fb364ddc4f-inventory" (OuterVolumeSpecName: "inventory") pod "ce8656f9-5811-44d9-bed1-39fb364ddc4f" (UID: "ce8656f9-5811-44d9-bed1-39fb364ddc4f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:29:40 crc kubenswrapper[4826]: I0319 19:29:40.614142 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce8656f9-5811-44d9-bed1-39fb364ddc4f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ce8656f9-5811-44d9-bed1-39fb364ddc4f" (UID: "ce8656f9-5811-44d9-bed1-39fb364ddc4f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:29:40 crc kubenswrapper[4826]: I0319 19:29:40.638670 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gg8qq\" (UniqueName: \"kubernetes.io/projected/ce8656f9-5811-44d9-bed1-39fb364ddc4f-kube-api-access-gg8qq\") on node \"crc\" DevicePath \"\"" Mar 19 19:29:40 crc kubenswrapper[4826]: I0319 19:29:40.638708 4826 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ce8656f9-5811-44d9-bed1-39fb364ddc4f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 19:29:40 crc kubenswrapper[4826]: I0319 19:29:40.638720 4826 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ce8656f9-5811-44d9-bed1-39fb364ddc4f-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 19:29:40 crc kubenswrapper[4826]: I0319 19:29:40.905496 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2bzk2" event={"ID":"ce8656f9-5811-44d9-bed1-39fb364ddc4f","Type":"ContainerDied","Data":"67579bf99e5657f433a2ceacd79d2e251d8c182eecfcfa57355dcceccd9e712d"} Mar 19 19:29:40 crc kubenswrapper[4826]: I0319 19:29:40.906066 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67579bf99e5657f433a2ceacd79d2e251d8c182eecfcfa57355dcceccd9e712d" Mar 19 19:29:40 crc kubenswrapper[4826]: I0319 19:29:40.905543 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2bzk2" Mar 19 19:29:41 crc kubenswrapper[4826]: I0319 19:29:41.038916 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kjfmq"] Mar 19 19:29:41 crc kubenswrapper[4826]: E0319 19:29:41.039452 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce8656f9-5811-44d9-bed1-39fb364ddc4f" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 19 19:29:41 crc kubenswrapper[4826]: I0319 19:29:41.039476 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce8656f9-5811-44d9-bed1-39fb364ddc4f" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 19 19:29:41 crc kubenswrapper[4826]: I0319 19:29:41.039782 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce8656f9-5811-44d9-bed1-39fb364ddc4f" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 19 19:29:41 crc kubenswrapper[4826]: I0319 19:29:41.040750 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kjfmq" Mar 19 19:29:41 crc kubenswrapper[4826]: I0319 19:29:41.043695 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 19:29:41 crc kubenswrapper[4826]: I0319 19:29:41.043998 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 19:29:41 crc kubenswrapper[4826]: I0319 19:29:41.044052 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jchxw" Mar 19 19:29:41 crc kubenswrapper[4826]: I0319 19:29:41.044185 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 19:29:41 crc kubenswrapper[4826]: I0319 19:29:41.053179 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kjfmq"] Mar 19 19:29:41 crc kubenswrapper[4826]: I0319 19:29:41.151569 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d357bfcc-5ce5-456d-b8b7-142ef30a57b0-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kjfmq\" (UID: \"d357bfcc-5ce5-456d-b8b7-142ef30a57b0\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kjfmq" Mar 19 19:29:41 crc kubenswrapper[4826]: I0319 19:29:41.151781 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9rsr\" (UniqueName: \"kubernetes.io/projected/d357bfcc-5ce5-456d-b8b7-142ef30a57b0-kube-api-access-f9rsr\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kjfmq\" (UID: \"d357bfcc-5ce5-456d-b8b7-142ef30a57b0\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kjfmq" Mar 19 19:29:41 crc kubenswrapper[4826]: I0319 19:29:41.152242 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d357bfcc-5ce5-456d-b8b7-142ef30a57b0-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kjfmq\" (UID: \"d357bfcc-5ce5-456d-b8b7-142ef30a57b0\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kjfmq" Mar 19 19:29:41 crc kubenswrapper[4826]: I0319 19:29:41.255415 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d357bfcc-5ce5-456d-b8b7-142ef30a57b0-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kjfmq\" (UID: \"d357bfcc-5ce5-456d-b8b7-142ef30a57b0\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kjfmq" Mar 19 19:29:41 crc kubenswrapper[4826]: I0319 19:29:41.255713 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d357bfcc-5ce5-456d-b8b7-142ef30a57b0-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kjfmq\" (UID: \"d357bfcc-5ce5-456d-b8b7-142ef30a57b0\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kjfmq" Mar 19 19:29:41 crc kubenswrapper[4826]: I0319 19:29:41.255822 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9rsr\" (UniqueName: \"kubernetes.io/projected/d357bfcc-5ce5-456d-b8b7-142ef30a57b0-kube-api-access-f9rsr\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kjfmq\" (UID: \"d357bfcc-5ce5-456d-b8b7-142ef30a57b0\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kjfmq" Mar 19 19:29:41 crc kubenswrapper[4826]: I0319 19:29:41.261185 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d357bfcc-5ce5-456d-b8b7-142ef30a57b0-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kjfmq\" (UID: \"d357bfcc-5ce5-456d-b8b7-142ef30a57b0\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kjfmq" Mar 19 19:29:41 crc kubenswrapper[4826]: I0319 19:29:41.261493 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d357bfcc-5ce5-456d-b8b7-142ef30a57b0-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kjfmq\" (UID: \"d357bfcc-5ce5-456d-b8b7-142ef30a57b0\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kjfmq" Mar 19 19:29:41 crc kubenswrapper[4826]: I0319 19:29:41.275040 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9rsr\" (UniqueName: \"kubernetes.io/projected/d357bfcc-5ce5-456d-b8b7-142ef30a57b0-kube-api-access-f9rsr\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kjfmq\" (UID: \"d357bfcc-5ce5-456d-b8b7-142ef30a57b0\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kjfmq" Mar 19 19:29:41 crc kubenswrapper[4826]: I0319 19:29:41.364962 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kjfmq" Mar 19 19:29:41 crc kubenswrapper[4826]: I0319 19:29:41.973694 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kjfmq"] Mar 19 19:29:41 crc kubenswrapper[4826]: I0319 19:29:41.987934 4826 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 19:29:42 crc kubenswrapper[4826]: I0319 19:29:42.942198 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kjfmq" event={"ID":"d357bfcc-5ce5-456d-b8b7-142ef30a57b0","Type":"ContainerStarted","Data":"59821b2291e2047b1a1d3b396fdb5feef294682b9e1b2f97d1aeeaa9fe1a2794"} Mar 19 19:29:42 crc kubenswrapper[4826]: I0319 19:29:42.942853 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kjfmq" event={"ID":"d357bfcc-5ce5-456d-b8b7-142ef30a57b0","Type":"ContainerStarted","Data":"f2db40ff45f79bd7429acb8432541c0b0d042b059106cabaaaec1e1c8d67eb6b"} Mar 19 19:29:42 crc kubenswrapper[4826]: I0319 19:29:42.971874 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kjfmq" podStartSLOduration=1.458534654 podStartE2EDuration="1.971858224s" podCreationTimestamp="2026-03-19 19:29:41 +0000 UTC" firstStartedPulling="2026-03-19 19:29:41.987520203 +0000 UTC m=+2006.741588536" lastFinishedPulling="2026-03-19 19:29:42.500843753 +0000 UTC m=+2007.254912106" observedRunningTime="2026-03-19 19:29:42.964475783 +0000 UTC m=+2007.718544096" watchObservedRunningTime="2026-03-19 19:29:42.971858224 +0000 UTC m=+2007.725926537" Mar 19 19:29:48 crc kubenswrapper[4826]: I0319 19:29:48.067031 4826 generic.go:334] "Generic (PLEG): container finished" podID="d357bfcc-5ce5-456d-b8b7-142ef30a57b0" containerID="59821b2291e2047b1a1d3b396fdb5feef294682b9e1b2f97d1aeeaa9fe1a2794" exitCode=0 Mar 19 19:29:48 crc kubenswrapper[4826]: I0319 19:29:48.067102 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kjfmq" event={"ID":"d357bfcc-5ce5-456d-b8b7-142ef30a57b0","Type":"ContainerDied","Data":"59821b2291e2047b1a1d3b396fdb5feef294682b9e1b2f97d1aeeaa9fe1a2794"} Mar 19 19:29:49 crc kubenswrapper[4826]: I0319 19:29:49.654846 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kjfmq" Mar 19 19:29:49 crc kubenswrapper[4826]: I0319 19:29:49.782330 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d357bfcc-5ce5-456d-b8b7-142ef30a57b0-ssh-key-openstack-edpm-ipam\") pod \"d357bfcc-5ce5-456d-b8b7-142ef30a57b0\" (UID: \"d357bfcc-5ce5-456d-b8b7-142ef30a57b0\") " Mar 19 19:29:49 crc kubenswrapper[4826]: I0319 19:29:49.782402 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d357bfcc-5ce5-456d-b8b7-142ef30a57b0-inventory\") pod \"d357bfcc-5ce5-456d-b8b7-142ef30a57b0\" (UID: \"d357bfcc-5ce5-456d-b8b7-142ef30a57b0\") " Mar 19 19:29:49 crc kubenswrapper[4826]: I0319 19:29:49.782632 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9rsr\" (UniqueName: \"kubernetes.io/projected/d357bfcc-5ce5-456d-b8b7-142ef30a57b0-kube-api-access-f9rsr\") pod \"d357bfcc-5ce5-456d-b8b7-142ef30a57b0\" (UID: \"d357bfcc-5ce5-456d-b8b7-142ef30a57b0\") " Mar 19 19:29:49 crc kubenswrapper[4826]: I0319 19:29:49.788758 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d357bfcc-5ce5-456d-b8b7-142ef30a57b0-kube-api-access-f9rsr" (OuterVolumeSpecName: "kube-api-access-f9rsr") pod "d357bfcc-5ce5-456d-b8b7-142ef30a57b0" (UID: "d357bfcc-5ce5-456d-b8b7-142ef30a57b0"). InnerVolumeSpecName "kube-api-access-f9rsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:29:49 crc kubenswrapper[4826]: I0319 19:29:49.817186 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d357bfcc-5ce5-456d-b8b7-142ef30a57b0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d357bfcc-5ce5-456d-b8b7-142ef30a57b0" (UID: "d357bfcc-5ce5-456d-b8b7-142ef30a57b0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:29:49 crc kubenswrapper[4826]: I0319 19:29:49.819905 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d357bfcc-5ce5-456d-b8b7-142ef30a57b0-inventory" (OuterVolumeSpecName: "inventory") pod "d357bfcc-5ce5-456d-b8b7-142ef30a57b0" (UID: "d357bfcc-5ce5-456d-b8b7-142ef30a57b0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:29:49 crc kubenswrapper[4826]: I0319 19:29:49.885297 4826 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d357bfcc-5ce5-456d-b8b7-142ef30a57b0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 19:29:49 crc kubenswrapper[4826]: I0319 19:29:49.885367 4826 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d357bfcc-5ce5-456d-b8b7-142ef30a57b0-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 19:29:49 crc kubenswrapper[4826]: I0319 19:29:49.885383 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9rsr\" (UniqueName: \"kubernetes.io/projected/d357bfcc-5ce5-456d-b8b7-142ef30a57b0-kube-api-access-f9rsr\") on node \"crc\" DevicePath \"\"" Mar 19 19:29:50 crc kubenswrapper[4826]: I0319 19:29:50.095499 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kjfmq" event={"ID":"d357bfcc-5ce5-456d-b8b7-142ef30a57b0","Type":"ContainerDied","Data":"f2db40ff45f79bd7429acb8432541c0b0d042b059106cabaaaec1e1c8d67eb6b"} Mar 19 19:29:50 crc kubenswrapper[4826]: I0319 19:29:50.095547 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2db40ff45f79bd7429acb8432541c0b0d042b059106cabaaaec1e1c8d67eb6b" Mar 19 19:29:50 crc kubenswrapper[4826]: I0319 19:29:50.095635 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kjfmq" Mar 19 19:29:50 crc kubenswrapper[4826]: I0319 19:29:50.194990 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-7frjg"] Mar 19 19:29:50 crc kubenswrapper[4826]: E0319 19:29:50.195718 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d357bfcc-5ce5-456d-b8b7-142ef30a57b0" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 19 19:29:50 crc kubenswrapper[4826]: I0319 19:29:50.195747 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="d357bfcc-5ce5-456d-b8b7-142ef30a57b0" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 19 19:29:50 crc kubenswrapper[4826]: I0319 19:29:50.196067 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="d357bfcc-5ce5-456d-b8b7-142ef30a57b0" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 19 19:29:50 crc kubenswrapper[4826]: I0319 19:29:50.197105 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7frjg" Mar 19 19:29:50 crc kubenswrapper[4826]: I0319 19:29:50.199275 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 19:29:50 crc kubenswrapper[4826]: I0319 19:29:50.199440 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jchxw" Mar 19 19:29:50 crc kubenswrapper[4826]: I0319 19:29:50.199557 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 19:29:50 crc kubenswrapper[4826]: I0319 19:29:50.199769 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 19:29:50 crc kubenswrapper[4826]: I0319 19:29:50.222240 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-7frjg"] Mar 19 19:29:50 crc kubenswrapper[4826]: I0319 19:29:50.295881 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktx76\" (UniqueName: \"kubernetes.io/projected/58cc0c09-7531-463e-ac30-4cb993eca5fc-kube-api-access-ktx76\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7frjg\" (UID: \"58cc0c09-7531-463e-ac30-4cb993eca5fc\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7frjg" Mar 19 19:29:50 crc kubenswrapper[4826]: I0319 19:29:50.295963 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/58cc0c09-7531-463e-ac30-4cb993eca5fc-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7frjg\" (UID: \"58cc0c09-7531-463e-ac30-4cb993eca5fc\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7frjg" Mar 19 19:29:50 crc kubenswrapper[4826]: I0319 19:29:50.296208 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/58cc0c09-7531-463e-ac30-4cb993eca5fc-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7frjg\" (UID: \"58cc0c09-7531-463e-ac30-4cb993eca5fc\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7frjg" Mar 19 19:29:50 crc kubenswrapper[4826]: I0319 19:29:50.399871 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/58cc0c09-7531-463e-ac30-4cb993eca5fc-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7frjg\" (UID: \"58cc0c09-7531-463e-ac30-4cb993eca5fc\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7frjg" Mar 19 19:29:50 crc kubenswrapper[4826]: I0319 19:29:50.400255 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktx76\" (UniqueName: \"kubernetes.io/projected/58cc0c09-7531-463e-ac30-4cb993eca5fc-kube-api-access-ktx76\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7frjg\" (UID: \"58cc0c09-7531-463e-ac30-4cb993eca5fc\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7frjg" Mar 19 19:29:50 crc kubenswrapper[4826]: I0319 19:29:50.400385 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/58cc0c09-7531-463e-ac30-4cb993eca5fc-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7frjg\" (UID: \"58cc0c09-7531-463e-ac30-4cb993eca5fc\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7frjg" Mar 19 19:29:50 crc kubenswrapper[4826]: I0319 19:29:50.404215 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/58cc0c09-7531-463e-ac30-4cb993eca5fc-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7frjg\" (UID: \"58cc0c09-7531-463e-ac30-4cb993eca5fc\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7frjg" Mar 19 19:29:50 crc kubenswrapper[4826]: I0319 19:29:50.407014 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/58cc0c09-7531-463e-ac30-4cb993eca5fc-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7frjg\" (UID: \"58cc0c09-7531-463e-ac30-4cb993eca5fc\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7frjg" Mar 19 19:29:50 crc kubenswrapper[4826]: I0319 19:29:50.422170 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktx76\" (UniqueName: \"kubernetes.io/projected/58cc0c09-7531-463e-ac30-4cb993eca5fc-kube-api-access-ktx76\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7frjg\" (UID: \"58cc0c09-7531-463e-ac30-4cb993eca5fc\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7frjg" Mar 19 19:29:50 crc kubenswrapper[4826]: I0319 19:29:50.520961 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7frjg" Mar 19 19:29:51 crc kubenswrapper[4826]: I0319 19:29:51.114713 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-7frjg"] Mar 19 19:29:52 crc kubenswrapper[4826]: I0319 19:29:52.130405 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7frjg" event={"ID":"58cc0c09-7531-463e-ac30-4cb993eca5fc","Type":"ContainerStarted","Data":"22ed8aefa80ae13dc9bc4c69929fa9841acaa6328d4945997a65d68186d87f5f"} Mar 19 19:29:52 crc kubenswrapper[4826]: I0319 19:29:52.131116 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7frjg" event={"ID":"58cc0c09-7531-463e-ac30-4cb993eca5fc","Type":"ContainerStarted","Data":"64b3457b24b1154853a6bacf45ea489ba4f011fb487210000dc18e48688c0a89"} Mar 19 19:29:52 crc kubenswrapper[4826]: I0319 19:29:52.164525 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7frjg" podStartSLOduration=1.681787134 podStartE2EDuration="2.164506009s" podCreationTimestamp="2026-03-19 19:29:50 +0000 UTC" firstStartedPulling="2026-03-19 19:29:51.104917955 +0000 UTC m=+2015.858986268" lastFinishedPulling="2026-03-19 19:29:51.58763682 +0000 UTC m=+2016.341705143" observedRunningTime="2026-03-19 19:29:52.154025704 +0000 UTC m=+2016.908094047" watchObservedRunningTime="2026-03-19 19:29:52.164506009 +0000 UTC m=+2016.918574322" Mar 19 19:29:55 crc kubenswrapper[4826]: I0319 19:29:55.400193 4826 patch_prober.go:28] interesting pod/machine-config-daemon-zz87p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:29:55 crc kubenswrapper[4826]: I0319 19:29:55.400812 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:30:00 crc kubenswrapper[4826]: I0319 19:30:00.164348 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565810-9wxvc"] Mar 19 19:30:00 crc kubenswrapper[4826]: I0319 19:30:00.166673 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565810-9wxvc" Mar 19 19:30:00 crc kubenswrapper[4826]: I0319 19:30:00.169986 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b27wl" Mar 19 19:30:00 crc kubenswrapper[4826]: I0319 19:30:00.170243 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 19:30:00 crc kubenswrapper[4826]: I0319 19:30:00.170678 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 19:30:00 crc kubenswrapper[4826]: I0319 19:30:00.196429 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565810-w7dq2"] Mar 19 19:30:00 crc kubenswrapper[4826]: I0319 19:30:00.198725 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565810-w7dq2" Mar 19 19:30:00 crc kubenswrapper[4826]: I0319 19:30:00.201445 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 19 19:30:00 crc kubenswrapper[4826]: I0319 19:30:00.201605 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 19 19:30:00 crc kubenswrapper[4826]: I0319 19:30:00.208731 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565810-9wxvc"] Mar 19 19:30:00 crc kubenswrapper[4826]: I0319 19:30:00.222152 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565810-w7dq2"] Mar 19 19:30:00 crc kubenswrapper[4826]: I0319 19:30:00.265405 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lsdl\" (UniqueName: \"kubernetes.io/projected/bda6fead-94fb-42e0-84f6-f318afdb6969-kube-api-access-4lsdl\") pod \"auto-csr-approver-29565810-9wxvc\" (UID: \"bda6fead-94fb-42e0-84f6-f318afdb6969\") " pod="openshift-infra/auto-csr-approver-29565810-9wxvc" Mar 19 19:30:00 crc kubenswrapper[4826]: I0319 19:30:00.265458 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtxqv\" (UniqueName: \"kubernetes.io/projected/f5cc3244-041a-411c-9459-64b8b57ed1ae-kube-api-access-gtxqv\") pod \"collect-profiles-29565810-w7dq2\" (UID: \"f5cc3244-041a-411c-9459-64b8b57ed1ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565810-w7dq2" Mar 19 19:30:00 crc kubenswrapper[4826]: I0319 19:30:00.265690 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f5cc3244-041a-411c-9459-64b8b57ed1ae-config-volume\") pod \"collect-profiles-29565810-w7dq2\" (UID: \"f5cc3244-041a-411c-9459-64b8b57ed1ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565810-w7dq2" Mar 19 19:30:00 crc kubenswrapper[4826]: I0319 19:30:00.265824 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f5cc3244-041a-411c-9459-64b8b57ed1ae-secret-volume\") pod \"collect-profiles-29565810-w7dq2\" (UID: \"f5cc3244-041a-411c-9459-64b8b57ed1ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565810-w7dq2" Mar 19 19:30:00 crc kubenswrapper[4826]: I0319 19:30:00.368728 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lsdl\" (UniqueName: \"kubernetes.io/projected/bda6fead-94fb-42e0-84f6-f318afdb6969-kube-api-access-4lsdl\") pod \"auto-csr-approver-29565810-9wxvc\" (UID: \"bda6fead-94fb-42e0-84f6-f318afdb6969\") " pod="openshift-infra/auto-csr-approver-29565810-9wxvc" Mar 19 19:30:00 crc kubenswrapper[4826]: I0319 19:30:00.368832 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtxqv\" (UniqueName: \"kubernetes.io/projected/f5cc3244-041a-411c-9459-64b8b57ed1ae-kube-api-access-gtxqv\") pod \"collect-profiles-29565810-w7dq2\" (UID: \"f5cc3244-041a-411c-9459-64b8b57ed1ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565810-w7dq2" Mar 19 19:30:00 crc kubenswrapper[4826]: I0319 19:30:00.368940 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f5cc3244-041a-411c-9459-64b8b57ed1ae-config-volume\") pod \"collect-profiles-29565810-w7dq2\" (UID: \"f5cc3244-041a-411c-9459-64b8b57ed1ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565810-w7dq2" Mar 19 19:30:00 crc kubenswrapper[4826]: I0319 19:30:00.369025 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f5cc3244-041a-411c-9459-64b8b57ed1ae-secret-volume\") pod \"collect-profiles-29565810-w7dq2\" (UID: \"f5cc3244-041a-411c-9459-64b8b57ed1ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565810-w7dq2" Mar 19 19:30:00 crc kubenswrapper[4826]: I0319 19:30:00.369867 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f5cc3244-041a-411c-9459-64b8b57ed1ae-config-volume\") pod \"collect-profiles-29565810-w7dq2\" (UID: \"f5cc3244-041a-411c-9459-64b8b57ed1ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565810-w7dq2" Mar 19 19:30:00 crc kubenswrapper[4826]: I0319 19:30:00.375447 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f5cc3244-041a-411c-9459-64b8b57ed1ae-secret-volume\") pod \"collect-profiles-29565810-w7dq2\" (UID: \"f5cc3244-041a-411c-9459-64b8b57ed1ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565810-w7dq2" Mar 19 19:30:00 crc kubenswrapper[4826]: I0319 19:30:00.389578 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtxqv\" (UniqueName: \"kubernetes.io/projected/f5cc3244-041a-411c-9459-64b8b57ed1ae-kube-api-access-gtxqv\") pod \"collect-profiles-29565810-w7dq2\" (UID: \"f5cc3244-041a-411c-9459-64b8b57ed1ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565810-w7dq2" Mar 19 19:30:00 crc kubenswrapper[4826]: I0319 19:30:00.396297 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lsdl\" (UniqueName: \"kubernetes.io/projected/bda6fead-94fb-42e0-84f6-f318afdb6969-kube-api-access-4lsdl\") pod \"auto-csr-approver-29565810-9wxvc\" (UID: \"bda6fead-94fb-42e0-84f6-f318afdb6969\") " pod="openshift-infra/auto-csr-approver-29565810-9wxvc" Mar 19 19:30:00 crc kubenswrapper[4826]: I0319 19:30:00.491185 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565810-9wxvc" Mar 19 19:30:00 crc kubenswrapper[4826]: I0319 19:30:00.517034 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565810-w7dq2" Mar 19 19:30:01 crc kubenswrapper[4826]: I0319 19:30:01.092720 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565810-w7dq2"] Mar 19 19:30:01 crc kubenswrapper[4826]: W0319 19:30:01.162749 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbda6fead_94fb_42e0_84f6_f318afdb6969.slice/crio-bddd9cd82948fed870b06f0a6b60bb90680ff5af812fada52a194d5baddaadaf WatchSource:0}: Error finding container bddd9cd82948fed870b06f0a6b60bb90680ff5af812fada52a194d5baddaadaf: Status 404 returned error can't find the container with id bddd9cd82948fed870b06f0a6b60bb90680ff5af812fada52a194d5baddaadaf Mar 19 19:30:01 crc kubenswrapper[4826]: I0319 19:30:01.162922 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565810-9wxvc"] Mar 19 19:30:01 crc kubenswrapper[4826]: I0319 19:30:01.245978 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565810-w7dq2" event={"ID":"f5cc3244-041a-411c-9459-64b8b57ed1ae","Type":"ContainerStarted","Data":"c88ddb848b7cd7d50ff4db58926bb57963b411e4d5987348cc1a0f12ce88d368"} Mar 19 19:30:01 crc kubenswrapper[4826]: I0319 19:30:01.247280 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565810-9wxvc" event={"ID":"bda6fead-94fb-42e0-84f6-f318afdb6969","Type":"ContainerStarted","Data":"bddd9cd82948fed870b06f0a6b60bb90680ff5af812fada52a194d5baddaadaf"} Mar 19 19:30:01 crc kubenswrapper[4826]: I0319 19:30:01.331046 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-29cf9"] Mar 19 19:30:01 crc kubenswrapper[4826]: I0319 19:30:01.333997 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-29cf9" Mar 19 19:30:01 crc kubenswrapper[4826]: I0319 19:30:01.354465 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-29cf9"] Mar 19 19:30:01 crc kubenswrapper[4826]: I0319 19:30:01.396727 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59qt2\" (UniqueName: \"kubernetes.io/projected/4499b5a1-26c2-4aeb-9f5b-096c09cc25ea-kube-api-access-59qt2\") pod \"community-operators-29cf9\" (UID: \"4499b5a1-26c2-4aeb-9f5b-096c09cc25ea\") " pod="openshift-marketplace/community-operators-29cf9" Mar 19 19:30:01 crc kubenswrapper[4826]: I0319 19:30:01.397053 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4499b5a1-26c2-4aeb-9f5b-096c09cc25ea-catalog-content\") pod \"community-operators-29cf9\" (UID: \"4499b5a1-26c2-4aeb-9f5b-096c09cc25ea\") " pod="openshift-marketplace/community-operators-29cf9" Mar 19 19:30:01 crc kubenswrapper[4826]: I0319 19:30:01.397178 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4499b5a1-26c2-4aeb-9f5b-096c09cc25ea-utilities\") pod \"community-operators-29cf9\" (UID: \"4499b5a1-26c2-4aeb-9f5b-096c09cc25ea\") " pod="openshift-marketplace/community-operators-29cf9" Mar 19 19:30:01 crc kubenswrapper[4826]: I0319 19:30:01.498868 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4499b5a1-26c2-4aeb-9f5b-096c09cc25ea-catalog-content\") pod \"community-operators-29cf9\" (UID: \"4499b5a1-26c2-4aeb-9f5b-096c09cc25ea\") " pod="openshift-marketplace/community-operators-29cf9" Mar 19 19:30:01 crc kubenswrapper[4826]: I0319 19:30:01.498939 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4499b5a1-26c2-4aeb-9f5b-096c09cc25ea-utilities\") pod \"community-operators-29cf9\" (UID: \"4499b5a1-26c2-4aeb-9f5b-096c09cc25ea\") " pod="openshift-marketplace/community-operators-29cf9" Mar 19 19:30:01 crc kubenswrapper[4826]: I0319 19:30:01.499028 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59qt2\" (UniqueName: \"kubernetes.io/projected/4499b5a1-26c2-4aeb-9f5b-096c09cc25ea-kube-api-access-59qt2\") pod \"community-operators-29cf9\" (UID: \"4499b5a1-26c2-4aeb-9f5b-096c09cc25ea\") " pod="openshift-marketplace/community-operators-29cf9" Mar 19 19:30:01 crc kubenswrapper[4826]: I0319 19:30:01.499615 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4499b5a1-26c2-4aeb-9f5b-096c09cc25ea-utilities\") pod \"community-operators-29cf9\" (UID: \"4499b5a1-26c2-4aeb-9f5b-096c09cc25ea\") " pod="openshift-marketplace/community-operators-29cf9" Mar 19 19:30:01 crc kubenswrapper[4826]: I0319 19:30:01.499615 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4499b5a1-26c2-4aeb-9f5b-096c09cc25ea-catalog-content\") pod \"community-operators-29cf9\" (UID: \"4499b5a1-26c2-4aeb-9f5b-096c09cc25ea\") " pod="openshift-marketplace/community-operators-29cf9" Mar 19 19:30:01 crc kubenswrapper[4826]: I0319 19:30:01.526347 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59qt2\" (UniqueName: \"kubernetes.io/projected/4499b5a1-26c2-4aeb-9f5b-096c09cc25ea-kube-api-access-59qt2\") pod \"community-operators-29cf9\" (UID: \"4499b5a1-26c2-4aeb-9f5b-096c09cc25ea\") " pod="openshift-marketplace/community-operators-29cf9" Mar 19 19:30:01 crc kubenswrapper[4826]: I0319 19:30:01.654452 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-29cf9" Mar 19 19:30:02 crc kubenswrapper[4826]: I0319 19:30:02.275218 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565810-w7dq2" event={"ID":"f5cc3244-041a-411c-9459-64b8b57ed1ae","Type":"ContainerStarted","Data":"47fc440aa4c3e5bf057fcf82ecb76a2a0de666fa617ada59ab4bdfa77112bb5d"} Mar 19 19:30:02 crc kubenswrapper[4826]: I0319 19:30:02.341045 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-29cf9"] Mar 19 19:30:02 crc kubenswrapper[4826]: I0319 19:30:02.359679 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29565810-w7dq2" podStartSLOduration=2.35964351 podStartE2EDuration="2.35964351s" podCreationTimestamp="2026-03-19 19:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:30:02.327514257 +0000 UTC m=+2027.081582570" watchObservedRunningTime="2026-03-19 19:30:02.35964351 +0000 UTC m=+2027.113711823" Mar 19 19:30:02 crc kubenswrapper[4826]: W0319 19:30:02.377122 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4499b5a1_26c2_4aeb_9f5b_096c09cc25ea.slice/crio-d88158bfdac4e67d8caf63ec4c930590c7060275a27b33337d4c6d8b36721fd7 WatchSource:0}: Error finding container d88158bfdac4e67d8caf63ec4c930590c7060275a27b33337d4c6d8b36721fd7: Status 404 returned error can't find the container with id d88158bfdac4e67d8caf63ec4c930590c7060275a27b33337d4c6d8b36721fd7 Mar 19 19:30:03 crc kubenswrapper[4826]: I0319 19:30:03.288147 4826 generic.go:334] "Generic (PLEG): container finished" podID="f5cc3244-041a-411c-9459-64b8b57ed1ae" containerID="47fc440aa4c3e5bf057fcf82ecb76a2a0de666fa617ada59ab4bdfa77112bb5d" exitCode=0 Mar 19 19:30:03 crc kubenswrapper[4826]: I0319 19:30:03.288196 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565810-w7dq2" event={"ID":"f5cc3244-041a-411c-9459-64b8b57ed1ae","Type":"ContainerDied","Data":"47fc440aa4c3e5bf057fcf82ecb76a2a0de666fa617ada59ab4bdfa77112bb5d"} Mar 19 19:30:03 crc kubenswrapper[4826]: I0319 19:30:03.291788 4826 generic.go:334] "Generic (PLEG): container finished" podID="4499b5a1-26c2-4aeb-9f5b-096c09cc25ea" containerID="1a2c085013e59ad6a01a724d00dddf1667084476f2eec15df99143299c1394e2" exitCode=0 Mar 19 19:30:03 crc kubenswrapper[4826]: I0319 19:30:03.291825 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-29cf9" event={"ID":"4499b5a1-26c2-4aeb-9f5b-096c09cc25ea","Type":"ContainerDied","Data":"1a2c085013e59ad6a01a724d00dddf1667084476f2eec15df99143299c1394e2"} Mar 19 19:30:03 crc kubenswrapper[4826]: I0319 19:30:03.291849 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-29cf9" event={"ID":"4499b5a1-26c2-4aeb-9f5b-096c09cc25ea","Type":"ContainerStarted","Data":"d88158bfdac4e67d8caf63ec4c930590c7060275a27b33337d4c6d8b36721fd7"} Mar 19 19:30:04 crc kubenswrapper[4826]: I0319 19:30:04.311256 4826 generic.go:334] "Generic (PLEG): container finished" podID="bda6fead-94fb-42e0-84f6-f318afdb6969" containerID="c96fa13f4c93ffd55256410848bc8474ed30078636283bba2babc2b8c71622ff" exitCode=0 Mar 19 19:30:04 crc kubenswrapper[4826]: I0319 19:30:04.311333 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565810-9wxvc" event={"ID":"bda6fead-94fb-42e0-84f6-f318afdb6969","Type":"ContainerDied","Data":"c96fa13f4c93ffd55256410848bc8474ed30078636283bba2babc2b8c71622ff"} Mar 19 19:30:05 crc kubenswrapper[4826]: I0319 19:30:05.050183 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-nn4pv"] Mar 19 19:30:05 crc kubenswrapper[4826]: I0319 19:30:05.063043 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-nn4pv"] Mar 19 19:30:05 crc kubenswrapper[4826]: I0319 19:30:05.235666 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565810-w7dq2" Mar 19 19:30:05 crc kubenswrapper[4826]: I0319 19:30:05.317906 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f5cc3244-041a-411c-9459-64b8b57ed1ae-secret-volume\") pod \"f5cc3244-041a-411c-9459-64b8b57ed1ae\" (UID: \"f5cc3244-041a-411c-9459-64b8b57ed1ae\") " Mar 19 19:30:05 crc kubenswrapper[4826]: I0319 19:30:05.318085 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtxqv\" (UniqueName: \"kubernetes.io/projected/f5cc3244-041a-411c-9459-64b8b57ed1ae-kube-api-access-gtxqv\") pod \"f5cc3244-041a-411c-9459-64b8b57ed1ae\" (UID: \"f5cc3244-041a-411c-9459-64b8b57ed1ae\") " Mar 19 19:30:05 crc kubenswrapper[4826]: I0319 19:30:05.318137 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f5cc3244-041a-411c-9459-64b8b57ed1ae-config-volume\") pod \"f5cc3244-041a-411c-9459-64b8b57ed1ae\" (UID: \"f5cc3244-041a-411c-9459-64b8b57ed1ae\") " Mar 19 19:30:05 crc kubenswrapper[4826]: I0319 19:30:05.319593 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5cc3244-041a-411c-9459-64b8b57ed1ae-config-volume" (OuterVolumeSpecName: "config-volume") pod "f5cc3244-041a-411c-9459-64b8b57ed1ae" (UID: "f5cc3244-041a-411c-9459-64b8b57ed1ae"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:30:05 crc kubenswrapper[4826]: I0319 19:30:05.330059 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5cc3244-041a-411c-9459-64b8b57ed1ae-kube-api-access-gtxqv" (OuterVolumeSpecName: "kube-api-access-gtxqv") pod "f5cc3244-041a-411c-9459-64b8b57ed1ae" (UID: "f5cc3244-041a-411c-9459-64b8b57ed1ae"). InnerVolumeSpecName "kube-api-access-gtxqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:30:05 crc kubenswrapper[4826]: I0319 19:30:05.330935 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565810-w7dq2" event={"ID":"f5cc3244-041a-411c-9459-64b8b57ed1ae","Type":"ContainerDied","Data":"c88ddb848b7cd7d50ff4db58926bb57963b411e4d5987348cc1a0f12ce88d368"} Mar 19 19:30:05 crc kubenswrapper[4826]: I0319 19:30:05.330992 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c88ddb848b7cd7d50ff4db58926bb57963b411e4d5987348cc1a0f12ce88d368" Mar 19 19:30:05 crc kubenswrapper[4826]: I0319 19:30:05.331018 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565810-w7dq2" Mar 19 19:30:05 crc kubenswrapper[4826]: I0319 19:30:05.331041 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5cc3244-041a-411c-9459-64b8b57ed1ae-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f5cc3244-041a-411c-9459-64b8b57ed1ae" (UID: "f5cc3244-041a-411c-9459-64b8b57ed1ae"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:30:05 crc kubenswrapper[4826]: I0319 19:30:05.421292 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtxqv\" (UniqueName: \"kubernetes.io/projected/f5cc3244-041a-411c-9459-64b8b57ed1ae-kube-api-access-gtxqv\") on node \"crc\" DevicePath \"\"" Mar 19 19:30:05 crc kubenswrapper[4826]: I0319 19:30:05.421335 4826 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f5cc3244-041a-411c-9459-64b8b57ed1ae-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 19:30:05 crc kubenswrapper[4826]: I0319 19:30:05.421346 4826 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f5cc3244-041a-411c-9459-64b8b57ed1ae-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 19 19:30:05 crc kubenswrapper[4826]: I0319 19:30:05.854377 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565810-9wxvc" Mar 19 19:30:05 crc kubenswrapper[4826]: I0319 19:30:05.942081 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lsdl\" (UniqueName: \"kubernetes.io/projected/bda6fead-94fb-42e0-84f6-f318afdb6969-kube-api-access-4lsdl\") pod \"bda6fead-94fb-42e0-84f6-f318afdb6969\" (UID: \"bda6fead-94fb-42e0-84f6-f318afdb6969\") " Mar 19 19:30:05 crc kubenswrapper[4826]: I0319 19:30:05.950124 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bda6fead-94fb-42e0-84f6-f318afdb6969-kube-api-access-4lsdl" (OuterVolumeSpecName: "kube-api-access-4lsdl") pod "bda6fead-94fb-42e0-84f6-f318afdb6969" (UID: "bda6fead-94fb-42e0-84f6-f318afdb6969"). InnerVolumeSpecName "kube-api-access-4lsdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:30:06 crc kubenswrapper[4826]: I0319 19:30:06.004018 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f69e3ed0-cf8e-438d-a6f0-dac56664901e" path="/var/lib/kubelet/pods/f69e3ed0-cf8e-438d-a6f0-dac56664901e/volumes" Mar 19 19:30:06 crc kubenswrapper[4826]: I0319 19:30:06.047523 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lsdl\" (UniqueName: \"kubernetes.io/projected/bda6fead-94fb-42e0-84f6-f318afdb6969-kube-api-access-4lsdl\") on node \"crc\" DevicePath \"\"" Mar 19 19:30:06 crc kubenswrapper[4826]: I0319 19:30:06.314599 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565765-b929z"] Mar 19 19:30:06 crc kubenswrapper[4826]: I0319 19:30:06.325362 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565765-b929z"] Mar 19 19:30:06 crc kubenswrapper[4826]: I0319 19:30:06.344876 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565810-9wxvc" event={"ID":"bda6fead-94fb-42e0-84f6-f318afdb6969","Type":"ContainerDied","Data":"bddd9cd82948fed870b06f0a6b60bb90680ff5af812fada52a194d5baddaadaf"} Mar 19 19:30:06 crc kubenswrapper[4826]: I0319 19:30:06.344951 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bddd9cd82948fed870b06f0a6b60bb90680ff5af812fada52a194d5baddaadaf" Mar 19 19:30:06 crc kubenswrapper[4826]: I0319 19:30:06.344914 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565810-9wxvc" Mar 19 19:30:06 crc kubenswrapper[4826]: I0319 19:30:06.347141 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-29cf9" event={"ID":"4499b5a1-26c2-4aeb-9f5b-096c09cc25ea","Type":"ContainerStarted","Data":"3755053ba16b5c65f3f87307f1b2d8491fd33ab0ba37c501b839e6b071ac384b"} Mar 19 19:30:06 crc kubenswrapper[4826]: I0319 19:30:06.931785 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565804-fjhhm"] Mar 19 19:30:06 crc kubenswrapper[4826]: I0319 19:30:06.944406 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565804-fjhhm"] Mar 19 19:30:07 crc kubenswrapper[4826]: I0319 19:30:07.988991 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0eadd329-3574-4dcc-a93d-163d850caa5a" path="/var/lib/kubelet/pods/0eadd329-3574-4dcc-a93d-163d850caa5a/volumes" Mar 19 19:30:07 crc kubenswrapper[4826]: I0319 19:30:07.991724 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c53abe4-412d-47a0-bccc-ec9e6f4d8784" path="/var/lib/kubelet/pods/7c53abe4-412d-47a0-bccc-ec9e6f4d8784/volumes" Mar 19 19:30:08 crc kubenswrapper[4826]: I0319 19:30:08.377459 4826 generic.go:334] "Generic (PLEG): container finished" podID="4499b5a1-26c2-4aeb-9f5b-096c09cc25ea" containerID="3755053ba16b5c65f3f87307f1b2d8491fd33ab0ba37c501b839e6b071ac384b" exitCode=0 Mar 19 19:30:08 crc kubenswrapper[4826]: I0319 19:30:08.377730 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-29cf9" event={"ID":"4499b5a1-26c2-4aeb-9f5b-096c09cc25ea","Type":"ContainerDied","Data":"3755053ba16b5c65f3f87307f1b2d8491fd33ab0ba37c501b839e6b071ac384b"} Mar 19 19:30:09 crc kubenswrapper[4826]: I0319 19:30:09.392827 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-29cf9" event={"ID":"4499b5a1-26c2-4aeb-9f5b-096c09cc25ea","Type":"ContainerStarted","Data":"8666bef44b6fa14974b8d540019ed4488c2b6703fd4ed277759294c5f321694a"} Mar 19 19:30:09 crc kubenswrapper[4826]: I0319 19:30:09.419205 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-29cf9" podStartSLOduration=2.7846879810000003 podStartE2EDuration="8.419167587s" podCreationTimestamp="2026-03-19 19:30:01 +0000 UTC" firstStartedPulling="2026-03-19 19:30:03.298940042 +0000 UTC m=+2028.053008355" lastFinishedPulling="2026-03-19 19:30:08.933419648 +0000 UTC m=+2033.687487961" observedRunningTime="2026-03-19 19:30:09.410511515 +0000 UTC m=+2034.164579848" watchObservedRunningTime="2026-03-19 19:30:09.419167587 +0000 UTC m=+2034.173235910" Mar 19 19:30:11 crc kubenswrapper[4826]: I0319 19:30:11.654685 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-29cf9" Mar 19 19:30:11 crc kubenswrapper[4826]: I0319 19:30:11.654951 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-29cf9" Mar 19 19:30:11 crc kubenswrapper[4826]: I0319 19:30:11.704751 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-29cf9" Mar 19 19:30:21 crc kubenswrapper[4826]: I0319 19:30:21.731531 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-29cf9" Mar 19 19:30:21 crc kubenswrapper[4826]: I0319 19:30:21.794022 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-29cf9"] Mar 19 19:30:22 crc kubenswrapper[4826]: I0319 19:30:22.559246 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-29cf9" podUID="4499b5a1-26c2-4aeb-9f5b-096c09cc25ea" containerName="registry-server" containerID="cri-o://8666bef44b6fa14974b8d540019ed4488c2b6703fd4ed277759294c5f321694a" gracePeriod=2 Mar 19 19:30:23 crc kubenswrapper[4826]: I0319 19:30:23.189676 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-29cf9" Mar 19 19:30:23 crc kubenswrapper[4826]: I0319 19:30:23.294240 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59qt2\" (UniqueName: \"kubernetes.io/projected/4499b5a1-26c2-4aeb-9f5b-096c09cc25ea-kube-api-access-59qt2\") pod \"4499b5a1-26c2-4aeb-9f5b-096c09cc25ea\" (UID: \"4499b5a1-26c2-4aeb-9f5b-096c09cc25ea\") " Mar 19 19:30:23 crc kubenswrapper[4826]: I0319 19:30:23.294408 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4499b5a1-26c2-4aeb-9f5b-096c09cc25ea-catalog-content\") pod \"4499b5a1-26c2-4aeb-9f5b-096c09cc25ea\" (UID: \"4499b5a1-26c2-4aeb-9f5b-096c09cc25ea\") " Mar 19 19:30:23 crc kubenswrapper[4826]: I0319 19:30:23.294631 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4499b5a1-26c2-4aeb-9f5b-096c09cc25ea-utilities\") pod \"4499b5a1-26c2-4aeb-9f5b-096c09cc25ea\" (UID: \"4499b5a1-26c2-4aeb-9f5b-096c09cc25ea\") " Mar 19 19:30:23 crc kubenswrapper[4826]: I0319 19:30:23.296096 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4499b5a1-26c2-4aeb-9f5b-096c09cc25ea-utilities" (OuterVolumeSpecName: "utilities") pod "4499b5a1-26c2-4aeb-9f5b-096c09cc25ea" (UID: "4499b5a1-26c2-4aeb-9f5b-096c09cc25ea"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:30:23 crc kubenswrapper[4826]: I0319 19:30:23.301843 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4499b5a1-26c2-4aeb-9f5b-096c09cc25ea-kube-api-access-59qt2" (OuterVolumeSpecName: "kube-api-access-59qt2") pod "4499b5a1-26c2-4aeb-9f5b-096c09cc25ea" (UID: "4499b5a1-26c2-4aeb-9f5b-096c09cc25ea"). InnerVolumeSpecName "kube-api-access-59qt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:30:23 crc kubenswrapper[4826]: I0319 19:30:23.352174 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4499b5a1-26c2-4aeb-9f5b-096c09cc25ea-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4499b5a1-26c2-4aeb-9f5b-096c09cc25ea" (UID: "4499b5a1-26c2-4aeb-9f5b-096c09cc25ea"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:30:23 crc kubenswrapper[4826]: I0319 19:30:23.399167 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59qt2\" (UniqueName: \"kubernetes.io/projected/4499b5a1-26c2-4aeb-9f5b-096c09cc25ea-kube-api-access-59qt2\") on node \"crc\" DevicePath \"\"" Mar 19 19:30:23 crc kubenswrapper[4826]: I0319 19:30:23.399211 4826 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4499b5a1-26c2-4aeb-9f5b-096c09cc25ea-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 19:30:23 crc kubenswrapper[4826]: I0319 19:30:23.399221 4826 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4499b5a1-26c2-4aeb-9f5b-096c09cc25ea-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 19:30:23 crc kubenswrapper[4826]: I0319 19:30:23.573583 4826 generic.go:334] "Generic (PLEG): container finished" podID="4499b5a1-26c2-4aeb-9f5b-096c09cc25ea" containerID="8666bef44b6fa14974b8d540019ed4488c2b6703fd4ed277759294c5f321694a" exitCode=0 Mar 19 19:30:23 crc kubenswrapper[4826]: I0319 19:30:23.573643 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-29cf9" event={"ID":"4499b5a1-26c2-4aeb-9f5b-096c09cc25ea","Type":"ContainerDied","Data":"8666bef44b6fa14974b8d540019ed4488c2b6703fd4ed277759294c5f321694a"} Mar 19 19:30:23 crc kubenswrapper[4826]: I0319 19:30:23.573721 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-29cf9" event={"ID":"4499b5a1-26c2-4aeb-9f5b-096c09cc25ea","Type":"ContainerDied","Data":"d88158bfdac4e67d8caf63ec4c930590c7060275a27b33337d4c6d8b36721fd7"} Mar 19 19:30:23 crc kubenswrapper[4826]: I0319 19:30:23.573719 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-29cf9" Mar 19 19:30:23 crc kubenswrapper[4826]: I0319 19:30:23.573783 4826 scope.go:117] "RemoveContainer" containerID="8666bef44b6fa14974b8d540019ed4488c2b6703fd4ed277759294c5f321694a" Mar 19 19:30:23 crc kubenswrapper[4826]: I0319 19:30:23.621117 4826 scope.go:117] "RemoveContainer" containerID="3755053ba16b5c65f3f87307f1b2d8491fd33ab0ba37c501b839e6b071ac384b" Mar 19 19:30:23 crc kubenswrapper[4826]: I0319 19:30:23.631593 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-29cf9"] Mar 19 19:30:23 crc kubenswrapper[4826]: I0319 19:30:23.643618 4826 scope.go:117] "RemoveContainer" containerID="1a2c085013e59ad6a01a724d00dddf1667084476f2eec15df99143299c1394e2" Mar 19 19:30:23 crc kubenswrapper[4826]: I0319 19:30:23.645974 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-29cf9"] Mar 19 19:30:23 crc kubenswrapper[4826]: I0319 19:30:23.726701 4826 scope.go:117] "RemoveContainer" containerID="8666bef44b6fa14974b8d540019ed4488c2b6703fd4ed277759294c5f321694a" Mar 19 19:30:23 crc kubenswrapper[4826]: E0319 19:30:23.727530 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8666bef44b6fa14974b8d540019ed4488c2b6703fd4ed277759294c5f321694a\": container with ID starting with 8666bef44b6fa14974b8d540019ed4488c2b6703fd4ed277759294c5f321694a not found: ID does not exist" containerID="8666bef44b6fa14974b8d540019ed4488c2b6703fd4ed277759294c5f321694a" Mar 19 19:30:23 crc kubenswrapper[4826]: I0319 19:30:23.727604 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8666bef44b6fa14974b8d540019ed4488c2b6703fd4ed277759294c5f321694a"} err="failed to get container status \"8666bef44b6fa14974b8d540019ed4488c2b6703fd4ed277759294c5f321694a\": rpc error: code = NotFound desc = could not find container \"8666bef44b6fa14974b8d540019ed4488c2b6703fd4ed277759294c5f321694a\": container with ID starting with 8666bef44b6fa14974b8d540019ed4488c2b6703fd4ed277759294c5f321694a not found: ID does not exist" Mar 19 19:30:23 crc kubenswrapper[4826]: I0319 19:30:23.727631 4826 scope.go:117] "RemoveContainer" containerID="3755053ba16b5c65f3f87307f1b2d8491fd33ab0ba37c501b839e6b071ac384b" Mar 19 19:30:23 crc kubenswrapper[4826]: E0319 19:30:23.728230 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3755053ba16b5c65f3f87307f1b2d8491fd33ab0ba37c501b839e6b071ac384b\": container with ID starting with 3755053ba16b5c65f3f87307f1b2d8491fd33ab0ba37c501b839e6b071ac384b not found: ID does not exist" containerID="3755053ba16b5c65f3f87307f1b2d8491fd33ab0ba37c501b839e6b071ac384b" Mar 19 19:30:23 crc kubenswrapper[4826]: I0319 19:30:23.728291 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3755053ba16b5c65f3f87307f1b2d8491fd33ab0ba37c501b839e6b071ac384b"} err="failed to get container status \"3755053ba16b5c65f3f87307f1b2d8491fd33ab0ba37c501b839e6b071ac384b\": rpc error: code = NotFound desc = could not find container \"3755053ba16b5c65f3f87307f1b2d8491fd33ab0ba37c501b839e6b071ac384b\": container with ID starting with 3755053ba16b5c65f3f87307f1b2d8491fd33ab0ba37c501b839e6b071ac384b not found: ID does not exist" Mar 19 19:30:23 crc kubenswrapper[4826]: I0319 19:30:23.728327 4826 scope.go:117] "RemoveContainer" containerID="1a2c085013e59ad6a01a724d00dddf1667084476f2eec15df99143299c1394e2" Mar 19 19:30:23 crc kubenswrapper[4826]: E0319 19:30:23.729036 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a2c085013e59ad6a01a724d00dddf1667084476f2eec15df99143299c1394e2\": container with ID starting with 1a2c085013e59ad6a01a724d00dddf1667084476f2eec15df99143299c1394e2 not found: ID does not exist" containerID="1a2c085013e59ad6a01a724d00dddf1667084476f2eec15df99143299c1394e2" Mar 19 19:30:23 crc kubenswrapper[4826]: I0319 19:30:23.729073 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a2c085013e59ad6a01a724d00dddf1667084476f2eec15df99143299c1394e2"} err="failed to get container status \"1a2c085013e59ad6a01a724d00dddf1667084476f2eec15df99143299c1394e2\": rpc error: code = NotFound desc = could not find container \"1a2c085013e59ad6a01a724d00dddf1667084476f2eec15df99143299c1394e2\": container with ID starting with 1a2c085013e59ad6a01a724d00dddf1667084476f2eec15df99143299c1394e2 not found: ID does not exist" Mar 19 19:30:23 crc kubenswrapper[4826]: I0319 19:30:23.993894 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4499b5a1-26c2-4aeb-9f5b-096c09cc25ea" path="/var/lib/kubelet/pods/4499b5a1-26c2-4aeb-9f5b-096c09cc25ea/volumes" Mar 19 19:30:25 crc kubenswrapper[4826]: I0319 19:30:25.400352 4826 patch_prober.go:28] interesting pod/machine-config-daemon-zz87p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:30:25 crc kubenswrapper[4826]: I0319 19:30:25.400673 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:30:25 crc kubenswrapper[4826]: I0319 19:30:25.400727 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" Mar 19 19:30:25 crc kubenswrapper[4826]: I0319 19:30:25.401779 4826 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"daa7bd03e971974092a41659f4aba26392bc838aa5d2437fd4d817280d85c5e9"} pod="openshift-machine-config-operator/machine-config-daemon-zz87p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 19:30:25 crc kubenswrapper[4826]: I0319 19:30:25.401848 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" containerID="cri-o://daa7bd03e971974092a41659f4aba26392bc838aa5d2437fd4d817280d85c5e9" gracePeriod=600 Mar 19 19:30:25 crc kubenswrapper[4826]: I0319 19:30:25.602766 4826 generic.go:334] "Generic (PLEG): container finished" podID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerID="daa7bd03e971974092a41659f4aba26392bc838aa5d2437fd4d817280d85c5e9" exitCode=0 Mar 19 19:30:25 crc kubenswrapper[4826]: I0319 19:30:25.602818 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" event={"ID":"b456fa3f-c7a7-45ca-b560-e7a9b21be05a","Type":"ContainerDied","Data":"daa7bd03e971974092a41659f4aba26392bc838aa5d2437fd4d817280d85c5e9"} Mar 19 19:30:25 crc kubenswrapper[4826]: I0319 19:30:25.602864 4826 scope.go:117] "RemoveContainer" containerID="856447f1cdc796c080402d3bfb76d7471741ca95039714006756d0cb980e424c" Mar 19 19:30:26 crc kubenswrapper[4826]: I0319 19:30:26.624359 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" event={"ID":"b456fa3f-c7a7-45ca-b560-e7a9b21be05a","Type":"ContainerStarted","Data":"7ad46c116609c2194bb036463992f5f5e8e6454d574c11fba78d76956fe99246"} Mar 19 19:30:30 crc kubenswrapper[4826]: I0319 19:30:30.043538 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-b743-account-create-update-tg8gf"] Mar 19 19:30:30 crc kubenswrapper[4826]: I0319 19:30:30.052475 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-xntrr"] Mar 19 19:30:30 crc kubenswrapper[4826]: I0319 19:30:30.062558 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-b743-account-create-update-tg8gf"] Mar 19 19:30:30 crc kubenswrapper[4826]: I0319 19:30:30.072468 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-xntrr"] Mar 19 19:30:30 crc kubenswrapper[4826]: I0319 19:30:30.692603 4826 generic.go:334] "Generic (PLEG): container finished" podID="58cc0c09-7531-463e-ac30-4cb993eca5fc" containerID="22ed8aefa80ae13dc9bc4c69929fa9841acaa6328d4945997a65d68186d87f5f" exitCode=0 Mar 19 19:30:30 crc kubenswrapper[4826]: I0319 19:30:30.696708 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7frjg" event={"ID":"58cc0c09-7531-463e-ac30-4cb993eca5fc","Type":"ContainerDied","Data":"22ed8aefa80ae13dc9bc4c69929fa9841acaa6328d4945997a65d68186d87f5f"} Mar 19 19:30:31 crc kubenswrapper[4826]: I0319 19:30:31.396033 4826 scope.go:117] "RemoveContainer" containerID="13db328b16b5b91f7dab9647f18f61f5279dc45c5f092cecb97afa038e776674" Mar 19 19:30:31 crc kubenswrapper[4826]: I0319 19:30:31.443055 4826 scope.go:117] "RemoveContainer" containerID="c9b0fe2e7f9ebfff2357c08b0c5c9bdbaa33ce2f44b1e85e2a7eb86f4a52b539" Mar 19 19:30:31 crc kubenswrapper[4826]: I0319 19:30:31.538931 4826 scope.go:117] "RemoveContainer" containerID="1822cdbf54e42ae9c9de3ddaf66eef30a5a35bec9d9f90947b3e4984139e9265" Mar 19 19:30:32 crc kubenswrapper[4826]: I0319 19:30:32.003152 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c93b8d4-4e5b-4a8d-a8fb-de909730e675" path="/var/lib/kubelet/pods/6c93b8d4-4e5b-4a8d-a8fb-de909730e675/volumes" Mar 19 19:30:32 crc kubenswrapper[4826]: I0319 19:30:32.004593 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c79db235-4360-4ffc-a56e-b9fe0fad7eb6" path="/var/lib/kubelet/pods/c79db235-4360-4ffc-a56e-b9fe0fad7eb6/volumes" Mar 19 19:30:32 crc kubenswrapper[4826]: I0319 19:30:32.039717 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-jsjkk"] Mar 19 19:30:32 crc kubenswrapper[4826]: I0319 19:30:32.053014 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-jsjkk"] Mar 19 19:30:32 crc kubenswrapper[4826]: I0319 19:30:32.198123 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7frjg" Mar 19 19:30:32 crc kubenswrapper[4826]: I0319 19:30:32.360748 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/58cc0c09-7531-463e-ac30-4cb993eca5fc-ssh-key-openstack-edpm-ipam\") pod \"58cc0c09-7531-463e-ac30-4cb993eca5fc\" (UID: \"58cc0c09-7531-463e-ac30-4cb993eca5fc\") " Mar 19 19:30:32 crc kubenswrapper[4826]: I0319 19:30:32.361099 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/58cc0c09-7531-463e-ac30-4cb993eca5fc-inventory\") pod \"58cc0c09-7531-463e-ac30-4cb993eca5fc\" (UID: \"58cc0c09-7531-463e-ac30-4cb993eca5fc\") " Mar 19 19:30:32 crc kubenswrapper[4826]: I0319 19:30:32.361228 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktx76\" (UniqueName: \"kubernetes.io/projected/58cc0c09-7531-463e-ac30-4cb993eca5fc-kube-api-access-ktx76\") pod \"58cc0c09-7531-463e-ac30-4cb993eca5fc\" (UID: \"58cc0c09-7531-463e-ac30-4cb993eca5fc\") " Mar 19 19:30:32 crc kubenswrapper[4826]: I0319 19:30:32.365902 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58cc0c09-7531-463e-ac30-4cb993eca5fc-kube-api-access-ktx76" (OuterVolumeSpecName: "kube-api-access-ktx76") pod "58cc0c09-7531-463e-ac30-4cb993eca5fc" (UID: "58cc0c09-7531-463e-ac30-4cb993eca5fc"). InnerVolumeSpecName "kube-api-access-ktx76". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:30:32 crc kubenswrapper[4826]: I0319 19:30:32.400023 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58cc0c09-7531-463e-ac30-4cb993eca5fc-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "58cc0c09-7531-463e-ac30-4cb993eca5fc" (UID: "58cc0c09-7531-463e-ac30-4cb993eca5fc"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:30:32 crc kubenswrapper[4826]: I0319 19:30:32.413377 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58cc0c09-7531-463e-ac30-4cb993eca5fc-inventory" (OuterVolumeSpecName: "inventory") pod "58cc0c09-7531-463e-ac30-4cb993eca5fc" (UID: "58cc0c09-7531-463e-ac30-4cb993eca5fc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:30:32 crc kubenswrapper[4826]: I0319 19:30:32.464056 4826 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/58cc0c09-7531-463e-ac30-4cb993eca5fc-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 19:30:32 crc kubenswrapper[4826]: I0319 19:30:32.464090 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktx76\" (UniqueName: \"kubernetes.io/projected/58cc0c09-7531-463e-ac30-4cb993eca5fc-kube-api-access-ktx76\") on node \"crc\" DevicePath \"\"" Mar 19 19:30:32 crc kubenswrapper[4826]: I0319 19:30:32.464105 4826 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/58cc0c09-7531-463e-ac30-4cb993eca5fc-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 19:30:32 crc kubenswrapper[4826]: I0319 19:30:32.724426 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7frjg" event={"ID":"58cc0c09-7531-463e-ac30-4cb993eca5fc","Type":"ContainerDied","Data":"64b3457b24b1154853a6bacf45ea489ba4f011fb487210000dc18e48688c0a89"} Mar 19 19:30:32 crc kubenswrapper[4826]: I0319 19:30:32.724476 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64b3457b24b1154853a6bacf45ea489ba4f011fb487210000dc18e48688c0a89" Mar 19 19:30:32 crc kubenswrapper[4826]: I0319 19:30:32.724503 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7frjg" Mar 19 19:30:32 crc kubenswrapper[4826]: I0319 19:30:32.840816 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d7rnr"] Mar 19 19:30:32 crc kubenswrapper[4826]: E0319 19:30:32.841832 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4499b5a1-26c2-4aeb-9f5b-096c09cc25ea" containerName="extract-utilities" Mar 19 19:30:32 crc kubenswrapper[4826]: I0319 19:30:32.841865 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="4499b5a1-26c2-4aeb-9f5b-096c09cc25ea" containerName="extract-utilities" Mar 19 19:30:32 crc kubenswrapper[4826]: E0319 19:30:32.841895 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5cc3244-041a-411c-9459-64b8b57ed1ae" containerName="collect-profiles" Mar 19 19:30:32 crc kubenswrapper[4826]: I0319 19:30:32.841908 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5cc3244-041a-411c-9459-64b8b57ed1ae" containerName="collect-profiles" Mar 19 19:30:32 crc kubenswrapper[4826]: E0319 19:30:32.841928 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bda6fead-94fb-42e0-84f6-f318afdb6969" containerName="oc" Mar 19 19:30:32 crc kubenswrapper[4826]: I0319 19:30:32.841940 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="bda6fead-94fb-42e0-84f6-f318afdb6969" containerName="oc" Mar 19 19:30:32 crc kubenswrapper[4826]: E0319 19:30:32.841986 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58cc0c09-7531-463e-ac30-4cb993eca5fc" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 19 19:30:32 crc kubenswrapper[4826]: I0319 19:30:32.842001 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="58cc0c09-7531-463e-ac30-4cb993eca5fc" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 19 19:30:32 crc kubenswrapper[4826]: E0319 19:30:32.842021 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4499b5a1-26c2-4aeb-9f5b-096c09cc25ea" containerName="extract-content" Mar 19 19:30:32 crc kubenswrapper[4826]: I0319 19:30:32.842034 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="4499b5a1-26c2-4aeb-9f5b-096c09cc25ea" containerName="extract-content" Mar 19 19:30:32 crc kubenswrapper[4826]: E0319 19:30:32.842055 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4499b5a1-26c2-4aeb-9f5b-096c09cc25ea" containerName="registry-server" Mar 19 19:30:32 crc kubenswrapper[4826]: I0319 19:30:32.842066 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="4499b5a1-26c2-4aeb-9f5b-096c09cc25ea" containerName="registry-server" Mar 19 19:30:32 crc kubenswrapper[4826]: I0319 19:30:32.842433 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5cc3244-041a-411c-9459-64b8b57ed1ae" containerName="collect-profiles" Mar 19 19:30:32 crc kubenswrapper[4826]: I0319 19:30:32.842464 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="4499b5a1-26c2-4aeb-9f5b-096c09cc25ea" containerName="registry-server" Mar 19 19:30:32 crc kubenswrapper[4826]: I0319 19:30:32.842492 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="58cc0c09-7531-463e-ac30-4cb993eca5fc" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 19 19:30:32 crc kubenswrapper[4826]: I0319 19:30:32.842520 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="bda6fead-94fb-42e0-84f6-f318afdb6969" containerName="oc" Mar 19 19:30:32 crc kubenswrapper[4826]: I0319 19:30:32.843988 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d7rnr" Mar 19 19:30:32 crc kubenswrapper[4826]: I0319 19:30:32.846165 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 19:30:32 crc kubenswrapper[4826]: I0319 19:30:32.846224 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 19:30:32 crc kubenswrapper[4826]: I0319 19:30:32.846398 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 19:30:32 crc kubenswrapper[4826]: I0319 19:30:32.846743 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jchxw" Mar 19 19:30:32 crc kubenswrapper[4826]: I0319 19:30:32.852119 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d7rnr"] Mar 19 19:30:32 crc kubenswrapper[4826]: I0319 19:30:32.978244 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lr24k\" (UniqueName: \"kubernetes.io/projected/a450a030-0a6c-4b58-8d12-ac4b7fb8c20d-kube-api-access-lr24k\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-d7rnr\" (UID: \"a450a030-0a6c-4b58-8d12-ac4b7fb8c20d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d7rnr" Mar 19 19:30:32 crc kubenswrapper[4826]: I0319 19:30:32.978379 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a450a030-0a6c-4b58-8d12-ac4b7fb8c20d-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-d7rnr\" (UID: \"a450a030-0a6c-4b58-8d12-ac4b7fb8c20d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d7rnr" Mar 19 19:30:32 crc kubenswrapper[4826]: I0319 19:30:32.978464 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a450a030-0a6c-4b58-8d12-ac4b7fb8c20d-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-d7rnr\" (UID: \"a450a030-0a6c-4b58-8d12-ac4b7fb8c20d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d7rnr" Mar 19 19:30:33 crc kubenswrapper[4826]: I0319 19:30:33.081078 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lr24k\" (UniqueName: \"kubernetes.io/projected/a450a030-0a6c-4b58-8d12-ac4b7fb8c20d-kube-api-access-lr24k\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-d7rnr\" (UID: \"a450a030-0a6c-4b58-8d12-ac4b7fb8c20d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d7rnr" Mar 19 19:30:33 crc kubenswrapper[4826]: I0319 19:30:33.081310 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a450a030-0a6c-4b58-8d12-ac4b7fb8c20d-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-d7rnr\" (UID: \"a450a030-0a6c-4b58-8d12-ac4b7fb8c20d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d7rnr" Mar 19 19:30:33 crc kubenswrapper[4826]: I0319 19:30:33.081456 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a450a030-0a6c-4b58-8d12-ac4b7fb8c20d-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-d7rnr\" (UID: \"a450a030-0a6c-4b58-8d12-ac4b7fb8c20d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d7rnr" Mar 19 19:30:33 crc kubenswrapper[4826]: I0319 19:30:33.087719 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a450a030-0a6c-4b58-8d12-ac4b7fb8c20d-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-d7rnr\" (UID: \"a450a030-0a6c-4b58-8d12-ac4b7fb8c20d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d7rnr" Mar 19 19:30:33 crc kubenswrapper[4826]: I0319 19:30:33.087840 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a450a030-0a6c-4b58-8d12-ac4b7fb8c20d-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-d7rnr\" (UID: \"a450a030-0a6c-4b58-8d12-ac4b7fb8c20d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d7rnr" Mar 19 19:30:33 crc kubenswrapper[4826]: I0319 19:30:33.110163 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lr24k\" (UniqueName: \"kubernetes.io/projected/a450a030-0a6c-4b58-8d12-ac4b7fb8c20d-kube-api-access-lr24k\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-d7rnr\" (UID: \"a450a030-0a6c-4b58-8d12-ac4b7fb8c20d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d7rnr" Mar 19 19:30:33 crc kubenswrapper[4826]: I0319 19:30:33.160462 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d7rnr" Mar 19 19:30:33 crc kubenswrapper[4826]: I0319 19:30:33.780705 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d7rnr"] Mar 19 19:30:33 crc kubenswrapper[4826]: I0319 19:30:33.991212 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05dd165a-9504-4521-9814-4e252234fd9b" path="/var/lib/kubelet/pods/05dd165a-9504-4521-9814-4e252234fd9b/volumes" Mar 19 19:30:34 crc kubenswrapper[4826]: I0319 19:30:34.756394 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d7rnr" event={"ID":"a450a030-0a6c-4b58-8d12-ac4b7fb8c20d","Type":"ContainerStarted","Data":"5c68949f88602982418f75f42dba066f45334a72f99fac429ec8e3d2fd725c38"} Mar 19 19:30:34 crc kubenswrapper[4826]: I0319 19:30:34.756778 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d7rnr" event={"ID":"a450a030-0a6c-4b58-8d12-ac4b7fb8c20d","Type":"ContainerStarted","Data":"a26fe3d7b6d8f8b468082e399a282888f8214078a66926ba2dfd4ee16a857a02"} Mar 19 19:30:34 crc kubenswrapper[4826]: I0319 19:30:34.783430 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d7rnr" podStartSLOduration=2.291413141 podStartE2EDuration="2.783398112s" podCreationTimestamp="2026-03-19 19:30:32 +0000 UTC" firstStartedPulling="2026-03-19 19:30:33.786482844 +0000 UTC m=+2058.540551167" lastFinishedPulling="2026-03-19 19:30:34.278467815 +0000 UTC m=+2059.032536138" observedRunningTime="2026-03-19 19:30:34.776515874 +0000 UTC m=+2059.530584187" watchObservedRunningTime="2026-03-19 19:30:34.783398112 +0000 UTC m=+2059.537466455" Mar 19 19:30:35 crc kubenswrapper[4826]: I0319 19:30:35.035576 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-mdwdm"] Mar 19 19:30:35 crc kubenswrapper[4826]: I0319 19:30:35.048945 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-mdwdm"] Mar 19 19:30:36 crc kubenswrapper[4826]: I0319 19:30:36.003331 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15200e4f-15d2-450d-93ff-3b26e3df0b48" path="/var/lib/kubelet/pods/15200e4f-15d2-450d-93ff-3b26e3df0b48/volumes" Mar 19 19:31:20 crc kubenswrapper[4826]: I0319 19:31:20.071380 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-4bbb2"] Mar 19 19:31:20 crc kubenswrapper[4826]: I0319 19:31:20.086162 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-4bbb2"] Mar 19 19:31:21 crc kubenswrapper[4826]: I0319 19:31:21.997824 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8c464b5-25b1-4543-8067-01ca9883d215" path="/var/lib/kubelet/pods/a8c464b5-25b1-4543-8067-01ca9883d215/volumes" Mar 19 19:31:27 crc kubenswrapper[4826]: I0319 19:31:27.395183 4826 generic.go:334] "Generic (PLEG): container finished" podID="a450a030-0a6c-4b58-8d12-ac4b7fb8c20d" containerID="5c68949f88602982418f75f42dba066f45334a72f99fac429ec8e3d2fd725c38" exitCode=0 Mar 19 19:31:27 crc kubenswrapper[4826]: I0319 19:31:27.395287 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d7rnr" event={"ID":"a450a030-0a6c-4b58-8d12-ac4b7fb8c20d","Type":"ContainerDied","Data":"5c68949f88602982418f75f42dba066f45334a72f99fac429ec8e3d2fd725c38"} Mar 19 19:31:29 crc kubenswrapper[4826]: I0319 19:31:29.125116 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d7rnr" Mar 19 19:31:29 crc kubenswrapper[4826]: I0319 19:31:29.174254 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lr24k\" (UniqueName: \"kubernetes.io/projected/a450a030-0a6c-4b58-8d12-ac4b7fb8c20d-kube-api-access-lr24k\") pod \"a450a030-0a6c-4b58-8d12-ac4b7fb8c20d\" (UID: \"a450a030-0a6c-4b58-8d12-ac4b7fb8c20d\") " Mar 19 19:31:29 crc kubenswrapper[4826]: I0319 19:31:29.174487 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a450a030-0a6c-4b58-8d12-ac4b7fb8c20d-ssh-key-openstack-edpm-ipam\") pod \"a450a030-0a6c-4b58-8d12-ac4b7fb8c20d\" (UID: \"a450a030-0a6c-4b58-8d12-ac4b7fb8c20d\") " Mar 19 19:31:29 crc kubenswrapper[4826]: I0319 19:31:29.174651 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a450a030-0a6c-4b58-8d12-ac4b7fb8c20d-inventory\") pod \"a450a030-0a6c-4b58-8d12-ac4b7fb8c20d\" (UID: \"a450a030-0a6c-4b58-8d12-ac4b7fb8c20d\") " Mar 19 19:31:29 crc kubenswrapper[4826]: I0319 19:31:29.181978 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a450a030-0a6c-4b58-8d12-ac4b7fb8c20d-kube-api-access-lr24k" (OuterVolumeSpecName: "kube-api-access-lr24k") pod "a450a030-0a6c-4b58-8d12-ac4b7fb8c20d" (UID: "a450a030-0a6c-4b58-8d12-ac4b7fb8c20d"). InnerVolumeSpecName "kube-api-access-lr24k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:31:29 crc kubenswrapper[4826]: I0319 19:31:29.208594 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a450a030-0a6c-4b58-8d12-ac4b7fb8c20d-inventory" (OuterVolumeSpecName: "inventory") pod "a450a030-0a6c-4b58-8d12-ac4b7fb8c20d" (UID: "a450a030-0a6c-4b58-8d12-ac4b7fb8c20d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:31:29 crc kubenswrapper[4826]: I0319 19:31:29.233745 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a450a030-0a6c-4b58-8d12-ac4b7fb8c20d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a450a030-0a6c-4b58-8d12-ac4b7fb8c20d" (UID: "a450a030-0a6c-4b58-8d12-ac4b7fb8c20d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:31:29 crc kubenswrapper[4826]: I0319 19:31:29.277614 4826 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a450a030-0a6c-4b58-8d12-ac4b7fb8c20d-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 19:31:29 crc kubenswrapper[4826]: I0319 19:31:29.277672 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lr24k\" (UniqueName: \"kubernetes.io/projected/a450a030-0a6c-4b58-8d12-ac4b7fb8c20d-kube-api-access-lr24k\") on node \"crc\" DevicePath \"\"" Mar 19 19:31:29 crc kubenswrapper[4826]: I0319 19:31:29.277686 4826 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a450a030-0a6c-4b58-8d12-ac4b7fb8c20d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 19:31:29 crc kubenswrapper[4826]: I0319 19:31:29.420205 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d7rnr" event={"ID":"a450a030-0a6c-4b58-8d12-ac4b7fb8c20d","Type":"ContainerDied","Data":"a26fe3d7b6d8f8b468082e399a282888f8214078a66926ba2dfd4ee16a857a02"} Mar 19 19:31:29 crc kubenswrapper[4826]: I0319 19:31:29.420611 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a26fe3d7b6d8f8b468082e399a282888f8214078a66926ba2dfd4ee16a857a02" Mar 19 19:31:29 crc kubenswrapper[4826]: I0319 19:31:29.420429 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d7rnr" Mar 19 19:31:29 crc kubenswrapper[4826]: I0319 19:31:29.533510 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-njzdg"] Mar 19 19:31:29 crc kubenswrapper[4826]: E0319 19:31:29.534388 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a450a030-0a6c-4b58-8d12-ac4b7fb8c20d" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 19 19:31:29 crc kubenswrapper[4826]: I0319 19:31:29.534423 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="a450a030-0a6c-4b58-8d12-ac4b7fb8c20d" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 19 19:31:29 crc kubenswrapper[4826]: I0319 19:31:29.534873 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="a450a030-0a6c-4b58-8d12-ac4b7fb8c20d" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 19 19:31:29 crc kubenswrapper[4826]: I0319 19:31:29.536418 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-njzdg" Mar 19 19:31:29 crc kubenswrapper[4826]: I0319 19:31:29.540648 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 19:31:29 crc kubenswrapper[4826]: I0319 19:31:29.540725 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 19:31:29 crc kubenswrapper[4826]: I0319 19:31:29.540725 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 19:31:29 crc kubenswrapper[4826]: I0319 19:31:29.541416 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jchxw" Mar 19 19:31:29 crc kubenswrapper[4826]: I0319 19:31:29.544267 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-njzdg"] Mar 19 19:31:29 crc kubenswrapper[4826]: I0319 19:31:29.584623 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxzhl\" (UniqueName: \"kubernetes.io/projected/a28972ae-f55e-4db8-9c20-28befd60e934-kube-api-access-rxzhl\") pod \"ssh-known-hosts-edpm-deployment-njzdg\" (UID: \"a28972ae-f55e-4db8-9c20-28befd60e934\") " pod="openstack/ssh-known-hosts-edpm-deployment-njzdg" Mar 19 19:31:29 crc kubenswrapper[4826]: I0319 19:31:29.584921 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a28972ae-f55e-4db8-9c20-28befd60e934-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-njzdg\" (UID: \"a28972ae-f55e-4db8-9c20-28befd60e934\") " pod="openstack/ssh-known-hosts-edpm-deployment-njzdg" Mar 19 19:31:29 crc kubenswrapper[4826]: I0319 19:31:29.585003 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/a28972ae-f55e-4db8-9c20-28befd60e934-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-njzdg\" (UID: \"a28972ae-f55e-4db8-9c20-28befd60e934\") " pod="openstack/ssh-known-hosts-edpm-deployment-njzdg" Mar 19 19:31:29 crc kubenswrapper[4826]: I0319 19:31:29.686896 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/a28972ae-f55e-4db8-9c20-28befd60e934-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-njzdg\" (UID: \"a28972ae-f55e-4db8-9c20-28befd60e934\") " pod="openstack/ssh-known-hosts-edpm-deployment-njzdg" Mar 19 19:31:29 crc kubenswrapper[4826]: I0319 19:31:29.686993 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxzhl\" (UniqueName: \"kubernetes.io/projected/a28972ae-f55e-4db8-9c20-28befd60e934-kube-api-access-rxzhl\") pod \"ssh-known-hosts-edpm-deployment-njzdg\" (UID: \"a28972ae-f55e-4db8-9c20-28befd60e934\") " pod="openstack/ssh-known-hosts-edpm-deployment-njzdg" Mar 19 19:31:29 crc kubenswrapper[4826]: I0319 19:31:29.687200 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a28972ae-f55e-4db8-9c20-28befd60e934-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-njzdg\" (UID: \"a28972ae-f55e-4db8-9c20-28befd60e934\") " pod="openstack/ssh-known-hosts-edpm-deployment-njzdg" Mar 19 19:31:29 crc kubenswrapper[4826]: I0319 19:31:29.692374 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/a28972ae-f55e-4db8-9c20-28befd60e934-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-njzdg\" (UID: \"a28972ae-f55e-4db8-9c20-28befd60e934\") " pod="openstack/ssh-known-hosts-edpm-deployment-njzdg" Mar 19 19:31:29 crc kubenswrapper[4826]: I0319 19:31:29.693282 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a28972ae-f55e-4db8-9c20-28befd60e934-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-njzdg\" (UID: \"a28972ae-f55e-4db8-9c20-28befd60e934\") " pod="openstack/ssh-known-hosts-edpm-deployment-njzdg" Mar 19 19:31:29 crc kubenswrapper[4826]: I0319 19:31:29.706816 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxzhl\" (UniqueName: \"kubernetes.io/projected/a28972ae-f55e-4db8-9c20-28befd60e934-kube-api-access-rxzhl\") pod \"ssh-known-hosts-edpm-deployment-njzdg\" (UID: \"a28972ae-f55e-4db8-9c20-28befd60e934\") " pod="openstack/ssh-known-hosts-edpm-deployment-njzdg" Mar 19 19:31:29 crc kubenswrapper[4826]: I0319 19:31:29.856743 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-njzdg" Mar 19 19:31:30 crc kubenswrapper[4826]: I0319 19:31:30.553242 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-njzdg"] Mar 19 19:31:30 crc kubenswrapper[4826]: W0319 19:31:30.556481 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda28972ae_f55e_4db8_9c20_28befd60e934.slice/crio-56520b7af10120c70a1b3b5d92abe96afdc319c813780b72821f518c89871963 WatchSource:0}: Error finding container 56520b7af10120c70a1b3b5d92abe96afdc319c813780b72821f518c89871963: Status 404 returned error can't find the container with id 56520b7af10120c70a1b3b5d92abe96afdc319c813780b72821f518c89871963 Mar 19 19:31:31 crc kubenswrapper[4826]: I0319 19:31:31.446821 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-njzdg" event={"ID":"a28972ae-f55e-4db8-9c20-28befd60e934","Type":"ContainerStarted","Data":"56520b7af10120c70a1b3b5d92abe96afdc319c813780b72821f518c89871963"} Mar 19 19:31:31 crc kubenswrapper[4826]: I0319 19:31:31.748376 4826 scope.go:117] "RemoveContainer" containerID="06c8539cd9a6da203de24f0f362302644bbdfa63a2e96e302fee3836bf4a7a20" Mar 19 19:31:31 crc kubenswrapper[4826]: I0319 19:31:31.802766 4826 scope.go:117] "RemoveContainer" containerID="09baaea1ee7746a33fabaa2cd9a14a050dad1d6223ce71cfd63f0bb9d1e46a1b" Mar 19 19:31:31 crc kubenswrapper[4826]: I0319 19:31:31.848770 4826 scope.go:117] "RemoveContainer" containerID="ec3eb7c4919e7e32538e479de44132a4e46e4a61c8b886700cefcb8d689aab77" Mar 19 19:31:31 crc kubenswrapper[4826]: I0319 19:31:31.888006 4826 scope.go:117] "RemoveContainer" containerID="233819d697b61141ef384fc05a50210d065f07c7f69db5d65d6425edd348850e" Mar 19 19:31:31 crc kubenswrapper[4826]: I0319 19:31:31.913637 4826 scope.go:117] "RemoveContainer" containerID="95feeb7d4dd3a671e8330b63a53f2465bf33a41ef703a61445cf7d0fb1715f3d" Mar 19 19:31:32 crc kubenswrapper[4826]: I0319 19:31:32.463557 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-njzdg" event={"ID":"a28972ae-f55e-4db8-9c20-28befd60e934","Type":"ContainerStarted","Data":"88825481cda392308a1d894b291f868a24e59418ff4c392ef6420f5db6745ede"} Mar 19 19:31:32 crc kubenswrapper[4826]: I0319 19:31:32.509237 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-njzdg" podStartSLOduration=2.515512206 podStartE2EDuration="3.509169134s" podCreationTimestamp="2026-03-19 19:31:29 +0000 UTC" firstStartedPulling="2026-03-19 19:31:30.560767217 +0000 UTC m=+2115.314835540" lastFinishedPulling="2026-03-19 19:31:31.554424155 +0000 UTC m=+2116.308492468" observedRunningTime="2026-03-19 19:31:32.496738661 +0000 UTC m=+2117.250807004" watchObservedRunningTime="2026-03-19 19:31:32.509169134 +0000 UTC m=+2117.263237457" Mar 19 19:31:39 crc kubenswrapper[4826]: I0319 19:31:39.561823 4826 generic.go:334] "Generic (PLEG): container finished" podID="a28972ae-f55e-4db8-9c20-28befd60e934" containerID="88825481cda392308a1d894b291f868a24e59418ff4c392ef6420f5db6745ede" exitCode=0 Mar 19 19:31:39 crc kubenswrapper[4826]: I0319 19:31:39.561940 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-njzdg" event={"ID":"a28972ae-f55e-4db8-9c20-28befd60e934","Type":"ContainerDied","Data":"88825481cda392308a1d894b291f868a24e59418ff4c392ef6420f5db6745ede"} Mar 19 19:31:41 crc kubenswrapper[4826]: I0319 19:31:41.150356 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-njzdg" Mar 19 19:31:41 crc kubenswrapper[4826]: I0319 19:31:41.240474 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a28972ae-f55e-4db8-9c20-28befd60e934-ssh-key-openstack-edpm-ipam\") pod \"a28972ae-f55e-4db8-9c20-28befd60e934\" (UID: \"a28972ae-f55e-4db8-9c20-28befd60e934\") " Mar 19 19:31:41 crc kubenswrapper[4826]: I0319 19:31:41.240696 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/a28972ae-f55e-4db8-9c20-28befd60e934-inventory-0\") pod \"a28972ae-f55e-4db8-9c20-28befd60e934\" (UID: \"a28972ae-f55e-4db8-9c20-28befd60e934\") " Mar 19 19:31:41 crc kubenswrapper[4826]: I0319 19:31:41.240990 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxzhl\" (UniqueName: \"kubernetes.io/projected/a28972ae-f55e-4db8-9c20-28befd60e934-kube-api-access-rxzhl\") pod \"a28972ae-f55e-4db8-9c20-28befd60e934\" (UID: \"a28972ae-f55e-4db8-9c20-28befd60e934\") " Mar 19 19:31:41 crc kubenswrapper[4826]: I0319 19:31:41.248499 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a28972ae-f55e-4db8-9c20-28befd60e934-kube-api-access-rxzhl" (OuterVolumeSpecName: "kube-api-access-rxzhl") pod "a28972ae-f55e-4db8-9c20-28befd60e934" (UID: "a28972ae-f55e-4db8-9c20-28befd60e934"). InnerVolumeSpecName "kube-api-access-rxzhl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:31:41 crc kubenswrapper[4826]: I0319 19:31:41.295596 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a28972ae-f55e-4db8-9c20-28befd60e934-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "a28972ae-f55e-4db8-9c20-28befd60e934" (UID: "a28972ae-f55e-4db8-9c20-28befd60e934"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:31:41 crc kubenswrapper[4826]: I0319 19:31:41.313844 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a28972ae-f55e-4db8-9c20-28befd60e934-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a28972ae-f55e-4db8-9c20-28befd60e934" (UID: "a28972ae-f55e-4db8-9c20-28befd60e934"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:31:41 crc kubenswrapper[4826]: I0319 19:31:41.343988 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxzhl\" (UniqueName: \"kubernetes.io/projected/a28972ae-f55e-4db8-9c20-28befd60e934-kube-api-access-rxzhl\") on node \"crc\" DevicePath \"\"" Mar 19 19:31:41 crc kubenswrapper[4826]: I0319 19:31:41.344028 4826 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a28972ae-f55e-4db8-9c20-28befd60e934-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 19:31:41 crc kubenswrapper[4826]: I0319 19:31:41.344043 4826 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/a28972ae-f55e-4db8-9c20-28befd60e934-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 19 19:31:41 crc kubenswrapper[4826]: I0319 19:31:41.610320 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-njzdg" event={"ID":"a28972ae-f55e-4db8-9c20-28befd60e934","Type":"ContainerDied","Data":"56520b7af10120c70a1b3b5d92abe96afdc319c813780b72821f518c89871963"} Mar 19 19:31:41 crc kubenswrapper[4826]: I0319 19:31:41.610802 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56520b7af10120c70a1b3b5d92abe96afdc319c813780b72821f518c89871963" Mar 19 19:31:41 crc kubenswrapper[4826]: I0319 19:31:41.610608 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-njzdg" Mar 19 19:31:41 crc kubenswrapper[4826]: I0319 19:31:41.686080 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-jb5ln"] Mar 19 19:31:41 crc kubenswrapper[4826]: E0319 19:31:41.686763 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a28972ae-f55e-4db8-9c20-28befd60e934" containerName="ssh-known-hosts-edpm-deployment" Mar 19 19:31:41 crc kubenswrapper[4826]: I0319 19:31:41.686787 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="a28972ae-f55e-4db8-9c20-28befd60e934" containerName="ssh-known-hosts-edpm-deployment" Mar 19 19:31:41 crc kubenswrapper[4826]: I0319 19:31:41.687113 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="a28972ae-f55e-4db8-9c20-28befd60e934" containerName="ssh-known-hosts-edpm-deployment" Mar 19 19:31:41 crc kubenswrapper[4826]: I0319 19:31:41.688200 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jb5ln" Mar 19 19:31:41 crc kubenswrapper[4826]: I0319 19:31:41.698480 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-jb5ln"] Mar 19 19:31:41 crc kubenswrapper[4826]: I0319 19:31:41.719995 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 19:31:41 crc kubenswrapper[4826]: I0319 19:31:41.720115 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jchxw" Mar 19 19:31:41 crc kubenswrapper[4826]: I0319 19:31:41.720004 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 19:31:41 crc kubenswrapper[4826]: I0319 19:31:41.720366 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 19:31:41 crc kubenswrapper[4826]: I0319 19:31:41.757767 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8s5m\" (UniqueName: \"kubernetes.io/projected/0778f133-8f61-4ccc-bde7-a664c3ff638b-kube-api-access-z8s5m\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-jb5ln\" (UID: \"0778f133-8f61-4ccc-bde7-a664c3ff638b\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jb5ln" Mar 19 19:31:41 crc kubenswrapper[4826]: I0319 19:31:41.757886 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0778f133-8f61-4ccc-bde7-a664c3ff638b-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-jb5ln\" (UID: \"0778f133-8f61-4ccc-bde7-a664c3ff638b\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jb5ln" Mar 19 19:31:41 crc kubenswrapper[4826]: I0319 19:31:41.758121 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0778f133-8f61-4ccc-bde7-a664c3ff638b-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-jb5ln\" (UID: \"0778f133-8f61-4ccc-bde7-a664c3ff638b\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jb5ln" Mar 19 19:31:41 crc kubenswrapper[4826]: I0319 19:31:41.860572 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0778f133-8f61-4ccc-bde7-a664c3ff638b-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-jb5ln\" (UID: \"0778f133-8f61-4ccc-bde7-a664c3ff638b\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jb5ln" Mar 19 19:31:41 crc kubenswrapper[4826]: I0319 19:31:41.860774 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0778f133-8f61-4ccc-bde7-a664c3ff638b-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-jb5ln\" (UID: \"0778f133-8f61-4ccc-bde7-a664c3ff638b\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jb5ln" Mar 19 19:31:41 crc kubenswrapper[4826]: I0319 19:31:41.860871 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8s5m\" (UniqueName: \"kubernetes.io/projected/0778f133-8f61-4ccc-bde7-a664c3ff638b-kube-api-access-z8s5m\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-jb5ln\" (UID: \"0778f133-8f61-4ccc-bde7-a664c3ff638b\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jb5ln" Mar 19 19:31:41 crc kubenswrapper[4826]: I0319 19:31:41.867351 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0778f133-8f61-4ccc-bde7-a664c3ff638b-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-jb5ln\" (UID: \"0778f133-8f61-4ccc-bde7-a664c3ff638b\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jb5ln" Mar 19 19:31:41 crc kubenswrapper[4826]: I0319 19:31:41.868067 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0778f133-8f61-4ccc-bde7-a664c3ff638b-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-jb5ln\" (UID: \"0778f133-8f61-4ccc-bde7-a664c3ff638b\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jb5ln" Mar 19 19:31:41 crc kubenswrapper[4826]: I0319 19:31:41.877453 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8s5m\" (UniqueName: \"kubernetes.io/projected/0778f133-8f61-4ccc-bde7-a664c3ff638b-kube-api-access-z8s5m\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-jb5ln\" (UID: \"0778f133-8f61-4ccc-bde7-a664c3ff638b\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jb5ln" Mar 19 19:31:42 crc kubenswrapper[4826]: I0319 19:31:42.051600 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jb5ln" Mar 19 19:31:42 crc kubenswrapper[4826]: I0319 19:31:42.669939 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-jb5ln"] Mar 19 19:31:43 crc kubenswrapper[4826]: I0319 19:31:43.643976 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jb5ln" event={"ID":"0778f133-8f61-4ccc-bde7-a664c3ff638b","Type":"ContainerStarted","Data":"1f09a6de8922aa8af5532fb9d21bdfce699a57eca759d646246b9015d6be84ae"} Mar 19 19:31:43 crc kubenswrapper[4826]: I0319 19:31:43.644319 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jb5ln" event={"ID":"0778f133-8f61-4ccc-bde7-a664c3ff638b","Type":"ContainerStarted","Data":"c8635dd4359854a27b6606f167702aedc3b9f5f8f7121e2a964fbd40978f0a93"} Mar 19 19:31:43 crc kubenswrapper[4826]: I0319 19:31:43.668315 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jb5ln" podStartSLOduration=2.19733156 podStartE2EDuration="2.668294628s" podCreationTimestamp="2026-03-19 19:31:41 +0000 UTC" firstStartedPulling="2026-03-19 19:31:42.665972639 +0000 UTC m=+2127.420040952" lastFinishedPulling="2026-03-19 19:31:43.136935697 +0000 UTC m=+2127.891004020" observedRunningTime="2026-03-19 19:31:43.658670043 +0000 UTC m=+2128.412738366" watchObservedRunningTime="2026-03-19 19:31:43.668294628 +0000 UTC m=+2128.422362941" Mar 19 19:31:51 crc kubenswrapper[4826]: I0319 19:31:51.752171 4826 generic.go:334] "Generic (PLEG): container finished" podID="0778f133-8f61-4ccc-bde7-a664c3ff638b" containerID="1f09a6de8922aa8af5532fb9d21bdfce699a57eca759d646246b9015d6be84ae" exitCode=0 Mar 19 19:31:51 crc kubenswrapper[4826]: I0319 19:31:51.752294 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jb5ln" event={"ID":"0778f133-8f61-4ccc-bde7-a664c3ff638b","Type":"ContainerDied","Data":"1f09a6de8922aa8af5532fb9d21bdfce699a57eca759d646246b9015d6be84ae"} Mar 19 19:31:53 crc kubenswrapper[4826]: I0319 19:31:53.435749 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jb5ln" Mar 19 19:31:53 crc kubenswrapper[4826]: I0319 19:31:53.633740 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0778f133-8f61-4ccc-bde7-a664c3ff638b-ssh-key-openstack-edpm-ipam\") pod \"0778f133-8f61-4ccc-bde7-a664c3ff638b\" (UID: \"0778f133-8f61-4ccc-bde7-a664c3ff638b\") " Mar 19 19:31:53 crc kubenswrapper[4826]: I0319 19:31:53.634302 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0778f133-8f61-4ccc-bde7-a664c3ff638b-inventory\") pod \"0778f133-8f61-4ccc-bde7-a664c3ff638b\" (UID: \"0778f133-8f61-4ccc-bde7-a664c3ff638b\") " Mar 19 19:31:53 crc kubenswrapper[4826]: I0319 19:31:53.634346 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8s5m\" (UniqueName: \"kubernetes.io/projected/0778f133-8f61-4ccc-bde7-a664c3ff638b-kube-api-access-z8s5m\") pod \"0778f133-8f61-4ccc-bde7-a664c3ff638b\" (UID: \"0778f133-8f61-4ccc-bde7-a664c3ff638b\") " Mar 19 19:31:53 crc kubenswrapper[4826]: I0319 19:31:53.640341 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0778f133-8f61-4ccc-bde7-a664c3ff638b-kube-api-access-z8s5m" (OuterVolumeSpecName: "kube-api-access-z8s5m") pod "0778f133-8f61-4ccc-bde7-a664c3ff638b" (UID: "0778f133-8f61-4ccc-bde7-a664c3ff638b"). InnerVolumeSpecName "kube-api-access-z8s5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:31:53 crc kubenswrapper[4826]: I0319 19:31:53.674601 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0778f133-8f61-4ccc-bde7-a664c3ff638b-inventory" (OuterVolumeSpecName: "inventory") pod "0778f133-8f61-4ccc-bde7-a664c3ff638b" (UID: "0778f133-8f61-4ccc-bde7-a664c3ff638b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:31:53 crc kubenswrapper[4826]: I0319 19:31:53.692445 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0778f133-8f61-4ccc-bde7-a664c3ff638b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0778f133-8f61-4ccc-bde7-a664c3ff638b" (UID: "0778f133-8f61-4ccc-bde7-a664c3ff638b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:31:53 crc kubenswrapper[4826]: I0319 19:31:53.738090 4826 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0778f133-8f61-4ccc-bde7-a664c3ff638b-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 19:31:53 crc kubenswrapper[4826]: I0319 19:31:53.738167 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8s5m\" (UniqueName: \"kubernetes.io/projected/0778f133-8f61-4ccc-bde7-a664c3ff638b-kube-api-access-z8s5m\") on node \"crc\" DevicePath \"\"" Mar 19 19:31:53 crc kubenswrapper[4826]: I0319 19:31:53.738198 4826 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0778f133-8f61-4ccc-bde7-a664c3ff638b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 19:31:53 crc kubenswrapper[4826]: I0319 19:31:53.779623 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jb5ln" event={"ID":"0778f133-8f61-4ccc-bde7-a664c3ff638b","Type":"ContainerDied","Data":"c8635dd4359854a27b6606f167702aedc3b9f5f8f7121e2a964fbd40978f0a93"} Mar 19 19:31:53 crc kubenswrapper[4826]: I0319 19:31:53.779703 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8635dd4359854a27b6606f167702aedc3b9f5f8f7121e2a964fbd40978f0a93" Mar 19 19:31:53 crc kubenswrapper[4826]: I0319 19:31:53.779747 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jb5ln" Mar 19 19:31:53 crc kubenswrapper[4826]: I0319 19:31:53.863207 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tfjnp"] Mar 19 19:31:53 crc kubenswrapper[4826]: E0319 19:31:53.863667 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0778f133-8f61-4ccc-bde7-a664c3ff638b" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 19 19:31:53 crc kubenswrapper[4826]: I0319 19:31:53.863684 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="0778f133-8f61-4ccc-bde7-a664c3ff638b" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 19 19:31:53 crc kubenswrapper[4826]: I0319 19:31:53.864828 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="0778f133-8f61-4ccc-bde7-a664c3ff638b" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 19 19:31:53 crc kubenswrapper[4826]: I0319 19:31:53.866519 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tfjnp" Mar 19 19:31:53 crc kubenswrapper[4826]: I0319 19:31:53.870444 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jchxw" Mar 19 19:31:53 crc kubenswrapper[4826]: I0319 19:31:53.870446 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 19:31:53 crc kubenswrapper[4826]: I0319 19:31:53.872596 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 19:31:53 crc kubenswrapper[4826]: I0319 19:31:53.873802 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 19:31:53 crc kubenswrapper[4826]: I0319 19:31:53.880702 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tfjnp"] Mar 19 19:31:54 crc kubenswrapper[4826]: I0319 19:31:54.044827 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sj6z\" (UniqueName: \"kubernetes.io/projected/23035475-3ff6-49e0-9810-c27013a74f8c-kube-api-access-5sj6z\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-tfjnp\" (UID: \"23035475-3ff6-49e0-9810-c27013a74f8c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tfjnp" Mar 19 19:31:54 crc kubenswrapper[4826]: I0319 19:31:54.045495 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23035475-3ff6-49e0-9810-c27013a74f8c-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-tfjnp\" (UID: \"23035475-3ff6-49e0-9810-c27013a74f8c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tfjnp" Mar 19 19:31:54 crc kubenswrapper[4826]: I0319 19:31:54.046193 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/23035475-3ff6-49e0-9810-c27013a74f8c-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-tfjnp\" (UID: \"23035475-3ff6-49e0-9810-c27013a74f8c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tfjnp" Mar 19 19:31:54 crc kubenswrapper[4826]: I0319 19:31:54.147836 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sj6z\" (UniqueName: \"kubernetes.io/projected/23035475-3ff6-49e0-9810-c27013a74f8c-kube-api-access-5sj6z\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-tfjnp\" (UID: \"23035475-3ff6-49e0-9810-c27013a74f8c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tfjnp" Mar 19 19:31:54 crc kubenswrapper[4826]: I0319 19:31:54.148028 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23035475-3ff6-49e0-9810-c27013a74f8c-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-tfjnp\" (UID: \"23035475-3ff6-49e0-9810-c27013a74f8c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tfjnp" Mar 19 19:31:54 crc kubenswrapper[4826]: I0319 19:31:54.148147 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/23035475-3ff6-49e0-9810-c27013a74f8c-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-tfjnp\" (UID: \"23035475-3ff6-49e0-9810-c27013a74f8c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tfjnp" Mar 19 19:31:54 crc kubenswrapper[4826]: I0319 19:31:54.153012 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/23035475-3ff6-49e0-9810-c27013a74f8c-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-tfjnp\" (UID: \"23035475-3ff6-49e0-9810-c27013a74f8c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tfjnp" Mar 19 19:31:54 crc kubenswrapper[4826]: I0319 19:31:54.167164 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23035475-3ff6-49e0-9810-c27013a74f8c-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-tfjnp\" (UID: \"23035475-3ff6-49e0-9810-c27013a74f8c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tfjnp" Mar 19 19:31:54 crc kubenswrapper[4826]: I0319 19:31:54.170069 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sj6z\" (UniqueName: \"kubernetes.io/projected/23035475-3ff6-49e0-9810-c27013a74f8c-kube-api-access-5sj6z\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-tfjnp\" (UID: \"23035475-3ff6-49e0-9810-c27013a74f8c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tfjnp" Mar 19 19:31:54 crc kubenswrapper[4826]: I0319 19:31:54.191455 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tfjnp" Mar 19 19:31:54 crc kubenswrapper[4826]: I0319 19:31:54.740342 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tfjnp"] Mar 19 19:31:54 crc kubenswrapper[4826]: I0319 19:31:54.790602 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tfjnp" event={"ID":"23035475-3ff6-49e0-9810-c27013a74f8c","Type":"ContainerStarted","Data":"ab8c3474fe1de4bb536a268b7edf0bb8773fe70107d9bea77ed0342009b4853b"} Mar 19 19:31:55 crc kubenswrapper[4826]: I0319 19:31:55.803032 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tfjnp" event={"ID":"23035475-3ff6-49e0-9810-c27013a74f8c","Type":"ContainerStarted","Data":"29efbd620435c0ef291c36c625969d96a53398cb642d5e40ad49b073297e05b7"} Mar 19 19:31:55 crc kubenswrapper[4826]: I0319 19:31:55.831808 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tfjnp" podStartSLOduration=2.334295994 podStartE2EDuration="2.83178728s" podCreationTimestamp="2026-03-19 19:31:53 +0000 UTC" firstStartedPulling="2026-03-19 19:31:54.73755287 +0000 UTC m=+2139.491621183" lastFinishedPulling="2026-03-19 19:31:55.235044116 +0000 UTC m=+2139.989112469" observedRunningTime="2026-03-19 19:31:55.818491716 +0000 UTC m=+2140.572560069" watchObservedRunningTime="2026-03-19 19:31:55.83178728 +0000 UTC m=+2140.585855613" Mar 19 19:32:00 crc kubenswrapper[4826]: I0319 19:32:00.174128 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565812-l2b98"] Mar 19 19:32:00 crc kubenswrapper[4826]: I0319 19:32:00.179763 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565812-l2b98" Mar 19 19:32:00 crc kubenswrapper[4826]: I0319 19:32:00.183074 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 19:32:00 crc kubenswrapper[4826]: I0319 19:32:00.183277 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b27wl" Mar 19 19:32:00 crc kubenswrapper[4826]: I0319 19:32:00.186297 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565812-l2b98"] Mar 19 19:32:00 crc kubenswrapper[4826]: I0319 19:32:00.186401 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 19:32:00 crc kubenswrapper[4826]: I0319 19:32:00.226301 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wchh\" (UniqueName: \"kubernetes.io/projected/69938b68-3fa5-4ed5-acd8-ca538ae21efb-kube-api-access-4wchh\") pod \"auto-csr-approver-29565812-l2b98\" (UID: \"69938b68-3fa5-4ed5-acd8-ca538ae21efb\") " pod="openshift-infra/auto-csr-approver-29565812-l2b98" Mar 19 19:32:00 crc kubenswrapper[4826]: I0319 19:32:00.328946 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wchh\" (UniqueName: \"kubernetes.io/projected/69938b68-3fa5-4ed5-acd8-ca538ae21efb-kube-api-access-4wchh\") pod \"auto-csr-approver-29565812-l2b98\" (UID: \"69938b68-3fa5-4ed5-acd8-ca538ae21efb\") " pod="openshift-infra/auto-csr-approver-29565812-l2b98" Mar 19 19:32:00 crc kubenswrapper[4826]: I0319 19:32:00.350687 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wchh\" (UniqueName: \"kubernetes.io/projected/69938b68-3fa5-4ed5-acd8-ca538ae21efb-kube-api-access-4wchh\") pod \"auto-csr-approver-29565812-l2b98\" (UID: \"69938b68-3fa5-4ed5-acd8-ca538ae21efb\") " pod="openshift-infra/auto-csr-approver-29565812-l2b98" Mar 19 19:32:00 crc kubenswrapper[4826]: I0319 19:32:00.510611 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565812-l2b98" Mar 19 19:32:01 crc kubenswrapper[4826]: I0319 19:32:01.096335 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565812-l2b98"] Mar 19 19:32:01 crc kubenswrapper[4826]: W0319 19:32:01.097644 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69938b68_3fa5_4ed5_acd8_ca538ae21efb.slice/crio-0ecba1d1ad3d445067fc72c0422e9db2ec5ea1b74b34af17ab8a999f98d533bf WatchSource:0}: Error finding container 0ecba1d1ad3d445067fc72c0422e9db2ec5ea1b74b34af17ab8a999f98d533bf: Status 404 returned error can't find the container with id 0ecba1d1ad3d445067fc72c0422e9db2ec5ea1b74b34af17ab8a999f98d533bf Mar 19 19:32:01 crc kubenswrapper[4826]: I0319 19:32:01.892457 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565812-l2b98" event={"ID":"69938b68-3fa5-4ed5-acd8-ca538ae21efb","Type":"ContainerStarted","Data":"0ecba1d1ad3d445067fc72c0422e9db2ec5ea1b74b34af17ab8a999f98d533bf"} Mar 19 19:32:02 crc kubenswrapper[4826]: I0319 19:32:02.904590 4826 generic.go:334] "Generic (PLEG): container finished" podID="69938b68-3fa5-4ed5-acd8-ca538ae21efb" containerID="b76b0b59c8ee17e1a0e07823ccb1a646ba9e016245e00ac1feb580f549936e37" exitCode=0 Mar 19 19:32:02 crc kubenswrapper[4826]: I0319 19:32:02.904687 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565812-l2b98" event={"ID":"69938b68-3fa5-4ed5-acd8-ca538ae21efb","Type":"ContainerDied","Data":"b76b0b59c8ee17e1a0e07823ccb1a646ba9e016245e00ac1feb580f549936e37"} Mar 19 19:32:04 crc kubenswrapper[4826]: I0319 19:32:04.444149 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565812-l2b98" Mar 19 19:32:04 crc kubenswrapper[4826]: I0319 19:32:04.565392 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wchh\" (UniqueName: \"kubernetes.io/projected/69938b68-3fa5-4ed5-acd8-ca538ae21efb-kube-api-access-4wchh\") pod \"69938b68-3fa5-4ed5-acd8-ca538ae21efb\" (UID: \"69938b68-3fa5-4ed5-acd8-ca538ae21efb\") " Mar 19 19:32:04 crc kubenswrapper[4826]: I0319 19:32:04.749464 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69938b68-3fa5-4ed5-acd8-ca538ae21efb-kube-api-access-4wchh" (OuterVolumeSpecName: "kube-api-access-4wchh") pod "69938b68-3fa5-4ed5-acd8-ca538ae21efb" (UID: "69938b68-3fa5-4ed5-acd8-ca538ae21efb"). InnerVolumeSpecName "kube-api-access-4wchh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:32:04 crc kubenswrapper[4826]: I0319 19:32:04.770269 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wchh\" (UniqueName: \"kubernetes.io/projected/69938b68-3fa5-4ed5-acd8-ca538ae21efb-kube-api-access-4wchh\") on node \"crc\" DevicePath \"\"" Mar 19 19:32:04 crc kubenswrapper[4826]: I0319 19:32:04.931465 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565812-l2b98" event={"ID":"69938b68-3fa5-4ed5-acd8-ca538ae21efb","Type":"ContainerDied","Data":"0ecba1d1ad3d445067fc72c0422e9db2ec5ea1b74b34af17ab8a999f98d533bf"} Mar 19 19:32:04 crc kubenswrapper[4826]: I0319 19:32:04.931796 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ecba1d1ad3d445067fc72c0422e9db2ec5ea1b74b34af17ab8a999f98d533bf" Mar 19 19:32:04 crc kubenswrapper[4826]: I0319 19:32:04.931599 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565812-l2b98" Mar 19 19:32:05 crc kubenswrapper[4826]: I0319 19:32:05.534141 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565806-qjm7v"] Mar 19 19:32:05 crc kubenswrapper[4826]: I0319 19:32:05.552689 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565806-qjm7v"] Mar 19 19:32:05 crc kubenswrapper[4826]: I0319 19:32:05.953433 4826 generic.go:334] "Generic (PLEG): container finished" podID="23035475-3ff6-49e0-9810-c27013a74f8c" containerID="29efbd620435c0ef291c36c625969d96a53398cb642d5e40ad49b073297e05b7" exitCode=0 Mar 19 19:32:05 crc kubenswrapper[4826]: I0319 19:32:05.953504 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tfjnp" event={"ID":"23035475-3ff6-49e0-9810-c27013a74f8c","Type":"ContainerDied","Data":"29efbd620435c0ef291c36c625969d96a53398cb642d5e40ad49b073297e05b7"} Mar 19 19:32:06 crc kubenswrapper[4826]: I0319 19:32:06.002185 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ab15c7e-f20f-44a8-aa1f-4156be2adec9" path="/var/lib/kubelet/pods/5ab15c7e-f20f-44a8-aa1f-4156be2adec9/volumes" Mar 19 19:32:07 crc kubenswrapper[4826]: I0319 19:32:07.514985 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tfjnp" Mar 19 19:32:07 crc kubenswrapper[4826]: I0319 19:32:07.655435 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5sj6z\" (UniqueName: \"kubernetes.io/projected/23035475-3ff6-49e0-9810-c27013a74f8c-kube-api-access-5sj6z\") pod \"23035475-3ff6-49e0-9810-c27013a74f8c\" (UID: \"23035475-3ff6-49e0-9810-c27013a74f8c\") " Mar 19 19:32:07 crc kubenswrapper[4826]: I0319 19:32:07.655577 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/23035475-3ff6-49e0-9810-c27013a74f8c-ssh-key-openstack-edpm-ipam\") pod \"23035475-3ff6-49e0-9810-c27013a74f8c\" (UID: \"23035475-3ff6-49e0-9810-c27013a74f8c\") " Mar 19 19:32:07 crc kubenswrapper[4826]: I0319 19:32:07.655614 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23035475-3ff6-49e0-9810-c27013a74f8c-inventory\") pod \"23035475-3ff6-49e0-9810-c27013a74f8c\" (UID: \"23035475-3ff6-49e0-9810-c27013a74f8c\") " Mar 19 19:32:07 crc kubenswrapper[4826]: I0319 19:32:07.676091 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23035475-3ff6-49e0-9810-c27013a74f8c-kube-api-access-5sj6z" (OuterVolumeSpecName: "kube-api-access-5sj6z") pod "23035475-3ff6-49e0-9810-c27013a74f8c" (UID: "23035475-3ff6-49e0-9810-c27013a74f8c"). InnerVolumeSpecName "kube-api-access-5sj6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:32:07 crc kubenswrapper[4826]: I0319 19:32:07.690227 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23035475-3ff6-49e0-9810-c27013a74f8c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "23035475-3ff6-49e0-9810-c27013a74f8c" (UID: "23035475-3ff6-49e0-9810-c27013a74f8c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:32:07 crc kubenswrapper[4826]: I0319 19:32:07.703848 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23035475-3ff6-49e0-9810-c27013a74f8c-inventory" (OuterVolumeSpecName: "inventory") pod "23035475-3ff6-49e0-9810-c27013a74f8c" (UID: "23035475-3ff6-49e0-9810-c27013a74f8c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:32:07 crc kubenswrapper[4826]: I0319 19:32:07.759382 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5sj6z\" (UniqueName: \"kubernetes.io/projected/23035475-3ff6-49e0-9810-c27013a74f8c-kube-api-access-5sj6z\") on node \"crc\" DevicePath \"\"" Mar 19 19:32:07 crc kubenswrapper[4826]: I0319 19:32:07.759417 4826 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/23035475-3ff6-49e0-9810-c27013a74f8c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 19:32:07 crc kubenswrapper[4826]: I0319 19:32:07.759432 4826 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23035475-3ff6-49e0-9810-c27013a74f8c-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 19:32:07 crc kubenswrapper[4826]: I0319 19:32:07.991370 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tfjnp" Mar 19 19:32:07 crc kubenswrapper[4826]: I0319 19:32:07.992999 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tfjnp" event={"ID":"23035475-3ff6-49e0-9810-c27013a74f8c","Type":"ContainerDied","Data":"ab8c3474fe1de4bb536a268b7edf0bb8773fe70107d9bea77ed0342009b4853b"} Mar 19 19:32:07 crc kubenswrapper[4826]: I0319 19:32:07.993124 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab8c3474fe1de4bb536a268b7edf0bb8773fe70107d9bea77ed0342009b4853b" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.129608 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-glfmw"] Mar 19 19:32:08 crc kubenswrapper[4826]: E0319 19:32:08.130636 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23035475-3ff6-49e0-9810-c27013a74f8c" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.130678 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="23035475-3ff6-49e0-9810-c27013a74f8c" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 19 19:32:08 crc kubenswrapper[4826]: E0319 19:32:08.130737 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69938b68-3fa5-4ed5-acd8-ca538ae21efb" containerName="oc" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.130747 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="69938b68-3fa5-4ed5-acd8-ca538ae21efb" containerName="oc" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.131165 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="69938b68-3fa5-4ed5-acd8-ca538ae21efb" containerName="oc" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.131192 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="23035475-3ff6-49e0-9810-c27013a74f8c" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.132210 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-glfmw" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.134399 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.136803 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.137034 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.137324 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.137633 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.137801 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.137832 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.137634 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jchxw" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.138038 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.178083 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-glfmw"] Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.272032 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e534979a-19b2-434a-9b5e-d7d7f4d9125b-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-glfmw\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-glfmw" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.274570 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e534979a-19b2-434a-9b5e-d7d7f4d9125b-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-glfmw\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-glfmw" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.274630 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e534979a-19b2-434a-9b5e-d7d7f4d9125b-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-glfmw\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-glfmw" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.274694 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e534979a-19b2-434a-9b5e-d7d7f4d9125b-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-glfmw\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-glfmw" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.274833 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e534979a-19b2-434a-9b5e-d7d7f4d9125b-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-glfmw\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-glfmw" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.274987 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59ll5\" (UniqueName: \"kubernetes.io/projected/e534979a-19b2-434a-9b5e-d7d7f4d9125b-kube-api-access-59ll5\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-glfmw\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-glfmw" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.275125 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e534979a-19b2-434a-9b5e-d7d7f4d9125b-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-glfmw\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-glfmw" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.275360 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e534979a-19b2-434a-9b5e-d7d7f4d9125b-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-glfmw\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-glfmw" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.275389 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e534979a-19b2-434a-9b5e-d7d7f4d9125b-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-glfmw\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-glfmw" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.275479 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e534979a-19b2-434a-9b5e-d7d7f4d9125b-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-glfmw\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-glfmw" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.275586 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e534979a-19b2-434a-9b5e-d7d7f4d9125b-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-glfmw\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-glfmw" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.275725 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e534979a-19b2-434a-9b5e-d7d7f4d9125b-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-glfmw\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-glfmw" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.275887 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e534979a-19b2-434a-9b5e-d7d7f4d9125b-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-glfmw\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-glfmw" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.275980 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e534979a-19b2-434a-9b5e-d7d7f4d9125b-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-glfmw\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-glfmw" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.276137 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e534979a-19b2-434a-9b5e-d7d7f4d9125b-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-glfmw\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-glfmw" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.276238 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e534979a-19b2-434a-9b5e-d7d7f4d9125b-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-glfmw\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-glfmw" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.379644 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e534979a-19b2-434a-9b5e-d7d7f4d9125b-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-glfmw\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-glfmw" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.379733 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e534979a-19b2-434a-9b5e-d7d7f4d9125b-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-glfmw\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-glfmw" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.379845 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e534979a-19b2-434a-9b5e-d7d7f4d9125b-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-glfmw\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-glfmw" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.379889 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e534979a-19b2-434a-9b5e-d7d7f4d9125b-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-glfmw\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-glfmw" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.379930 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e534979a-19b2-434a-9b5e-d7d7f4d9125b-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-glfmw\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-glfmw" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.380498 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e534979a-19b2-434a-9b5e-d7d7f4d9125b-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-glfmw\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-glfmw" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.380548 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59ll5\" (UniqueName: \"kubernetes.io/projected/e534979a-19b2-434a-9b5e-d7d7f4d9125b-kube-api-access-59ll5\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-glfmw\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-glfmw" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.380618 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e534979a-19b2-434a-9b5e-d7d7f4d9125b-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-glfmw\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-glfmw" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.380707 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e534979a-19b2-434a-9b5e-d7d7f4d9125b-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-glfmw\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-glfmw" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.380724 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e534979a-19b2-434a-9b5e-d7d7f4d9125b-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-glfmw\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-glfmw" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.380777 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e534979a-19b2-434a-9b5e-d7d7f4d9125b-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-glfmw\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-glfmw" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.380825 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e534979a-19b2-434a-9b5e-d7d7f4d9125b-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-glfmw\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-glfmw" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.380895 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e534979a-19b2-434a-9b5e-d7d7f4d9125b-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-glfmw\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-glfmw" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.380967 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e534979a-19b2-434a-9b5e-d7d7f4d9125b-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-glfmw\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-glfmw" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.380991 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e534979a-19b2-434a-9b5e-d7d7f4d9125b-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-glfmw\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-glfmw" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.381026 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e534979a-19b2-434a-9b5e-d7d7f4d9125b-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-glfmw\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-glfmw" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.385480 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e534979a-19b2-434a-9b5e-d7d7f4d9125b-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-glfmw\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-glfmw" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.389316 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e534979a-19b2-434a-9b5e-d7d7f4d9125b-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-glfmw\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-glfmw" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.390825 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e534979a-19b2-434a-9b5e-d7d7f4d9125b-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-glfmw\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-glfmw" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.390927 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e534979a-19b2-434a-9b5e-d7d7f4d9125b-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-glfmw\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-glfmw" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.391881 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e534979a-19b2-434a-9b5e-d7d7f4d9125b-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-glfmw\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-glfmw" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.393076 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e534979a-19b2-434a-9b5e-d7d7f4d9125b-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-glfmw\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-glfmw" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.393897 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e534979a-19b2-434a-9b5e-d7d7f4d9125b-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-glfmw\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-glfmw" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.395572 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e534979a-19b2-434a-9b5e-d7d7f4d9125b-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-glfmw\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-glfmw" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.397859 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e534979a-19b2-434a-9b5e-d7d7f4d9125b-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-glfmw\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-glfmw" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.398700 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e534979a-19b2-434a-9b5e-d7d7f4d9125b-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-glfmw\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-glfmw" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.400166 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e534979a-19b2-434a-9b5e-d7d7f4d9125b-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-glfmw\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-glfmw" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.400376 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e534979a-19b2-434a-9b5e-d7d7f4d9125b-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-glfmw\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-glfmw" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.400419 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e534979a-19b2-434a-9b5e-d7d7f4d9125b-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-glfmw\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-glfmw" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.400946 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e534979a-19b2-434a-9b5e-d7d7f4d9125b-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-glfmw\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-glfmw" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.401609 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e534979a-19b2-434a-9b5e-d7d7f4d9125b-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-glfmw\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-glfmw" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.404165 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59ll5\" (UniqueName: \"kubernetes.io/projected/e534979a-19b2-434a-9b5e-d7d7f4d9125b-kube-api-access-59ll5\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-glfmw\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-glfmw" Mar 19 19:32:08 crc kubenswrapper[4826]: I0319 19:32:08.464815 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-glfmw" Mar 19 19:32:09 crc kubenswrapper[4826]: I0319 19:32:09.091383 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-glfmw"] Mar 19 19:32:10 crc kubenswrapper[4826]: I0319 19:32:10.020580 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-glfmw" event={"ID":"e534979a-19b2-434a-9b5e-d7d7f4d9125b","Type":"ContainerStarted","Data":"306cb979a86190e83f4cf5415dad307250cc008f2573c486c534da3aa2ed80d3"} Mar 19 19:32:10 crc kubenswrapper[4826]: I0319 19:32:10.020919 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-glfmw" event={"ID":"e534979a-19b2-434a-9b5e-d7d7f4d9125b","Type":"ContainerStarted","Data":"34c75cac691547c7dbd0ef3a4cc5666dc42d0525863a7622c294c0461f2fb44a"} Mar 19 19:32:10 crc kubenswrapper[4826]: I0319 19:32:10.058603 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-glfmw" podStartSLOduration=1.518349194 podStartE2EDuration="2.058581101s" podCreationTimestamp="2026-03-19 19:32:08 +0000 UTC" firstStartedPulling="2026-03-19 19:32:09.093576771 +0000 UTC m=+2153.847645084" lastFinishedPulling="2026-03-19 19:32:09.633808658 +0000 UTC m=+2154.387876991" observedRunningTime="2026-03-19 19:32:10.043925864 +0000 UTC m=+2154.797994217" watchObservedRunningTime="2026-03-19 19:32:10.058581101 +0000 UTC m=+2154.812649424" Mar 19 19:32:25 crc kubenswrapper[4826]: I0319 19:32:25.401232 4826 patch_prober.go:28] interesting pod/machine-config-daemon-zz87p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:32:25 crc kubenswrapper[4826]: I0319 19:32:25.402102 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:32:27 crc kubenswrapper[4826]: I0319 19:32:27.690607 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-h5fnh"] Mar 19 19:32:27 crc kubenswrapper[4826]: I0319 19:32:27.697324 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h5fnh" Mar 19 19:32:27 crc kubenswrapper[4826]: I0319 19:32:27.736018 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h5fnh"] Mar 19 19:32:27 crc kubenswrapper[4826]: I0319 19:32:27.856716 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pz485\" (UniqueName: \"kubernetes.io/projected/f04d4a78-800e-4641-87ed-8268ad4fea33-kube-api-access-pz485\") pod \"redhat-marketplace-h5fnh\" (UID: \"f04d4a78-800e-4641-87ed-8268ad4fea33\") " pod="openshift-marketplace/redhat-marketplace-h5fnh" Mar 19 19:32:27 crc kubenswrapper[4826]: I0319 19:32:27.856793 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f04d4a78-800e-4641-87ed-8268ad4fea33-utilities\") pod \"redhat-marketplace-h5fnh\" (UID: \"f04d4a78-800e-4641-87ed-8268ad4fea33\") " pod="openshift-marketplace/redhat-marketplace-h5fnh" Mar 19 19:32:27 crc kubenswrapper[4826]: I0319 19:32:27.857264 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f04d4a78-800e-4641-87ed-8268ad4fea33-catalog-content\") pod \"redhat-marketplace-h5fnh\" (UID: \"f04d4a78-800e-4641-87ed-8268ad4fea33\") " pod="openshift-marketplace/redhat-marketplace-h5fnh" Mar 19 19:32:27 crc kubenswrapper[4826]: I0319 19:32:27.959185 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f04d4a78-800e-4641-87ed-8268ad4fea33-catalog-content\") pod \"redhat-marketplace-h5fnh\" (UID: \"f04d4a78-800e-4641-87ed-8268ad4fea33\") " pod="openshift-marketplace/redhat-marketplace-h5fnh" Mar 19 19:32:27 crc kubenswrapper[4826]: I0319 19:32:27.959589 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pz485\" (UniqueName: \"kubernetes.io/projected/f04d4a78-800e-4641-87ed-8268ad4fea33-kube-api-access-pz485\") pod \"redhat-marketplace-h5fnh\" (UID: \"f04d4a78-800e-4641-87ed-8268ad4fea33\") " pod="openshift-marketplace/redhat-marketplace-h5fnh" Mar 19 19:32:27 crc kubenswrapper[4826]: I0319 19:32:27.959893 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f04d4a78-800e-4641-87ed-8268ad4fea33-utilities\") pod \"redhat-marketplace-h5fnh\" (UID: \"f04d4a78-800e-4641-87ed-8268ad4fea33\") " pod="openshift-marketplace/redhat-marketplace-h5fnh" Mar 19 19:32:27 crc kubenswrapper[4826]: I0319 19:32:27.959643 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f04d4a78-800e-4641-87ed-8268ad4fea33-catalog-content\") pod \"redhat-marketplace-h5fnh\" (UID: \"f04d4a78-800e-4641-87ed-8268ad4fea33\") " pod="openshift-marketplace/redhat-marketplace-h5fnh" Mar 19 19:32:27 crc kubenswrapper[4826]: I0319 19:32:27.960170 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f04d4a78-800e-4641-87ed-8268ad4fea33-utilities\") pod \"redhat-marketplace-h5fnh\" (UID: \"f04d4a78-800e-4641-87ed-8268ad4fea33\") " pod="openshift-marketplace/redhat-marketplace-h5fnh" Mar 19 19:32:27 crc kubenswrapper[4826]: I0319 19:32:27.982626 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pz485\" (UniqueName: \"kubernetes.io/projected/f04d4a78-800e-4641-87ed-8268ad4fea33-kube-api-access-pz485\") pod \"redhat-marketplace-h5fnh\" (UID: \"f04d4a78-800e-4641-87ed-8268ad4fea33\") " pod="openshift-marketplace/redhat-marketplace-h5fnh" Mar 19 19:32:28 crc kubenswrapper[4826]: I0319 19:32:28.021472 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h5fnh" Mar 19 19:32:28 crc kubenswrapper[4826]: I0319 19:32:28.494564 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h5fnh"] Mar 19 19:32:29 crc kubenswrapper[4826]: I0319 19:32:29.265772 4826 generic.go:334] "Generic (PLEG): container finished" podID="f04d4a78-800e-4641-87ed-8268ad4fea33" containerID="e2579c97c5aa667c6f65a1891225fef94a8cbfc5b0b694ca614cb541c5dc4fa5" exitCode=0 Mar 19 19:32:29 crc kubenswrapper[4826]: I0319 19:32:29.266227 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h5fnh" event={"ID":"f04d4a78-800e-4641-87ed-8268ad4fea33","Type":"ContainerDied","Data":"e2579c97c5aa667c6f65a1891225fef94a8cbfc5b0b694ca614cb541c5dc4fa5"} Mar 19 19:32:29 crc kubenswrapper[4826]: I0319 19:32:29.266310 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h5fnh" event={"ID":"f04d4a78-800e-4641-87ed-8268ad4fea33","Type":"ContainerStarted","Data":"1ea09bc16515bf244161d363ff34b8d10b422c47da823884c1d2d4587c1c890d"} Mar 19 19:32:30 crc kubenswrapper[4826]: I0319 19:32:30.282070 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h5fnh" event={"ID":"f04d4a78-800e-4641-87ed-8268ad4fea33","Type":"ContainerStarted","Data":"d981cfe103ad7de3fc786c40bda07f51424e94b8c29e0c8d36849ad9daa93a02"} Mar 19 19:32:32 crc kubenswrapper[4826]: I0319 19:32:32.097775 4826 scope.go:117] "RemoveContainer" containerID="b3f2a66dec58233f73318862b34c57797c4ba7d6fc294398f3c9e5eabaa806c1" Mar 19 19:32:32 crc kubenswrapper[4826]: I0319 19:32:32.308688 4826 generic.go:334] "Generic (PLEG): container finished" podID="f04d4a78-800e-4641-87ed-8268ad4fea33" containerID="d981cfe103ad7de3fc786c40bda07f51424e94b8c29e0c8d36849ad9daa93a02" exitCode=0 Mar 19 19:32:32 crc kubenswrapper[4826]: I0319 19:32:32.308743 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h5fnh" event={"ID":"f04d4a78-800e-4641-87ed-8268ad4fea33","Type":"ContainerDied","Data":"d981cfe103ad7de3fc786c40bda07f51424e94b8c29e0c8d36849ad9daa93a02"} Mar 19 19:32:33 crc kubenswrapper[4826]: I0319 19:32:33.321076 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h5fnh" event={"ID":"f04d4a78-800e-4641-87ed-8268ad4fea33","Type":"ContainerStarted","Data":"68b9ad0bb66d03e89feceeb7325b0405be44bab2b1b49d1832e5f5837f211a95"} Mar 19 19:32:33 crc kubenswrapper[4826]: I0319 19:32:33.355168 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-h5fnh" podStartSLOduration=2.857940567 podStartE2EDuration="6.355147672s" podCreationTimestamp="2026-03-19 19:32:27 +0000 UTC" firstStartedPulling="2026-03-19 19:32:29.270243533 +0000 UTC m=+2174.024311896" lastFinishedPulling="2026-03-19 19:32:32.767450668 +0000 UTC m=+2177.521519001" observedRunningTime="2026-03-19 19:32:33.339548772 +0000 UTC m=+2178.093617095" watchObservedRunningTime="2026-03-19 19:32:33.355147672 +0000 UTC m=+2178.109216005" Mar 19 19:32:38 crc kubenswrapper[4826]: I0319 19:32:38.022388 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-h5fnh" Mar 19 19:32:38 crc kubenswrapper[4826]: I0319 19:32:38.023264 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-h5fnh" Mar 19 19:32:38 crc kubenswrapper[4826]: I0319 19:32:38.117889 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-h5fnh" Mar 19 19:32:38 crc kubenswrapper[4826]: I0319 19:32:38.460845 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-h5fnh" Mar 19 19:32:38 crc kubenswrapper[4826]: I0319 19:32:38.512574 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h5fnh"] Mar 19 19:32:40 crc kubenswrapper[4826]: I0319 19:32:40.409879 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-h5fnh" podUID="f04d4a78-800e-4641-87ed-8268ad4fea33" containerName="registry-server" containerID="cri-o://68b9ad0bb66d03e89feceeb7325b0405be44bab2b1b49d1832e5f5837f211a95" gracePeriod=2 Mar 19 19:32:40 crc kubenswrapper[4826]: I0319 19:32:40.897975 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h5fnh" Mar 19 19:32:40 crc kubenswrapper[4826]: I0319 19:32:40.958489 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f04d4a78-800e-4641-87ed-8268ad4fea33-utilities\") pod \"f04d4a78-800e-4641-87ed-8268ad4fea33\" (UID: \"f04d4a78-800e-4641-87ed-8268ad4fea33\") " Mar 19 19:32:40 crc kubenswrapper[4826]: I0319 19:32:40.959018 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f04d4a78-800e-4641-87ed-8268ad4fea33-catalog-content\") pod \"f04d4a78-800e-4641-87ed-8268ad4fea33\" (UID: \"f04d4a78-800e-4641-87ed-8268ad4fea33\") " Mar 19 19:32:40 crc kubenswrapper[4826]: I0319 19:32:40.959148 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pz485\" (UniqueName: \"kubernetes.io/projected/f04d4a78-800e-4641-87ed-8268ad4fea33-kube-api-access-pz485\") pod \"f04d4a78-800e-4641-87ed-8268ad4fea33\" (UID: \"f04d4a78-800e-4641-87ed-8268ad4fea33\") " Mar 19 19:32:40 crc kubenswrapper[4826]: I0319 19:32:40.959530 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f04d4a78-800e-4641-87ed-8268ad4fea33-utilities" (OuterVolumeSpecName: "utilities") pod "f04d4a78-800e-4641-87ed-8268ad4fea33" (UID: "f04d4a78-800e-4641-87ed-8268ad4fea33"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:32:40 crc kubenswrapper[4826]: I0319 19:32:40.960417 4826 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f04d4a78-800e-4641-87ed-8268ad4fea33-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 19:32:40 crc kubenswrapper[4826]: I0319 19:32:40.968106 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f04d4a78-800e-4641-87ed-8268ad4fea33-kube-api-access-pz485" (OuterVolumeSpecName: "kube-api-access-pz485") pod "f04d4a78-800e-4641-87ed-8268ad4fea33" (UID: "f04d4a78-800e-4641-87ed-8268ad4fea33"). InnerVolumeSpecName "kube-api-access-pz485". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:32:40 crc kubenswrapper[4826]: I0319 19:32:40.998126 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f04d4a78-800e-4641-87ed-8268ad4fea33-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f04d4a78-800e-4641-87ed-8268ad4fea33" (UID: "f04d4a78-800e-4641-87ed-8268ad4fea33"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:32:41 crc kubenswrapper[4826]: I0319 19:32:41.074809 4826 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f04d4a78-800e-4641-87ed-8268ad4fea33-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 19:32:41 crc kubenswrapper[4826]: I0319 19:32:41.074870 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pz485\" (UniqueName: \"kubernetes.io/projected/f04d4a78-800e-4641-87ed-8268ad4fea33-kube-api-access-pz485\") on node \"crc\" DevicePath \"\"" Mar 19 19:32:41 crc kubenswrapper[4826]: I0319 19:32:41.423705 4826 generic.go:334] "Generic (PLEG): container finished" podID="f04d4a78-800e-4641-87ed-8268ad4fea33" containerID="68b9ad0bb66d03e89feceeb7325b0405be44bab2b1b49d1832e5f5837f211a95" exitCode=0 Mar 19 19:32:41 crc kubenswrapper[4826]: I0319 19:32:41.423765 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h5fnh" event={"ID":"f04d4a78-800e-4641-87ed-8268ad4fea33","Type":"ContainerDied","Data":"68b9ad0bb66d03e89feceeb7325b0405be44bab2b1b49d1832e5f5837f211a95"} Mar 19 19:32:41 crc kubenswrapper[4826]: I0319 19:32:41.423796 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h5fnh" event={"ID":"f04d4a78-800e-4641-87ed-8268ad4fea33","Type":"ContainerDied","Data":"1ea09bc16515bf244161d363ff34b8d10b422c47da823884c1d2d4587c1c890d"} Mar 19 19:32:41 crc kubenswrapper[4826]: I0319 19:32:41.423817 4826 scope.go:117] "RemoveContainer" containerID="68b9ad0bb66d03e89feceeb7325b0405be44bab2b1b49d1832e5f5837f211a95" Mar 19 19:32:41 crc kubenswrapper[4826]: I0319 19:32:41.423826 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h5fnh" Mar 19 19:32:41 crc kubenswrapper[4826]: I0319 19:32:41.488484 4826 scope.go:117] "RemoveContainer" containerID="d981cfe103ad7de3fc786c40bda07f51424e94b8c29e0c8d36849ad9daa93a02" Mar 19 19:32:41 crc kubenswrapper[4826]: I0319 19:32:41.489719 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h5fnh"] Mar 19 19:32:41 crc kubenswrapper[4826]: I0319 19:32:41.500696 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-h5fnh"] Mar 19 19:32:41 crc kubenswrapper[4826]: I0319 19:32:41.532932 4826 scope.go:117] "RemoveContainer" containerID="e2579c97c5aa667c6f65a1891225fef94a8cbfc5b0b694ca614cb541c5dc4fa5" Mar 19 19:32:41 crc kubenswrapper[4826]: I0319 19:32:41.577455 4826 scope.go:117] "RemoveContainer" containerID="68b9ad0bb66d03e89feceeb7325b0405be44bab2b1b49d1832e5f5837f211a95" Mar 19 19:32:41 crc kubenswrapper[4826]: E0319 19:32:41.578062 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68b9ad0bb66d03e89feceeb7325b0405be44bab2b1b49d1832e5f5837f211a95\": container with ID starting with 68b9ad0bb66d03e89feceeb7325b0405be44bab2b1b49d1832e5f5837f211a95 not found: ID does not exist" containerID="68b9ad0bb66d03e89feceeb7325b0405be44bab2b1b49d1832e5f5837f211a95" Mar 19 19:32:41 crc kubenswrapper[4826]: I0319 19:32:41.578126 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68b9ad0bb66d03e89feceeb7325b0405be44bab2b1b49d1832e5f5837f211a95"} err="failed to get container status \"68b9ad0bb66d03e89feceeb7325b0405be44bab2b1b49d1832e5f5837f211a95\": rpc error: code = NotFound desc = could not find container \"68b9ad0bb66d03e89feceeb7325b0405be44bab2b1b49d1832e5f5837f211a95\": container with ID starting with 68b9ad0bb66d03e89feceeb7325b0405be44bab2b1b49d1832e5f5837f211a95 not found: ID does not exist" Mar 19 19:32:41 crc kubenswrapper[4826]: I0319 19:32:41.578162 4826 scope.go:117] "RemoveContainer" containerID="d981cfe103ad7de3fc786c40bda07f51424e94b8c29e0c8d36849ad9daa93a02" Mar 19 19:32:41 crc kubenswrapper[4826]: E0319 19:32:41.578622 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d981cfe103ad7de3fc786c40bda07f51424e94b8c29e0c8d36849ad9daa93a02\": container with ID starting with d981cfe103ad7de3fc786c40bda07f51424e94b8c29e0c8d36849ad9daa93a02 not found: ID does not exist" containerID="d981cfe103ad7de3fc786c40bda07f51424e94b8c29e0c8d36849ad9daa93a02" Mar 19 19:32:41 crc kubenswrapper[4826]: I0319 19:32:41.578672 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d981cfe103ad7de3fc786c40bda07f51424e94b8c29e0c8d36849ad9daa93a02"} err="failed to get container status \"d981cfe103ad7de3fc786c40bda07f51424e94b8c29e0c8d36849ad9daa93a02\": rpc error: code = NotFound desc = could not find container \"d981cfe103ad7de3fc786c40bda07f51424e94b8c29e0c8d36849ad9daa93a02\": container with ID starting with d981cfe103ad7de3fc786c40bda07f51424e94b8c29e0c8d36849ad9daa93a02 not found: ID does not exist" Mar 19 19:32:41 crc kubenswrapper[4826]: I0319 19:32:41.578698 4826 scope.go:117] "RemoveContainer" containerID="e2579c97c5aa667c6f65a1891225fef94a8cbfc5b0b694ca614cb541c5dc4fa5" Mar 19 19:32:41 crc kubenswrapper[4826]: E0319 19:32:41.579006 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2579c97c5aa667c6f65a1891225fef94a8cbfc5b0b694ca614cb541c5dc4fa5\": container with ID starting with e2579c97c5aa667c6f65a1891225fef94a8cbfc5b0b694ca614cb541c5dc4fa5 not found: ID does not exist" containerID="e2579c97c5aa667c6f65a1891225fef94a8cbfc5b0b694ca614cb541c5dc4fa5" Mar 19 19:32:41 crc kubenswrapper[4826]: I0319 19:32:41.579029 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2579c97c5aa667c6f65a1891225fef94a8cbfc5b0b694ca614cb541c5dc4fa5"} err="failed to get container status \"e2579c97c5aa667c6f65a1891225fef94a8cbfc5b0b694ca614cb541c5dc4fa5\": rpc error: code = NotFound desc = could not find container \"e2579c97c5aa667c6f65a1891225fef94a8cbfc5b0b694ca614cb541c5dc4fa5\": container with ID starting with e2579c97c5aa667c6f65a1891225fef94a8cbfc5b0b694ca614cb541c5dc4fa5 not found: ID does not exist" Mar 19 19:32:41 crc kubenswrapper[4826]: I0319 19:32:41.992436 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f04d4a78-800e-4641-87ed-8268ad4fea33" path="/var/lib/kubelet/pods/f04d4a78-800e-4641-87ed-8268ad4fea33/volumes" Mar 19 19:32:46 crc kubenswrapper[4826]: I0319 19:32:46.046812 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-pvzvx"] Mar 19 19:32:46 crc kubenswrapper[4826]: I0319 19:32:46.059767 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-pvzvx"] Mar 19 19:32:47 crc kubenswrapper[4826]: I0319 19:32:47.989891 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be3204f1-b777-49ae-8ba5-2f30f639dd1e" path="/var/lib/kubelet/pods/be3204f1-b777-49ae-8ba5-2f30f639dd1e/volumes" Mar 19 19:32:49 crc kubenswrapper[4826]: I0319 19:32:49.398506 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-d5cbh"] Mar 19 19:32:49 crc kubenswrapper[4826]: E0319 19:32:49.402008 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f04d4a78-800e-4641-87ed-8268ad4fea33" containerName="extract-utilities" Mar 19 19:32:49 crc kubenswrapper[4826]: I0319 19:32:49.402052 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="f04d4a78-800e-4641-87ed-8268ad4fea33" containerName="extract-utilities" Mar 19 19:32:49 crc kubenswrapper[4826]: E0319 19:32:49.402697 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f04d4a78-800e-4641-87ed-8268ad4fea33" containerName="registry-server" Mar 19 19:32:49 crc kubenswrapper[4826]: I0319 19:32:49.402721 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="f04d4a78-800e-4641-87ed-8268ad4fea33" containerName="registry-server" Mar 19 19:32:49 crc kubenswrapper[4826]: E0319 19:32:49.402764 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f04d4a78-800e-4641-87ed-8268ad4fea33" containerName="extract-content" Mar 19 19:32:49 crc kubenswrapper[4826]: I0319 19:32:49.402774 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="f04d4a78-800e-4641-87ed-8268ad4fea33" containerName="extract-content" Mar 19 19:32:49 crc kubenswrapper[4826]: I0319 19:32:49.405540 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="f04d4a78-800e-4641-87ed-8268ad4fea33" containerName="registry-server" Mar 19 19:32:49 crc kubenswrapper[4826]: I0319 19:32:49.413114 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d5cbh" Mar 19 19:32:49 crc kubenswrapper[4826]: I0319 19:32:49.463714 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d5cbh"] Mar 19 19:32:49 crc kubenswrapper[4826]: I0319 19:32:49.535277 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/913489f6-b821-469f-bece-10ed5f48c768-utilities\") pod \"certified-operators-d5cbh\" (UID: \"913489f6-b821-469f-bece-10ed5f48c768\") " pod="openshift-marketplace/certified-operators-d5cbh" Mar 19 19:32:49 crc kubenswrapper[4826]: I0319 19:32:49.535324 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbbk4\" (UniqueName: \"kubernetes.io/projected/913489f6-b821-469f-bece-10ed5f48c768-kube-api-access-fbbk4\") pod \"certified-operators-d5cbh\" (UID: \"913489f6-b821-469f-bece-10ed5f48c768\") " pod="openshift-marketplace/certified-operators-d5cbh" Mar 19 19:32:49 crc kubenswrapper[4826]: I0319 19:32:49.535571 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/913489f6-b821-469f-bece-10ed5f48c768-catalog-content\") pod \"certified-operators-d5cbh\" (UID: \"913489f6-b821-469f-bece-10ed5f48c768\") " pod="openshift-marketplace/certified-operators-d5cbh" Mar 19 19:32:49 crc kubenswrapper[4826]: I0319 19:32:49.637596 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/913489f6-b821-469f-bece-10ed5f48c768-catalog-content\") pod \"certified-operators-d5cbh\" (UID: \"913489f6-b821-469f-bece-10ed5f48c768\") " pod="openshift-marketplace/certified-operators-d5cbh" Mar 19 19:32:49 crc kubenswrapper[4826]: I0319 19:32:49.637726 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/913489f6-b821-469f-bece-10ed5f48c768-utilities\") pod \"certified-operators-d5cbh\" (UID: \"913489f6-b821-469f-bece-10ed5f48c768\") " pod="openshift-marketplace/certified-operators-d5cbh" Mar 19 19:32:49 crc kubenswrapper[4826]: I0319 19:32:49.637745 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbbk4\" (UniqueName: \"kubernetes.io/projected/913489f6-b821-469f-bece-10ed5f48c768-kube-api-access-fbbk4\") pod \"certified-operators-d5cbh\" (UID: \"913489f6-b821-469f-bece-10ed5f48c768\") " pod="openshift-marketplace/certified-operators-d5cbh" Mar 19 19:32:49 crc kubenswrapper[4826]: I0319 19:32:49.638343 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/913489f6-b821-469f-bece-10ed5f48c768-catalog-content\") pod \"certified-operators-d5cbh\" (UID: \"913489f6-b821-469f-bece-10ed5f48c768\") " pod="openshift-marketplace/certified-operators-d5cbh" Mar 19 19:32:49 crc kubenswrapper[4826]: I0319 19:32:49.638374 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/913489f6-b821-469f-bece-10ed5f48c768-utilities\") pod \"certified-operators-d5cbh\" (UID: \"913489f6-b821-469f-bece-10ed5f48c768\") " pod="openshift-marketplace/certified-operators-d5cbh" Mar 19 19:32:49 crc kubenswrapper[4826]: I0319 19:32:49.658854 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbbk4\" (UniqueName: \"kubernetes.io/projected/913489f6-b821-469f-bece-10ed5f48c768-kube-api-access-fbbk4\") pod \"certified-operators-d5cbh\" (UID: \"913489f6-b821-469f-bece-10ed5f48c768\") " pod="openshift-marketplace/certified-operators-d5cbh" Mar 19 19:32:49 crc kubenswrapper[4826]: I0319 19:32:49.773058 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d5cbh" Mar 19 19:32:50 crc kubenswrapper[4826]: I0319 19:32:50.295072 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d5cbh"] Mar 19 19:32:50 crc kubenswrapper[4826]: I0319 19:32:50.541517 4826 generic.go:334] "Generic (PLEG): container finished" podID="913489f6-b821-469f-bece-10ed5f48c768" containerID="55c9c9f323f4e04b45e67749baa8493107d1d9d9e627e524b965bf8754d271aa" exitCode=0 Mar 19 19:32:50 crc kubenswrapper[4826]: I0319 19:32:50.541564 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d5cbh" event={"ID":"913489f6-b821-469f-bece-10ed5f48c768","Type":"ContainerDied","Data":"55c9c9f323f4e04b45e67749baa8493107d1d9d9e627e524b965bf8754d271aa"} Mar 19 19:32:50 crc kubenswrapper[4826]: I0319 19:32:50.541592 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d5cbh" event={"ID":"913489f6-b821-469f-bece-10ed5f48c768","Type":"ContainerStarted","Data":"6b7a48bb1b719b5abad299823995decb5597ccaf4d74fa71b6f27312c6fed1d0"} Mar 19 19:32:52 crc kubenswrapper[4826]: I0319 19:32:52.369204 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2r79f"] Mar 19 19:32:52 crc kubenswrapper[4826]: I0319 19:32:52.372177 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2r79f" Mar 19 19:32:52 crc kubenswrapper[4826]: I0319 19:32:52.393797 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2r79f"] Mar 19 19:32:52 crc kubenswrapper[4826]: I0319 19:32:52.514552 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bdf4030-5400-493e-bb9a-7988bd10dc68-utilities\") pod \"redhat-operators-2r79f\" (UID: \"9bdf4030-5400-493e-bb9a-7988bd10dc68\") " pod="openshift-marketplace/redhat-operators-2r79f" Mar 19 19:32:52 crc kubenswrapper[4826]: I0319 19:32:52.514791 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bdf4030-5400-493e-bb9a-7988bd10dc68-catalog-content\") pod \"redhat-operators-2r79f\" (UID: \"9bdf4030-5400-493e-bb9a-7988bd10dc68\") " pod="openshift-marketplace/redhat-operators-2r79f" Mar 19 19:32:52 crc kubenswrapper[4826]: I0319 19:32:52.515111 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldnqv\" (UniqueName: \"kubernetes.io/projected/9bdf4030-5400-493e-bb9a-7988bd10dc68-kube-api-access-ldnqv\") pod \"redhat-operators-2r79f\" (UID: \"9bdf4030-5400-493e-bb9a-7988bd10dc68\") " pod="openshift-marketplace/redhat-operators-2r79f" Mar 19 19:32:52 crc kubenswrapper[4826]: I0319 19:32:52.562137 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d5cbh" event={"ID":"913489f6-b821-469f-bece-10ed5f48c768","Type":"ContainerStarted","Data":"330ec5127cd1daec3618327d3868fd446d30204f7d519fb964def21d82a256f6"} Mar 19 19:32:52 crc kubenswrapper[4826]: I0319 19:32:52.617503 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldnqv\" (UniqueName: \"kubernetes.io/projected/9bdf4030-5400-493e-bb9a-7988bd10dc68-kube-api-access-ldnqv\") pod \"redhat-operators-2r79f\" (UID: \"9bdf4030-5400-493e-bb9a-7988bd10dc68\") " pod="openshift-marketplace/redhat-operators-2r79f" Mar 19 19:32:52 crc kubenswrapper[4826]: I0319 19:32:52.617608 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bdf4030-5400-493e-bb9a-7988bd10dc68-utilities\") pod \"redhat-operators-2r79f\" (UID: \"9bdf4030-5400-493e-bb9a-7988bd10dc68\") " pod="openshift-marketplace/redhat-operators-2r79f" Mar 19 19:32:52 crc kubenswrapper[4826]: I0319 19:32:52.617812 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bdf4030-5400-493e-bb9a-7988bd10dc68-catalog-content\") pod \"redhat-operators-2r79f\" (UID: \"9bdf4030-5400-493e-bb9a-7988bd10dc68\") " pod="openshift-marketplace/redhat-operators-2r79f" Mar 19 19:32:52 crc kubenswrapper[4826]: I0319 19:32:52.618429 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bdf4030-5400-493e-bb9a-7988bd10dc68-utilities\") pod \"redhat-operators-2r79f\" (UID: \"9bdf4030-5400-493e-bb9a-7988bd10dc68\") " pod="openshift-marketplace/redhat-operators-2r79f" Mar 19 19:32:52 crc kubenswrapper[4826]: I0319 19:32:52.618489 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bdf4030-5400-493e-bb9a-7988bd10dc68-catalog-content\") pod \"redhat-operators-2r79f\" (UID: \"9bdf4030-5400-493e-bb9a-7988bd10dc68\") " pod="openshift-marketplace/redhat-operators-2r79f" Mar 19 19:32:52 crc kubenswrapper[4826]: I0319 19:32:52.641134 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldnqv\" (UniqueName: \"kubernetes.io/projected/9bdf4030-5400-493e-bb9a-7988bd10dc68-kube-api-access-ldnqv\") pod \"redhat-operators-2r79f\" (UID: \"9bdf4030-5400-493e-bb9a-7988bd10dc68\") " pod="openshift-marketplace/redhat-operators-2r79f" Mar 19 19:32:52 crc kubenswrapper[4826]: I0319 19:32:52.697430 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2r79f" Mar 19 19:32:53 crc kubenswrapper[4826]: I0319 19:32:53.041467 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2r79f"] Mar 19 19:32:53 crc kubenswrapper[4826]: I0319 19:32:53.575807 4826 generic.go:334] "Generic (PLEG): container finished" podID="9bdf4030-5400-493e-bb9a-7988bd10dc68" containerID="73be179c0eab37f3888bb0ea6a88e78997d65af038d76af4fdcff0db9dd46449" exitCode=0 Mar 19 19:32:53 crc kubenswrapper[4826]: I0319 19:32:53.575867 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2r79f" event={"ID":"9bdf4030-5400-493e-bb9a-7988bd10dc68","Type":"ContainerDied","Data":"73be179c0eab37f3888bb0ea6a88e78997d65af038d76af4fdcff0db9dd46449"} Mar 19 19:32:53 crc kubenswrapper[4826]: I0319 19:32:53.576148 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2r79f" event={"ID":"9bdf4030-5400-493e-bb9a-7988bd10dc68","Type":"ContainerStarted","Data":"14361bb02781b86ac9a4358bc6b4ca65f04d2c2834e57e3f100d9e70c932a555"} Mar 19 19:32:54 crc kubenswrapper[4826]: I0319 19:32:54.598879 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2r79f" event={"ID":"9bdf4030-5400-493e-bb9a-7988bd10dc68","Type":"ContainerStarted","Data":"c74bf03fa01849d19259075a758dcfa24ebbf18661aa55b2d83bada40297291a"} Mar 19 19:32:54 crc kubenswrapper[4826]: I0319 19:32:54.602017 4826 generic.go:334] "Generic (PLEG): container finished" podID="913489f6-b821-469f-bece-10ed5f48c768" containerID="330ec5127cd1daec3618327d3868fd446d30204f7d519fb964def21d82a256f6" exitCode=0 Mar 19 19:32:54 crc kubenswrapper[4826]: I0319 19:32:54.602074 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d5cbh" event={"ID":"913489f6-b821-469f-bece-10ed5f48c768","Type":"ContainerDied","Data":"330ec5127cd1daec3618327d3868fd446d30204f7d519fb964def21d82a256f6"} Mar 19 19:32:55 crc kubenswrapper[4826]: I0319 19:32:55.400142 4826 patch_prober.go:28] interesting pod/machine-config-daemon-zz87p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:32:55 crc kubenswrapper[4826]: I0319 19:32:55.400430 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:32:55 crc kubenswrapper[4826]: I0319 19:32:55.616719 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d5cbh" event={"ID":"913489f6-b821-469f-bece-10ed5f48c768","Type":"ContainerStarted","Data":"802eb5ac661ecf5b0c7b7661dee50f704027624dedfe969318841f67ea69eab3"} Mar 19 19:32:55 crc kubenswrapper[4826]: I0319 19:32:55.649481 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-d5cbh" podStartSLOduration=2.049125925 podStartE2EDuration="6.649457016s" podCreationTimestamp="2026-03-19 19:32:49 +0000 UTC" firstStartedPulling="2026-03-19 19:32:50.543902411 +0000 UTC m=+2195.297970724" lastFinishedPulling="2026-03-19 19:32:55.144233462 +0000 UTC m=+2199.898301815" observedRunningTime="2026-03-19 19:32:55.637536095 +0000 UTC m=+2200.391604418" watchObservedRunningTime="2026-03-19 19:32:55.649457016 +0000 UTC m=+2200.403525339" Mar 19 19:32:57 crc kubenswrapper[4826]: I0319 19:32:57.642221 4826 generic.go:334] "Generic (PLEG): container finished" podID="e534979a-19b2-434a-9b5e-d7d7f4d9125b" containerID="306cb979a86190e83f4cf5415dad307250cc008f2573c486c534da3aa2ed80d3" exitCode=0 Mar 19 19:32:57 crc kubenswrapper[4826]: I0319 19:32:57.642310 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-glfmw" event={"ID":"e534979a-19b2-434a-9b5e-d7d7f4d9125b","Type":"ContainerDied","Data":"306cb979a86190e83f4cf5415dad307250cc008f2573c486c534da3aa2ed80d3"} Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.173262 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-glfmw" Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.309144 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e534979a-19b2-434a-9b5e-d7d7f4d9125b-repo-setup-combined-ca-bundle\") pod \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.309361 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e534979a-19b2-434a-9b5e-d7d7f4d9125b-bootstrap-combined-ca-bundle\") pod \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.309421 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e534979a-19b2-434a-9b5e-d7d7f4d9125b-inventory\") pod \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.309481 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e534979a-19b2-434a-9b5e-d7d7f4d9125b-openstack-edpm-ipam-ovn-default-certs-0\") pod \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.309513 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e534979a-19b2-434a-9b5e-d7d7f4d9125b-telemetry-combined-ca-bundle\") pod \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.309563 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e534979a-19b2-434a-9b5e-d7d7f4d9125b-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.309590 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e534979a-19b2-434a-9b5e-d7d7f4d9125b-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.309668 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59ll5\" (UniqueName: \"kubernetes.io/projected/e534979a-19b2-434a-9b5e-d7d7f4d9125b-kube-api-access-59ll5\") pod \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.309733 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e534979a-19b2-434a-9b5e-d7d7f4d9125b-ovn-combined-ca-bundle\") pod \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.309760 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e534979a-19b2-434a-9b5e-d7d7f4d9125b-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.309807 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e534979a-19b2-434a-9b5e-d7d7f4d9125b-ssh-key-openstack-edpm-ipam\") pod \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.309833 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e534979a-19b2-434a-9b5e-d7d7f4d9125b-telemetry-power-monitoring-combined-ca-bundle\") pod \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.309863 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e534979a-19b2-434a-9b5e-d7d7f4d9125b-libvirt-combined-ca-bundle\") pod \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.309884 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e534979a-19b2-434a-9b5e-d7d7f4d9125b-neutron-metadata-combined-ca-bundle\") pod \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.309926 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e534979a-19b2-434a-9b5e-d7d7f4d9125b-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.309955 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e534979a-19b2-434a-9b5e-d7d7f4d9125b-nova-combined-ca-bundle\") pod \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\" (UID: \"e534979a-19b2-434a-9b5e-d7d7f4d9125b\") " Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.317604 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e534979a-19b2-434a-9b5e-d7d7f4d9125b-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "e534979a-19b2-434a-9b5e-d7d7f4d9125b" (UID: "e534979a-19b2-434a-9b5e-d7d7f4d9125b"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.318986 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e534979a-19b2-434a-9b5e-d7d7f4d9125b-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "e534979a-19b2-434a-9b5e-d7d7f4d9125b" (UID: "e534979a-19b2-434a-9b5e-d7d7f4d9125b"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.319403 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e534979a-19b2-434a-9b5e-d7d7f4d9125b-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "e534979a-19b2-434a-9b5e-d7d7f4d9125b" (UID: "e534979a-19b2-434a-9b5e-d7d7f4d9125b"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.319466 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e534979a-19b2-434a-9b5e-d7d7f4d9125b-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "e534979a-19b2-434a-9b5e-d7d7f4d9125b" (UID: "e534979a-19b2-434a-9b5e-d7d7f4d9125b"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.319569 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e534979a-19b2-434a-9b5e-d7d7f4d9125b-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0") pod "e534979a-19b2-434a-9b5e-d7d7f4d9125b" (UID: "e534979a-19b2-434a-9b5e-d7d7f4d9125b"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.320544 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e534979a-19b2-434a-9b5e-d7d7f4d9125b-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "e534979a-19b2-434a-9b5e-d7d7f4d9125b" (UID: "e534979a-19b2-434a-9b5e-d7d7f4d9125b"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.320913 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e534979a-19b2-434a-9b5e-d7d7f4d9125b-telemetry-power-monitoring-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-power-monitoring-combined-ca-bundle") pod "e534979a-19b2-434a-9b5e-d7d7f4d9125b" (UID: "e534979a-19b2-434a-9b5e-d7d7f4d9125b"). InnerVolumeSpecName "telemetry-power-monitoring-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.320936 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e534979a-19b2-434a-9b5e-d7d7f4d9125b-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "e534979a-19b2-434a-9b5e-d7d7f4d9125b" (UID: "e534979a-19b2-434a-9b5e-d7d7f4d9125b"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.321413 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e534979a-19b2-434a-9b5e-d7d7f4d9125b-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "e534979a-19b2-434a-9b5e-d7d7f4d9125b" (UID: "e534979a-19b2-434a-9b5e-d7d7f4d9125b"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.321853 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e534979a-19b2-434a-9b5e-d7d7f4d9125b-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "e534979a-19b2-434a-9b5e-d7d7f4d9125b" (UID: "e534979a-19b2-434a-9b5e-d7d7f4d9125b"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.322150 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e534979a-19b2-434a-9b5e-d7d7f4d9125b-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "e534979a-19b2-434a-9b5e-d7d7f4d9125b" (UID: "e534979a-19b2-434a-9b5e-d7d7f4d9125b"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.325254 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e534979a-19b2-434a-9b5e-d7d7f4d9125b-kube-api-access-59ll5" (OuterVolumeSpecName: "kube-api-access-59ll5") pod "e534979a-19b2-434a-9b5e-d7d7f4d9125b" (UID: "e534979a-19b2-434a-9b5e-d7d7f4d9125b"). InnerVolumeSpecName "kube-api-access-59ll5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.326398 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e534979a-19b2-434a-9b5e-d7d7f4d9125b-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "e534979a-19b2-434a-9b5e-d7d7f4d9125b" (UID: "e534979a-19b2-434a-9b5e-d7d7f4d9125b"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.327747 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e534979a-19b2-434a-9b5e-d7d7f4d9125b-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "e534979a-19b2-434a-9b5e-d7d7f4d9125b" (UID: "e534979a-19b2-434a-9b5e-d7d7f4d9125b"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.350859 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e534979a-19b2-434a-9b5e-d7d7f4d9125b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e534979a-19b2-434a-9b5e-d7d7f4d9125b" (UID: "e534979a-19b2-434a-9b5e-d7d7f4d9125b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.358380 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e534979a-19b2-434a-9b5e-d7d7f4d9125b-inventory" (OuterVolumeSpecName: "inventory") pod "e534979a-19b2-434a-9b5e-d7d7f4d9125b" (UID: "e534979a-19b2-434a-9b5e-d7d7f4d9125b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.412789 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59ll5\" (UniqueName: \"kubernetes.io/projected/e534979a-19b2-434a-9b5e-d7d7f4d9125b-kube-api-access-59ll5\") on node \"crc\" DevicePath \"\"" Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.412830 4826 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e534979a-19b2-434a-9b5e-d7d7f4d9125b-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.412842 4826 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e534979a-19b2-434a-9b5e-d7d7f4d9125b-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.412855 4826 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e534979a-19b2-434a-9b5e-d7d7f4d9125b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.412867 4826 reconciler_common.go:293] "Volume detached for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e534979a-19b2-434a-9b5e-d7d7f4d9125b-telemetry-power-monitoring-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.412877 4826 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e534979a-19b2-434a-9b5e-d7d7f4d9125b-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.412887 4826 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e534979a-19b2-434a-9b5e-d7d7f4d9125b-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.412897 4826 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e534979a-19b2-434a-9b5e-d7d7f4d9125b-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.412907 4826 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e534979a-19b2-434a-9b5e-d7d7f4d9125b-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.412917 4826 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e534979a-19b2-434a-9b5e-d7d7f4d9125b-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.412926 4826 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e534979a-19b2-434a-9b5e-d7d7f4d9125b-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.412937 4826 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e534979a-19b2-434a-9b5e-d7d7f4d9125b-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.412946 4826 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e534979a-19b2-434a-9b5e-d7d7f4d9125b-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.412955 4826 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e534979a-19b2-434a-9b5e-d7d7f4d9125b-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.412963 4826 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e534979a-19b2-434a-9b5e-d7d7f4d9125b-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.412971 4826 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e534979a-19b2-434a-9b5e-d7d7f4d9125b-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.663218 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-glfmw" event={"ID":"e534979a-19b2-434a-9b5e-d7d7f4d9125b","Type":"ContainerDied","Data":"34c75cac691547c7dbd0ef3a4cc5666dc42d0525863a7622c294c0461f2fb44a"} Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.663263 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34c75cac691547c7dbd0ef3a4cc5666dc42d0525863a7622c294c0461f2fb44a" Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.663318 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-glfmw" Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.773249 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-d5cbh" Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.773298 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-d5cbh" Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.845685 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-ftdhw"] Mar 19 19:32:59 crc kubenswrapper[4826]: E0319 19:32:59.846648 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e534979a-19b2-434a-9b5e-d7d7f4d9125b" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.846682 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="e534979a-19b2-434a-9b5e-d7d7f4d9125b" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.846978 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="e534979a-19b2-434a-9b5e-d7d7f4d9125b" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.848015 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ftdhw" Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.855420 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.855757 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jchxw" Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.855757 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.855870 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.855964 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.863236 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-ftdhw"] Mar 19 19:32:59 crc kubenswrapper[4826]: E0319 19:32:59.884893 4826 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode534979a_19b2_434a_9b5e_d7d7f4d9125b.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode534979a_19b2_434a_9b5e_d7d7f4d9125b.slice/crio-34c75cac691547c7dbd0ef3a4cc5666dc42d0525863a7622c294c0461f2fb44a\": RecentStats: unable to find data in memory cache]" Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.928038 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/56108211-8bba-4740-8883-b40c8a139f8e-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-ftdhw\" (UID: \"56108211-8bba-4740-8883-b40c8a139f8e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ftdhw" Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.928167 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56108211-8bba-4740-8883-b40c8a139f8e-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-ftdhw\" (UID: \"56108211-8bba-4740-8883-b40c8a139f8e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ftdhw" Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.928250 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/56108211-8bba-4740-8883-b40c8a139f8e-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-ftdhw\" (UID: \"56108211-8bba-4740-8883-b40c8a139f8e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ftdhw" Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.928330 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/56108211-8bba-4740-8883-b40c8a139f8e-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-ftdhw\" (UID: \"56108211-8bba-4740-8883-b40c8a139f8e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ftdhw" Mar 19 19:32:59 crc kubenswrapper[4826]: I0319 19:32:59.928351 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7n9s5\" (UniqueName: \"kubernetes.io/projected/56108211-8bba-4740-8883-b40c8a139f8e-kube-api-access-7n9s5\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-ftdhw\" (UID: \"56108211-8bba-4740-8883-b40c8a139f8e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ftdhw" Mar 19 19:33:00 crc kubenswrapper[4826]: I0319 19:33:00.030524 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/56108211-8bba-4740-8883-b40c8a139f8e-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-ftdhw\" (UID: \"56108211-8bba-4740-8883-b40c8a139f8e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ftdhw" Mar 19 19:33:00 crc kubenswrapper[4826]: I0319 19:33:00.030699 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56108211-8bba-4740-8883-b40c8a139f8e-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-ftdhw\" (UID: \"56108211-8bba-4740-8883-b40c8a139f8e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ftdhw" Mar 19 19:33:00 crc kubenswrapper[4826]: I0319 19:33:00.030813 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/56108211-8bba-4740-8883-b40c8a139f8e-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-ftdhw\" (UID: \"56108211-8bba-4740-8883-b40c8a139f8e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ftdhw" Mar 19 19:33:00 crc kubenswrapper[4826]: I0319 19:33:00.030941 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/56108211-8bba-4740-8883-b40c8a139f8e-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-ftdhw\" (UID: \"56108211-8bba-4740-8883-b40c8a139f8e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ftdhw" Mar 19 19:33:00 crc kubenswrapper[4826]: I0319 19:33:00.030991 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7n9s5\" (UniqueName: \"kubernetes.io/projected/56108211-8bba-4740-8883-b40c8a139f8e-kube-api-access-7n9s5\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-ftdhw\" (UID: \"56108211-8bba-4740-8883-b40c8a139f8e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ftdhw" Mar 19 19:33:00 crc kubenswrapper[4826]: I0319 19:33:00.032316 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/56108211-8bba-4740-8883-b40c8a139f8e-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-ftdhw\" (UID: \"56108211-8bba-4740-8883-b40c8a139f8e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ftdhw" Mar 19 19:33:00 crc kubenswrapper[4826]: I0319 19:33:00.041270 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/56108211-8bba-4740-8883-b40c8a139f8e-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-ftdhw\" (UID: \"56108211-8bba-4740-8883-b40c8a139f8e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ftdhw" Mar 19 19:33:00 crc kubenswrapper[4826]: I0319 19:33:00.041729 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56108211-8bba-4740-8883-b40c8a139f8e-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-ftdhw\" (UID: \"56108211-8bba-4740-8883-b40c8a139f8e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ftdhw" Mar 19 19:33:00 crc kubenswrapper[4826]: I0319 19:33:00.047168 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/56108211-8bba-4740-8883-b40c8a139f8e-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-ftdhw\" (UID: \"56108211-8bba-4740-8883-b40c8a139f8e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ftdhw" Mar 19 19:33:00 crc kubenswrapper[4826]: I0319 19:33:00.050680 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7n9s5\" (UniqueName: \"kubernetes.io/projected/56108211-8bba-4740-8883-b40c8a139f8e-kube-api-access-7n9s5\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-ftdhw\" (UID: \"56108211-8bba-4740-8883-b40c8a139f8e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ftdhw" Mar 19 19:33:00 crc kubenswrapper[4826]: I0319 19:33:00.189511 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ftdhw" Mar 19 19:33:00 crc kubenswrapper[4826]: I0319 19:33:00.676337 4826 generic.go:334] "Generic (PLEG): container finished" podID="9bdf4030-5400-493e-bb9a-7988bd10dc68" containerID="c74bf03fa01849d19259075a758dcfa24ebbf18661aa55b2d83bada40297291a" exitCode=0 Mar 19 19:33:00 crc kubenswrapper[4826]: I0319 19:33:00.676644 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2r79f" event={"ID":"9bdf4030-5400-493e-bb9a-7988bd10dc68","Type":"ContainerDied","Data":"c74bf03fa01849d19259075a758dcfa24ebbf18661aa55b2d83bada40297291a"} Mar 19 19:33:00 crc kubenswrapper[4826]: I0319 19:33:00.840217 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-ftdhw"] Mar 19 19:33:00 crc kubenswrapper[4826]: W0319 19:33:00.845421 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56108211_8bba_4740_8883_b40c8a139f8e.slice/crio-8609e2e355b2b2418516c3b3712da2e11ce39f2f21bbbf6e0ae42178cb50e088 WatchSource:0}: Error finding container 8609e2e355b2b2418516c3b3712da2e11ce39f2f21bbbf6e0ae42178cb50e088: Status 404 returned error can't find the container with id 8609e2e355b2b2418516c3b3712da2e11ce39f2f21bbbf6e0ae42178cb50e088 Mar 19 19:33:00 crc kubenswrapper[4826]: I0319 19:33:00.847491 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-d5cbh" podUID="913489f6-b821-469f-bece-10ed5f48c768" containerName="registry-server" probeResult="failure" output=< Mar 19 19:33:00 crc kubenswrapper[4826]: timeout: failed to connect service ":50051" within 1s Mar 19 19:33:00 crc kubenswrapper[4826]: > Mar 19 19:33:01 crc kubenswrapper[4826]: I0319 19:33:01.691118 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ftdhw" event={"ID":"56108211-8bba-4740-8883-b40c8a139f8e","Type":"ContainerStarted","Data":"3f6ea6c0f20fb2a27ff08576387f26ea90e2cffa929cc46ca844c83100666465"} Mar 19 19:33:01 crc kubenswrapper[4826]: I0319 19:33:01.691708 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ftdhw" event={"ID":"56108211-8bba-4740-8883-b40c8a139f8e","Type":"ContainerStarted","Data":"8609e2e355b2b2418516c3b3712da2e11ce39f2f21bbbf6e0ae42178cb50e088"} Mar 19 19:33:01 crc kubenswrapper[4826]: I0319 19:33:01.694109 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2r79f" event={"ID":"9bdf4030-5400-493e-bb9a-7988bd10dc68","Type":"ContainerStarted","Data":"86282ac2c301d61aede9fbfc5f6a73bcabbebc2a3b39c208f286d54ca9b9d011"} Mar 19 19:33:01 crc kubenswrapper[4826]: I0319 19:33:01.715316 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ftdhw" podStartSLOduration=2.23797824 podStartE2EDuration="2.715300184s" podCreationTimestamp="2026-03-19 19:32:59 +0000 UTC" firstStartedPulling="2026-03-19 19:33:00.850601599 +0000 UTC m=+2205.604669922" lastFinishedPulling="2026-03-19 19:33:01.327923553 +0000 UTC m=+2206.081991866" observedRunningTime="2026-03-19 19:33:01.707450353 +0000 UTC m=+2206.461518666" watchObservedRunningTime="2026-03-19 19:33:01.715300184 +0000 UTC m=+2206.469368497" Mar 19 19:33:01 crc kubenswrapper[4826]: I0319 19:33:01.727176 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2r79f" podStartSLOduration=2.201924425 podStartE2EDuration="9.727143482s" podCreationTimestamp="2026-03-19 19:32:52 +0000 UTC" firstStartedPulling="2026-03-19 19:32:53.577942548 +0000 UTC m=+2198.332010861" lastFinishedPulling="2026-03-19 19:33:01.103161605 +0000 UTC m=+2205.857229918" observedRunningTime="2026-03-19 19:33:01.725845951 +0000 UTC m=+2206.479914264" watchObservedRunningTime="2026-03-19 19:33:01.727143482 +0000 UTC m=+2206.481211795" Mar 19 19:33:02 crc kubenswrapper[4826]: I0319 19:33:02.697861 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2r79f" Mar 19 19:33:02 crc kubenswrapper[4826]: I0319 19:33:02.698138 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2r79f" Mar 19 19:33:03 crc kubenswrapper[4826]: I0319 19:33:03.811635 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2r79f" podUID="9bdf4030-5400-493e-bb9a-7988bd10dc68" containerName="registry-server" probeResult="failure" output=< Mar 19 19:33:03 crc kubenswrapper[4826]: timeout: failed to connect service ":50051" within 1s Mar 19 19:33:03 crc kubenswrapper[4826]: > Mar 19 19:33:09 crc kubenswrapper[4826]: I0319 19:33:09.823373 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-d5cbh" Mar 19 19:33:09 crc kubenswrapper[4826]: I0319 19:33:09.879096 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-d5cbh" Mar 19 19:33:10 crc kubenswrapper[4826]: I0319 19:33:10.568763 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d5cbh"] Mar 19 19:33:10 crc kubenswrapper[4826]: I0319 19:33:10.863367 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-d5cbh" podUID="913489f6-b821-469f-bece-10ed5f48c768" containerName="registry-server" containerID="cri-o://802eb5ac661ecf5b0c7b7661dee50f704027624dedfe969318841f67ea69eab3" gracePeriod=2 Mar 19 19:33:11 crc kubenswrapper[4826]: I0319 19:33:11.440634 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d5cbh" Mar 19 19:33:11 crc kubenswrapper[4826]: I0319 19:33:11.543602 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/913489f6-b821-469f-bece-10ed5f48c768-utilities\") pod \"913489f6-b821-469f-bece-10ed5f48c768\" (UID: \"913489f6-b821-469f-bece-10ed5f48c768\") " Mar 19 19:33:11 crc kubenswrapper[4826]: I0319 19:33:11.543820 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/913489f6-b821-469f-bece-10ed5f48c768-catalog-content\") pod \"913489f6-b821-469f-bece-10ed5f48c768\" (UID: \"913489f6-b821-469f-bece-10ed5f48c768\") " Mar 19 19:33:11 crc kubenswrapper[4826]: I0319 19:33:11.543872 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbbk4\" (UniqueName: \"kubernetes.io/projected/913489f6-b821-469f-bece-10ed5f48c768-kube-api-access-fbbk4\") pod \"913489f6-b821-469f-bece-10ed5f48c768\" (UID: \"913489f6-b821-469f-bece-10ed5f48c768\") " Mar 19 19:33:11 crc kubenswrapper[4826]: I0319 19:33:11.544799 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/913489f6-b821-469f-bece-10ed5f48c768-utilities" (OuterVolumeSpecName: "utilities") pod "913489f6-b821-469f-bece-10ed5f48c768" (UID: "913489f6-b821-469f-bece-10ed5f48c768"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:33:11 crc kubenswrapper[4826]: I0319 19:33:11.551799 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/913489f6-b821-469f-bece-10ed5f48c768-kube-api-access-fbbk4" (OuterVolumeSpecName: "kube-api-access-fbbk4") pod "913489f6-b821-469f-bece-10ed5f48c768" (UID: "913489f6-b821-469f-bece-10ed5f48c768"). InnerVolumeSpecName "kube-api-access-fbbk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:33:11 crc kubenswrapper[4826]: I0319 19:33:11.602061 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/913489f6-b821-469f-bece-10ed5f48c768-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "913489f6-b821-469f-bece-10ed5f48c768" (UID: "913489f6-b821-469f-bece-10ed5f48c768"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:33:11 crc kubenswrapper[4826]: I0319 19:33:11.647179 4826 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/913489f6-b821-469f-bece-10ed5f48c768-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 19:33:11 crc kubenswrapper[4826]: I0319 19:33:11.647219 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbbk4\" (UniqueName: \"kubernetes.io/projected/913489f6-b821-469f-bece-10ed5f48c768-kube-api-access-fbbk4\") on node \"crc\" DevicePath \"\"" Mar 19 19:33:11 crc kubenswrapper[4826]: I0319 19:33:11.647235 4826 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/913489f6-b821-469f-bece-10ed5f48c768-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 19:33:11 crc kubenswrapper[4826]: I0319 19:33:11.875565 4826 generic.go:334] "Generic (PLEG): container finished" podID="913489f6-b821-469f-bece-10ed5f48c768" containerID="802eb5ac661ecf5b0c7b7661dee50f704027624dedfe969318841f67ea69eab3" exitCode=0 Mar 19 19:33:11 crc kubenswrapper[4826]: I0319 19:33:11.875772 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d5cbh" event={"ID":"913489f6-b821-469f-bece-10ed5f48c768","Type":"ContainerDied","Data":"802eb5ac661ecf5b0c7b7661dee50f704027624dedfe969318841f67ea69eab3"} Mar 19 19:33:11 crc kubenswrapper[4826]: I0319 19:33:11.876000 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d5cbh" event={"ID":"913489f6-b821-469f-bece-10ed5f48c768","Type":"ContainerDied","Data":"6b7a48bb1b719b5abad299823995decb5597ccaf4d74fa71b6f27312c6fed1d0"} Mar 19 19:33:11 crc kubenswrapper[4826]: I0319 19:33:11.875853 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d5cbh" Mar 19 19:33:11 crc kubenswrapper[4826]: I0319 19:33:11.876031 4826 scope.go:117] "RemoveContainer" containerID="802eb5ac661ecf5b0c7b7661dee50f704027624dedfe969318841f67ea69eab3" Mar 19 19:33:11 crc kubenswrapper[4826]: I0319 19:33:11.912117 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d5cbh"] Mar 19 19:33:11 crc kubenswrapper[4826]: I0319 19:33:11.912204 4826 scope.go:117] "RemoveContainer" containerID="330ec5127cd1daec3618327d3868fd446d30204f7d519fb964def21d82a256f6" Mar 19 19:33:11 crc kubenswrapper[4826]: I0319 19:33:11.923819 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-d5cbh"] Mar 19 19:33:11 crc kubenswrapper[4826]: I0319 19:33:11.932353 4826 scope.go:117] "RemoveContainer" containerID="55c9c9f323f4e04b45e67749baa8493107d1d9d9e627e524b965bf8754d271aa" Mar 19 19:33:11 crc kubenswrapper[4826]: I0319 19:33:11.994665 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="913489f6-b821-469f-bece-10ed5f48c768" path="/var/lib/kubelet/pods/913489f6-b821-469f-bece-10ed5f48c768/volumes" Mar 19 19:33:12 crc kubenswrapper[4826]: I0319 19:33:12.001108 4826 scope.go:117] "RemoveContainer" containerID="802eb5ac661ecf5b0c7b7661dee50f704027624dedfe969318841f67ea69eab3" Mar 19 19:33:12 crc kubenswrapper[4826]: E0319 19:33:12.002983 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"802eb5ac661ecf5b0c7b7661dee50f704027624dedfe969318841f67ea69eab3\": container with ID starting with 802eb5ac661ecf5b0c7b7661dee50f704027624dedfe969318841f67ea69eab3 not found: ID does not exist" containerID="802eb5ac661ecf5b0c7b7661dee50f704027624dedfe969318841f67ea69eab3" Mar 19 19:33:12 crc kubenswrapper[4826]: I0319 19:33:12.003027 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"802eb5ac661ecf5b0c7b7661dee50f704027624dedfe969318841f67ea69eab3"} err="failed to get container status \"802eb5ac661ecf5b0c7b7661dee50f704027624dedfe969318841f67ea69eab3\": rpc error: code = NotFound desc = could not find container \"802eb5ac661ecf5b0c7b7661dee50f704027624dedfe969318841f67ea69eab3\": container with ID starting with 802eb5ac661ecf5b0c7b7661dee50f704027624dedfe969318841f67ea69eab3 not found: ID does not exist" Mar 19 19:33:12 crc kubenswrapper[4826]: I0319 19:33:12.003050 4826 scope.go:117] "RemoveContainer" containerID="330ec5127cd1daec3618327d3868fd446d30204f7d519fb964def21d82a256f6" Mar 19 19:33:12 crc kubenswrapper[4826]: E0319 19:33:12.003315 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"330ec5127cd1daec3618327d3868fd446d30204f7d519fb964def21d82a256f6\": container with ID starting with 330ec5127cd1daec3618327d3868fd446d30204f7d519fb964def21d82a256f6 not found: ID does not exist" containerID="330ec5127cd1daec3618327d3868fd446d30204f7d519fb964def21d82a256f6" Mar 19 19:33:12 crc kubenswrapper[4826]: I0319 19:33:12.003346 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"330ec5127cd1daec3618327d3868fd446d30204f7d519fb964def21d82a256f6"} err="failed to get container status \"330ec5127cd1daec3618327d3868fd446d30204f7d519fb964def21d82a256f6\": rpc error: code = NotFound desc = could not find container \"330ec5127cd1daec3618327d3868fd446d30204f7d519fb964def21d82a256f6\": container with ID starting with 330ec5127cd1daec3618327d3868fd446d30204f7d519fb964def21d82a256f6 not found: ID does not exist" Mar 19 19:33:12 crc kubenswrapper[4826]: I0319 19:33:12.003364 4826 scope.go:117] "RemoveContainer" containerID="55c9c9f323f4e04b45e67749baa8493107d1d9d9e627e524b965bf8754d271aa" Mar 19 19:33:12 crc kubenswrapper[4826]: E0319 19:33:12.003645 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55c9c9f323f4e04b45e67749baa8493107d1d9d9e627e524b965bf8754d271aa\": container with ID starting with 55c9c9f323f4e04b45e67749baa8493107d1d9d9e627e524b965bf8754d271aa not found: ID does not exist" containerID="55c9c9f323f4e04b45e67749baa8493107d1d9d9e627e524b965bf8754d271aa" Mar 19 19:33:12 crc kubenswrapper[4826]: I0319 19:33:12.003689 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55c9c9f323f4e04b45e67749baa8493107d1d9d9e627e524b965bf8754d271aa"} err="failed to get container status \"55c9c9f323f4e04b45e67749baa8493107d1d9d9e627e524b965bf8754d271aa\": rpc error: code = NotFound desc = could not find container \"55c9c9f323f4e04b45e67749baa8493107d1d9d9e627e524b965bf8754d271aa\": container with ID starting with 55c9c9f323f4e04b45e67749baa8493107d1d9d9e627e524b965bf8754d271aa not found: ID does not exist" Mar 19 19:33:13 crc kubenswrapper[4826]: I0319 19:33:13.755911 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2r79f" podUID="9bdf4030-5400-493e-bb9a-7988bd10dc68" containerName="registry-server" probeResult="failure" output=< Mar 19 19:33:13 crc kubenswrapper[4826]: timeout: failed to connect service ":50051" within 1s Mar 19 19:33:13 crc kubenswrapper[4826]: > Mar 19 19:33:23 crc kubenswrapper[4826]: I0319 19:33:23.770759 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2r79f" podUID="9bdf4030-5400-493e-bb9a-7988bd10dc68" containerName="registry-server" probeResult="failure" output=< Mar 19 19:33:23 crc kubenswrapper[4826]: timeout: failed to connect service ":50051" within 1s Mar 19 19:33:23 crc kubenswrapper[4826]: > Mar 19 19:33:25 crc kubenswrapper[4826]: I0319 19:33:25.400992 4826 patch_prober.go:28] interesting pod/machine-config-daemon-zz87p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:33:25 crc kubenswrapper[4826]: I0319 19:33:25.401241 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:33:25 crc kubenswrapper[4826]: I0319 19:33:25.401295 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" Mar 19 19:33:25 crc kubenswrapper[4826]: I0319 19:33:25.402360 4826 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7ad46c116609c2194bb036463992f5f5e8e6454d574c11fba78d76956fe99246"} pod="openshift-machine-config-operator/machine-config-daemon-zz87p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 19:33:25 crc kubenswrapper[4826]: I0319 19:33:25.402435 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" containerID="cri-o://7ad46c116609c2194bb036463992f5f5e8e6454d574c11fba78d76956fe99246" gracePeriod=600 Mar 19 19:33:25 crc kubenswrapper[4826]: E0319 19:33:25.528408 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:33:26 crc kubenswrapper[4826]: I0319 19:33:26.042828 4826 generic.go:334] "Generic (PLEG): container finished" podID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerID="7ad46c116609c2194bb036463992f5f5e8e6454d574c11fba78d76956fe99246" exitCode=0 Mar 19 19:33:26 crc kubenswrapper[4826]: I0319 19:33:26.042875 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" event={"ID":"b456fa3f-c7a7-45ca-b560-e7a9b21be05a","Type":"ContainerDied","Data":"7ad46c116609c2194bb036463992f5f5e8e6454d574c11fba78d76956fe99246"} Mar 19 19:33:26 crc kubenswrapper[4826]: I0319 19:33:26.042922 4826 scope.go:117] "RemoveContainer" containerID="daa7bd03e971974092a41659f4aba26392bc838aa5d2437fd4d817280d85c5e9" Mar 19 19:33:26 crc kubenswrapper[4826]: I0319 19:33:26.044855 4826 scope.go:117] "RemoveContainer" containerID="7ad46c116609c2194bb036463992f5f5e8e6454d574c11fba78d76956fe99246" Mar 19 19:33:26 crc kubenswrapper[4826]: E0319 19:33:26.046014 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:33:29 crc kubenswrapper[4826]: I0319 19:33:29.039120 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-h4j6w"] Mar 19 19:33:29 crc kubenswrapper[4826]: I0319 19:33:29.050555 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-h4j6w"] Mar 19 19:33:29 crc kubenswrapper[4826]: I0319 19:33:29.989274 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a20d409-205e-4c2a-9197-a6fad3fc7e94" path="/var/lib/kubelet/pods/6a20d409-205e-4c2a-9197-a6fad3fc7e94/volumes" Mar 19 19:33:32 crc kubenswrapper[4826]: I0319 19:33:32.216853 4826 scope.go:117] "RemoveContainer" containerID="85d96ce366d983e2fc3b550894d052f631e25ebfeae724bed7516dfd925e0ee2" Mar 19 19:33:32 crc kubenswrapper[4826]: I0319 19:33:32.254466 4826 scope.go:117] "RemoveContainer" containerID="82a7514d32eb46e0c15228fa576238ef2b003bde6ee7e20d68573001764192ed" Mar 19 19:33:32 crc kubenswrapper[4826]: I0319 19:33:32.761477 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2r79f" Mar 19 19:33:32 crc kubenswrapper[4826]: I0319 19:33:32.842238 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2r79f" Mar 19 19:33:36 crc kubenswrapper[4826]: I0319 19:33:36.027780 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2r79f"] Mar 19 19:33:36 crc kubenswrapper[4826]: I0319 19:33:36.028645 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2r79f" podUID="9bdf4030-5400-493e-bb9a-7988bd10dc68" containerName="registry-server" containerID="cri-o://86282ac2c301d61aede9fbfc5f6a73bcabbebc2a3b39c208f286d54ca9b9d011" gracePeriod=2 Mar 19 19:33:36 crc kubenswrapper[4826]: I0319 19:33:36.172620 4826 generic.go:334] "Generic (PLEG): container finished" podID="9bdf4030-5400-493e-bb9a-7988bd10dc68" containerID="86282ac2c301d61aede9fbfc5f6a73bcabbebc2a3b39c208f286d54ca9b9d011" exitCode=0 Mar 19 19:33:36 crc kubenswrapper[4826]: I0319 19:33:36.172700 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2r79f" event={"ID":"9bdf4030-5400-493e-bb9a-7988bd10dc68","Type":"ContainerDied","Data":"86282ac2c301d61aede9fbfc5f6a73bcabbebc2a3b39c208f286d54ca9b9d011"} Mar 19 19:33:36 crc kubenswrapper[4826]: I0319 19:33:36.619001 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2r79f" Mar 19 19:33:36 crc kubenswrapper[4826]: I0319 19:33:36.681823 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bdf4030-5400-493e-bb9a-7988bd10dc68-utilities\") pod \"9bdf4030-5400-493e-bb9a-7988bd10dc68\" (UID: \"9bdf4030-5400-493e-bb9a-7988bd10dc68\") " Mar 19 19:33:36 crc kubenswrapper[4826]: I0319 19:33:36.682084 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldnqv\" (UniqueName: \"kubernetes.io/projected/9bdf4030-5400-493e-bb9a-7988bd10dc68-kube-api-access-ldnqv\") pod \"9bdf4030-5400-493e-bb9a-7988bd10dc68\" (UID: \"9bdf4030-5400-493e-bb9a-7988bd10dc68\") " Mar 19 19:33:36 crc kubenswrapper[4826]: I0319 19:33:36.682265 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bdf4030-5400-493e-bb9a-7988bd10dc68-catalog-content\") pod \"9bdf4030-5400-493e-bb9a-7988bd10dc68\" (UID: \"9bdf4030-5400-493e-bb9a-7988bd10dc68\") " Mar 19 19:33:36 crc kubenswrapper[4826]: I0319 19:33:36.682811 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9bdf4030-5400-493e-bb9a-7988bd10dc68-utilities" (OuterVolumeSpecName: "utilities") pod "9bdf4030-5400-493e-bb9a-7988bd10dc68" (UID: "9bdf4030-5400-493e-bb9a-7988bd10dc68"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:33:36 crc kubenswrapper[4826]: I0319 19:33:36.683053 4826 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bdf4030-5400-493e-bb9a-7988bd10dc68-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 19:33:36 crc kubenswrapper[4826]: I0319 19:33:36.690507 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bdf4030-5400-493e-bb9a-7988bd10dc68-kube-api-access-ldnqv" (OuterVolumeSpecName: "kube-api-access-ldnqv") pod "9bdf4030-5400-493e-bb9a-7988bd10dc68" (UID: "9bdf4030-5400-493e-bb9a-7988bd10dc68"). InnerVolumeSpecName "kube-api-access-ldnqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:33:36 crc kubenswrapper[4826]: I0319 19:33:36.793230 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldnqv\" (UniqueName: \"kubernetes.io/projected/9bdf4030-5400-493e-bb9a-7988bd10dc68-kube-api-access-ldnqv\") on node \"crc\" DevicePath \"\"" Mar 19 19:33:36 crc kubenswrapper[4826]: I0319 19:33:36.815985 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9bdf4030-5400-493e-bb9a-7988bd10dc68-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9bdf4030-5400-493e-bb9a-7988bd10dc68" (UID: "9bdf4030-5400-493e-bb9a-7988bd10dc68"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:33:36 crc kubenswrapper[4826]: I0319 19:33:36.895346 4826 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bdf4030-5400-493e-bb9a-7988bd10dc68-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 19:33:37 crc kubenswrapper[4826]: I0319 19:33:37.192339 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2r79f" event={"ID":"9bdf4030-5400-493e-bb9a-7988bd10dc68","Type":"ContainerDied","Data":"14361bb02781b86ac9a4358bc6b4ca65f04d2c2834e57e3f100d9e70c932a555"} Mar 19 19:33:37 crc kubenswrapper[4826]: I0319 19:33:37.192799 4826 scope.go:117] "RemoveContainer" containerID="86282ac2c301d61aede9fbfc5f6a73bcabbebc2a3b39c208f286d54ca9b9d011" Mar 19 19:33:37 crc kubenswrapper[4826]: I0319 19:33:37.193039 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2r79f" Mar 19 19:33:37 crc kubenswrapper[4826]: I0319 19:33:37.246928 4826 scope.go:117] "RemoveContainer" containerID="c74bf03fa01849d19259075a758dcfa24ebbf18661aa55b2d83bada40297291a" Mar 19 19:33:37 crc kubenswrapper[4826]: I0319 19:33:37.258328 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2r79f"] Mar 19 19:33:37 crc kubenswrapper[4826]: I0319 19:33:37.277782 4826 scope.go:117] "RemoveContainer" containerID="73be179c0eab37f3888bb0ea6a88e78997d65af038d76af4fdcff0db9dd46449" Mar 19 19:33:37 crc kubenswrapper[4826]: I0319 19:33:37.278387 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2r79f"] Mar 19 19:33:37 crc kubenswrapper[4826]: I0319 19:33:37.999394 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bdf4030-5400-493e-bb9a-7988bd10dc68" path="/var/lib/kubelet/pods/9bdf4030-5400-493e-bb9a-7988bd10dc68/volumes" Mar 19 19:33:38 crc kubenswrapper[4826]: I0319 19:33:38.976754 4826 scope.go:117] "RemoveContainer" containerID="7ad46c116609c2194bb036463992f5f5e8e6454d574c11fba78d76956fe99246" Mar 19 19:33:38 crc kubenswrapper[4826]: E0319 19:33:38.977537 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:33:49 crc kubenswrapper[4826]: I0319 19:33:49.977332 4826 scope.go:117] "RemoveContainer" containerID="7ad46c116609c2194bb036463992f5f5e8e6454d574c11fba78d76956fe99246" Mar 19 19:33:49 crc kubenswrapper[4826]: E0319 19:33:49.978400 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:34:00 crc kubenswrapper[4826]: I0319 19:34:00.225989 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565814-drclz"] Mar 19 19:34:00 crc kubenswrapper[4826]: E0319 19:34:00.226996 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="913489f6-b821-469f-bece-10ed5f48c768" containerName="extract-content" Mar 19 19:34:00 crc kubenswrapper[4826]: I0319 19:34:00.227007 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="913489f6-b821-469f-bece-10ed5f48c768" containerName="extract-content" Mar 19 19:34:00 crc kubenswrapper[4826]: E0319 19:34:00.227017 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bdf4030-5400-493e-bb9a-7988bd10dc68" containerName="extract-content" Mar 19 19:34:00 crc kubenswrapper[4826]: I0319 19:34:00.227023 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bdf4030-5400-493e-bb9a-7988bd10dc68" containerName="extract-content" Mar 19 19:34:00 crc kubenswrapper[4826]: E0319 19:34:00.227059 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="913489f6-b821-469f-bece-10ed5f48c768" containerName="extract-utilities" Mar 19 19:34:00 crc kubenswrapper[4826]: I0319 19:34:00.227065 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="913489f6-b821-469f-bece-10ed5f48c768" containerName="extract-utilities" Mar 19 19:34:00 crc kubenswrapper[4826]: E0319 19:34:00.227074 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bdf4030-5400-493e-bb9a-7988bd10dc68" containerName="extract-utilities" Mar 19 19:34:00 crc kubenswrapper[4826]: I0319 19:34:00.227080 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bdf4030-5400-493e-bb9a-7988bd10dc68" containerName="extract-utilities" Mar 19 19:34:00 crc kubenswrapper[4826]: E0319 19:34:00.227088 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bdf4030-5400-493e-bb9a-7988bd10dc68" containerName="registry-server" Mar 19 19:34:00 crc kubenswrapper[4826]: I0319 19:34:00.227094 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bdf4030-5400-493e-bb9a-7988bd10dc68" containerName="registry-server" Mar 19 19:34:00 crc kubenswrapper[4826]: E0319 19:34:00.227119 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="913489f6-b821-469f-bece-10ed5f48c768" containerName="registry-server" Mar 19 19:34:00 crc kubenswrapper[4826]: I0319 19:34:00.227125 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="913489f6-b821-469f-bece-10ed5f48c768" containerName="registry-server" Mar 19 19:34:00 crc kubenswrapper[4826]: I0319 19:34:00.227332 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bdf4030-5400-493e-bb9a-7988bd10dc68" containerName="registry-server" Mar 19 19:34:00 crc kubenswrapper[4826]: I0319 19:34:00.227348 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="913489f6-b821-469f-bece-10ed5f48c768" containerName="registry-server" Mar 19 19:34:00 crc kubenswrapper[4826]: I0319 19:34:00.228126 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565814-drclz" Mar 19 19:34:00 crc kubenswrapper[4826]: I0319 19:34:00.235800 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k42c2\" (UniqueName: \"kubernetes.io/projected/a01c4fcf-ef63-40d1-b71b-5212f5ad8ffb-kube-api-access-k42c2\") pod \"auto-csr-approver-29565814-drclz\" (UID: \"a01c4fcf-ef63-40d1-b71b-5212f5ad8ffb\") " pod="openshift-infra/auto-csr-approver-29565814-drclz" Mar 19 19:34:00 crc kubenswrapper[4826]: I0319 19:34:00.235923 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565814-drclz"] Mar 19 19:34:00 crc kubenswrapper[4826]: I0319 19:34:00.236064 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 19:34:00 crc kubenswrapper[4826]: I0319 19:34:00.236144 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b27wl" Mar 19 19:34:00 crc kubenswrapper[4826]: I0319 19:34:00.237317 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 19:34:00 crc kubenswrapper[4826]: I0319 19:34:00.339860 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k42c2\" (UniqueName: \"kubernetes.io/projected/a01c4fcf-ef63-40d1-b71b-5212f5ad8ffb-kube-api-access-k42c2\") pod \"auto-csr-approver-29565814-drclz\" (UID: \"a01c4fcf-ef63-40d1-b71b-5212f5ad8ffb\") " pod="openshift-infra/auto-csr-approver-29565814-drclz" Mar 19 19:34:00 crc kubenswrapper[4826]: I0319 19:34:00.362283 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k42c2\" (UniqueName: \"kubernetes.io/projected/a01c4fcf-ef63-40d1-b71b-5212f5ad8ffb-kube-api-access-k42c2\") pod \"auto-csr-approver-29565814-drclz\" (UID: \"a01c4fcf-ef63-40d1-b71b-5212f5ad8ffb\") " pod="openshift-infra/auto-csr-approver-29565814-drclz" Mar 19 19:34:00 crc kubenswrapper[4826]: I0319 19:34:00.556262 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565814-drclz" Mar 19 19:34:01 crc kubenswrapper[4826]: W0319 19:34:01.122102 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda01c4fcf_ef63_40d1_b71b_5212f5ad8ffb.slice/crio-0de29f0924eaf70e43bb83b74dd024f302dbc61202599828daea73153e5fa11e WatchSource:0}: Error finding container 0de29f0924eaf70e43bb83b74dd024f302dbc61202599828daea73153e5fa11e: Status 404 returned error can't find the container with id 0de29f0924eaf70e43bb83b74dd024f302dbc61202599828daea73153e5fa11e Mar 19 19:34:01 crc kubenswrapper[4826]: I0319 19:34:01.126090 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565814-drclz"] Mar 19 19:34:01 crc kubenswrapper[4826]: I0319 19:34:01.516300 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565814-drclz" event={"ID":"a01c4fcf-ef63-40d1-b71b-5212f5ad8ffb","Type":"ContainerStarted","Data":"0de29f0924eaf70e43bb83b74dd024f302dbc61202599828daea73153e5fa11e"} Mar 19 19:34:02 crc kubenswrapper[4826]: I0319 19:34:02.532802 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565814-drclz" event={"ID":"a01c4fcf-ef63-40d1-b71b-5212f5ad8ffb","Type":"ContainerStarted","Data":"14ab3d9e8c7e0d70aefc985d4182cba56433449962298bd91f293a74ae0a5a6a"} Mar 19 19:34:02 crc kubenswrapper[4826]: I0319 19:34:02.554751 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565814-drclz" podStartSLOduration=1.646700483 podStartE2EDuration="2.554723223s" podCreationTimestamp="2026-03-19 19:34:00 +0000 UTC" firstStartedPulling="2026-03-19 19:34:01.127026787 +0000 UTC m=+2265.881095120" lastFinishedPulling="2026-03-19 19:34:02.035049557 +0000 UTC m=+2266.789117860" observedRunningTime="2026-03-19 19:34:02.547380534 +0000 UTC m=+2267.301448867" watchObservedRunningTime="2026-03-19 19:34:02.554723223 +0000 UTC m=+2267.308791566" Mar 19 19:34:02 crc kubenswrapper[4826]: I0319 19:34:02.976154 4826 scope.go:117] "RemoveContainer" containerID="7ad46c116609c2194bb036463992f5f5e8e6454d574c11fba78d76956fe99246" Mar 19 19:34:02 crc kubenswrapper[4826]: E0319 19:34:02.976797 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:34:03 crc kubenswrapper[4826]: I0319 19:34:03.547100 4826 generic.go:334] "Generic (PLEG): container finished" podID="a01c4fcf-ef63-40d1-b71b-5212f5ad8ffb" containerID="14ab3d9e8c7e0d70aefc985d4182cba56433449962298bd91f293a74ae0a5a6a" exitCode=0 Mar 19 19:34:03 crc kubenswrapper[4826]: I0319 19:34:03.547152 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565814-drclz" event={"ID":"a01c4fcf-ef63-40d1-b71b-5212f5ad8ffb","Type":"ContainerDied","Data":"14ab3d9e8c7e0d70aefc985d4182cba56433449962298bd91f293a74ae0a5a6a"} Mar 19 19:34:05 crc kubenswrapper[4826]: I0319 19:34:05.021194 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565814-drclz" Mar 19 19:34:05 crc kubenswrapper[4826]: I0319 19:34:05.165034 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k42c2\" (UniqueName: \"kubernetes.io/projected/a01c4fcf-ef63-40d1-b71b-5212f5ad8ffb-kube-api-access-k42c2\") pod \"a01c4fcf-ef63-40d1-b71b-5212f5ad8ffb\" (UID: \"a01c4fcf-ef63-40d1-b71b-5212f5ad8ffb\") " Mar 19 19:34:05 crc kubenswrapper[4826]: I0319 19:34:05.171364 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a01c4fcf-ef63-40d1-b71b-5212f5ad8ffb-kube-api-access-k42c2" (OuterVolumeSpecName: "kube-api-access-k42c2") pod "a01c4fcf-ef63-40d1-b71b-5212f5ad8ffb" (UID: "a01c4fcf-ef63-40d1-b71b-5212f5ad8ffb"). InnerVolumeSpecName "kube-api-access-k42c2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:34:05 crc kubenswrapper[4826]: I0319 19:34:05.269758 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k42c2\" (UniqueName: \"kubernetes.io/projected/a01c4fcf-ef63-40d1-b71b-5212f5ad8ffb-kube-api-access-k42c2\") on node \"crc\" DevicePath \"\"" Mar 19 19:34:05 crc kubenswrapper[4826]: I0319 19:34:05.580877 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565814-drclz" event={"ID":"a01c4fcf-ef63-40d1-b71b-5212f5ad8ffb","Type":"ContainerDied","Data":"0de29f0924eaf70e43bb83b74dd024f302dbc61202599828daea73153e5fa11e"} Mar 19 19:34:05 crc kubenswrapper[4826]: I0319 19:34:05.580922 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0de29f0924eaf70e43bb83b74dd024f302dbc61202599828daea73153e5fa11e" Mar 19 19:34:05 crc kubenswrapper[4826]: I0319 19:34:05.580978 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565814-drclz" Mar 19 19:34:05 crc kubenswrapper[4826]: I0319 19:34:05.635476 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565808-dff7k"] Mar 19 19:34:05 crc kubenswrapper[4826]: I0319 19:34:05.644710 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565808-dff7k"] Mar 19 19:34:05 crc kubenswrapper[4826]: I0319 19:34:05.998344 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab61a4b-32ad-4a2b-b2a4-d12a17a6f63a" path="/var/lib/kubelet/pods/3ab61a4b-32ad-4a2b-b2a4-d12a17a6f63a/volumes" Mar 19 19:34:06 crc kubenswrapper[4826]: I0319 19:34:06.597736 4826 generic.go:334] "Generic (PLEG): container finished" podID="56108211-8bba-4740-8883-b40c8a139f8e" containerID="3f6ea6c0f20fb2a27ff08576387f26ea90e2cffa929cc46ca844c83100666465" exitCode=0 Mar 19 19:34:06 crc kubenswrapper[4826]: I0319 19:34:06.597813 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ftdhw" event={"ID":"56108211-8bba-4740-8883-b40c8a139f8e","Type":"ContainerDied","Data":"3f6ea6c0f20fb2a27ff08576387f26ea90e2cffa929cc46ca844c83100666465"} Mar 19 19:34:08 crc kubenswrapper[4826]: I0319 19:34:08.119094 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ftdhw" Mar 19 19:34:08 crc kubenswrapper[4826]: I0319 19:34:08.249424 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/56108211-8bba-4740-8883-b40c8a139f8e-ssh-key-openstack-edpm-ipam\") pod \"56108211-8bba-4740-8883-b40c8a139f8e\" (UID: \"56108211-8bba-4740-8883-b40c8a139f8e\") " Mar 19 19:34:08 crc kubenswrapper[4826]: I0319 19:34:08.249644 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56108211-8bba-4740-8883-b40c8a139f8e-ovn-combined-ca-bundle\") pod \"56108211-8bba-4740-8883-b40c8a139f8e\" (UID: \"56108211-8bba-4740-8883-b40c8a139f8e\") " Mar 19 19:34:08 crc kubenswrapper[4826]: I0319 19:34:08.249758 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7n9s5\" (UniqueName: \"kubernetes.io/projected/56108211-8bba-4740-8883-b40c8a139f8e-kube-api-access-7n9s5\") pod \"56108211-8bba-4740-8883-b40c8a139f8e\" (UID: \"56108211-8bba-4740-8883-b40c8a139f8e\") " Mar 19 19:34:08 crc kubenswrapper[4826]: I0319 19:34:08.249926 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/56108211-8bba-4740-8883-b40c8a139f8e-ovncontroller-config-0\") pod \"56108211-8bba-4740-8883-b40c8a139f8e\" (UID: \"56108211-8bba-4740-8883-b40c8a139f8e\") " Mar 19 19:34:08 crc kubenswrapper[4826]: I0319 19:34:08.249982 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/56108211-8bba-4740-8883-b40c8a139f8e-inventory\") pod \"56108211-8bba-4740-8883-b40c8a139f8e\" (UID: \"56108211-8bba-4740-8883-b40c8a139f8e\") " Mar 19 19:34:08 crc kubenswrapper[4826]: I0319 19:34:08.261340 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56108211-8bba-4740-8883-b40c8a139f8e-kube-api-access-7n9s5" (OuterVolumeSpecName: "kube-api-access-7n9s5") pod "56108211-8bba-4740-8883-b40c8a139f8e" (UID: "56108211-8bba-4740-8883-b40c8a139f8e"). InnerVolumeSpecName "kube-api-access-7n9s5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:34:08 crc kubenswrapper[4826]: I0319 19:34:08.264061 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56108211-8bba-4740-8883-b40c8a139f8e-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "56108211-8bba-4740-8883-b40c8a139f8e" (UID: "56108211-8bba-4740-8883-b40c8a139f8e"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:34:08 crc kubenswrapper[4826]: I0319 19:34:08.283696 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56108211-8bba-4740-8883-b40c8a139f8e-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "56108211-8bba-4740-8883-b40c8a139f8e" (UID: "56108211-8bba-4740-8883-b40c8a139f8e"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:34:08 crc kubenswrapper[4826]: I0319 19:34:08.287192 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56108211-8bba-4740-8883-b40c8a139f8e-inventory" (OuterVolumeSpecName: "inventory") pod "56108211-8bba-4740-8883-b40c8a139f8e" (UID: "56108211-8bba-4740-8883-b40c8a139f8e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:34:08 crc kubenswrapper[4826]: I0319 19:34:08.291749 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56108211-8bba-4740-8883-b40c8a139f8e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "56108211-8bba-4740-8883-b40c8a139f8e" (UID: "56108211-8bba-4740-8883-b40c8a139f8e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:34:08 crc kubenswrapper[4826]: I0319 19:34:08.353184 4826 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56108211-8bba-4740-8883-b40c8a139f8e-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:34:08 crc kubenswrapper[4826]: I0319 19:34:08.353220 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7n9s5\" (UniqueName: \"kubernetes.io/projected/56108211-8bba-4740-8883-b40c8a139f8e-kube-api-access-7n9s5\") on node \"crc\" DevicePath \"\"" Mar 19 19:34:08 crc kubenswrapper[4826]: I0319 19:34:08.353233 4826 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/56108211-8bba-4740-8883-b40c8a139f8e-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 19 19:34:08 crc kubenswrapper[4826]: I0319 19:34:08.353244 4826 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/56108211-8bba-4740-8883-b40c8a139f8e-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 19:34:08 crc kubenswrapper[4826]: I0319 19:34:08.353256 4826 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/56108211-8bba-4740-8883-b40c8a139f8e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 19:34:08 crc kubenswrapper[4826]: I0319 19:34:08.627333 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ftdhw" event={"ID":"56108211-8bba-4740-8883-b40c8a139f8e","Type":"ContainerDied","Data":"8609e2e355b2b2418516c3b3712da2e11ce39f2f21bbbf6e0ae42178cb50e088"} Mar 19 19:34:08 crc kubenswrapper[4826]: I0319 19:34:08.627436 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ftdhw" Mar 19 19:34:08 crc kubenswrapper[4826]: I0319 19:34:08.627407 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8609e2e355b2b2418516c3b3712da2e11ce39f2f21bbbf6e0ae42178cb50e088" Mar 19 19:34:08 crc kubenswrapper[4826]: I0319 19:34:08.758448 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-r5gzg"] Mar 19 19:34:08 crc kubenswrapper[4826]: E0319 19:34:08.759114 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56108211-8bba-4740-8883-b40c8a139f8e" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 19 19:34:08 crc kubenswrapper[4826]: I0319 19:34:08.759137 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="56108211-8bba-4740-8883-b40c8a139f8e" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 19 19:34:08 crc kubenswrapper[4826]: E0319 19:34:08.759181 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a01c4fcf-ef63-40d1-b71b-5212f5ad8ffb" containerName="oc" Mar 19 19:34:08 crc kubenswrapper[4826]: I0319 19:34:08.759188 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="a01c4fcf-ef63-40d1-b71b-5212f5ad8ffb" containerName="oc" Mar 19 19:34:08 crc kubenswrapper[4826]: I0319 19:34:08.759400 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="a01c4fcf-ef63-40d1-b71b-5212f5ad8ffb" containerName="oc" Mar 19 19:34:08 crc kubenswrapper[4826]: I0319 19:34:08.759425 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="56108211-8bba-4740-8883-b40c8a139f8e" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 19 19:34:08 crc kubenswrapper[4826]: I0319 19:34:08.760271 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-r5gzg" Mar 19 19:34:08 crc kubenswrapper[4826]: I0319 19:34:08.764772 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 19:34:08 crc kubenswrapper[4826]: I0319 19:34:08.765349 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 19:34:08 crc kubenswrapper[4826]: I0319 19:34:08.766134 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 19 19:34:08 crc kubenswrapper[4826]: I0319 19:34:08.766370 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 19 19:34:08 crc kubenswrapper[4826]: I0319 19:34:08.766554 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jchxw" Mar 19 19:34:08 crc kubenswrapper[4826]: I0319 19:34:08.768479 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 19:34:08 crc kubenswrapper[4826]: I0319 19:34:08.780922 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-r5gzg"] Mar 19 19:34:08 crc kubenswrapper[4826]: I0319 19:34:08.786155 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/29402a90-3f51-4212-bc44-5c382894d0e6-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-r5gzg\" (UID: \"29402a90-3f51-4212-bc44-5c382894d0e6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-r5gzg" Mar 19 19:34:08 crc kubenswrapper[4826]: I0319 19:34:08.788833 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/29402a90-3f51-4212-bc44-5c382894d0e6-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-r5gzg\" (UID: \"29402a90-3f51-4212-bc44-5c382894d0e6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-r5gzg" Mar 19 19:34:08 crc kubenswrapper[4826]: I0319 19:34:08.788876 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29402a90-3f51-4212-bc44-5c382894d0e6-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-r5gzg\" (UID: \"29402a90-3f51-4212-bc44-5c382894d0e6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-r5gzg" Mar 19 19:34:08 crc kubenswrapper[4826]: I0319 19:34:08.788921 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29402a90-3f51-4212-bc44-5c382894d0e6-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-r5gzg\" (UID: \"29402a90-3f51-4212-bc44-5c382894d0e6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-r5gzg" Mar 19 19:34:08 crc kubenswrapper[4826]: I0319 19:34:08.788944 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cs9nn\" (UniqueName: \"kubernetes.io/projected/29402a90-3f51-4212-bc44-5c382894d0e6-kube-api-access-cs9nn\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-r5gzg\" (UID: \"29402a90-3f51-4212-bc44-5c382894d0e6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-r5gzg" Mar 19 19:34:08 crc kubenswrapper[4826]: I0319 19:34:08.788995 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/29402a90-3f51-4212-bc44-5c382894d0e6-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-r5gzg\" (UID: \"29402a90-3f51-4212-bc44-5c382894d0e6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-r5gzg" Mar 19 19:34:08 crc kubenswrapper[4826]: I0319 19:34:08.891238 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/29402a90-3f51-4212-bc44-5c382894d0e6-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-r5gzg\" (UID: \"29402a90-3f51-4212-bc44-5c382894d0e6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-r5gzg" Mar 19 19:34:08 crc kubenswrapper[4826]: I0319 19:34:08.891289 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29402a90-3f51-4212-bc44-5c382894d0e6-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-r5gzg\" (UID: \"29402a90-3f51-4212-bc44-5c382894d0e6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-r5gzg" Mar 19 19:34:08 crc kubenswrapper[4826]: I0319 19:34:08.891332 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29402a90-3f51-4212-bc44-5c382894d0e6-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-r5gzg\" (UID: \"29402a90-3f51-4212-bc44-5c382894d0e6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-r5gzg" Mar 19 19:34:08 crc kubenswrapper[4826]: I0319 19:34:08.891355 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cs9nn\" (UniqueName: \"kubernetes.io/projected/29402a90-3f51-4212-bc44-5c382894d0e6-kube-api-access-cs9nn\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-r5gzg\" (UID: \"29402a90-3f51-4212-bc44-5c382894d0e6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-r5gzg" Mar 19 19:34:08 crc kubenswrapper[4826]: I0319 19:34:08.891405 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/29402a90-3f51-4212-bc44-5c382894d0e6-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-r5gzg\" (UID: \"29402a90-3f51-4212-bc44-5c382894d0e6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-r5gzg" Mar 19 19:34:08 crc kubenswrapper[4826]: I0319 19:34:08.891610 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/29402a90-3f51-4212-bc44-5c382894d0e6-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-r5gzg\" (UID: \"29402a90-3f51-4212-bc44-5c382894d0e6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-r5gzg" Mar 19 19:34:08 crc kubenswrapper[4826]: I0319 19:34:08.896509 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/29402a90-3f51-4212-bc44-5c382894d0e6-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-r5gzg\" (UID: \"29402a90-3f51-4212-bc44-5c382894d0e6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-r5gzg" Mar 19 19:34:08 crc kubenswrapper[4826]: I0319 19:34:08.897902 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/29402a90-3f51-4212-bc44-5c382894d0e6-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-r5gzg\" (UID: \"29402a90-3f51-4212-bc44-5c382894d0e6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-r5gzg" Mar 19 19:34:08 crc kubenswrapper[4826]: I0319 19:34:08.899195 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29402a90-3f51-4212-bc44-5c382894d0e6-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-r5gzg\" (UID: \"29402a90-3f51-4212-bc44-5c382894d0e6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-r5gzg" Mar 19 19:34:08 crc kubenswrapper[4826]: I0319 19:34:08.899786 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/29402a90-3f51-4212-bc44-5c382894d0e6-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-r5gzg\" (UID: \"29402a90-3f51-4212-bc44-5c382894d0e6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-r5gzg" Mar 19 19:34:08 crc kubenswrapper[4826]: I0319 19:34:08.907522 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29402a90-3f51-4212-bc44-5c382894d0e6-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-r5gzg\" (UID: \"29402a90-3f51-4212-bc44-5c382894d0e6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-r5gzg" Mar 19 19:34:08 crc kubenswrapper[4826]: I0319 19:34:08.908530 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cs9nn\" (UniqueName: \"kubernetes.io/projected/29402a90-3f51-4212-bc44-5c382894d0e6-kube-api-access-cs9nn\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-r5gzg\" (UID: \"29402a90-3f51-4212-bc44-5c382894d0e6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-r5gzg" Mar 19 19:34:09 crc kubenswrapper[4826]: I0319 19:34:09.090352 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-r5gzg" Mar 19 19:34:09 crc kubenswrapper[4826]: I0319 19:34:09.719496 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-r5gzg"] Mar 19 19:34:10 crc kubenswrapper[4826]: I0319 19:34:10.646983 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-r5gzg" event={"ID":"29402a90-3f51-4212-bc44-5c382894d0e6","Type":"ContainerStarted","Data":"1a8b0c1ce284202d397e0d565439fb292572989d931452b804f42af740a0d416"} Mar 19 19:34:10 crc kubenswrapper[4826]: I0319 19:34:10.649426 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-r5gzg" event={"ID":"29402a90-3f51-4212-bc44-5c382894d0e6","Type":"ContainerStarted","Data":"9efafb4c1a80528d1b5d8f9ee6304ed278c7930cdee1d4872e11ee326c1111ef"} Mar 19 19:34:10 crc kubenswrapper[4826]: I0319 19:34:10.689606 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-r5gzg" podStartSLOduration=2.178254965 podStartE2EDuration="2.689585528s" podCreationTimestamp="2026-03-19 19:34:08 +0000 UTC" firstStartedPulling="2026-03-19 19:34:09.730598585 +0000 UTC m=+2274.484666938" lastFinishedPulling="2026-03-19 19:34:10.241929178 +0000 UTC m=+2274.995997501" observedRunningTime="2026-03-19 19:34:10.674603414 +0000 UTC m=+2275.428671767" watchObservedRunningTime="2026-03-19 19:34:10.689585528 +0000 UTC m=+2275.443653851" Mar 19 19:34:16 crc kubenswrapper[4826]: I0319 19:34:16.978443 4826 scope.go:117] "RemoveContainer" containerID="7ad46c116609c2194bb036463992f5f5e8e6454d574c11fba78d76956fe99246" Mar 19 19:34:16 crc kubenswrapper[4826]: E0319 19:34:16.980716 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:34:28 crc kubenswrapper[4826]: I0319 19:34:28.976218 4826 scope.go:117] "RemoveContainer" containerID="7ad46c116609c2194bb036463992f5f5e8e6454d574c11fba78d76956fe99246" Mar 19 19:34:28 crc kubenswrapper[4826]: E0319 19:34:28.977240 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:34:32 crc kubenswrapper[4826]: I0319 19:34:32.439372 4826 scope.go:117] "RemoveContainer" containerID="2e1f4d70f3f933df3f2a33288b8b007376aad8dbc3bf9c627028b05b6d1f6ab9" Mar 19 19:34:41 crc kubenswrapper[4826]: I0319 19:34:41.977369 4826 scope.go:117] "RemoveContainer" containerID="7ad46c116609c2194bb036463992f5f5e8e6454d574c11fba78d76956fe99246" Mar 19 19:34:41 crc kubenswrapper[4826]: E0319 19:34:41.978522 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:34:56 crc kubenswrapper[4826]: I0319 19:34:56.975915 4826 scope.go:117] "RemoveContainer" containerID="7ad46c116609c2194bb036463992f5f5e8e6454d574c11fba78d76956fe99246" Mar 19 19:34:56 crc kubenswrapper[4826]: E0319 19:34:56.977034 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:35:03 crc kubenswrapper[4826]: I0319 19:35:03.322183 4826 generic.go:334] "Generic (PLEG): container finished" podID="29402a90-3f51-4212-bc44-5c382894d0e6" containerID="1a8b0c1ce284202d397e0d565439fb292572989d931452b804f42af740a0d416" exitCode=0 Mar 19 19:35:03 crc kubenswrapper[4826]: I0319 19:35:03.322267 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-r5gzg" event={"ID":"29402a90-3f51-4212-bc44-5c382894d0e6","Type":"ContainerDied","Data":"1a8b0c1ce284202d397e0d565439fb292572989d931452b804f42af740a0d416"} Mar 19 19:35:05 crc kubenswrapper[4826]: I0319 19:35:05.136594 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-r5gzg" Mar 19 19:35:05 crc kubenswrapper[4826]: I0319 19:35:05.254988 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cs9nn\" (UniqueName: \"kubernetes.io/projected/29402a90-3f51-4212-bc44-5c382894d0e6-kube-api-access-cs9nn\") pod \"29402a90-3f51-4212-bc44-5c382894d0e6\" (UID: \"29402a90-3f51-4212-bc44-5c382894d0e6\") " Mar 19 19:35:05 crc kubenswrapper[4826]: I0319 19:35:05.255061 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29402a90-3f51-4212-bc44-5c382894d0e6-inventory\") pod \"29402a90-3f51-4212-bc44-5c382894d0e6\" (UID: \"29402a90-3f51-4212-bc44-5c382894d0e6\") " Mar 19 19:35:05 crc kubenswrapper[4826]: I0319 19:35:05.255157 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/29402a90-3f51-4212-bc44-5c382894d0e6-neutron-ovn-metadata-agent-neutron-config-0\") pod \"29402a90-3f51-4212-bc44-5c382894d0e6\" (UID: \"29402a90-3f51-4212-bc44-5c382894d0e6\") " Mar 19 19:35:05 crc kubenswrapper[4826]: I0319 19:35:05.255205 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/29402a90-3f51-4212-bc44-5c382894d0e6-ssh-key-openstack-edpm-ipam\") pod \"29402a90-3f51-4212-bc44-5c382894d0e6\" (UID: \"29402a90-3f51-4212-bc44-5c382894d0e6\") " Mar 19 19:35:05 crc kubenswrapper[4826]: I0319 19:35:05.255249 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29402a90-3f51-4212-bc44-5c382894d0e6-neutron-metadata-combined-ca-bundle\") pod \"29402a90-3f51-4212-bc44-5c382894d0e6\" (UID: \"29402a90-3f51-4212-bc44-5c382894d0e6\") " Mar 19 19:35:05 crc kubenswrapper[4826]: I0319 19:35:05.255299 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/29402a90-3f51-4212-bc44-5c382894d0e6-nova-metadata-neutron-config-0\") pod \"29402a90-3f51-4212-bc44-5c382894d0e6\" (UID: \"29402a90-3f51-4212-bc44-5c382894d0e6\") " Mar 19 19:35:05 crc kubenswrapper[4826]: I0319 19:35:05.263486 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29402a90-3f51-4212-bc44-5c382894d0e6-kube-api-access-cs9nn" (OuterVolumeSpecName: "kube-api-access-cs9nn") pod "29402a90-3f51-4212-bc44-5c382894d0e6" (UID: "29402a90-3f51-4212-bc44-5c382894d0e6"). InnerVolumeSpecName "kube-api-access-cs9nn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:35:05 crc kubenswrapper[4826]: I0319 19:35:05.268870 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29402a90-3f51-4212-bc44-5c382894d0e6-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "29402a90-3f51-4212-bc44-5c382894d0e6" (UID: "29402a90-3f51-4212-bc44-5c382894d0e6"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:35:05 crc kubenswrapper[4826]: I0319 19:35:05.291271 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29402a90-3f51-4212-bc44-5c382894d0e6-inventory" (OuterVolumeSpecName: "inventory") pod "29402a90-3f51-4212-bc44-5c382894d0e6" (UID: "29402a90-3f51-4212-bc44-5c382894d0e6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:35:05 crc kubenswrapper[4826]: I0319 19:35:05.294882 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29402a90-3f51-4212-bc44-5c382894d0e6-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "29402a90-3f51-4212-bc44-5c382894d0e6" (UID: "29402a90-3f51-4212-bc44-5c382894d0e6"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:35:05 crc kubenswrapper[4826]: I0319 19:35:05.304407 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29402a90-3f51-4212-bc44-5c382894d0e6-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "29402a90-3f51-4212-bc44-5c382894d0e6" (UID: "29402a90-3f51-4212-bc44-5c382894d0e6"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:35:05 crc kubenswrapper[4826]: I0319 19:35:05.323833 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29402a90-3f51-4212-bc44-5c382894d0e6-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "29402a90-3f51-4212-bc44-5c382894d0e6" (UID: "29402a90-3f51-4212-bc44-5c382894d0e6"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:35:05 crc kubenswrapper[4826]: I0319 19:35:05.351047 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-r5gzg" event={"ID":"29402a90-3f51-4212-bc44-5c382894d0e6","Type":"ContainerDied","Data":"9efafb4c1a80528d1b5d8f9ee6304ed278c7930cdee1d4872e11ee326c1111ef"} Mar 19 19:35:05 crc kubenswrapper[4826]: I0319 19:35:05.351093 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9efafb4c1a80528d1b5d8f9ee6304ed278c7930cdee1d4872e11ee326c1111ef" Mar 19 19:35:05 crc kubenswrapper[4826]: I0319 19:35:05.351089 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-r5gzg" Mar 19 19:35:05 crc kubenswrapper[4826]: I0319 19:35:05.358521 4826 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/29402a90-3f51-4212-bc44-5c382894d0e6-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 19 19:35:05 crc kubenswrapper[4826]: I0319 19:35:05.358568 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cs9nn\" (UniqueName: \"kubernetes.io/projected/29402a90-3f51-4212-bc44-5c382894d0e6-kube-api-access-cs9nn\") on node \"crc\" DevicePath \"\"" Mar 19 19:35:05 crc kubenswrapper[4826]: I0319 19:35:05.358590 4826 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29402a90-3f51-4212-bc44-5c382894d0e6-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 19:35:05 crc kubenswrapper[4826]: I0319 19:35:05.358610 4826 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/29402a90-3f51-4212-bc44-5c382894d0e6-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 19 19:35:05 crc kubenswrapper[4826]: I0319 19:35:05.358629 4826 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/29402a90-3f51-4212-bc44-5c382894d0e6-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 19:35:05 crc kubenswrapper[4826]: I0319 19:35:05.358648 4826 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29402a90-3f51-4212-bc44-5c382894d0e6-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:35:05 crc kubenswrapper[4826]: I0319 19:35:05.459317 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-r4tbj"] Mar 19 19:35:05 crc kubenswrapper[4826]: E0319 19:35:05.459815 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29402a90-3f51-4212-bc44-5c382894d0e6" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 19 19:35:05 crc kubenswrapper[4826]: I0319 19:35:05.459830 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="29402a90-3f51-4212-bc44-5c382894d0e6" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 19 19:35:05 crc kubenswrapper[4826]: I0319 19:35:05.460091 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="29402a90-3f51-4212-bc44-5c382894d0e6" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 19 19:35:05 crc kubenswrapper[4826]: I0319 19:35:05.460917 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-r4tbj" Mar 19 19:35:05 crc kubenswrapper[4826]: I0319 19:35:05.465518 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 19 19:35:05 crc kubenswrapper[4826]: I0319 19:35:05.465903 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 19:35:05 crc kubenswrapper[4826]: I0319 19:35:05.466261 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 19:35:05 crc kubenswrapper[4826]: I0319 19:35:05.466836 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 19:35:05 crc kubenswrapper[4826]: I0319 19:35:05.467100 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jchxw" Mar 19 19:35:05 crc kubenswrapper[4826]: I0319 19:35:05.481819 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-r4tbj"] Mar 19 19:35:05 crc kubenswrapper[4826]: I0319 19:35:05.564248 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/0e998b4b-67f9-4f5c-bcb7-df21ad523a61-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-r4tbj\" (UID: \"0e998b4b-67f9-4f5c-bcb7-df21ad523a61\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-r4tbj" Mar 19 19:35:05 crc kubenswrapper[4826]: I0319 19:35:05.564339 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzch2\" (UniqueName: \"kubernetes.io/projected/0e998b4b-67f9-4f5c-bcb7-df21ad523a61-kube-api-access-dzch2\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-r4tbj\" (UID: \"0e998b4b-67f9-4f5c-bcb7-df21ad523a61\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-r4tbj" Mar 19 19:35:05 crc kubenswrapper[4826]: I0319 19:35:05.564458 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e998b4b-67f9-4f5c-bcb7-df21ad523a61-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-r4tbj\" (UID: \"0e998b4b-67f9-4f5c-bcb7-df21ad523a61\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-r4tbj" Mar 19 19:35:05 crc kubenswrapper[4826]: I0319 19:35:05.564513 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e998b4b-67f9-4f5c-bcb7-df21ad523a61-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-r4tbj\" (UID: \"0e998b4b-67f9-4f5c-bcb7-df21ad523a61\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-r4tbj" Mar 19 19:35:05 crc kubenswrapper[4826]: I0319 19:35:05.564579 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0e998b4b-67f9-4f5c-bcb7-df21ad523a61-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-r4tbj\" (UID: \"0e998b4b-67f9-4f5c-bcb7-df21ad523a61\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-r4tbj" Mar 19 19:35:05 crc kubenswrapper[4826]: I0319 19:35:05.667514 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e998b4b-67f9-4f5c-bcb7-df21ad523a61-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-r4tbj\" (UID: \"0e998b4b-67f9-4f5c-bcb7-df21ad523a61\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-r4tbj" Mar 19 19:35:05 crc kubenswrapper[4826]: I0319 19:35:05.667676 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0e998b4b-67f9-4f5c-bcb7-df21ad523a61-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-r4tbj\" (UID: \"0e998b4b-67f9-4f5c-bcb7-df21ad523a61\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-r4tbj" Mar 19 19:35:05 crc kubenswrapper[4826]: I0319 19:35:05.667858 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/0e998b4b-67f9-4f5c-bcb7-df21ad523a61-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-r4tbj\" (UID: \"0e998b4b-67f9-4f5c-bcb7-df21ad523a61\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-r4tbj" Mar 19 19:35:05 crc kubenswrapper[4826]: I0319 19:35:05.667899 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzch2\" (UniqueName: \"kubernetes.io/projected/0e998b4b-67f9-4f5c-bcb7-df21ad523a61-kube-api-access-dzch2\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-r4tbj\" (UID: \"0e998b4b-67f9-4f5c-bcb7-df21ad523a61\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-r4tbj" Mar 19 19:35:05 crc kubenswrapper[4826]: I0319 19:35:05.667973 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e998b4b-67f9-4f5c-bcb7-df21ad523a61-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-r4tbj\" (UID: \"0e998b4b-67f9-4f5c-bcb7-df21ad523a61\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-r4tbj" Mar 19 19:35:05 crc kubenswrapper[4826]: I0319 19:35:05.671981 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e998b4b-67f9-4f5c-bcb7-df21ad523a61-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-r4tbj\" (UID: \"0e998b4b-67f9-4f5c-bcb7-df21ad523a61\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-r4tbj" Mar 19 19:35:05 crc kubenswrapper[4826]: I0319 19:35:05.672643 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e998b4b-67f9-4f5c-bcb7-df21ad523a61-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-r4tbj\" (UID: \"0e998b4b-67f9-4f5c-bcb7-df21ad523a61\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-r4tbj" Mar 19 19:35:05 crc kubenswrapper[4826]: I0319 19:35:05.674169 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0e998b4b-67f9-4f5c-bcb7-df21ad523a61-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-r4tbj\" (UID: \"0e998b4b-67f9-4f5c-bcb7-df21ad523a61\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-r4tbj" Mar 19 19:35:05 crc kubenswrapper[4826]: I0319 19:35:05.674522 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/0e998b4b-67f9-4f5c-bcb7-df21ad523a61-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-r4tbj\" (UID: \"0e998b4b-67f9-4f5c-bcb7-df21ad523a61\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-r4tbj" Mar 19 19:35:05 crc kubenswrapper[4826]: I0319 19:35:05.689570 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzch2\" (UniqueName: \"kubernetes.io/projected/0e998b4b-67f9-4f5c-bcb7-df21ad523a61-kube-api-access-dzch2\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-r4tbj\" (UID: \"0e998b4b-67f9-4f5c-bcb7-df21ad523a61\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-r4tbj" Mar 19 19:35:05 crc kubenswrapper[4826]: I0319 19:35:05.791900 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-r4tbj" Mar 19 19:35:06 crc kubenswrapper[4826]: I0319 19:35:06.452424 4826 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 19:35:06 crc kubenswrapper[4826]: I0319 19:35:06.456646 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-r4tbj"] Mar 19 19:35:07 crc kubenswrapper[4826]: I0319 19:35:07.375902 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-r4tbj" event={"ID":"0e998b4b-67f9-4f5c-bcb7-df21ad523a61","Type":"ContainerStarted","Data":"36911c531c6d905b0325832eb4a67ef05142a09510d6e6b79f71bcbe023deb94"} Mar 19 19:35:07 crc kubenswrapper[4826]: I0319 19:35:07.376632 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-r4tbj" event={"ID":"0e998b4b-67f9-4f5c-bcb7-df21ad523a61","Type":"ContainerStarted","Data":"d680badfc5c39264722171b1b44267538d112feb9a01e8e6adabe71ac9fc9ff9"} Mar 19 19:35:07 crc kubenswrapper[4826]: I0319 19:35:07.421093 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-r4tbj" podStartSLOduration=2.016081253 podStartE2EDuration="2.421067191s" podCreationTimestamp="2026-03-19 19:35:05 +0000 UTC" firstStartedPulling="2026-03-19 19:35:06.452163872 +0000 UTC m=+2331.206232175" lastFinishedPulling="2026-03-19 19:35:06.8571498 +0000 UTC m=+2331.611218113" observedRunningTime="2026-03-19 19:35:07.396688959 +0000 UTC m=+2332.150757282" watchObservedRunningTime="2026-03-19 19:35:07.421067191 +0000 UTC m=+2332.175135514" Mar 19 19:35:10 crc kubenswrapper[4826]: I0319 19:35:10.978121 4826 scope.go:117] "RemoveContainer" containerID="7ad46c116609c2194bb036463992f5f5e8e6454d574c11fba78d76956fe99246" Mar 19 19:35:10 crc kubenswrapper[4826]: E0319 19:35:10.979165 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:35:25 crc kubenswrapper[4826]: I0319 19:35:25.988078 4826 scope.go:117] "RemoveContainer" containerID="7ad46c116609c2194bb036463992f5f5e8e6454d574c11fba78d76956fe99246" Mar 19 19:35:25 crc kubenswrapper[4826]: E0319 19:35:25.989165 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:35:36 crc kubenswrapper[4826]: I0319 19:35:36.976694 4826 scope.go:117] "RemoveContainer" containerID="7ad46c116609c2194bb036463992f5f5e8e6454d574c11fba78d76956fe99246" Mar 19 19:35:36 crc kubenswrapper[4826]: E0319 19:35:36.977914 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:35:49 crc kubenswrapper[4826]: I0319 19:35:49.976570 4826 scope.go:117] "RemoveContainer" containerID="7ad46c116609c2194bb036463992f5f5e8e6454d574c11fba78d76956fe99246" Mar 19 19:35:49 crc kubenswrapper[4826]: E0319 19:35:49.977300 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:36:00 crc kubenswrapper[4826]: I0319 19:36:00.160740 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565816-2mk4w"] Mar 19 19:36:00 crc kubenswrapper[4826]: I0319 19:36:00.163274 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565816-2mk4w" Mar 19 19:36:00 crc kubenswrapper[4826]: I0319 19:36:00.166279 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 19:36:00 crc kubenswrapper[4826]: I0319 19:36:00.166733 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b27wl" Mar 19 19:36:00 crc kubenswrapper[4826]: I0319 19:36:00.167307 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 19:36:00 crc kubenswrapper[4826]: I0319 19:36:00.172777 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565816-2mk4w"] Mar 19 19:36:00 crc kubenswrapper[4826]: I0319 19:36:00.196879 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pp4v9\" (UniqueName: \"kubernetes.io/projected/e7533636-0dfb-4a71-9d89-ffa996e4175b-kube-api-access-pp4v9\") pod \"auto-csr-approver-29565816-2mk4w\" (UID: \"e7533636-0dfb-4a71-9d89-ffa996e4175b\") " pod="openshift-infra/auto-csr-approver-29565816-2mk4w" Mar 19 19:36:00 crc kubenswrapper[4826]: I0319 19:36:00.299703 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pp4v9\" (UniqueName: \"kubernetes.io/projected/e7533636-0dfb-4a71-9d89-ffa996e4175b-kube-api-access-pp4v9\") pod \"auto-csr-approver-29565816-2mk4w\" (UID: \"e7533636-0dfb-4a71-9d89-ffa996e4175b\") " pod="openshift-infra/auto-csr-approver-29565816-2mk4w" Mar 19 19:36:00 crc kubenswrapper[4826]: I0319 19:36:00.321847 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pp4v9\" (UniqueName: \"kubernetes.io/projected/e7533636-0dfb-4a71-9d89-ffa996e4175b-kube-api-access-pp4v9\") pod \"auto-csr-approver-29565816-2mk4w\" (UID: \"e7533636-0dfb-4a71-9d89-ffa996e4175b\") " pod="openshift-infra/auto-csr-approver-29565816-2mk4w" Mar 19 19:36:00 crc kubenswrapper[4826]: I0319 19:36:00.490048 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565816-2mk4w" Mar 19 19:36:00 crc kubenswrapper[4826]: I0319 19:36:00.960019 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565816-2mk4w"] Mar 19 19:36:01 crc kubenswrapper[4826]: I0319 19:36:01.104525 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565816-2mk4w" event={"ID":"e7533636-0dfb-4a71-9d89-ffa996e4175b","Type":"ContainerStarted","Data":"e10e9d2eeac8cb9594ec263ade0b83bfe93870237103d6fce61f9ae104a4daa6"} Mar 19 19:36:02 crc kubenswrapper[4826]: I0319 19:36:02.977796 4826 scope.go:117] "RemoveContainer" containerID="7ad46c116609c2194bb036463992f5f5e8e6454d574c11fba78d76956fe99246" Mar 19 19:36:02 crc kubenswrapper[4826]: E0319 19:36:02.978481 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:36:03 crc kubenswrapper[4826]: I0319 19:36:03.129020 4826 generic.go:334] "Generic (PLEG): container finished" podID="e7533636-0dfb-4a71-9d89-ffa996e4175b" containerID="b59f89413e5881f3ccfb0f8fa9915e909bbee426cbef4b65785c848a6455896c" exitCode=0 Mar 19 19:36:03 crc kubenswrapper[4826]: I0319 19:36:03.129089 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565816-2mk4w" event={"ID":"e7533636-0dfb-4a71-9d89-ffa996e4175b","Type":"ContainerDied","Data":"b59f89413e5881f3ccfb0f8fa9915e909bbee426cbef4b65785c848a6455896c"} Mar 19 19:36:04 crc kubenswrapper[4826]: I0319 19:36:04.554178 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565816-2mk4w" Mar 19 19:36:04 crc kubenswrapper[4826]: I0319 19:36:04.721122 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pp4v9\" (UniqueName: \"kubernetes.io/projected/e7533636-0dfb-4a71-9d89-ffa996e4175b-kube-api-access-pp4v9\") pod \"e7533636-0dfb-4a71-9d89-ffa996e4175b\" (UID: \"e7533636-0dfb-4a71-9d89-ffa996e4175b\") " Mar 19 19:36:04 crc kubenswrapper[4826]: I0319 19:36:04.741197 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7533636-0dfb-4a71-9d89-ffa996e4175b-kube-api-access-pp4v9" (OuterVolumeSpecName: "kube-api-access-pp4v9") pod "e7533636-0dfb-4a71-9d89-ffa996e4175b" (UID: "e7533636-0dfb-4a71-9d89-ffa996e4175b"). InnerVolumeSpecName "kube-api-access-pp4v9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:36:04 crc kubenswrapper[4826]: I0319 19:36:04.826105 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pp4v9\" (UniqueName: \"kubernetes.io/projected/e7533636-0dfb-4a71-9d89-ffa996e4175b-kube-api-access-pp4v9\") on node \"crc\" DevicePath \"\"" Mar 19 19:36:05 crc kubenswrapper[4826]: I0319 19:36:05.151433 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565816-2mk4w" event={"ID":"e7533636-0dfb-4a71-9d89-ffa996e4175b","Type":"ContainerDied","Data":"e10e9d2eeac8cb9594ec263ade0b83bfe93870237103d6fce61f9ae104a4daa6"} Mar 19 19:36:05 crc kubenswrapper[4826]: I0319 19:36:05.151493 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e10e9d2eeac8cb9594ec263ade0b83bfe93870237103d6fce61f9ae104a4daa6" Mar 19 19:36:05 crc kubenswrapper[4826]: I0319 19:36:05.151543 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565816-2mk4w" Mar 19 19:36:05 crc kubenswrapper[4826]: I0319 19:36:05.652057 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565810-9wxvc"] Mar 19 19:36:05 crc kubenswrapper[4826]: I0319 19:36:05.663426 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565810-9wxvc"] Mar 19 19:36:06 crc kubenswrapper[4826]: I0319 19:36:06.002374 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bda6fead-94fb-42e0-84f6-f318afdb6969" path="/var/lib/kubelet/pods/bda6fead-94fb-42e0-84f6-f318afdb6969/volumes" Mar 19 19:36:17 crc kubenswrapper[4826]: I0319 19:36:17.977343 4826 scope.go:117] "RemoveContainer" containerID="7ad46c116609c2194bb036463992f5f5e8e6454d574c11fba78d76956fe99246" Mar 19 19:36:17 crc kubenswrapper[4826]: E0319 19:36:17.978777 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:36:30 crc kubenswrapper[4826]: I0319 19:36:30.976517 4826 scope.go:117] "RemoveContainer" containerID="7ad46c116609c2194bb036463992f5f5e8e6454d574c11fba78d76956fe99246" Mar 19 19:36:30 crc kubenswrapper[4826]: E0319 19:36:30.977188 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:36:32 crc kubenswrapper[4826]: I0319 19:36:32.595751 4826 scope.go:117] "RemoveContainer" containerID="c96fa13f4c93ffd55256410848bc8474ed30078636283bba2babc2b8c71622ff" Mar 19 19:36:42 crc kubenswrapper[4826]: I0319 19:36:42.976433 4826 scope.go:117] "RemoveContainer" containerID="7ad46c116609c2194bb036463992f5f5e8e6454d574c11fba78d76956fe99246" Mar 19 19:36:42 crc kubenswrapper[4826]: E0319 19:36:42.977378 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:36:53 crc kubenswrapper[4826]: I0319 19:36:53.978361 4826 scope.go:117] "RemoveContainer" containerID="7ad46c116609c2194bb036463992f5f5e8e6454d574c11fba78d76956fe99246" Mar 19 19:36:53 crc kubenswrapper[4826]: E0319 19:36:53.979177 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:37:07 crc kubenswrapper[4826]: I0319 19:37:07.979160 4826 scope.go:117] "RemoveContainer" containerID="7ad46c116609c2194bb036463992f5f5e8e6454d574c11fba78d76956fe99246" Mar 19 19:37:07 crc kubenswrapper[4826]: E0319 19:37:07.981527 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:37:19 crc kubenswrapper[4826]: I0319 19:37:19.984619 4826 scope.go:117] "RemoveContainer" containerID="7ad46c116609c2194bb036463992f5f5e8e6454d574c11fba78d76956fe99246" Mar 19 19:37:19 crc kubenswrapper[4826]: E0319 19:37:19.985740 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:37:32 crc kubenswrapper[4826]: I0319 19:37:32.977197 4826 scope.go:117] "RemoveContainer" containerID="7ad46c116609c2194bb036463992f5f5e8e6454d574c11fba78d76956fe99246" Mar 19 19:37:32 crc kubenswrapper[4826]: E0319 19:37:32.980766 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:37:45 crc kubenswrapper[4826]: I0319 19:37:45.977090 4826 scope.go:117] "RemoveContainer" containerID="7ad46c116609c2194bb036463992f5f5e8e6454d574c11fba78d76956fe99246" Mar 19 19:37:45 crc kubenswrapper[4826]: E0319 19:37:45.978405 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:37:58 crc kubenswrapper[4826]: I0319 19:37:58.977207 4826 scope.go:117] "RemoveContainer" containerID="7ad46c116609c2194bb036463992f5f5e8e6454d574c11fba78d76956fe99246" Mar 19 19:37:58 crc kubenswrapper[4826]: E0319 19:37:58.978464 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:38:00 crc kubenswrapper[4826]: I0319 19:38:00.174542 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565818-wnkgs"] Mar 19 19:38:00 crc kubenswrapper[4826]: E0319 19:38:00.175409 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7533636-0dfb-4a71-9d89-ffa996e4175b" containerName="oc" Mar 19 19:38:00 crc kubenswrapper[4826]: I0319 19:38:00.175425 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7533636-0dfb-4a71-9d89-ffa996e4175b" containerName="oc" Mar 19 19:38:00 crc kubenswrapper[4826]: I0319 19:38:00.175776 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7533636-0dfb-4a71-9d89-ffa996e4175b" containerName="oc" Mar 19 19:38:00 crc kubenswrapper[4826]: I0319 19:38:00.176792 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565818-wnkgs" Mar 19 19:38:00 crc kubenswrapper[4826]: I0319 19:38:00.179770 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 19:38:00 crc kubenswrapper[4826]: I0319 19:38:00.179913 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 19:38:00 crc kubenswrapper[4826]: I0319 19:38:00.180006 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b27wl" Mar 19 19:38:00 crc kubenswrapper[4826]: I0319 19:38:00.188002 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565818-wnkgs"] Mar 19 19:38:00 crc kubenswrapper[4826]: I0319 19:38:00.292869 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q76dq\" (UniqueName: \"kubernetes.io/projected/61abc61b-9740-4acd-9ea8-5d49946d31cd-kube-api-access-q76dq\") pod \"auto-csr-approver-29565818-wnkgs\" (UID: \"61abc61b-9740-4acd-9ea8-5d49946d31cd\") " pod="openshift-infra/auto-csr-approver-29565818-wnkgs" Mar 19 19:38:00 crc kubenswrapper[4826]: I0319 19:38:00.395008 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q76dq\" (UniqueName: \"kubernetes.io/projected/61abc61b-9740-4acd-9ea8-5d49946d31cd-kube-api-access-q76dq\") pod \"auto-csr-approver-29565818-wnkgs\" (UID: \"61abc61b-9740-4acd-9ea8-5d49946d31cd\") " pod="openshift-infra/auto-csr-approver-29565818-wnkgs" Mar 19 19:38:00 crc kubenswrapper[4826]: I0319 19:38:00.413351 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q76dq\" (UniqueName: \"kubernetes.io/projected/61abc61b-9740-4acd-9ea8-5d49946d31cd-kube-api-access-q76dq\") pod \"auto-csr-approver-29565818-wnkgs\" (UID: \"61abc61b-9740-4acd-9ea8-5d49946d31cd\") " pod="openshift-infra/auto-csr-approver-29565818-wnkgs" Mar 19 19:38:00 crc kubenswrapper[4826]: I0319 19:38:00.499378 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565818-wnkgs" Mar 19 19:38:01 crc kubenswrapper[4826]: I0319 19:38:01.001138 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565818-wnkgs"] Mar 19 19:38:01 crc kubenswrapper[4826]: W0319 19:38:01.006564 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61abc61b_9740_4acd_9ea8_5d49946d31cd.slice/crio-e88c76a47a032dbde36ac5d64bc769b740223866df4374e3fe356a3cd6202b59 WatchSource:0}: Error finding container e88c76a47a032dbde36ac5d64bc769b740223866df4374e3fe356a3cd6202b59: Status 404 returned error can't find the container with id e88c76a47a032dbde36ac5d64bc769b740223866df4374e3fe356a3cd6202b59 Mar 19 19:38:01 crc kubenswrapper[4826]: I0319 19:38:01.739026 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565818-wnkgs" event={"ID":"61abc61b-9740-4acd-9ea8-5d49946d31cd","Type":"ContainerStarted","Data":"e88c76a47a032dbde36ac5d64bc769b740223866df4374e3fe356a3cd6202b59"} Mar 19 19:38:02 crc kubenswrapper[4826]: I0319 19:38:02.750948 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565818-wnkgs" event={"ID":"61abc61b-9740-4acd-9ea8-5d49946d31cd","Type":"ContainerStarted","Data":"4dc34e9c32d100aaa792c14dabd72182c65b8ac3e20f67568af720d1c7dcfe22"} Mar 19 19:38:02 crc kubenswrapper[4826]: I0319 19:38:02.775483 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565818-wnkgs" podStartSLOduration=1.57423654 podStartE2EDuration="2.775463951s" podCreationTimestamp="2026-03-19 19:38:00 +0000 UTC" firstStartedPulling="2026-03-19 19:38:01.011546559 +0000 UTC m=+2505.765614912" lastFinishedPulling="2026-03-19 19:38:02.212774 +0000 UTC m=+2506.966842323" observedRunningTime="2026-03-19 19:38:02.767098818 +0000 UTC m=+2507.521167141" watchObservedRunningTime="2026-03-19 19:38:02.775463951 +0000 UTC m=+2507.529532264" Mar 19 19:38:03 crc kubenswrapper[4826]: I0319 19:38:03.764900 4826 generic.go:334] "Generic (PLEG): container finished" podID="61abc61b-9740-4acd-9ea8-5d49946d31cd" containerID="4dc34e9c32d100aaa792c14dabd72182c65b8ac3e20f67568af720d1c7dcfe22" exitCode=0 Mar 19 19:38:03 crc kubenswrapper[4826]: I0319 19:38:03.765029 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565818-wnkgs" event={"ID":"61abc61b-9740-4acd-9ea8-5d49946d31cd","Type":"ContainerDied","Data":"4dc34e9c32d100aaa792c14dabd72182c65b8ac3e20f67568af720d1c7dcfe22"} Mar 19 19:38:05 crc kubenswrapper[4826]: I0319 19:38:05.309354 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565818-wnkgs" Mar 19 19:38:05 crc kubenswrapper[4826]: I0319 19:38:05.441772 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q76dq\" (UniqueName: \"kubernetes.io/projected/61abc61b-9740-4acd-9ea8-5d49946d31cd-kube-api-access-q76dq\") pod \"61abc61b-9740-4acd-9ea8-5d49946d31cd\" (UID: \"61abc61b-9740-4acd-9ea8-5d49946d31cd\") " Mar 19 19:38:05 crc kubenswrapper[4826]: I0319 19:38:05.452384 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61abc61b-9740-4acd-9ea8-5d49946d31cd-kube-api-access-q76dq" (OuterVolumeSpecName: "kube-api-access-q76dq") pod "61abc61b-9740-4acd-9ea8-5d49946d31cd" (UID: "61abc61b-9740-4acd-9ea8-5d49946d31cd"). InnerVolumeSpecName "kube-api-access-q76dq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:38:05 crc kubenswrapper[4826]: I0319 19:38:05.546446 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q76dq\" (UniqueName: \"kubernetes.io/projected/61abc61b-9740-4acd-9ea8-5d49946d31cd-kube-api-access-q76dq\") on node \"crc\" DevicePath \"\"" Mar 19 19:38:05 crc kubenswrapper[4826]: I0319 19:38:05.815068 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565818-wnkgs" event={"ID":"61abc61b-9740-4acd-9ea8-5d49946d31cd","Type":"ContainerDied","Data":"e88c76a47a032dbde36ac5d64bc769b740223866df4374e3fe356a3cd6202b59"} Mar 19 19:38:05 crc kubenswrapper[4826]: I0319 19:38:05.815111 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e88c76a47a032dbde36ac5d64bc769b740223866df4374e3fe356a3cd6202b59" Mar 19 19:38:05 crc kubenswrapper[4826]: I0319 19:38:05.815121 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565818-wnkgs" Mar 19 19:38:05 crc kubenswrapper[4826]: I0319 19:38:05.861647 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565812-l2b98"] Mar 19 19:38:05 crc kubenswrapper[4826]: I0319 19:38:05.872349 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565812-l2b98"] Mar 19 19:38:05 crc kubenswrapper[4826]: I0319 19:38:05.991916 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69938b68-3fa5-4ed5-acd8-ca538ae21efb" path="/var/lib/kubelet/pods/69938b68-3fa5-4ed5-acd8-ca538ae21efb/volumes" Mar 19 19:38:09 crc kubenswrapper[4826]: I0319 19:38:09.978106 4826 scope.go:117] "RemoveContainer" containerID="7ad46c116609c2194bb036463992f5f5e8e6454d574c11fba78d76956fe99246" Mar 19 19:38:09 crc kubenswrapper[4826]: E0319 19:38:09.979169 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:38:23 crc kubenswrapper[4826]: I0319 19:38:23.977430 4826 scope.go:117] "RemoveContainer" containerID="7ad46c116609c2194bb036463992f5f5e8e6454d574c11fba78d76956fe99246" Mar 19 19:38:23 crc kubenswrapper[4826]: E0319 19:38:23.978339 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:38:32 crc kubenswrapper[4826]: I0319 19:38:32.718333 4826 scope.go:117] "RemoveContainer" containerID="b76b0b59c8ee17e1a0e07823ccb1a646ba9e016245e00ac1feb580f549936e37" Mar 19 19:38:37 crc kubenswrapper[4826]: I0319 19:38:37.977488 4826 scope.go:117] "RemoveContainer" containerID="7ad46c116609c2194bb036463992f5f5e8e6454d574c11fba78d76956fe99246" Mar 19 19:38:38 crc kubenswrapper[4826]: I0319 19:38:38.291453 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" event={"ID":"b456fa3f-c7a7-45ca-b560-e7a9b21be05a","Type":"ContainerStarted","Data":"f4d1ea77252a9b2871e0b60c8046505311f4f02f5e721a5099d9cfb1206679db"} Mar 19 19:39:08 crc kubenswrapper[4826]: I0319 19:39:08.699305 4826 generic.go:334] "Generic (PLEG): container finished" podID="0e998b4b-67f9-4f5c-bcb7-df21ad523a61" containerID="36911c531c6d905b0325832eb4a67ef05142a09510d6e6b79f71bcbe023deb94" exitCode=0 Mar 19 19:39:08 crc kubenswrapper[4826]: I0319 19:39:08.699452 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-r4tbj" event={"ID":"0e998b4b-67f9-4f5c-bcb7-df21ad523a61","Type":"ContainerDied","Data":"36911c531c6d905b0325832eb4a67ef05142a09510d6e6b79f71bcbe023deb94"} Mar 19 19:39:10 crc kubenswrapper[4826]: I0319 19:39:10.321784 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-r4tbj" Mar 19 19:39:10 crc kubenswrapper[4826]: I0319 19:39:10.438226 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e998b4b-67f9-4f5c-bcb7-df21ad523a61-inventory\") pod \"0e998b4b-67f9-4f5c-bcb7-df21ad523a61\" (UID: \"0e998b4b-67f9-4f5c-bcb7-df21ad523a61\") " Mar 19 19:39:10 crc kubenswrapper[4826]: I0319 19:39:10.438328 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzch2\" (UniqueName: \"kubernetes.io/projected/0e998b4b-67f9-4f5c-bcb7-df21ad523a61-kube-api-access-dzch2\") pod \"0e998b4b-67f9-4f5c-bcb7-df21ad523a61\" (UID: \"0e998b4b-67f9-4f5c-bcb7-df21ad523a61\") " Mar 19 19:39:10 crc kubenswrapper[4826]: I0319 19:39:10.438392 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0e998b4b-67f9-4f5c-bcb7-df21ad523a61-ssh-key-openstack-edpm-ipam\") pod \"0e998b4b-67f9-4f5c-bcb7-df21ad523a61\" (UID: \"0e998b4b-67f9-4f5c-bcb7-df21ad523a61\") " Mar 19 19:39:10 crc kubenswrapper[4826]: I0319 19:39:10.438455 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e998b4b-67f9-4f5c-bcb7-df21ad523a61-libvirt-combined-ca-bundle\") pod \"0e998b4b-67f9-4f5c-bcb7-df21ad523a61\" (UID: \"0e998b4b-67f9-4f5c-bcb7-df21ad523a61\") " Mar 19 19:39:10 crc kubenswrapper[4826]: I0319 19:39:10.438632 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/0e998b4b-67f9-4f5c-bcb7-df21ad523a61-libvirt-secret-0\") pod \"0e998b4b-67f9-4f5c-bcb7-df21ad523a61\" (UID: \"0e998b4b-67f9-4f5c-bcb7-df21ad523a61\") " Mar 19 19:39:10 crc kubenswrapper[4826]: I0319 19:39:10.456692 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e998b4b-67f9-4f5c-bcb7-df21ad523a61-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "0e998b4b-67f9-4f5c-bcb7-df21ad523a61" (UID: "0e998b4b-67f9-4f5c-bcb7-df21ad523a61"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:39:10 crc kubenswrapper[4826]: I0319 19:39:10.457790 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e998b4b-67f9-4f5c-bcb7-df21ad523a61-kube-api-access-dzch2" (OuterVolumeSpecName: "kube-api-access-dzch2") pod "0e998b4b-67f9-4f5c-bcb7-df21ad523a61" (UID: "0e998b4b-67f9-4f5c-bcb7-df21ad523a61"). InnerVolumeSpecName "kube-api-access-dzch2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:39:10 crc kubenswrapper[4826]: I0319 19:39:10.473521 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e998b4b-67f9-4f5c-bcb7-df21ad523a61-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0e998b4b-67f9-4f5c-bcb7-df21ad523a61" (UID: "0e998b4b-67f9-4f5c-bcb7-df21ad523a61"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:39:10 crc kubenswrapper[4826]: I0319 19:39:10.475908 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e998b4b-67f9-4f5c-bcb7-df21ad523a61-inventory" (OuterVolumeSpecName: "inventory") pod "0e998b4b-67f9-4f5c-bcb7-df21ad523a61" (UID: "0e998b4b-67f9-4f5c-bcb7-df21ad523a61"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:39:10 crc kubenswrapper[4826]: I0319 19:39:10.487569 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e998b4b-67f9-4f5c-bcb7-df21ad523a61-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "0e998b4b-67f9-4f5c-bcb7-df21ad523a61" (UID: "0e998b4b-67f9-4f5c-bcb7-df21ad523a61"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:39:10 crc kubenswrapper[4826]: I0319 19:39:10.548229 4826 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0e998b4b-67f9-4f5c-bcb7-df21ad523a61-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 19:39:10 crc kubenswrapper[4826]: I0319 19:39:10.548277 4826 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e998b4b-67f9-4f5c-bcb7-df21ad523a61-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:39:10 crc kubenswrapper[4826]: I0319 19:39:10.548298 4826 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/0e998b4b-67f9-4f5c-bcb7-df21ad523a61-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 19 19:39:10 crc kubenswrapper[4826]: I0319 19:39:10.548322 4826 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e998b4b-67f9-4f5c-bcb7-df21ad523a61-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 19:39:10 crc kubenswrapper[4826]: I0319 19:39:10.548341 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dzch2\" (UniqueName: \"kubernetes.io/projected/0e998b4b-67f9-4f5c-bcb7-df21ad523a61-kube-api-access-dzch2\") on node \"crc\" DevicePath \"\"" Mar 19 19:39:10 crc kubenswrapper[4826]: I0319 19:39:10.732314 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-r4tbj" event={"ID":"0e998b4b-67f9-4f5c-bcb7-df21ad523a61","Type":"ContainerDied","Data":"d680badfc5c39264722171b1b44267538d112feb9a01e8e6adabe71ac9fc9ff9"} Mar 19 19:39:10 crc kubenswrapper[4826]: I0319 19:39:10.732363 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d680badfc5c39264722171b1b44267538d112feb9a01e8e6adabe71ac9fc9ff9" Mar 19 19:39:10 crc kubenswrapper[4826]: I0319 19:39:10.732426 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-r4tbj" Mar 19 19:39:10 crc kubenswrapper[4826]: I0319 19:39:10.830963 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-ppww7"] Mar 19 19:39:10 crc kubenswrapper[4826]: E0319 19:39:10.832249 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61abc61b-9740-4acd-9ea8-5d49946d31cd" containerName="oc" Mar 19 19:39:10 crc kubenswrapper[4826]: I0319 19:39:10.832272 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="61abc61b-9740-4acd-9ea8-5d49946d31cd" containerName="oc" Mar 19 19:39:10 crc kubenswrapper[4826]: E0319 19:39:10.832303 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e998b4b-67f9-4f5c-bcb7-df21ad523a61" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 19 19:39:10 crc kubenswrapper[4826]: I0319 19:39:10.832311 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e998b4b-67f9-4f5c-bcb7-df21ad523a61" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 19 19:39:10 crc kubenswrapper[4826]: I0319 19:39:10.832513 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="61abc61b-9740-4acd-9ea8-5d49946d31cd" containerName="oc" Mar 19 19:39:10 crc kubenswrapper[4826]: I0319 19:39:10.832538 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e998b4b-67f9-4f5c-bcb7-df21ad523a61" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 19 19:39:10 crc kubenswrapper[4826]: I0319 19:39:10.833349 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ppww7" Mar 19 19:39:10 crc kubenswrapper[4826]: I0319 19:39:10.838202 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 19:39:10 crc kubenswrapper[4826]: I0319 19:39:10.838608 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 19 19:39:10 crc kubenswrapper[4826]: I0319 19:39:10.838828 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 19:39:10 crc kubenswrapper[4826]: I0319 19:39:10.838985 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jchxw" Mar 19 19:39:10 crc kubenswrapper[4826]: I0319 19:39:10.843386 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 19:39:10 crc kubenswrapper[4826]: I0319 19:39:10.843569 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Mar 19 19:39:10 crc kubenswrapper[4826]: I0319 19:39:10.843611 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 19 19:39:10 crc kubenswrapper[4826]: I0319 19:39:10.844873 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-ppww7"] Mar 19 19:39:10 crc kubenswrapper[4826]: I0319 19:39:10.982747 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d55a2c66-72a6-4026-90be-b4a15bf7914a-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ppww7\" (UID: \"d55a2c66-72a6-4026-90be-b4a15bf7914a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ppww7" Mar 19 19:39:10 crc kubenswrapper[4826]: I0319 19:39:10.982858 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/d55a2c66-72a6-4026-90be-b4a15bf7914a-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ppww7\" (UID: \"d55a2c66-72a6-4026-90be-b4a15bf7914a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ppww7" Mar 19 19:39:10 crc kubenswrapper[4826]: I0319 19:39:10.982943 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d55a2c66-72a6-4026-90be-b4a15bf7914a-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ppww7\" (UID: \"d55a2c66-72a6-4026-90be-b4a15bf7914a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ppww7" Mar 19 19:39:10 crc kubenswrapper[4826]: I0319 19:39:10.983007 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/d55a2c66-72a6-4026-90be-b4a15bf7914a-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ppww7\" (UID: \"d55a2c66-72a6-4026-90be-b4a15bf7914a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ppww7" Mar 19 19:39:10 crc kubenswrapper[4826]: I0319 19:39:10.983082 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d55a2c66-72a6-4026-90be-b4a15bf7914a-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ppww7\" (UID: \"d55a2c66-72a6-4026-90be-b4a15bf7914a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ppww7" Mar 19 19:39:10 crc kubenswrapper[4826]: I0319 19:39:10.983123 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/d55a2c66-72a6-4026-90be-b4a15bf7914a-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ppww7\" (UID: \"d55a2c66-72a6-4026-90be-b4a15bf7914a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ppww7" Mar 19 19:39:10 crc kubenswrapper[4826]: I0319 19:39:10.983180 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d55a2c66-72a6-4026-90be-b4a15bf7914a-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ppww7\" (UID: \"d55a2c66-72a6-4026-90be-b4a15bf7914a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ppww7" Mar 19 19:39:10 crc kubenswrapper[4826]: I0319 19:39:10.983284 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d55a2c66-72a6-4026-90be-b4a15bf7914a-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ppww7\" (UID: \"d55a2c66-72a6-4026-90be-b4a15bf7914a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ppww7" Mar 19 19:39:10 crc kubenswrapper[4826]: I0319 19:39:10.983348 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d55a2c66-72a6-4026-90be-b4a15bf7914a-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ppww7\" (UID: \"d55a2c66-72a6-4026-90be-b4a15bf7914a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ppww7" Mar 19 19:39:10 crc kubenswrapper[4826]: I0319 19:39:10.983410 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6v479\" (UniqueName: \"kubernetes.io/projected/d55a2c66-72a6-4026-90be-b4a15bf7914a-kube-api-access-6v479\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ppww7\" (UID: \"d55a2c66-72a6-4026-90be-b4a15bf7914a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ppww7" Mar 19 19:39:10 crc kubenswrapper[4826]: I0319 19:39:10.983455 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d55a2c66-72a6-4026-90be-b4a15bf7914a-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ppww7\" (UID: \"d55a2c66-72a6-4026-90be-b4a15bf7914a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ppww7" Mar 19 19:39:11 crc kubenswrapper[4826]: I0319 19:39:11.085674 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/d55a2c66-72a6-4026-90be-b4a15bf7914a-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ppww7\" (UID: \"d55a2c66-72a6-4026-90be-b4a15bf7914a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ppww7" Mar 19 19:39:11 crc kubenswrapper[4826]: I0319 19:39:11.086055 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d55a2c66-72a6-4026-90be-b4a15bf7914a-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ppww7\" (UID: \"d55a2c66-72a6-4026-90be-b4a15bf7914a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ppww7" Mar 19 19:39:11 crc kubenswrapper[4826]: I0319 19:39:11.086095 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/d55a2c66-72a6-4026-90be-b4a15bf7914a-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ppww7\" (UID: \"d55a2c66-72a6-4026-90be-b4a15bf7914a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ppww7" Mar 19 19:39:11 crc kubenswrapper[4826]: I0319 19:39:11.086135 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d55a2c66-72a6-4026-90be-b4a15bf7914a-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ppww7\" (UID: \"d55a2c66-72a6-4026-90be-b4a15bf7914a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ppww7" Mar 19 19:39:11 crc kubenswrapper[4826]: I0319 19:39:11.086154 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/d55a2c66-72a6-4026-90be-b4a15bf7914a-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ppww7\" (UID: \"d55a2c66-72a6-4026-90be-b4a15bf7914a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ppww7" Mar 19 19:39:11 crc kubenswrapper[4826]: I0319 19:39:11.086195 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d55a2c66-72a6-4026-90be-b4a15bf7914a-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ppww7\" (UID: \"d55a2c66-72a6-4026-90be-b4a15bf7914a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ppww7" Mar 19 19:39:11 crc kubenswrapper[4826]: I0319 19:39:11.086281 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d55a2c66-72a6-4026-90be-b4a15bf7914a-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ppww7\" (UID: \"d55a2c66-72a6-4026-90be-b4a15bf7914a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ppww7" Mar 19 19:39:11 crc kubenswrapper[4826]: I0319 19:39:11.086319 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d55a2c66-72a6-4026-90be-b4a15bf7914a-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ppww7\" (UID: \"d55a2c66-72a6-4026-90be-b4a15bf7914a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ppww7" Mar 19 19:39:11 crc kubenswrapper[4826]: I0319 19:39:11.086353 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6v479\" (UniqueName: \"kubernetes.io/projected/d55a2c66-72a6-4026-90be-b4a15bf7914a-kube-api-access-6v479\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ppww7\" (UID: \"d55a2c66-72a6-4026-90be-b4a15bf7914a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ppww7" Mar 19 19:39:11 crc kubenswrapper[4826]: I0319 19:39:11.086388 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d55a2c66-72a6-4026-90be-b4a15bf7914a-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ppww7\" (UID: \"d55a2c66-72a6-4026-90be-b4a15bf7914a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ppww7" Mar 19 19:39:11 crc kubenswrapper[4826]: I0319 19:39:11.086434 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d55a2c66-72a6-4026-90be-b4a15bf7914a-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ppww7\" (UID: \"d55a2c66-72a6-4026-90be-b4a15bf7914a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ppww7" Mar 19 19:39:11 crc kubenswrapper[4826]: I0319 19:39:11.087909 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/d55a2c66-72a6-4026-90be-b4a15bf7914a-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ppww7\" (UID: \"d55a2c66-72a6-4026-90be-b4a15bf7914a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ppww7" Mar 19 19:39:11 crc kubenswrapper[4826]: I0319 19:39:11.090841 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d55a2c66-72a6-4026-90be-b4a15bf7914a-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ppww7\" (UID: \"d55a2c66-72a6-4026-90be-b4a15bf7914a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ppww7" Mar 19 19:39:11 crc kubenswrapper[4826]: I0319 19:39:11.091251 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d55a2c66-72a6-4026-90be-b4a15bf7914a-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ppww7\" (UID: \"d55a2c66-72a6-4026-90be-b4a15bf7914a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ppww7" Mar 19 19:39:11 crc kubenswrapper[4826]: I0319 19:39:11.091377 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/d55a2c66-72a6-4026-90be-b4a15bf7914a-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ppww7\" (UID: \"d55a2c66-72a6-4026-90be-b4a15bf7914a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ppww7" Mar 19 19:39:11 crc kubenswrapper[4826]: I0319 19:39:11.092084 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/d55a2c66-72a6-4026-90be-b4a15bf7914a-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ppww7\" (UID: \"d55a2c66-72a6-4026-90be-b4a15bf7914a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ppww7" Mar 19 19:39:11 crc kubenswrapper[4826]: I0319 19:39:11.093155 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d55a2c66-72a6-4026-90be-b4a15bf7914a-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ppww7\" (UID: \"d55a2c66-72a6-4026-90be-b4a15bf7914a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ppww7" Mar 19 19:39:11 crc kubenswrapper[4826]: I0319 19:39:11.093293 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d55a2c66-72a6-4026-90be-b4a15bf7914a-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ppww7\" (UID: \"d55a2c66-72a6-4026-90be-b4a15bf7914a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ppww7" Mar 19 19:39:11 crc kubenswrapper[4826]: I0319 19:39:11.093360 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d55a2c66-72a6-4026-90be-b4a15bf7914a-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ppww7\" (UID: \"d55a2c66-72a6-4026-90be-b4a15bf7914a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ppww7" Mar 19 19:39:11 crc kubenswrapper[4826]: I0319 19:39:11.094286 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d55a2c66-72a6-4026-90be-b4a15bf7914a-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ppww7\" (UID: \"d55a2c66-72a6-4026-90be-b4a15bf7914a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ppww7" Mar 19 19:39:11 crc kubenswrapper[4826]: I0319 19:39:11.094625 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d55a2c66-72a6-4026-90be-b4a15bf7914a-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ppww7\" (UID: \"d55a2c66-72a6-4026-90be-b4a15bf7914a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ppww7" Mar 19 19:39:11 crc kubenswrapper[4826]: I0319 19:39:11.110310 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6v479\" (UniqueName: \"kubernetes.io/projected/d55a2c66-72a6-4026-90be-b4a15bf7914a-kube-api-access-6v479\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ppww7\" (UID: \"d55a2c66-72a6-4026-90be-b4a15bf7914a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ppww7" Mar 19 19:39:11 crc kubenswrapper[4826]: I0319 19:39:11.175557 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ppww7" Mar 19 19:39:11 crc kubenswrapper[4826]: I0319 19:39:11.823688 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-ppww7"] Mar 19 19:39:11 crc kubenswrapper[4826]: W0319 19:39:11.825477 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd55a2c66_72a6_4026_90be_b4a15bf7914a.slice/crio-8d9d87da26f62f3a4a92f72e75182e35cdf6b57ae8785ee4874fdcea96eb9e88 WatchSource:0}: Error finding container 8d9d87da26f62f3a4a92f72e75182e35cdf6b57ae8785ee4874fdcea96eb9e88: Status 404 returned error can't find the container with id 8d9d87da26f62f3a4a92f72e75182e35cdf6b57ae8785ee4874fdcea96eb9e88 Mar 19 19:39:12 crc kubenswrapper[4826]: I0319 19:39:12.756566 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ppww7" event={"ID":"d55a2c66-72a6-4026-90be-b4a15bf7914a","Type":"ContainerStarted","Data":"eb0923ada20969f4f1ccded053400054374f8eae7eb72cf8723619c7316d695e"} Mar 19 19:39:12 crc kubenswrapper[4826]: I0319 19:39:12.757353 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ppww7" event={"ID":"d55a2c66-72a6-4026-90be-b4a15bf7914a","Type":"ContainerStarted","Data":"8d9d87da26f62f3a4a92f72e75182e35cdf6b57ae8785ee4874fdcea96eb9e88"} Mar 19 19:39:12 crc kubenswrapper[4826]: I0319 19:39:12.786234 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ppww7" podStartSLOduration=2.298271903 podStartE2EDuration="2.786217252s" podCreationTimestamp="2026-03-19 19:39:10 +0000 UTC" firstStartedPulling="2026-03-19 19:39:11.829375766 +0000 UTC m=+2576.583444079" lastFinishedPulling="2026-03-19 19:39:12.317321115 +0000 UTC m=+2577.071389428" observedRunningTime="2026-03-19 19:39:12.77708604 +0000 UTC m=+2577.531154383" watchObservedRunningTime="2026-03-19 19:39:12.786217252 +0000 UTC m=+2577.540285575" Mar 19 19:40:00 crc kubenswrapper[4826]: I0319 19:40:00.159826 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565820-4q9l7"] Mar 19 19:40:00 crc kubenswrapper[4826]: I0319 19:40:00.162090 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565820-4q9l7" Mar 19 19:40:00 crc kubenswrapper[4826]: I0319 19:40:00.164899 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 19:40:00 crc kubenswrapper[4826]: I0319 19:40:00.167035 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 19:40:00 crc kubenswrapper[4826]: I0319 19:40:00.167036 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b27wl" Mar 19 19:40:00 crc kubenswrapper[4826]: I0319 19:40:00.176929 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565820-4q9l7"] Mar 19 19:40:00 crc kubenswrapper[4826]: I0319 19:40:00.191057 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7sjz\" (UniqueName: \"kubernetes.io/projected/e7b0f07e-6a17-4cdf-ba58-0e8835e71c60-kube-api-access-l7sjz\") pod \"auto-csr-approver-29565820-4q9l7\" (UID: \"e7b0f07e-6a17-4cdf-ba58-0e8835e71c60\") " pod="openshift-infra/auto-csr-approver-29565820-4q9l7" Mar 19 19:40:00 crc kubenswrapper[4826]: I0319 19:40:00.293346 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7sjz\" (UniqueName: \"kubernetes.io/projected/e7b0f07e-6a17-4cdf-ba58-0e8835e71c60-kube-api-access-l7sjz\") pod \"auto-csr-approver-29565820-4q9l7\" (UID: \"e7b0f07e-6a17-4cdf-ba58-0e8835e71c60\") " pod="openshift-infra/auto-csr-approver-29565820-4q9l7" Mar 19 19:40:00 crc kubenswrapper[4826]: I0319 19:40:00.312277 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7sjz\" (UniqueName: \"kubernetes.io/projected/e7b0f07e-6a17-4cdf-ba58-0e8835e71c60-kube-api-access-l7sjz\") pod \"auto-csr-approver-29565820-4q9l7\" (UID: \"e7b0f07e-6a17-4cdf-ba58-0e8835e71c60\") " pod="openshift-infra/auto-csr-approver-29565820-4q9l7" Mar 19 19:40:00 crc kubenswrapper[4826]: I0319 19:40:00.493241 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565820-4q9l7" Mar 19 19:40:01 crc kubenswrapper[4826]: I0319 19:40:01.014949 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565820-4q9l7"] Mar 19 19:40:01 crc kubenswrapper[4826]: W0319 19:40:01.021910 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7b0f07e_6a17_4cdf_ba58_0e8835e71c60.slice/crio-706687b698a620ad21142755b087ee943ac7e1c95fec291e9e410628dc5f1866 WatchSource:0}: Error finding container 706687b698a620ad21142755b087ee943ac7e1c95fec291e9e410628dc5f1866: Status 404 returned error can't find the container with id 706687b698a620ad21142755b087ee943ac7e1c95fec291e9e410628dc5f1866 Mar 19 19:40:01 crc kubenswrapper[4826]: I0319 19:40:01.366578 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565820-4q9l7" event={"ID":"e7b0f07e-6a17-4cdf-ba58-0e8835e71c60","Type":"ContainerStarted","Data":"706687b698a620ad21142755b087ee943ac7e1c95fec291e9e410628dc5f1866"} Mar 19 19:40:03 crc kubenswrapper[4826]: I0319 19:40:03.391447 4826 generic.go:334] "Generic (PLEG): container finished" podID="e7b0f07e-6a17-4cdf-ba58-0e8835e71c60" containerID="7faa0834afc197d5950427f0c944db59b695dacf648b7f57b5ada0678cff3e3a" exitCode=0 Mar 19 19:40:03 crc kubenswrapper[4826]: I0319 19:40:03.391555 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565820-4q9l7" event={"ID":"e7b0f07e-6a17-4cdf-ba58-0e8835e71c60","Type":"ContainerDied","Data":"7faa0834afc197d5950427f0c944db59b695dacf648b7f57b5ada0678cff3e3a"} Mar 19 19:40:04 crc kubenswrapper[4826]: I0319 19:40:04.844595 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565820-4q9l7" Mar 19 19:40:05 crc kubenswrapper[4826]: I0319 19:40:05.016175 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7sjz\" (UniqueName: \"kubernetes.io/projected/e7b0f07e-6a17-4cdf-ba58-0e8835e71c60-kube-api-access-l7sjz\") pod \"e7b0f07e-6a17-4cdf-ba58-0e8835e71c60\" (UID: \"e7b0f07e-6a17-4cdf-ba58-0e8835e71c60\") " Mar 19 19:40:05 crc kubenswrapper[4826]: I0319 19:40:05.022605 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7b0f07e-6a17-4cdf-ba58-0e8835e71c60-kube-api-access-l7sjz" (OuterVolumeSpecName: "kube-api-access-l7sjz") pod "e7b0f07e-6a17-4cdf-ba58-0e8835e71c60" (UID: "e7b0f07e-6a17-4cdf-ba58-0e8835e71c60"). InnerVolumeSpecName "kube-api-access-l7sjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:40:05 crc kubenswrapper[4826]: I0319 19:40:05.119636 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7sjz\" (UniqueName: \"kubernetes.io/projected/e7b0f07e-6a17-4cdf-ba58-0e8835e71c60-kube-api-access-l7sjz\") on node \"crc\" DevicePath \"\"" Mar 19 19:40:05 crc kubenswrapper[4826]: I0319 19:40:05.416715 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565820-4q9l7" event={"ID":"e7b0f07e-6a17-4cdf-ba58-0e8835e71c60","Type":"ContainerDied","Data":"706687b698a620ad21142755b087ee943ac7e1c95fec291e9e410628dc5f1866"} Mar 19 19:40:05 crc kubenswrapper[4826]: I0319 19:40:05.416754 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="706687b698a620ad21142755b087ee943ac7e1c95fec291e9e410628dc5f1866" Mar 19 19:40:05 crc kubenswrapper[4826]: I0319 19:40:05.416766 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565820-4q9l7" Mar 19 19:40:05 crc kubenswrapper[4826]: I0319 19:40:05.938443 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565814-drclz"] Mar 19 19:40:05 crc kubenswrapper[4826]: I0319 19:40:05.962336 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565814-drclz"] Mar 19 19:40:06 crc kubenswrapper[4826]: I0319 19:40:06.001117 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a01c4fcf-ef63-40d1-b71b-5212f5ad8ffb" path="/var/lib/kubelet/pods/a01c4fcf-ef63-40d1-b71b-5212f5ad8ffb/volumes" Mar 19 19:40:32 crc kubenswrapper[4826]: I0319 19:40:32.869880 4826 scope.go:117] "RemoveContainer" containerID="14ab3d9e8c7e0d70aefc985d4182cba56433449962298bd91f293a74ae0a5a6a" Mar 19 19:40:47 crc kubenswrapper[4826]: I0319 19:40:47.947241 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-m7fjb"] Mar 19 19:40:47 crc kubenswrapper[4826]: E0319 19:40:47.948386 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7b0f07e-6a17-4cdf-ba58-0e8835e71c60" containerName="oc" Mar 19 19:40:47 crc kubenswrapper[4826]: I0319 19:40:47.948402 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7b0f07e-6a17-4cdf-ba58-0e8835e71c60" containerName="oc" Mar 19 19:40:47 crc kubenswrapper[4826]: I0319 19:40:47.948698 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7b0f07e-6a17-4cdf-ba58-0e8835e71c60" containerName="oc" Mar 19 19:40:47 crc kubenswrapper[4826]: I0319 19:40:47.953024 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m7fjb" Mar 19 19:40:48 crc kubenswrapper[4826]: I0319 19:40:48.003321 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m7fjb"] Mar 19 19:40:48 crc kubenswrapper[4826]: I0319 19:40:48.058524 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b11123c-f8ea-4722-a551-c34be7914b10-catalog-content\") pod \"community-operators-m7fjb\" (UID: \"9b11123c-f8ea-4722-a551-c34be7914b10\") " pod="openshift-marketplace/community-operators-m7fjb" Mar 19 19:40:48 crc kubenswrapper[4826]: I0319 19:40:48.058569 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b11123c-f8ea-4722-a551-c34be7914b10-utilities\") pod \"community-operators-m7fjb\" (UID: \"9b11123c-f8ea-4722-a551-c34be7914b10\") " pod="openshift-marketplace/community-operators-m7fjb" Mar 19 19:40:48 crc kubenswrapper[4826]: I0319 19:40:48.058753 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlh8q\" (UniqueName: \"kubernetes.io/projected/9b11123c-f8ea-4722-a551-c34be7914b10-kube-api-access-tlh8q\") pod \"community-operators-m7fjb\" (UID: \"9b11123c-f8ea-4722-a551-c34be7914b10\") " pod="openshift-marketplace/community-operators-m7fjb" Mar 19 19:40:48 crc kubenswrapper[4826]: I0319 19:40:48.161111 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b11123c-f8ea-4722-a551-c34be7914b10-catalog-content\") pod \"community-operators-m7fjb\" (UID: \"9b11123c-f8ea-4722-a551-c34be7914b10\") " pod="openshift-marketplace/community-operators-m7fjb" Mar 19 19:40:48 crc kubenswrapper[4826]: I0319 19:40:48.161435 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b11123c-f8ea-4722-a551-c34be7914b10-utilities\") pod \"community-operators-m7fjb\" (UID: \"9b11123c-f8ea-4722-a551-c34be7914b10\") " pod="openshift-marketplace/community-operators-m7fjb" Mar 19 19:40:48 crc kubenswrapper[4826]: I0319 19:40:48.161552 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlh8q\" (UniqueName: \"kubernetes.io/projected/9b11123c-f8ea-4722-a551-c34be7914b10-kube-api-access-tlh8q\") pod \"community-operators-m7fjb\" (UID: \"9b11123c-f8ea-4722-a551-c34be7914b10\") " pod="openshift-marketplace/community-operators-m7fjb" Mar 19 19:40:48 crc kubenswrapper[4826]: I0319 19:40:48.161788 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b11123c-f8ea-4722-a551-c34be7914b10-catalog-content\") pod \"community-operators-m7fjb\" (UID: \"9b11123c-f8ea-4722-a551-c34be7914b10\") " pod="openshift-marketplace/community-operators-m7fjb" Mar 19 19:40:48 crc kubenswrapper[4826]: I0319 19:40:48.161846 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b11123c-f8ea-4722-a551-c34be7914b10-utilities\") pod \"community-operators-m7fjb\" (UID: \"9b11123c-f8ea-4722-a551-c34be7914b10\") " pod="openshift-marketplace/community-operators-m7fjb" Mar 19 19:40:48 crc kubenswrapper[4826]: I0319 19:40:48.184552 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlh8q\" (UniqueName: \"kubernetes.io/projected/9b11123c-f8ea-4722-a551-c34be7914b10-kube-api-access-tlh8q\") pod \"community-operators-m7fjb\" (UID: \"9b11123c-f8ea-4722-a551-c34be7914b10\") " pod="openshift-marketplace/community-operators-m7fjb" Mar 19 19:40:48 crc kubenswrapper[4826]: I0319 19:40:48.289419 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m7fjb" Mar 19 19:40:48 crc kubenswrapper[4826]: I0319 19:40:48.858091 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m7fjb"] Mar 19 19:40:48 crc kubenswrapper[4826]: I0319 19:40:48.994757 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m7fjb" event={"ID":"9b11123c-f8ea-4722-a551-c34be7914b10","Type":"ContainerStarted","Data":"2efc7fb21abb0ef021dce2b22222cce3e23205b67b1483afb068b123bc5e8eec"} Mar 19 19:40:50 crc kubenswrapper[4826]: I0319 19:40:50.024156 4826 generic.go:334] "Generic (PLEG): container finished" podID="9b11123c-f8ea-4722-a551-c34be7914b10" containerID="fa1fdce1fd9c828c45f557c24637e8472d00bb07495216ba30cfae0b2dbe099b" exitCode=0 Mar 19 19:40:50 crc kubenswrapper[4826]: I0319 19:40:50.024239 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m7fjb" event={"ID":"9b11123c-f8ea-4722-a551-c34be7914b10","Type":"ContainerDied","Data":"fa1fdce1fd9c828c45f557c24637e8472d00bb07495216ba30cfae0b2dbe099b"} Mar 19 19:40:50 crc kubenswrapper[4826]: I0319 19:40:50.036382 4826 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 19:40:51 crc kubenswrapper[4826]: I0319 19:40:51.046407 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m7fjb" event={"ID":"9b11123c-f8ea-4722-a551-c34be7914b10","Type":"ContainerStarted","Data":"3825e1aa2542f4202bea71f3bca642d61f8666b5f09ef82fc4a56983690ac5e5"} Mar 19 19:40:53 crc kubenswrapper[4826]: I0319 19:40:53.074023 4826 generic.go:334] "Generic (PLEG): container finished" podID="9b11123c-f8ea-4722-a551-c34be7914b10" containerID="3825e1aa2542f4202bea71f3bca642d61f8666b5f09ef82fc4a56983690ac5e5" exitCode=0 Mar 19 19:40:53 crc kubenswrapper[4826]: I0319 19:40:53.074179 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m7fjb" event={"ID":"9b11123c-f8ea-4722-a551-c34be7914b10","Type":"ContainerDied","Data":"3825e1aa2542f4202bea71f3bca642d61f8666b5f09ef82fc4a56983690ac5e5"} Mar 19 19:40:54 crc kubenswrapper[4826]: I0319 19:40:54.086603 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m7fjb" event={"ID":"9b11123c-f8ea-4722-a551-c34be7914b10","Type":"ContainerStarted","Data":"43ab453193d48f42b15ac301cded5ab25a3d4808fd200918825624172dce1ca4"} Mar 19 19:40:54 crc kubenswrapper[4826]: I0319 19:40:54.118170 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-m7fjb" podStartSLOduration=3.638473067 podStartE2EDuration="7.118151811s" podCreationTimestamp="2026-03-19 19:40:47 +0000 UTC" firstStartedPulling="2026-03-19 19:40:50.036088566 +0000 UTC m=+2674.790156879" lastFinishedPulling="2026-03-19 19:40:53.51576726 +0000 UTC m=+2678.269835623" observedRunningTime="2026-03-19 19:40:54.111313035 +0000 UTC m=+2678.865381378" watchObservedRunningTime="2026-03-19 19:40:54.118151811 +0000 UTC m=+2678.872220124" Mar 19 19:40:55 crc kubenswrapper[4826]: I0319 19:40:55.400309 4826 patch_prober.go:28] interesting pod/machine-config-daemon-zz87p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:40:55 crc kubenswrapper[4826]: I0319 19:40:55.400879 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:40:58 crc kubenswrapper[4826]: I0319 19:40:58.290534 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-m7fjb" Mar 19 19:40:58 crc kubenswrapper[4826]: I0319 19:40:58.291403 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-m7fjb" Mar 19 19:40:58 crc kubenswrapper[4826]: I0319 19:40:58.357047 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-m7fjb" Mar 19 19:40:59 crc kubenswrapper[4826]: I0319 19:40:59.210231 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-m7fjb" Mar 19 19:40:59 crc kubenswrapper[4826]: I0319 19:40:59.265402 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m7fjb"] Mar 19 19:41:01 crc kubenswrapper[4826]: I0319 19:41:01.157042 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-m7fjb" podUID="9b11123c-f8ea-4722-a551-c34be7914b10" containerName="registry-server" containerID="cri-o://43ab453193d48f42b15ac301cded5ab25a3d4808fd200918825624172dce1ca4" gracePeriod=2 Mar 19 19:41:01 crc kubenswrapper[4826]: I0319 19:41:01.786984 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m7fjb" Mar 19 19:41:01 crc kubenswrapper[4826]: I0319 19:41:01.947434 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlh8q\" (UniqueName: \"kubernetes.io/projected/9b11123c-f8ea-4722-a551-c34be7914b10-kube-api-access-tlh8q\") pod \"9b11123c-f8ea-4722-a551-c34be7914b10\" (UID: \"9b11123c-f8ea-4722-a551-c34be7914b10\") " Mar 19 19:41:01 crc kubenswrapper[4826]: I0319 19:41:01.948034 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b11123c-f8ea-4722-a551-c34be7914b10-utilities\") pod \"9b11123c-f8ea-4722-a551-c34be7914b10\" (UID: \"9b11123c-f8ea-4722-a551-c34be7914b10\") " Mar 19 19:41:01 crc kubenswrapper[4826]: I0319 19:41:01.948150 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b11123c-f8ea-4722-a551-c34be7914b10-catalog-content\") pod \"9b11123c-f8ea-4722-a551-c34be7914b10\" (UID: \"9b11123c-f8ea-4722-a551-c34be7914b10\") " Mar 19 19:41:01 crc kubenswrapper[4826]: I0319 19:41:01.948720 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b11123c-f8ea-4722-a551-c34be7914b10-utilities" (OuterVolumeSpecName: "utilities") pod "9b11123c-f8ea-4722-a551-c34be7914b10" (UID: "9b11123c-f8ea-4722-a551-c34be7914b10"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:41:01 crc kubenswrapper[4826]: I0319 19:41:01.949535 4826 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b11123c-f8ea-4722-a551-c34be7914b10-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 19:41:01 crc kubenswrapper[4826]: I0319 19:41:01.953951 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b11123c-f8ea-4722-a551-c34be7914b10-kube-api-access-tlh8q" (OuterVolumeSpecName: "kube-api-access-tlh8q") pod "9b11123c-f8ea-4722-a551-c34be7914b10" (UID: "9b11123c-f8ea-4722-a551-c34be7914b10"). InnerVolumeSpecName "kube-api-access-tlh8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:41:02 crc kubenswrapper[4826]: I0319 19:41:02.012543 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b11123c-f8ea-4722-a551-c34be7914b10-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9b11123c-f8ea-4722-a551-c34be7914b10" (UID: "9b11123c-f8ea-4722-a551-c34be7914b10"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:41:02 crc kubenswrapper[4826]: I0319 19:41:02.053369 4826 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b11123c-f8ea-4722-a551-c34be7914b10-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 19:41:02 crc kubenswrapper[4826]: I0319 19:41:02.053403 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlh8q\" (UniqueName: \"kubernetes.io/projected/9b11123c-f8ea-4722-a551-c34be7914b10-kube-api-access-tlh8q\") on node \"crc\" DevicePath \"\"" Mar 19 19:41:02 crc kubenswrapper[4826]: I0319 19:41:02.201541 4826 generic.go:334] "Generic (PLEG): container finished" podID="9b11123c-f8ea-4722-a551-c34be7914b10" containerID="43ab453193d48f42b15ac301cded5ab25a3d4808fd200918825624172dce1ca4" exitCode=0 Mar 19 19:41:02 crc kubenswrapper[4826]: I0319 19:41:02.201601 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m7fjb" event={"ID":"9b11123c-f8ea-4722-a551-c34be7914b10","Type":"ContainerDied","Data":"43ab453193d48f42b15ac301cded5ab25a3d4808fd200918825624172dce1ca4"} Mar 19 19:41:02 crc kubenswrapper[4826]: I0319 19:41:02.201667 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m7fjb" event={"ID":"9b11123c-f8ea-4722-a551-c34be7914b10","Type":"ContainerDied","Data":"2efc7fb21abb0ef021dce2b22222cce3e23205b67b1483afb068b123bc5e8eec"} Mar 19 19:41:02 crc kubenswrapper[4826]: I0319 19:41:02.201691 4826 scope.go:117] "RemoveContainer" containerID="43ab453193d48f42b15ac301cded5ab25a3d4808fd200918825624172dce1ca4" Mar 19 19:41:02 crc kubenswrapper[4826]: I0319 19:41:02.201713 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m7fjb" Mar 19 19:41:02 crc kubenswrapper[4826]: I0319 19:41:02.227750 4826 scope.go:117] "RemoveContainer" containerID="3825e1aa2542f4202bea71f3bca642d61f8666b5f09ef82fc4a56983690ac5e5" Mar 19 19:41:02 crc kubenswrapper[4826]: I0319 19:41:02.268863 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m7fjb"] Mar 19 19:41:02 crc kubenswrapper[4826]: I0319 19:41:02.273536 4826 scope.go:117] "RemoveContainer" containerID="fa1fdce1fd9c828c45f557c24637e8472d00bb07495216ba30cfae0b2dbe099b" Mar 19 19:41:02 crc kubenswrapper[4826]: I0319 19:41:02.276071 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-m7fjb"] Mar 19 19:41:02 crc kubenswrapper[4826]: I0319 19:41:02.351771 4826 scope.go:117] "RemoveContainer" containerID="43ab453193d48f42b15ac301cded5ab25a3d4808fd200918825624172dce1ca4" Mar 19 19:41:02 crc kubenswrapper[4826]: E0319 19:41:02.352250 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43ab453193d48f42b15ac301cded5ab25a3d4808fd200918825624172dce1ca4\": container with ID starting with 43ab453193d48f42b15ac301cded5ab25a3d4808fd200918825624172dce1ca4 not found: ID does not exist" containerID="43ab453193d48f42b15ac301cded5ab25a3d4808fd200918825624172dce1ca4" Mar 19 19:41:02 crc kubenswrapper[4826]: I0319 19:41:02.352299 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43ab453193d48f42b15ac301cded5ab25a3d4808fd200918825624172dce1ca4"} err="failed to get container status \"43ab453193d48f42b15ac301cded5ab25a3d4808fd200918825624172dce1ca4\": rpc error: code = NotFound desc = could not find container \"43ab453193d48f42b15ac301cded5ab25a3d4808fd200918825624172dce1ca4\": container with ID starting with 43ab453193d48f42b15ac301cded5ab25a3d4808fd200918825624172dce1ca4 not found: ID does not exist" Mar 19 19:41:02 crc kubenswrapper[4826]: I0319 19:41:02.352331 4826 scope.go:117] "RemoveContainer" containerID="3825e1aa2542f4202bea71f3bca642d61f8666b5f09ef82fc4a56983690ac5e5" Mar 19 19:41:02 crc kubenswrapper[4826]: E0319 19:41:02.352615 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3825e1aa2542f4202bea71f3bca642d61f8666b5f09ef82fc4a56983690ac5e5\": container with ID starting with 3825e1aa2542f4202bea71f3bca642d61f8666b5f09ef82fc4a56983690ac5e5 not found: ID does not exist" containerID="3825e1aa2542f4202bea71f3bca642d61f8666b5f09ef82fc4a56983690ac5e5" Mar 19 19:41:02 crc kubenswrapper[4826]: I0319 19:41:02.352643 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3825e1aa2542f4202bea71f3bca642d61f8666b5f09ef82fc4a56983690ac5e5"} err="failed to get container status \"3825e1aa2542f4202bea71f3bca642d61f8666b5f09ef82fc4a56983690ac5e5\": rpc error: code = NotFound desc = could not find container \"3825e1aa2542f4202bea71f3bca642d61f8666b5f09ef82fc4a56983690ac5e5\": container with ID starting with 3825e1aa2542f4202bea71f3bca642d61f8666b5f09ef82fc4a56983690ac5e5 not found: ID does not exist" Mar 19 19:41:02 crc kubenswrapper[4826]: I0319 19:41:02.352676 4826 scope.go:117] "RemoveContainer" containerID="fa1fdce1fd9c828c45f557c24637e8472d00bb07495216ba30cfae0b2dbe099b" Mar 19 19:41:02 crc kubenswrapper[4826]: E0319 19:41:02.353058 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa1fdce1fd9c828c45f557c24637e8472d00bb07495216ba30cfae0b2dbe099b\": container with ID starting with fa1fdce1fd9c828c45f557c24637e8472d00bb07495216ba30cfae0b2dbe099b not found: ID does not exist" containerID="fa1fdce1fd9c828c45f557c24637e8472d00bb07495216ba30cfae0b2dbe099b" Mar 19 19:41:02 crc kubenswrapper[4826]: I0319 19:41:02.353083 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa1fdce1fd9c828c45f557c24637e8472d00bb07495216ba30cfae0b2dbe099b"} err="failed to get container status \"fa1fdce1fd9c828c45f557c24637e8472d00bb07495216ba30cfae0b2dbe099b\": rpc error: code = NotFound desc = could not find container \"fa1fdce1fd9c828c45f557c24637e8472d00bb07495216ba30cfae0b2dbe099b\": container with ID starting with fa1fdce1fd9c828c45f557c24637e8472d00bb07495216ba30cfae0b2dbe099b not found: ID does not exist" Mar 19 19:41:03 crc kubenswrapper[4826]: I0319 19:41:03.997162 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b11123c-f8ea-4722-a551-c34be7914b10" path="/var/lib/kubelet/pods/9b11123c-f8ea-4722-a551-c34be7914b10/volumes" Mar 19 19:41:25 crc kubenswrapper[4826]: I0319 19:41:25.400333 4826 patch_prober.go:28] interesting pod/machine-config-daemon-zz87p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:41:25 crc kubenswrapper[4826]: I0319 19:41:25.400797 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:41:49 crc kubenswrapper[4826]: I0319 19:41:49.792952 4826 generic.go:334] "Generic (PLEG): container finished" podID="d55a2c66-72a6-4026-90be-b4a15bf7914a" containerID="eb0923ada20969f4f1ccded053400054374f8eae7eb72cf8723619c7316d695e" exitCode=0 Mar 19 19:41:49 crc kubenswrapper[4826]: I0319 19:41:49.793061 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ppww7" event={"ID":"d55a2c66-72a6-4026-90be-b4a15bf7914a","Type":"ContainerDied","Data":"eb0923ada20969f4f1ccded053400054374f8eae7eb72cf8723619c7316d695e"} Mar 19 19:41:51 crc kubenswrapper[4826]: I0319 19:41:51.411857 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ppww7" Mar 19 19:41:51 crc kubenswrapper[4826]: I0319 19:41:51.466674 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/d55a2c66-72a6-4026-90be-b4a15bf7914a-nova-extra-config-0\") pod \"d55a2c66-72a6-4026-90be-b4a15bf7914a\" (UID: \"d55a2c66-72a6-4026-90be-b4a15bf7914a\") " Mar 19 19:41:51 crc kubenswrapper[4826]: I0319 19:41:51.466734 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/d55a2c66-72a6-4026-90be-b4a15bf7914a-nova-cell1-compute-config-3\") pod \"d55a2c66-72a6-4026-90be-b4a15bf7914a\" (UID: \"d55a2c66-72a6-4026-90be-b4a15bf7914a\") " Mar 19 19:41:51 crc kubenswrapper[4826]: I0319 19:41:51.466823 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/d55a2c66-72a6-4026-90be-b4a15bf7914a-nova-cell1-compute-config-2\") pod \"d55a2c66-72a6-4026-90be-b4a15bf7914a\" (UID: \"d55a2c66-72a6-4026-90be-b4a15bf7914a\") " Mar 19 19:41:51 crc kubenswrapper[4826]: I0319 19:41:51.466895 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d55a2c66-72a6-4026-90be-b4a15bf7914a-nova-cell1-compute-config-0\") pod \"d55a2c66-72a6-4026-90be-b4a15bf7914a\" (UID: \"d55a2c66-72a6-4026-90be-b4a15bf7914a\") " Mar 19 19:41:51 crc kubenswrapper[4826]: I0319 19:41:51.466942 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d55a2c66-72a6-4026-90be-b4a15bf7914a-inventory\") pod \"d55a2c66-72a6-4026-90be-b4a15bf7914a\" (UID: \"d55a2c66-72a6-4026-90be-b4a15bf7914a\") " Mar 19 19:41:51 crc kubenswrapper[4826]: I0319 19:41:51.467004 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d55a2c66-72a6-4026-90be-b4a15bf7914a-ssh-key-openstack-edpm-ipam\") pod \"d55a2c66-72a6-4026-90be-b4a15bf7914a\" (UID: \"d55a2c66-72a6-4026-90be-b4a15bf7914a\") " Mar 19 19:41:51 crc kubenswrapper[4826]: I0319 19:41:51.467029 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d55a2c66-72a6-4026-90be-b4a15bf7914a-nova-combined-ca-bundle\") pod \"d55a2c66-72a6-4026-90be-b4a15bf7914a\" (UID: \"d55a2c66-72a6-4026-90be-b4a15bf7914a\") " Mar 19 19:41:51 crc kubenswrapper[4826]: I0319 19:41:51.467162 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d55a2c66-72a6-4026-90be-b4a15bf7914a-nova-migration-ssh-key-1\") pod \"d55a2c66-72a6-4026-90be-b4a15bf7914a\" (UID: \"d55a2c66-72a6-4026-90be-b4a15bf7914a\") " Mar 19 19:41:51 crc kubenswrapper[4826]: I0319 19:41:51.467187 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6v479\" (UniqueName: \"kubernetes.io/projected/d55a2c66-72a6-4026-90be-b4a15bf7914a-kube-api-access-6v479\") pod \"d55a2c66-72a6-4026-90be-b4a15bf7914a\" (UID: \"d55a2c66-72a6-4026-90be-b4a15bf7914a\") " Mar 19 19:41:51 crc kubenswrapper[4826]: I0319 19:41:51.467321 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d55a2c66-72a6-4026-90be-b4a15bf7914a-nova-cell1-compute-config-1\") pod \"d55a2c66-72a6-4026-90be-b4a15bf7914a\" (UID: \"d55a2c66-72a6-4026-90be-b4a15bf7914a\") " Mar 19 19:41:51 crc kubenswrapper[4826]: I0319 19:41:51.467365 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d55a2c66-72a6-4026-90be-b4a15bf7914a-nova-migration-ssh-key-0\") pod \"d55a2c66-72a6-4026-90be-b4a15bf7914a\" (UID: \"d55a2c66-72a6-4026-90be-b4a15bf7914a\") " Mar 19 19:41:51 crc kubenswrapper[4826]: I0319 19:41:51.471450 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d55a2c66-72a6-4026-90be-b4a15bf7914a-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "d55a2c66-72a6-4026-90be-b4a15bf7914a" (UID: "d55a2c66-72a6-4026-90be-b4a15bf7914a"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:41:51 crc kubenswrapper[4826]: I0319 19:41:51.490611 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d55a2c66-72a6-4026-90be-b4a15bf7914a-kube-api-access-6v479" (OuterVolumeSpecName: "kube-api-access-6v479") pod "d55a2c66-72a6-4026-90be-b4a15bf7914a" (UID: "d55a2c66-72a6-4026-90be-b4a15bf7914a"). InnerVolumeSpecName "kube-api-access-6v479". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:41:51 crc kubenswrapper[4826]: I0319 19:41:51.501783 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d55a2c66-72a6-4026-90be-b4a15bf7914a-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "d55a2c66-72a6-4026-90be-b4a15bf7914a" (UID: "d55a2c66-72a6-4026-90be-b4a15bf7914a"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:41:51 crc kubenswrapper[4826]: I0319 19:41:51.519027 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d55a2c66-72a6-4026-90be-b4a15bf7914a-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "d55a2c66-72a6-4026-90be-b4a15bf7914a" (UID: "d55a2c66-72a6-4026-90be-b4a15bf7914a"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:41:51 crc kubenswrapper[4826]: I0319 19:41:51.519367 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d55a2c66-72a6-4026-90be-b4a15bf7914a-inventory" (OuterVolumeSpecName: "inventory") pod "d55a2c66-72a6-4026-90be-b4a15bf7914a" (UID: "d55a2c66-72a6-4026-90be-b4a15bf7914a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:41:51 crc kubenswrapper[4826]: I0319 19:41:51.520417 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d55a2c66-72a6-4026-90be-b4a15bf7914a-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "d55a2c66-72a6-4026-90be-b4a15bf7914a" (UID: "d55a2c66-72a6-4026-90be-b4a15bf7914a"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:41:51 crc kubenswrapper[4826]: I0319 19:41:51.525258 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d55a2c66-72a6-4026-90be-b4a15bf7914a-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "d55a2c66-72a6-4026-90be-b4a15bf7914a" (UID: "d55a2c66-72a6-4026-90be-b4a15bf7914a"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:41:51 crc kubenswrapper[4826]: I0319 19:41:51.533025 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d55a2c66-72a6-4026-90be-b4a15bf7914a-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "d55a2c66-72a6-4026-90be-b4a15bf7914a" (UID: "d55a2c66-72a6-4026-90be-b4a15bf7914a"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:41:51 crc kubenswrapper[4826]: I0319 19:41:51.534069 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d55a2c66-72a6-4026-90be-b4a15bf7914a-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "d55a2c66-72a6-4026-90be-b4a15bf7914a" (UID: "d55a2c66-72a6-4026-90be-b4a15bf7914a"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:41:51 crc kubenswrapper[4826]: I0319 19:41:51.540498 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d55a2c66-72a6-4026-90be-b4a15bf7914a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d55a2c66-72a6-4026-90be-b4a15bf7914a" (UID: "d55a2c66-72a6-4026-90be-b4a15bf7914a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:41:51 crc kubenswrapper[4826]: I0319 19:41:51.548341 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d55a2c66-72a6-4026-90be-b4a15bf7914a-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "d55a2c66-72a6-4026-90be-b4a15bf7914a" (UID: "d55a2c66-72a6-4026-90be-b4a15bf7914a"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:41:51 crc kubenswrapper[4826]: I0319 19:41:51.571003 4826 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d55a2c66-72a6-4026-90be-b4a15bf7914a-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 19 19:41:51 crc kubenswrapper[4826]: I0319 19:41:51.571235 4826 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d55a2c66-72a6-4026-90be-b4a15bf7914a-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 19 19:41:51 crc kubenswrapper[4826]: I0319 19:41:51.571366 4826 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/d55a2c66-72a6-4026-90be-b4a15bf7914a-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Mar 19 19:41:51 crc kubenswrapper[4826]: I0319 19:41:51.571454 4826 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/d55a2c66-72a6-4026-90be-b4a15bf7914a-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 19 19:41:51 crc kubenswrapper[4826]: I0319 19:41:51.571546 4826 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/d55a2c66-72a6-4026-90be-b4a15bf7914a-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 19 19:41:51 crc kubenswrapper[4826]: I0319 19:41:51.571628 4826 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d55a2c66-72a6-4026-90be-b4a15bf7914a-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 19 19:41:51 crc kubenswrapper[4826]: I0319 19:41:51.571733 4826 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d55a2c66-72a6-4026-90be-b4a15bf7914a-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 19:41:51 crc kubenswrapper[4826]: I0319 19:41:51.571804 4826 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d55a2c66-72a6-4026-90be-b4a15bf7914a-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:41:51 crc kubenswrapper[4826]: I0319 19:41:51.571872 4826 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d55a2c66-72a6-4026-90be-b4a15bf7914a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 19:41:51 crc kubenswrapper[4826]: I0319 19:41:51.571969 4826 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d55a2c66-72a6-4026-90be-b4a15bf7914a-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 19 19:41:51 crc kubenswrapper[4826]: I0319 19:41:51.572047 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6v479\" (UniqueName: \"kubernetes.io/projected/d55a2c66-72a6-4026-90be-b4a15bf7914a-kube-api-access-6v479\") on node \"crc\" DevicePath \"\"" Mar 19 19:41:51 crc kubenswrapper[4826]: I0319 19:41:51.818382 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ppww7" event={"ID":"d55a2c66-72a6-4026-90be-b4a15bf7914a","Type":"ContainerDied","Data":"8d9d87da26f62f3a4a92f72e75182e35cdf6b57ae8785ee4874fdcea96eb9e88"} Mar 19 19:41:51 crc kubenswrapper[4826]: I0319 19:41:51.818765 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d9d87da26f62f3a4a92f72e75182e35cdf6b57ae8785ee4874fdcea96eb9e88" Mar 19 19:41:51 crc kubenswrapper[4826]: I0319 19:41:51.818895 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ppww7" Mar 19 19:41:51 crc kubenswrapper[4826]: I0319 19:41:51.956376 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-78qhl"] Mar 19 19:41:51 crc kubenswrapper[4826]: E0319 19:41:51.956977 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d55a2c66-72a6-4026-90be-b4a15bf7914a" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 19 19:41:51 crc kubenswrapper[4826]: I0319 19:41:51.957003 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="d55a2c66-72a6-4026-90be-b4a15bf7914a" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 19 19:41:51 crc kubenswrapper[4826]: E0319 19:41:51.957036 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b11123c-f8ea-4722-a551-c34be7914b10" containerName="extract-utilities" Mar 19 19:41:51 crc kubenswrapper[4826]: I0319 19:41:51.957044 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b11123c-f8ea-4722-a551-c34be7914b10" containerName="extract-utilities" Mar 19 19:41:51 crc kubenswrapper[4826]: E0319 19:41:51.957058 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b11123c-f8ea-4722-a551-c34be7914b10" containerName="registry-server" Mar 19 19:41:51 crc kubenswrapper[4826]: I0319 19:41:51.957067 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b11123c-f8ea-4722-a551-c34be7914b10" containerName="registry-server" Mar 19 19:41:51 crc kubenswrapper[4826]: E0319 19:41:51.957084 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b11123c-f8ea-4722-a551-c34be7914b10" containerName="extract-content" Mar 19 19:41:51 crc kubenswrapper[4826]: I0319 19:41:51.957092 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b11123c-f8ea-4722-a551-c34be7914b10" containerName="extract-content" Mar 19 19:41:51 crc kubenswrapper[4826]: I0319 19:41:51.957330 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="d55a2c66-72a6-4026-90be-b4a15bf7914a" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 19 19:41:51 crc kubenswrapper[4826]: I0319 19:41:51.957356 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b11123c-f8ea-4722-a551-c34be7914b10" containerName="registry-server" Mar 19 19:41:51 crc kubenswrapper[4826]: I0319 19:41:51.958369 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-78qhl" Mar 19 19:41:51 crc kubenswrapper[4826]: I0319 19:41:51.963535 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 19:41:51 crc kubenswrapper[4826]: I0319 19:41:51.964480 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 19:41:51 crc kubenswrapper[4826]: I0319 19:41:51.964549 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Mar 19 19:41:51 crc kubenswrapper[4826]: I0319 19:41:51.964613 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 19:41:51 crc kubenswrapper[4826]: I0319 19:41:51.964905 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jchxw" Mar 19 19:41:51 crc kubenswrapper[4826]: I0319 19:41:51.973357 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-78qhl"] Mar 19 19:41:52 crc kubenswrapper[4826]: I0319 19:41:52.084285 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b0112fd9-267c-4357-8120-f42c43662900-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-78qhl\" (UID: \"b0112fd9-267c-4357-8120-f42c43662900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-78qhl" Mar 19 19:41:52 crc kubenswrapper[4826]: I0319 19:41:52.084388 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mg9bf\" (UniqueName: \"kubernetes.io/projected/b0112fd9-267c-4357-8120-f42c43662900-kube-api-access-mg9bf\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-78qhl\" (UID: \"b0112fd9-267c-4357-8120-f42c43662900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-78qhl" Mar 19 19:41:52 crc kubenswrapper[4826]: I0319 19:41:52.084422 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/b0112fd9-267c-4357-8120-f42c43662900-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-78qhl\" (UID: \"b0112fd9-267c-4357-8120-f42c43662900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-78qhl" Mar 19 19:41:52 crc kubenswrapper[4826]: I0319 19:41:52.084529 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0112fd9-267c-4357-8120-f42c43662900-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-78qhl\" (UID: \"b0112fd9-267c-4357-8120-f42c43662900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-78qhl" Mar 19 19:41:52 crc kubenswrapper[4826]: I0319 19:41:52.084588 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b0112fd9-267c-4357-8120-f42c43662900-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-78qhl\" (UID: \"b0112fd9-267c-4357-8120-f42c43662900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-78qhl" Mar 19 19:41:52 crc kubenswrapper[4826]: I0319 19:41:52.084967 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0112fd9-267c-4357-8120-f42c43662900-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-78qhl\" (UID: \"b0112fd9-267c-4357-8120-f42c43662900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-78qhl" Mar 19 19:41:52 crc kubenswrapper[4826]: I0319 19:41:52.085149 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b0112fd9-267c-4357-8120-f42c43662900-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-78qhl\" (UID: \"b0112fd9-267c-4357-8120-f42c43662900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-78qhl" Mar 19 19:41:52 crc kubenswrapper[4826]: I0319 19:41:52.188020 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mg9bf\" (UniqueName: \"kubernetes.io/projected/b0112fd9-267c-4357-8120-f42c43662900-kube-api-access-mg9bf\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-78qhl\" (UID: \"b0112fd9-267c-4357-8120-f42c43662900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-78qhl" Mar 19 19:41:52 crc kubenswrapper[4826]: I0319 19:41:52.188449 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/b0112fd9-267c-4357-8120-f42c43662900-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-78qhl\" (UID: \"b0112fd9-267c-4357-8120-f42c43662900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-78qhl" Mar 19 19:41:52 crc kubenswrapper[4826]: I0319 19:41:52.188606 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0112fd9-267c-4357-8120-f42c43662900-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-78qhl\" (UID: \"b0112fd9-267c-4357-8120-f42c43662900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-78qhl" Mar 19 19:41:52 crc kubenswrapper[4826]: I0319 19:41:52.188735 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b0112fd9-267c-4357-8120-f42c43662900-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-78qhl\" (UID: \"b0112fd9-267c-4357-8120-f42c43662900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-78qhl" Mar 19 19:41:52 crc kubenswrapper[4826]: I0319 19:41:52.188900 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0112fd9-267c-4357-8120-f42c43662900-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-78qhl\" (UID: \"b0112fd9-267c-4357-8120-f42c43662900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-78qhl" Mar 19 19:41:52 crc kubenswrapper[4826]: I0319 19:41:52.189003 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b0112fd9-267c-4357-8120-f42c43662900-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-78qhl\" (UID: \"b0112fd9-267c-4357-8120-f42c43662900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-78qhl" Mar 19 19:41:52 crc kubenswrapper[4826]: I0319 19:41:52.189091 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b0112fd9-267c-4357-8120-f42c43662900-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-78qhl\" (UID: \"b0112fd9-267c-4357-8120-f42c43662900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-78qhl" Mar 19 19:41:52 crc kubenswrapper[4826]: I0319 19:41:52.193634 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/b0112fd9-267c-4357-8120-f42c43662900-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-78qhl\" (UID: \"b0112fd9-267c-4357-8120-f42c43662900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-78qhl" Mar 19 19:41:52 crc kubenswrapper[4826]: I0319 19:41:52.194303 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0112fd9-267c-4357-8120-f42c43662900-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-78qhl\" (UID: \"b0112fd9-267c-4357-8120-f42c43662900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-78qhl" Mar 19 19:41:52 crc kubenswrapper[4826]: I0319 19:41:52.194914 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b0112fd9-267c-4357-8120-f42c43662900-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-78qhl\" (UID: \"b0112fd9-267c-4357-8120-f42c43662900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-78qhl" Mar 19 19:41:52 crc kubenswrapper[4826]: I0319 19:41:52.195140 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b0112fd9-267c-4357-8120-f42c43662900-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-78qhl\" (UID: \"b0112fd9-267c-4357-8120-f42c43662900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-78qhl" Mar 19 19:41:52 crc kubenswrapper[4826]: I0319 19:41:52.195414 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0112fd9-267c-4357-8120-f42c43662900-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-78qhl\" (UID: \"b0112fd9-267c-4357-8120-f42c43662900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-78qhl" Mar 19 19:41:52 crc kubenswrapper[4826]: I0319 19:41:52.195461 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b0112fd9-267c-4357-8120-f42c43662900-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-78qhl\" (UID: \"b0112fd9-267c-4357-8120-f42c43662900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-78qhl" Mar 19 19:41:52 crc kubenswrapper[4826]: I0319 19:41:52.221158 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mg9bf\" (UniqueName: \"kubernetes.io/projected/b0112fd9-267c-4357-8120-f42c43662900-kube-api-access-mg9bf\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-78qhl\" (UID: \"b0112fd9-267c-4357-8120-f42c43662900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-78qhl" Mar 19 19:41:52 crc kubenswrapper[4826]: I0319 19:41:52.298909 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-78qhl" Mar 19 19:41:52 crc kubenswrapper[4826]: I0319 19:41:52.911376 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-78qhl"] Mar 19 19:41:53 crc kubenswrapper[4826]: I0319 19:41:53.843984 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-78qhl" event={"ID":"b0112fd9-267c-4357-8120-f42c43662900","Type":"ContainerStarted","Data":"25615835ae77cc7386e0da64324b69147ed0cb70121720fa2b7e264997c4be29"} Mar 19 19:41:53 crc kubenswrapper[4826]: I0319 19:41:53.844586 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-78qhl" event={"ID":"b0112fd9-267c-4357-8120-f42c43662900","Type":"ContainerStarted","Data":"6b2961f97ceabd95504ebbaa3065a5f2e7743cd4368fc900782f1e3f59e2f11b"} Mar 19 19:41:53 crc kubenswrapper[4826]: I0319 19:41:53.874617 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-78qhl" podStartSLOduration=2.438023861 podStartE2EDuration="2.874591526s" podCreationTimestamp="2026-03-19 19:41:51 +0000 UTC" firstStartedPulling="2026-03-19 19:41:52.904609169 +0000 UTC m=+2737.658677482" lastFinishedPulling="2026-03-19 19:41:53.341176794 +0000 UTC m=+2738.095245147" observedRunningTime="2026-03-19 19:41:53.864475949 +0000 UTC m=+2738.618544282" watchObservedRunningTime="2026-03-19 19:41:53.874591526 +0000 UTC m=+2738.628659849" Mar 19 19:41:55 crc kubenswrapper[4826]: I0319 19:41:55.400404 4826 patch_prober.go:28] interesting pod/machine-config-daemon-zz87p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:41:55 crc kubenswrapper[4826]: I0319 19:41:55.400906 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:41:55 crc kubenswrapper[4826]: I0319 19:41:55.400984 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" Mar 19 19:41:55 crc kubenswrapper[4826]: I0319 19:41:55.402594 4826 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f4d1ea77252a9b2871e0b60c8046505311f4f02f5e721a5099d9cfb1206679db"} pod="openshift-machine-config-operator/machine-config-daemon-zz87p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 19:41:55 crc kubenswrapper[4826]: I0319 19:41:55.402836 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" containerID="cri-o://f4d1ea77252a9b2871e0b60c8046505311f4f02f5e721a5099d9cfb1206679db" gracePeriod=600 Mar 19 19:41:55 crc kubenswrapper[4826]: I0319 19:41:55.872092 4826 generic.go:334] "Generic (PLEG): container finished" podID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerID="f4d1ea77252a9b2871e0b60c8046505311f4f02f5e721a5099d9cfb1206679db" exitCode=0 Mar 19 19:41:55 crc kubenswrapper[4826]: I0319 19:41:55.872152 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" event={"ID":"b456fa3f-c7a7-45ca-b560-e7a9b21be05a","Type":"ContainerDied","Data":"f4d1ea77252a9b2871e0b60c8046505311f4f02f5e721a5099d9cfb1206679db"} Mar 19 19:41:55 crc kubenswrapper[4826]: I0319 19:41:55.872892 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" event={"ID":"b456fa3f-c7a7-45ca-b560-e7a9b21be05a","Type":"ContainerStarted","Data":"830b819005c423ed80223146f9e5bb7d6cfc2d12b0ead3f59adc695507460ab3"} Mar 19 19:41:55 crc kubenswrapper[4826]: I0319 19:41:55.872933 4826 scope.go:117] "RemoveContainer" containerID="7ad46c116609c2194bb036463992f5f5e8e6454d574c11fba78d76956fe99246" Mar 19 19:42:00 crc kubenswrapper[4826]: I0319 19:42:00.167205 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565822-bt6tr"] Mar 19 19:42:00 crc kubenswrapper[4826]: I0319 19:42:00.169860 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565822-bt6tr" Mar 19 19:42:00 crc kubenswrapper[4826]: I0319 19:42:00.172713 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b27wl" Mar 19 19:42:00 crc kubenswrapper[4826]: I0319 19:42:00.172971 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 19:42:00 crc kubenswrapper[4826]: I0319 19:42:00.176074 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 19:42:00 crc kubenswrapper[4826]: I0319 19:42:00.190913 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565822-bt6tr"] Mar 19 19:42:00 crc kubenswrapper[4826]: I0319 19:42:00.215137 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt9bh\" (UniqueName: \"kubernetes.io/projected/64c59e3a-0560-4fb5-9e8c-2bb1508b9bd1-kube-api-access-mt9bh\") pod \"auto-csr-approver-29565822-bt6tr\" (UID: \"64c59e3a-0560-4fb5-9e8c-2bb1508b9bd1\") " pod="openshift-infra/auto-csr-approver-29565822-bt6tr" Mar 19 19:42:00 crc kubenswrapper[4826]: I0319 19:42:00.317940 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mt9bh\" (UniqueName: \"kubernetes.io/projected/64c59e3a-0560-4fb5-9e8c-2bb1508b9bd1-kube-api-access-mt9bh\") pod \"auto-csr-approver-29565822-bt6tr\" (UID: \"64c59e3a-0560-4fb5-9e8c-2bb1508b9bd1\") " pod="openshift-infra/auto-csr-approver-29565822-bt6tr" Mar 19 19:42:00 crc kubenswrapper[4826]: I0319 19:42:00.344100 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mt9bh\" (UniqueName: \"kubernetes.io/projected/64c59e3a-0560-4fb5-9e8c-2bb1508b9bd1-kube-api-access-mt9bh\") pod \"auto-csr-approver-29565822-bt6tr\" (UID: \"64c59e3a-0560-4fb5-9e8c-2bb1508b9bd1\") " pod="openshift-infra/auto-csr-approver-29565822-bt6tr" Mar 19 19:42:00 crc kubenswrapper[4826]: I0319 19:42:00.500419 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565822-bt6tr" Mar 19 19:42:01 crc kubenswrapper[4826]: I0319 19:42:01.034635 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565822-bt6tr"] Mar 19 19:42:01 crc kubenswrapper[4826]: I0319 19:42:01.945648 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565822-bt6tr" event={"ID":"64c59e3a-0560-4fb5-9e8c-2bb1508b9bd1","Type":"ContainerStarted","Data":"fdc16f7007f3d0b23c02faf39799563685f7d863aac1b06e06af5d6574b00f44"} Mar 19 19:42:02 crc kubenswrapper[4826]: I0319 19:42:02.956946 4826 generic.go:334] "Generic (PLEG): container finished" podID="64c59e3a-0560-4fb5-9e8c-2bb1508b9bd1" containerID="532f76c99aef136f1b552bd11a5f2e49684dd5eab2678609af11e40537af7c72" exitCode=0 Mar 19 19:42:02 crc kubenswrapper[4826]: I0319 19:42:02.957079 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565822-bt6tr" event={"ID":"64c59e3a-0560-4fb5-9e8c-2bb1508b9bd1","Type":"ContainerDied","Data":"532f76c99aef136f1b552bd11a5f2e49684dd5eab2678609af11e40537af7c72"} Mar 19 19:42:04 crc kubenswrapper[4826]: I0319 19:42:04.376028 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565822-bt6tr" Mar 19 19:42:04 crc kubenswrapper[4826]: I0319 19:42:04.589101 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mt9bh\" (UniqueName: \"kubernetes.io/projected/64c59e3a-0560-4fb5-9e8c-2bb1508b9bd1-kube-api-access-mt9bh\") pod \"64c59e3a-0560-4fb5-9e8c-2bb1508b9bd1\" (UID: \"64c59e3a-0560-4fb5-9e8c-2bb1508b9bd1\") " Mar 19 19:42:04 crc kubenswrapper[4826]: I0319 19:42:04.598383 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64c59e3a-0560-4fb5-9e8c-2bb1508b9bd1-kube-api-access-mt9bh" (OuterVolumeSpecName: "kube-api-access-mt9bh") pod "64c59e3a-0560-4fb5-9e8c-2bb1508b9bd1" (UID: "64c59e3a-0560-4fb5-9e8c-2bb1508b9bd1"). InnerVolumeSpecName "kube-api-access-mt9bh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:42:04 crc kubenswrapper[4826]: I0319 19:42:04.692870 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mt9bh\" (UniqueName: \"kubernetes.io/projected/64c59e3a-0560-4fb5-9e8c-2bb1508b9bd1-kube-api-access-mt9bh\") on node \"crc\" DevicePath \"\"" Mar 19 19:42:05 crc kubenswrapper[4826]: I0319 19:42:05.014210 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565822-bt6tr" event={"ID":"64c59e3a-0560-4fb5-9e8c-2bb1508b9bd1","Type":"ContainerDied","Data":"fdc16f7007f3d0b23c02faf39799563685f7d863aac1b06e06af5d6574b00f44"} Mar 19 19:42:05 crc kubenswrapper[4826]: I0319 19:42:05.014444 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fdc16f7007f3d0b23c02faf39799563685f7d863aac1b06e06af5d6574b00f44" Mar 19 19:42:05 crc kubenswrapper[4826]: I0319 19:42:05.014395 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565822-bt6tr" Mar 19 19:42:05 crc kubenswrapper[4826]: I0319 19:42:05.484146 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565816-2mk4w"] Mar 19 19:42:05 crc kubenswrapper[4826]: I0319 19:42:05.507976 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565816-2mk4w"] Mar 19 19:42:06 crc kubenswrapper[4826]: I0319 19:42:06.000458 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7533636-0dfb-4a71-9d89-ffa996e4175b" path="/var/lib/kubelet/pods/e7533636-0dfb-4a71-9d89-ffa996e4175b/volumes" Mar 19 19:42:30 crc kubenswrapper[4826]: I0319 19:42:30.415371 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-558vw"] Mar 19 19:42:30 crc kubenswrapper[4826]: E0319 19:42:30.416434 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64c59e3a-0560-4fb5-9e8c-2bb1508b9bd1" containerName="oc" Mar 19 19:42:30 crc kubenswrapper[4826]: I0319 19:42:30.416450 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="64c59e3a-0560-4fb5-9e8c-2bb1508b9bd1" containerName="oc" Mar 19 19:42:30 crc kubenswrapper[4826]: I0319 19:42:30.416840 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="64c59e3a-0560-4fb5-9e8c-2bb1508b9bd1" containerName="oc" Mar 19 19:42:30 crc kubenswrapper[4826]: I0319 19:42:30.418952 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-558vw" Mar 19 19:42:30 crc kubenswrapper[4826]: I0319 19:42:30.433781 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-558vw"] Mar 19 19:42:30 crc kubenswrapper[4826]: I0319 19:42:30.519121 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b7e00ec-25a7-42a2-9407-d8aa83b3e2ee-catalog-content\") pod \"redhat-marketplace-558vw\" (UID: \"4b7e00ec-25a7-42a2-9407-d8aa83b3e2ee\") " pod="openshift-marketplace/redhat-marketplace-558vw" Mar 19 19:42:30 crc kubenswrapper[4826]: I0319 19:42:30.519227 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b7e00ec-25a7-42a2-9407-d8aa83b3e2ee-utilities\") pod \"redhat-marketplace-558vw\" (UID: \"4b7e00ec-25a7-42a2-9407-d8aa83b3e2ee\") " pod="openshift-marketplace/redhat-marketplace-558vw" Mar 19 19:42:30 crc kubenswrapper[4826]: I0319 19:42:30.519340 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr2h7\" (UniqueName: \"kubernetes.io/projected/4b7e00ec-25a7-42a2-9407-d8aa83b3e2ee-kube-api-access-cr2h7\") pod \"redhat-marketplace-558vw\" (UID: \"4b7e00ec-25a7-42a2-9407-d8aa83b3e2ee\") " pod="openshift-marketplace/redhat-marketplace-558vw" Mar 19 19:42:30 crc kubenswrapper[4826]: I0319 19:42:30.621428 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b7e00ec-25a7-42a2-9407-d8aa83b3e2ee-utilities\") pod \"redhat-marketplace-558vw\" (UID: \"4b7e00ec-25a7-42a2-9407-d8aa83b3e2ee\") " pod="openshift-marketplace/redhat-marketplace-558vw" Mar 19 19:42:30 crc kubenswrapper[4826]: I0319 19:42:30.621569 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cr2h7\" (UniqueName: \"kubernetes.io/projected/4b7e00ec-25a7-42a2-9407-d8aa83b3e2ee-kube-api-access-cr2h7\") pod \"redhat-marketplace-558vw\" (UID: \"4b7e00ec-25a7-42a2-9407-d8aa83b3e2ee\") " pod="openshift-marketplace/redhat-marketplace-558vw" Mar 19 19:42:30 crc kubenswrapper[4826]: I0319 19:42:30.621710 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b7e00ec-25a7-42a2-9407-d8aa83b3e2ee-catalog-content\") pod \"redhat-marketplace-558vw\" (UID: \"4b7e00ec-25a7-42a2-9407-d8aa83b3e2ee\") " pod="openshift-marketplace/redhat-marketplace-558vw" Mar 19 19:42:30 crc kubenswrapper[4826]: I0319 19:42:30.622190 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b7e00ec-25a7-42a2-9407-d8aa83b3e2ee-catalog-content\") pod \"redhat-marketplace-558vw\" (UID: \"4b7e00ec-25a7-42a2-9407-d8aa83b3e2ee\") " pod="openshift-marketplace/redhat-marketplace-558vw" Mar 19 19:42:30 crc kubenswrapper[4826]: I0319 19:42:30.622416 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b7e00ec-25a7-42a2-9407-d8aa83b3e2ee-utilities\") pod \"redhat-marketplace-558vw\" (UID: \"4b7e00ec-25a7-42a2-9407-d8aa83b3e2ee\") " pod="openshift-marketplace/redhat-marketplace-558vw" Mar 19 19:42:30 crc kubenswrapper[4826]: I0319 19:42:30.644014 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cr2h7\" (UniqueName: \"kubernetes.io/projected/4b7e00ec-25a7-42a2-9407-d8aa83b3e2ee-kube-api-access-cr2h7\") pod \"redhat-marketplace-558vw\" (UID: \"4b7e00ec-25a7-42a2-9407-d8aa83b3e2ee\") " pod="openshift-marketplace/redhat-marketplace-558vw" Mar 19 19:42:30 crc kubenswrapper[4826]: I0319 19:42:30.747872 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-558vw" Mar 19 19:42:31 crc kubenswrapper[4826]: W0319 19:42:31.261046 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b7e00ec_25a7_42a2_9407_d8aa83b3e2ee.slice/crio-82c95f7ea7f7a947e62e9346bd6df3a960b9a4ab256c1c8beca727a63273926e WatchSource:0}: Error finding container 82c95f7ea7f7a947e62e9346bd6df3a960b9a4ab256c1c8beca727a63273926e: Status 404 returned error can't find the container with id 82c95f7ea7f7a947e62e9346bd6df3a960b9a4ab256c1c8beca727a63273926e Mar 19 19:42:31 crc kubenswrapper[4826]: I0319 19:42:31.261705 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-558vw"] Mar 19 19:42:31 crc kubenswrapper[4826]: I0319 19:42:31.368731 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-558vw" event={"ID":"4b7e00ec-25a7-42a2-9407-d8aa83b3e2ee","Type":"ContainerStarted","Data":"82c95f7ea7f7a947e62e9346bd6df3a960b9a4ab256c1c8beca727a63273926e"} Mar 19 19:42:32 crc kubenswrapper[4826]: I0319 19:42:32.386918 4826 generic.go:334] "Generic (PLEG): container finished" podID="4b7e00ec-25a7-42a2-9407-d8aa83b3e2ee" containerID="4bb842f105ee30c998efbef465f0c12974492a619e30955d1db350a4d49361b9" exitCode=0 Mar 19 19:42:32 crc kubenswrapper[4826]: I0319 19:42:32.387051 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-558vw" event={"ID":"4b7e00ec-25a7-42a2-9407-d8aa83b3e2ee","Type":"ContainerDied","Data":"4bb842f105ee30c998efbef465f0c12974492a619e30955d1db350a4d49361b9"} Mar 19 19:42:33 crc kubenswrapper[4826]: I0319 19:42:33.025249 4826 scope.go:117] "RemoveContainer" containerID="b59f89413e5881f3ccfb0f8fa9915e909bbee426cbef4b65785c848a6455896c" Mar 19 19:42:34 crc kubenswrapper[4826]: I0319 19:42:34.415614 4826 generic.go:334] "Generic (PLEG): container finished" podID="4b7e00ec-25a7-42a2-9407-d8aa83b3e2ee" containerID="7cccdda2e162703ff34d748c509e64b33f5075f4753909f089352080d0b227a3" exitCode=0 Mar 19 19:42:34 crc kubenswrapper[4826]: I0319 19:42:34.416279 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-558vw" event={"ID":"4b7e00ec-25a7-42a2-9407-d8aa83b3e2ee","Type":"ContainerDied","Data":"7cccdda2e162703ff34d748c509e64b33f5075f4753909f089352080d0b227a3"} Mar 19 19:42:35 crc kubenswrapper[4826]: I0319 19:42:35.435854 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-558vw" event={"ID":"4b7e00ec-25a7-42a2-9407-d8aa83b3e2ee","Type":"ContainerStarted","Data":"68b32c372c256a252e648015c453d2b911fa50333d3e7975b9f73252b8233754"} Mar 19 19:42:35 crc kubenswrapper[4826]: I0319 19:42:35.471811 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-558vw" podStartSLOduration=3.030148604 podStartE2EDuration="5.471782907s" podCreationTimestamp="2026-03-19 19:42:30 +0000 UTC" firstStartedPulling="2026-03-19 19:42:32.390666611 +0000 UTC m=+2777.144734924" lastFinishedPulling="2026-03-19 19:42:34.832300864 +0000 UTC m=+2779.586369227" observedRunningTime="2026-03-19 19:42:35.463800162 +0000 UTC m=+2780.217868555" watchObservedRunningTime="2026-03-19 19:42:35.471782907 +0000 UTC m=+2780.225851260" Mar 19 19:42:40 crc kubenswrapper[4826]: I0319 19:42:40.748554 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-558vw" Mar 19 19:42:40 crc kubenswrapper[4826]: I0319 19:42:40.750227 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-558vw" Mar 19 19:42:40 crc kubenswrapper[4826]: I0319 19:42:40.829366 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-558vw" Mar 19 19:42:41 crc kubenswrapper[4826]: I0319 19:42:41.601344 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-558vw" Mar 19 19:42:41 crc kubenswrapper[4826]: I0319 19:42:41.676444 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-558vw"] Mar 19 19:42:43 crc kubenswrapper[4826]: I0319 19:42:43.529481 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-558vw" podUID="4b7e00ec-25a7-42a2-9407-d8aa83b3e2ee" containerName="registry-server" containerID="cri-o://68b32c372c256a252e648015c453d2b911fa50333d3e7975b9f73252b8233754" gracePeriod=2 Mar 19 19:42:44 crc kubenswrapper[4826]: I0319 19:42:44.038516 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-558vw" Mar 19 19:42:44 crc kubenswrapper[4826]: I0319 19:42:44.200822 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cr2h7\" (UniqueName: \"kubernetes.io/projected/4b7e00ec-25a7-42a2-9407-d8aa83b3e2ee-kube-api-access-cr2h7\") pod \"4b7e00ec-25a7-42a2-9407-d8aa83b3e2ee\" (UID: \"4b7e00ec-25a7-42a2-9407-d8aa83b3e2ee\") " Mar 19 19:42:44 crc kubenswrapper[4826]: I0319 19:42:44.201326 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b7e00ec-25a7-42a2-9407-d8aa83b3e2ee-utilities\") pod \"4b7e00ec-25a7-42a2-9407-d8aa83b3e2ee\" (UID: \"4b7e00ec-25a7-42a2-9407-d8aa83b3e2ee\") " Mar 19 19:42:44 crc kubenswrapper[4826]: I0319 19:42:44.202610 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b7e00ec-25a7-42a2-9407-d8aa83b3e2ee-utilities" (OuterVolumeSpecName: "utilities") pod "4b7e00ec-25a7-42a2-9407-d8aa83b3e2ee" (UID: "4b7e00ec-25a7-42a2-9407-d8aa83b3e2ee"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:42:44 crc kubenswrapper[4826]: I0319 19:42:44.202867 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b7e00ec-25a7-42a2-9407-d8aa83b3e2ee-catalog-content\") pod \"4b7e00ec-25a7-42a2-9407-d8aa83b3e2ee\" (UID: \"4b7e00ec-25a7-42a2-9407-d8aa83b3e2ee\") " Mar 19 19:42:44 crc kubenswrapper[4826]: I0319 19:42:44.204173 4826 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b7e00ec-25a7-42a2-9407-d8aa83b3e2ee-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 19:42:44 crc kubenswrapper[4826]: I0319 19:42:44.206576 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b7e00ec-25a7-42a2-9407-d8aa83b3e2ee-kube-api-access-cr2h7" (OuterVolumeSpecName: "kube-api-access-cr2h7") pod "4b7e00ec-25a7-42a2-9407-d8aa83b3e2ee" (UID: "4b7e00ec-25a7-42a2-9407-d8aa83b3e2ee"). InnerVolumeSpecName "kube-api-access-cr2h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:42:44 crc kubenswrapper[4826]: I0319 19:42:44.249456 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b7e00ec-25a7-42a2-9407-d8aa83b3e2ee-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4b7e00ec-25a7-42a2-9407-d8aa83b3e2ee" (UID: "4b7e00ec-25a7-42a2-9407-d8aa83b3e2ee"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:42:44 crc kubenswrapper[4826]: I0319 19:42:44.306205 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cr2h7\" (UniqueName: \"kubernetes.io/projected/4b7e00ec-25a7-42a2-9407-d8aa83b3e2ee-kube-api-access-cr2h7\") on node \"crc\" DevicePath \"\"" Mar 19 19:42:44 crc kubenswrapper[4826]: I0319 19:42:44.306235 4826 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b7e00ec-25a7-42a2-9407-d8aa83b3e2ee-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 19:42:44 crc kubenswrapper[4826]: I0319 19:42:44.544116 4826 generic.go:334] "Generic (PLEG): container finished" podID="4b7e00ec-25a7-42a2-9407-d8aa83b3e2ee" containerID="68b32c372c256a252e648015c453d2b911fa50333d3e7975b9f73252b8233754" exitCode=0 Mar 19 19:42:44 crc kubenswrapper[4826]: I0319 19:42:44.544176 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-558vw" event={"ID":"4b7e00ec-25a7-42a2-9407-d8aa83b3e2ee","Type":"ContainerDied","Data":"68b32c372c256a252e648015c453d2b911fa50333d3e7975b9f73252b8233754"} Mar 19 19:42:44 crc kubenswrapper[4826]: I0319 19:42:44.544223 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-558vw" Mar 19 19:42:44 crc kubenswrapper[4826]: I0319 19:42:44.544238 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-558vw" event={"ID":"4b7e00ec-25a7-42a2-9407-d8aa83b3e2ee","Type":"ContainerDied","Data":"82c95f7ea7f7a947e62e9346bd6df3a960b9a4ab256c1c8beca727a63273926e"} Mar 19 19:42:44 crc kubenswrapper[4826]: I0319 19:42:44.544271 4826 scope.go:117] "RemoveContainer" containerID="68b32c372c256a252e648015c453d2b911fa50333d3e7975b9f73252b8233754" Mar 19 19:42:44 crc kubenswrapper[4826]: I0319 19:42:44.573497 4826 scope.go:117] "RemoveContainer" containerID="7cccdda2e162703ff34d748c509e64b33f5075f4753909f089352080d0b227a3" Mar 19 19:42:44 crc kubenswrapper[4826]: I0319 19:42:44.634218 4826 scope.go:117] "RemoveContainer" containerID="4bb842f105ee30c998efbef465f0c12974492a619e30955d1db350a4d49361b9" Mar 19 19:42:44 crc kubenswrapper[4826]: I0319 19:42:44.651286 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-558vw"] Mar 19 19:42:44 crc kubenswrapper[4826]: I0319 19:42:44.667782 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-558vw"] Mar 19 19:42:44 crc kubenswrapper[4826]: I0319 19:42:44.688176 4826 scope.go:117] "RemoveContainer" containerID="68b32c372c256a252e648015c453d2b911fa50333d3e7975b9f73252b8233754" Mar 19 19:42:44 crc kubenswrapper[4826]: E0319 19:42:44.688675 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68b32c372c256a252e648015c453d2b911fa50333d3e7975b9f73252b8233754\": container with ID starting with 68b32c372c256a252e648015c453d2b911fa50333d3e7975b9f73252b8233754 not found: ID does not exist" containerID="68b32c372c256a252e648015c453d2b911fa50333d3e7975b9f73252b8233754" Mar 19 19:42:44 crc kubenswrapper[4826]: I0319 19:42:44.688709 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68b32c372c256a252e648015c453d2b911fa50333d3e7975b9f73252b8233754"} err="failed to get container status \"68b32c372c256a252e648015c453d2b911fa50333d3e7975b9f73252b8233754\": rpc error: code = NotFound desc = could not find container \"68b32c372c256a252e648015c453d2b911fa50333d3e7975b9f73252b8233754\": container with ID starting with 68b32c372c256a252e648015c453d2b911fa50333d3e7975b9f73252b8233754 not found: ID does not exist" Mar 19 19:42:44 crc kubenswrapper[4826]: I0319 19:42:44.688730 4826 scope.go:117] "RemoveContainer" containerID="7cccdda2e162703ff34d748c509e64b33f5075f4753909f089352080d0b227a3" Mar 19 19:42:44 crc kubenswrapper[4826]: E0319 19:42:44.689267 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cccdda2e162703ff34d748c509e64b33f5075f4753909f089352080d0b227a3\": container with ID starting with 7cccdda2e162703ff34d748c509e64b33f5075f4753909f089352080d0b227a3 not found: ID does not exist" containerID="7cccdda2e162703ff34d748c509e64b33f5075f4753909f089352080d0b227a3" Mar 19 19:42:44 crc kubenswrapper[4826]: I0319 19:42:44.689314 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cccdda2e162703ff34d748c509e64b33f5075f4753909f089352080d0b227a3"} err="failed to get container status \"7cccdda2e162703ff34d748c509e64b33f5075f4753909f089352080d0b227a3\": rpc error: code = NotFound desc = could not find container \"7cccdda2e162703ff34d748c509e64b33f5075f4753909f089352080d0b227a3\": container with ID starting with 7cccdda2e162703ff34d748c509e64b33f5075f4753909f089352080d0b227a3 not found: ID does not exist" Mar 19 19:42:44 crc kubenswrapper[4826]: I0319 19:42:44.689346 4826 scope.go:117] "RemoveContainer" containerID="4bb842f105ee30c998efbef465f0c12974492a619e30955d1db350a4d49361b9" Mar 19 19:42:44 crc kubenswrapper[4826]: E0319 19:42:44.689671 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bb842f105ee30c998efbef465f0c12974492a619e30955d1db350a4d49361b9\": container with ID starting with 4bb842f105ee30c998efbef465f0c12974492a619e30955d1db350a4d49361b9 not found: ID does not exist" containerID="4bb842f105ee30c998efbef465f0c12974492a619e30955d1db350a4d49361b9" Mar 19 19:42:44 crc kubenswrapper[4826]: I0319 19:42:44.689708 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bb842f105ee30c998efbef465f0c12974492a619e30955d1db350a4d49361b9"} err="failed to get container status \"4bb842f105ee30c998efbef465f0c12974492a619e30955d1db350a4d49361b9\": rpc error: code = NotFound desc = could not find container \"4bb842f105ee30c998efbef465f0c12974492a619e30955d1db350a4d49361b9\": container with ID starting with 4bb842f105ee30c998efbef465f0c12974492a619e30955d1db350a4d49361b9 not found: ID does not exist" Mar 19 19:42:45 crc kubenswrapper[4826]: I0319 19:42:45.998422 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b7e00ec-25a7-42a2-9407-d8aa83b3e2ee" path="/var/lib/kubelet/pods/4b7e00ec-25a7-42a2-9407-d8aa83b3e2ee/volumes" Mar 19 19:43:36 crc kubenswrapper[4826]: I0319 19:43:36.682802 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dvxjf"] Mar 19 19:43:36 crc kubenswrapper[4826]: E0319 19:43:36.684036 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b7e00ec-25a7-42a2-9407-d8aa83b3e2ee" containerName="registry-server" Mar 19 19:43:36 crc kubenswrapper[4826]: I0319 19:43:36.684056 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b7e00ec-25a7-42a2-9407-d8aa83b3e2ee" containerName="registry-server" Mar 19 19:43:36 crc kubenswrapper[4826]: E0319 19:43:36.684115 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b7e00ec-25a7-42a2-9407-d8aa83b3e2ee" containerName="extract-content" Mar 19 19:43:36 crc kubenswrapper[4826]: I0319 19:43:36.684126 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b7e00ec-25a7-42a2-9407-d8aa83b3e2ee" containerName="extract-content" Mar 19 19:43:36 crc kubenswrapper[4826]: E0319 19:43:36.684161 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b7e00ec-25a7-42a2-9407-d8aa83b3e2ee" containerName="extract-utilities" Mar 19 19:43:36 crc kubenswrapper[4826]: I0319 19:43:36.684170 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b7e00ec-25a7-42a2-9407-d8aa83b3e2ee" containerName="extract-utilities" Mar 19 19:43:36 crc kubenswrapper[4826]: I0319 19:43:36.684643 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b7e00ec-25a7-42a2-9407-d8aa83b3e2ee" containerName="registry-server" Mar 19 19:43:36 crc kubenswrapper[4826]: I0319 19:43:36.686966 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dvxjf" Mar 19 19:43:36 crc kubenswrapper[4826]: I0319 19:43:36.700640 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dvxjf"] Mar 19 19:43:36 crc kubenswrapper[4826]: I0319 19:43:36.758238 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea0b635b-b959-4ba8-a150-963b477bfab5-catalog-content\") pod \"redhat-operators-dvxjf\" (UID: \"ea0b635b-b959-4ba8-a150-963b477bfab5\") " pod="openshift-marketplace/redhat-operators-dvxjf" Mar 19 19:43:36 crc kubenswrapper[4826]: I0319 19:43:36.758561 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cl527\" (UniqueName: \"kubernetes.io/projected/ea0b635b-b959-4ba8-a150-963b477bfab5-kube-api-access-cl527\") pod \"redhat-operators-dvxjf\" (UID: \"ea0b635b-b959-4ba8-a150-963b477bfab5\") " pod="openshift-marketplace/redhat-operators-dvxjf" Mar 19 19:43:36 crc kubenswrapper[4826]: I0319 19:43:36.758642 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea0b635b-b959-4ba8-a150-963b477bfab5-utilities\") pod \"redhat-operators-dvxjf\" (UID: \"ea0b635b-b959-4ba8-a150-963b477bfab5\") " pod="openshift-marketplace/redhat-operators-dvxjf" Mar 19 19:43:36 crc kubenswrapper[4826]: I0319 19:43:36.860838 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea0b635b-b959-4ba8-a150-963b477bfab5-catalog-content\") pod \"redhat-operators-dvxjf\" (UID: \"ea0b635b-b959-4ba8-a150-963b477bfab5\") " pod="openshift-marketplace/redhat-operators-dvxjf" Mar 19 19:43:36 crc kubenswrapper[4826]: I0319 19:43:36.860903 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cl527\" (UniqueName: \"kubernetes.io/projected/ea0b635b-b959-4ba8-a150-963b477bfab5-kube-api-access-cl527\") pod \"redhat-operators-dvxjf\" (UID: \"ea0b635b-b959-4ba8-a150-963b477bfab5\") " pod="openshift-marketplace/redhat-operators-dvxjf" Mar 19 19:43:36 crc kubenswrapper[4826]: I0319 19:43:36.860977 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea0b635b-b959-4ba8-a150-963b477bfab5-utilities\") pod \"redhat-operators-dvxjf\" (UID: \"ea0b635b-b959-4ba8-a150-963b477bfab5\") " pod="openshift-marketplace/redhat-operators-dvxjf" Mar 19 19:43:36 crc kubenswrapper[4826]: I0319 19:43:36.861446 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea0b635b-b959-4ba8-a150-963b477bfab5-utilities\") pod \"redhat-operators-dvxjf\" (UID: \"ea0b635b-b959-4ba8-a150-963b477bfab5\") " pod="openshift-marketplace/redhat-operators-dvxjf" Mar 19 19:43:36 crc kubenswrapper[4826]: I0319 19:43:36.861664 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea0b635b-b959-4ba8-a150-963b477bfab5-catalog-content\") pod \"redhat-operators-dvxjf\" (UID: \"ea0b635b-b959-4ba8-a150-963b477bfab5\") " pod="openshift-marketplace/redhat-operators-dvxjf" Mar 19 19:43:36 crc kubenswrapper[4826]: I0319 19:43:36.883210 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cl527\" (UniqueName: \"kubernetes.io/projected/ea0b635b-b959-4ba8-a150-963b477bfab5-kube-api-access-cl527\") pod \"redhat-operators-dvxjf\" (UID: \"ea0b635b-b959-4ba8-a150-963b477bfab5\") " pod="openshift-marketplace/redhat-operators-dvxjf" Mar 19 19:43:37 crc kubenswrapper[4826]: I0319 19:43:37.024153 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dvxjf" Mar 19 19:43:37 crc kubenswrapper[4826]: I0319 19:43:37.503365 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dvxjf"] Mar 19 19:43:38 crc kubenswrapper[4826]: I0319 19:43:38.252120 4826 generic.go:334] "Generic (PLEG): container finished" podID="ea0b635b-b959-4ba8-a150-963b477bfab5" containerID="a71fee4141272de01a8eee2c35a30dd3f0a4c67d3f5c2801290ec888921ad719" exitCode=0 Mar 19 19:43:38 crc kubenswrapper[4826]: I0319 19:43:38.252206 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dvxjf" event={"ID":"ea0b635b-b959-4ba8-a150-963b477bfab5","Type":"ContainerDied","Data":"a71fee4141272de01a8eee2c35a30dd3f0a4c67d3f5c2801290ec888921ad719"} Mar 19 19:43:38 crc kubenswrapper[4826]: I0319 19:43:38.252396 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dvxjf" event={"ID":"ea0b635b-b959-4ba8-a150-963b477bfab5","Type":"ContainerStarted","Data":"b6df0d97e40d57d0bf90ba6d99f330a5b6890d3ce20a248810f60abd3722de50"} Mar 19 19:43:40 crc kubenswrapper[4826]: I0319 19:43:40.287961 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dvxjf" event={"ID":"ea0b635b-b959-4ba8-a150-963b477bfab5","Type":"ContainerStarted","Data":"772e282efee1825963797e837c1356fd9e342391ae204cac3f45a2c06443cbca"} Mar 19 19:43:45 crc kubenswrapper[4826]: I0319 19:43:45.353446 4826 generic.go:334] "Generic (PLEG): container finished" podID="ea0b635b-b959-4ba8-a150-963b477bfab5" containerID="772e282efee1825963797e837c1356fd9e342391ae204cac3f45a2c06443cbca" exitCode=0 Mar 19 19:43:45 crc kubenswrapper[4826]: I0319 19:43:45.353541 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dvxjf" event={"ID":"ea0b635b-b959-4ba8-a150-963b477bfab5","Type":"ContainerDied","Data":"772e282efee1825963797e837c1356fd9e342391ae204cac3f45a2c06443cbca"} Mar 19 19:43:46 crc kubenswrapper[4826]: I0319 19:43:46.372625 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dvxjf" event={"ID":"ea0b635b-b959-4ba8-a150-963b477bfab5","Type":"ContainerStarted","Data":"d3bf965b8759a729d24cac64d2d359652a4afdd7e5b51f0b2e75ed6642368417"} Mar 19 19:43:46 crc kubenswrapper[4826]: I0319 19:43:46.396840 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dvxjf" podStartSLOduration=2.872705425 podStartE2EDuration="10.39682165s" podCreationTimestamp="2026-03-19 19:43:36 +0000 UTC" firstStartedPulling="2026-03-19 19:43:38.254201773 +0000 UTC m=+2843.008270086" lastFinishedPulling="2026-03-19 19:43:45.778317998 +0000 UTC m=+2850.532386311" observedRunningTime="2026-03-19 19:43:46.387081033 +0000 UTC m=+2851.141149356" watchObservedRunningTime="2026-03-19 19:43:46.39682165 +0000 UTC m=+2851.150889973" Mar 19 19:43:47 crc kubenswrapper[4826]: I0319 19:43:47.025033 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dvxjf" Mar 19 19:43:47 crc kubenswrapper[4826]: I0319 19:43:47.026817 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dvxjf" Mar 19 19:43:48 crc kubenswrapper[4826]: I0319 19:43:48.094343 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dvxjf" podUID="ea0b635b-b959-4ba8-a150-963b477bfab5" containerName="registry-server" probeResult="failure" output=< Mar 19 19:43:48 crc kubenswrapper[4826]: timeout: failed to connect service ":50051" within 1s Mar 19 19:43:48 crc kubenswrapper[4826]: > Mar 19 19:43:55 crc kubenswrapper[4826]: I0319 19:43:55.401084 4826 patch_prober.go:28] interesting pod/machine-config-daemon-zz87p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:43:55 crc kubenswrapper[4826]: I0319 19:43:55.401602 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:43:57 crc kubenswrapper[4826]: I0319 19:43:57.077482 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dvxjf" Mar 19 19:43:57 crc kubenswrapper[4826]: I0319 19:43:57.215431 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dvxjf" Mar 19 19:43:57 crc kubenswrapper[4826]: I0319 19:43:57.356589 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dvxjf"] Mar 19 19:43:58 crc kubenswrapper[4826]: I0319 19:43:58.495385 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dvxjf" podUID="ea0b635b-b959-4ba8-a150-963b477bfab5" containerName="registry-server" containerID="cri-o://d3bf965b8759a729d24cac64d2d359652a4afdd7e5b51f0b2e75ed6642368417" gracePeriod=2 Mar 19 19:43:59 crc kubenswrapper[4826]: I0319 19:43:59.015998 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dvxjf" Mar 19 19:43:59 crc kubenswrapper[4826]: I0319 19:43:59.183477 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea0b635b-b959-4ba8-a150-963b477bfab5-utilities\") pod \"ea0b635b-b959-4ba8-a150-963b477bfab5\" (UID: \"ea0b635b-b959-4ba8-a150-963b477bfab5\") " Mar 19 19:43:59 crc kubenswrapper[4826]: I0319 19:43:59.183569 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cl527\" (UniqueName: \"kubernetes.io/projected/ea0b635b-b959-4ba8-a150-963b477bfab5-kube-api-access-cl527\") pod \"ea0b635b-b959-4ba8-a150-963b477bfab5\" (UID: \"ea0b635b-b959-4ba8-a150-963b477bfab5\") " Mar 19 19:43:59 crc kubenswrapper[4826]: I0319 19:43:59.183985 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea0b635b-b959-4ba8-a150-963b477bfab5-catalog-content\") pod \"ea0b635b-b959-4ba8-a150-963b477bfab5\" (UID: \"ea0b635b-b959-4ba8-a150-963b477bfab5\") " Mar 19 19:43:59 crc kubenswrapper[4826]: I0319 19:43:59.184712 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea0b635b-b959-4ba8-a150-963b477bfab5-utilities" (OuterVolumeSpecName: "utilities") pod "ea0b635b-b959-4ba8-a150-963b477bfab5" (UID: "ea0b635b-b959-4ba8-a150-963b477bfab5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:43:59 crc kubenswrapper[4826]: I0319 19:43:59.193858 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea0b635b-b959-4ba8-a150-963b477bfab5-kube-api-access-cl527" (OuterVolumeSpecName: "kube-api-access-cl527") pod "ea0b635b-b959-4ba8-a150-963b477bfab5" (UID: "ea0b635b-b959-4ba8-a150-963b477bfab5"). InnerVolumeSpecName "kube-api-access-cl527". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:43:59 crc kubenswrapper[4826]: I0319 19:43:59.287796 4826 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea0b635b-b959-4ba8-a150-963b477bfab5-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 19:43:59 crc kubenswrapper[4826]: I0319 19:43:59.288093 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cl527\" (UniqueName: \"kubernetes.io/projected/ea0b635b-b959-4ba8-a150-963b477bfab5-kube-api-access-cl527\") on node \"crc\" DevicePath \"\"" Mar 19 19:43:59 crc kubenswrapper[4826]: I0319 19:43:59.333031 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea0b635b-b959-4ba8-a150-963b477bfab5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ea0b635b-b959-4ba8-a150-963b477bfab5" (UID: "ea0b635b-b959-4ba8-a150-963b477bfab5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:43:59 crc kubenswrapper[4826]: I0319 19:43:59.391538 4826 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea0b635b-b959-4ba8-a150-963b477bfab5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 19:43:59 crc kubenswrapper[4826]: I0319 19:43:59.526430 4826 generic.go:334] "Generic (PLEG): container finished" podID="ea0b635b-b959-4ba8-a150-963b477bfab5" containerID="d3bf965b8759a729d24cac64d2d359652a4afdd7e5b51f0b2e75ed6642368417" exitCode=0 Mar 19 19:43:59 crc kubenswrapper[4826]: I0319 19:43:59.526585 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dvxjf" Mar 19 19:43:59 crc kubenswrapper[4826]: I0319 19:43:59.526582 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dvxjf" event={"ID":"ea0b635b-b959-4ba8-a150-963b477bfab5","Type":"ContainerDied","Data":"d3bf965b8759a729d24cac64d2d359652a4afdd7e5b51f0b2e75ed6642368417"} Mar 19 19:43:59 crc kubenswrapper[4826]: I0319 19:43:59.531447 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dvxjf" event={"ID":"ea0b635b-b959-4ba8-a150-963b477bfab5","Type":"ContainerDied","Data":"b6df0d97e40d57d0bf90ba6d99f330a5b6890d3ce20a248810f60abd3722de50"} Mar 19 19:43:59 crc kubenswrapper[4826]: I0319 19:43:59.531602 4826 scope.go:117] "RemoveContainer" containerID="d3bf965b8759a729d24cac64d2d359652a4afdd7e5b51f0b2e75ed6642368417" Mar 19 19:43:59 crc kubenswrapper[4826]: I0319 19:43:59.570105 4826 scope.go:117] "RemoveContainer" containerID="772e282efee1825963797e837c1356fd9e342391ae204cac3f45a2c06443cbca" Mar 19 19:43:59 crc kubenswrapper[4826]: I0319 19:43:59.606925 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dvxjf"] Mar 19 19:43:59 crc kubenswrapper[4826]: I0319 19:43:59.622035 4826 scope.go:117] "RemoveContainer" containerID="a71fee4141272de01a8eee2c35a30dd3f0a4c67d3f5c2801290ec888921ad719" Mar 19 19:43:59 crc kubenswrapper[4826]: I0319 19:43:59.622453 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dvxjf"] Mar 19 19:43:59 crc kubenswrapper[4826]: I0319 19:43:59.673688 4826 scope.go:117] "RemoveContainer" containerID="d3bf965b8759a729d24cac64d2d359652a4afdd7e5b51f0b2e75ed6642368417" Mar 19 19:43:59 crc kubenswrapper[4826]: E0319 19:43:59.674556 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3bf965b8759a729d24cac64d2d359652a4afdd7e5b51f0b2e75ed6642368417\": container with ID starting with d3bf965b8759a729d24cac64d2d359652a4afdd7e5b51f0b2e75ed6642368417 not found: ID does not exist" containerID="d3bf965b8759a729d24cac64d2d359652a4afdd7e5b51f0b2e75ed6642368417" Mar 19 19:43:59 crc kubenswrapper[4826]: I0319 19:43:59.674608 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3bf965b8759a729d24cac64d2d359652a4afdd7e5b51f0b2e75ed6642368417"} err="failed to get container status \"d3bf965b8759a729d24cac64d2d359652a4afdd7e5b51f0b2e75ed6642368417\": rpc error: code = NotFound desc = could not find container \"d3bf965b8759a729d24cac64d2d359652a4afdd7e5b51f0b2e75ed6642368417\": container with ID starting with d3bf965b8759a729d24cac64d2d359652a4afdd7e5b51f0b2e75ed6642368417 not found: ID does not exist" Mar 19 19:43:59 crc kubenswrapper[4826]: I0319 19:43:59.674641 4826 scope.go:117] "RemoveContainer" containerID="772e282efee1825963797e837c1356fd9e342391ae204cac3f45a2c06443cbca" Mar 19 19:43:59 crc kubenswrapper[4826]: E0319 19:43:59.674990 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"772e282efee1825963797e837c1356fd9e342391ae204cac3f45a2c06443cbca\": container with ID starting with 772e282efee1825963797e837c1356fd9e342391ae204cac3f45a2c06443cbca not found: ID does not exist" containerID="772e282efee1825963797e837c1356fd9e342391ae204cac3f45a2c06443cbca" Mar 19 19:43:59 crc kubenswrapper[4826]: I0319 19:43:59.675016 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"772e282efee1825963797e837c1356fd9e342391ae204cac3f45a2c06443cbca"} err="failed to get container status \"772e282efee1825963797e837c1356fd9e342391ae204cac3f45a2c06443cbca\": rpc error: code = NotFound desc = could not find container \"772e282efee1825963797e837c1356fd9e342391ae204cac3f45a2c06443cbca\": container with ID starting with 772e282efee1825963797e837c1356fd9e342391ae204cac3f45a2c06443cbca not found: ID does not exist" Mar 19 19:43:59 crc kubenswrapper[4826]: I0319 19:43:59.675040 4826 scope.go:117] "RemoveContainer" containerID="a71fee4141272de01a8eee2c35a30dd3f0a4c67d3f5c2801290ec888921ad719" Mar 19 19:43:59 crc kubenswrapper[4826]: E0319 19:43:59.675580 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a71fee4141272de01a8eee2c35a30dd3f0a4c67d3f5c2801290ec888921ad719\": container with ID starting with a71fee4141272de01a8eee2c35a30dd3f0a4c67d3f5c2801290ec888921ad719 not found: ID does not exist" containerID="a71fee4141272de01a8eee2c35a30dd3f0a4c67d3f5c2801290ec888921ad719" Mar 19 19:43:59 crc kubenswrapper[4826]: I0319 19:43:59.675708 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a71fee4141272de01a8eee2c35a30dd3f0a4c67d3f5c2801290ec888921ad719"} err="failed to get container status \"a71fee4141272de01a8eee2c35a30dd3f0a4c67d3f5c2801290ec888921ad719\": rpc error: code = NotFound desc = could not find container \"a71fee4141272de01a8eee2c35a30dd3f0a4c67d3f5c2801290ec888921ad719\": container with ID starting with a71fee4141272de01a8eee2c35a30dd3f0a4c67d3f5c2801290ec888921ad719 not found: ID does not exist" Mar 19 19:43:59 crc kubenswrapper[4826]: I0319 19:43:59.998215 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea0b635b-b959-4ba8-a150-963b477bfab5" path="/var/lib/kubelet/pods/ea0b635b-b959-4ba8-a150-963b477bfab5/volumes" Mar 19 19:44:00 crc kubenswrapper[4826]: I0319 19:44:00.167498 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565824-5zkh8"] Mar 19 19:44:00 crc kubenswrapper[4826]: E0319 19:44:00.168400 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea0b635b-b959-4ba8-a150-963b477bfab5" containerName="registry-server" Mar 19 19:44:00 crc kubenswrapper[4826]: I0319 19:44:00.168442 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea0b635b-b959-4ba8-a150-963b477bfab5" containerName="registry-server" Mar 19 19:44:00 crc kubenswrapper[4826]: E0319 19:44:00.168526 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea0b635b-b959-4ba8-a150-963b477bfab5" containerName="extract-content" Mar 19 19:44:00 crc kubenswrapper[4826]: I0319 19:44:00.168551 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea0b635b-b959-4ba8-a150-963b477bfab5" containerName="extract-content" Mar 19 19:44:00 crc kubenswrapper[4826]: E0319 19:44:00.168634 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea0b635b-b959-4ba8-a150-963b477bfab5" containerName="extract-utilities" Mar 19 19:44:00 crc kubenswrapper[4826]: I0319 19:44:00.168713 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea0b635b-b959-4ba8-a150-963b477bfab5" containerName="extract-utilities" Mar 19 19:44:00 crc kubenswrapper[4826]: I0319 19:44:00.169165 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea0b635b-b959-4ba8-a150-963b477bfab5" containerName="registry-server" Mar 19 19:44:00 crc kubenswrapper[4826]: I0319 19:44:00.170768 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565824-5zkh8" Mar 19 19:44:00 crc kubenswrapper[4826]: I0319 19:44:00.173810 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b27wl" Mar 19 19:44:00 crc kubenswrapper[4826]: I0319 19:44:00.174027 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 19:44:00 crc kubenswrapper[4826]: I0319 19:44:00.174236 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 19:44:00 crc kubenswrapper[4826]: I0319 19:44:00.191907 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565824-5zkh8"] Mar 19 19:44:00 crc kubenswrapper[4826]: I0319 19:44:00.216012 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dr52l\" (UniqueName: \"kubernetes.io/projected/3edb6db8-83eb-40a2-b680-cdbf4524b608-kube-api-access-dr52l\") pod \"auto-csr-approver-29565824-5zkh8\" (UID: \"3edb6db8-83eb-40a2-b680-cdbf4524b608\") " pod="openshift-infra/auto-csr-approver-29565824-5zkh8" Mar 19 19:44:00 crc kubenswrapper[4826]: I0319 19:44:00.319930 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dr52l\" (UniqueName: \"kubernetes.io/projected/3edb6db8-83eb-40a2-b680-cdbf4524b608-kube-api-access-dr52l\") pod \"auto-csr-approver-29565824-5zkh8\" (UID: \"3edb6db8-83eb-40a2-b680-cdbf4524b608\") " pod="openshift-infra/auto-csr-approver-29565824-5zkh8" Mar 19 19:44:00 crc kubenswrapper[4826]: I0319 19:44:00.342419 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dr52l\" (UniqueName: \"kubernetes.io/projected/3edb6db8-83eb-40a2-b680-cdbf4524b608-kube-api-access-dr52l\") pod \"auto-csr-approver-29565824-5zkh8\" (UID: \"3edb6db8-83eb-40a2-b680-cdbf4524b608\") " pod="openshift-infra/auto-csr-approver-29565824-5zkh8" Mar 19 19:44:00 crc kubenswrapper[4826]: I0319 19:44:00.494811 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565824-5zkh8" Mar 19 19:44:01 crc kubenswrapper[4826]: I0319 19:44:01.050238 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565824-5zkh8"] Mar 19 19:44:01 crc kubenswrapper[4826]: W0319 19:44:01.056523 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3edb6db8_83eb_40a2_b680_cdbf4524b608.slice/crio-5c8f899bd640f1d75a254e7a609d5e99de78fe72608c651a82d83cdae4fbc843 WatchSource:0}: Error finding container 5c8f899bd640f1d75a254e7a609d5e99de78fe72608c651a82d83cdae4fbc843: Status 404 returned error can't find the container with id 5c8f899bd640f1d75a254e7a609d5e99de78fe72608c651a82d83cdae4fbc843 Mar 19 19:44:01 crc kubenswrapper[4826]: I0319 19:44:01.562610 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565824-5zkh8" event={"ID":"3edb6db8-83eb-40a2-b680-cdbf4524b608","Type":"ContainerStarted","Data":"5c8f899bd640f1d75a254e7a609d5e99de78fe72608c651a82d83cdae4fbc843"} Mar 19 19:44:02 crc kubenswrapper[4826]: I0319 19:44:02.573334 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565824-5zkh8" event={"ID":"3edb6db8-83eb-40a2-b680-cdbf4524b608","Type":"ContainerStarted","Data":"65c80aeee7cb6c0e621b68a4b4202a2ff50636e37fdeb37cbf5da06e3aea8210"} Mar 19 19:44:02 crc kubenswrapper[4826]: I0319 19:44:02.597188 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565824-5zkh8" podStartSLOduration=1.7436448150000001 podStartE2EDuration="2.597172468s" podCreationTimestamp="2026-03-19 19:44:00 +0000 UTC" firstStartedPulling="2026-03-19 19:44:01.059788373 +0000 UTC m=+2865.813856706" lastFinishedPulling="2026-03-19 19:44:01.913316036 +0000 UTC m=+2866.667384359" observedRunningTime="2026-03-19 19:44:02.585763671 +0000 UTC m=+2867.339831974" watchObservedRunningTime="2026-03-19 19:44:02.597172468 +0000 UTC m=+2867.351240781" Mar 19 19:44:03 crc kubenswrapper[4826]: I0319 19:44:03.583413 4826 generic.go:334] "Generic (PLEG): container finished" podID="3edb6db8-83eb-40a2-b680-cdbf4524b608" containerID="65c80aeee7cb6c0e621b68a4b4202a2ff50636e37fdeb37cbf5da06e3aea8210" exitCode=0 Mar 19 19:44:03 crc kubenswrapper[4826]: I0319 19:44:03.583467 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565824-5zkh8" event={"ID":"3edb6db8-83eb-40a2-b680-cdbf4524b608","Type":"ContainerDied","Data":"65c80aeee7cb6c0e621b68a4b4202a2ff50636e37fdeb37cbf5da06e3aea8210"} Mar 19 19:44:05 crc kubenswrapper[4826]: I0319 19:44:05.023275 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565824-5zkh8" Mar 19 19:44:05 crc kubenswrapper[4826]: I0319 19:44:05.057154 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dr52l\" (UniqueName: \"kubernetes.io/projected/3edb6db8-83eb-40a2-b680-cdbf4524b608-kube-api-access-dr52l\") pod \"3edb6db8-83eb-40a2-b680-cdbf4524b608\" (UID: \"3edb6db8-83eb-40a2-b680-cdbf4524b608\") " Mar 19 19:44:05 crc kubenswrapper[4826]: I0319 19:44:05.069180 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3edb6db8-83eb-40a2-b680-cdbf4524b608-kube-api-access-dr52l" (OuterVolumeSpecName: "kube-api-access-dr52l") pod "3edb6db8-83eb-40a2-b680-cdbf4524b608" (UID: "3edb6db8-83eb-40a2-b680-cdbf4524b608"). InnerVolumeSpecName "kube-api-access-dr52l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:44:05 crc kubenswrapper[4826]: I0319 19:44:05.160238 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dr52l\" (UniqueName: \"kubernetes.io/projected/3edb6db8-83eb-40a2-b680-cdbf4524b608-kube-api-access-dr52l\") on node \"crc\" DevicePath \"\"" Mar 19 19:44:05 crc kubenswrapper[4826]: I0319 19:44:05.612412 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565824-5zkh8" event={"ID":"3edb6db8-83eb-40a2-b680-cdbf4524b608","Type":"ContainerDied","Data":"5c8f899bd640f1d75a254e7a609d5e99de78fe72608c651a82d83cdae4fbc843"} Mar 19 19:44:05 crc kubenswrapper[4826]: I0319 19:44:05.612452 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c8f899bd640f1d75a254e7a609d5e99de78fe72608c651a82d83cdae4fbc843" Mar 19 19:44:05 crc kubenswrapper[4826]: I0319 19:44:05.612475 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565824-5zkh8" Mar 19 19:44:05 crc kubenswrapper[4826]: I0319 19:44:05.697153 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565818-wnkgs"] Mar 19 19:44:05 crc kubenswrapper[4826]: I0319 19:44:05.711061 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565818-wnkgs"] Mar 19 19:44:06 crc kubenswrapper[4826]: I0319 19:44:05.998156 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61abc61b-9740-4acd-9ea8-5d49946d31cd" path="/var/lib/kubelet/pods/61abc61b-9740-4acd-9ea8-5d49946d31cd/volumes" Mar 19 19:44:25 crc kubenswrapper[4826]: I0319 19:44:25.401208 4826 patch_prober.go:28] interesting pod/machine-config-daemon-zz87p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:44:25 crc kubenswrapper[4826]: I0319 19:44:25.403511 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:44:25 crc kubenswrapper[4826]: I0319 19:44:25.874608 4826 generic.go:334] "Generic (PLEG): container finished" podID="b0112fd9-267c-4357-8120-f42c43662900" containerID="25615835ae77cc7386e0da64324b69147ed0cb70121720fa2b7e264997c4be29" exitCode=0 Mar 19 19:44:25 crc kubenswrapper[4826]: I0319 19:44:25.874697 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-78qhl" event={"ID":"b0112fd9-267c-4357-8120-f42c43662900","Type":"ContainerDied","Data":"25615835ae77cc7386e0da64324b69147ed0cb70121720fa2b7e264997c4be29"} Mar 19 19:44:27 crc kubenswrapper[4826]: I0319 19:44:27.556427 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-78qhl" Mar 19 19:44:27 crc kubenswrapper[4826]: I0319 19:44:27.640292 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b0112fd9-267c-4357-8120-f42c43662900-ssh-key-openstack-edpm-ipam\") pod \"b0112fd9-267c-4357-8120-f42c43662900\" (UID: \"b0112fd9-267c-4357-8120-f42c43662900\") " Mar 19 19:44:27 crc kubenswrapper[4826]: I0319 19:44:27.640367 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/b0112fd9-267c-4357-8120-f42c43662900-ceilometer-compute-config-data-2\") pod \"b0112fd9-267c-4357-8120-f42c43662900\" (UID: \"b0112fd9-267c-4357-8120-f42c43662900\") " Mar 19 19:44:27 crc kubenswrapper[4826]: I0319 19:44:27.640422 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0112fd9-267c-4357-8120-f42c43662900-inventory\") pod \"b0112fd9-267c-4357-8120-f42c43662900\" (UID: \"b0112fd9-267c-4357-8120-f42c43662900\") " Mar 19 19:44:27 crc kubenswrapper[4826]: I0319 19:44:27.640496 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0112fd9-267c-4357-8120-f42c43662900-telemetry-combined-ca-bundle\") pod \"b0112fd9-267c-4357-8120-f42c43662900\" (UID: \"b0112fd9-267c-4357-8120-f42c43662900\") " Mar 19 19:44:27 crc kubenswrapper[4826]: I0319 19:44:27.640590 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b0112fd9-267c-4357-8120-f42c43662900-ceilometer-compute-config-data-1\") pod \"b0112fd9-267c-4357-8120-f42c43662900\" (UID: \"b0112fd9-267c-4357-8120-f42c43662900\") " Mar 19 19:44:27 crc kubenswrapper[4826]: I0319 19:44:27.640612 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b0112fd9-267c-4357-8120-f42c43662900-ceilometer-compute-config-data-0\") pod \"b0112fd9-267c-4357-8120-f42c43662900\" (UID: \"b0112fd9-267c-4357-8120-f42c43662900\") " Mar 19 19:44:27 crc kubenswrapper[4826]: I0319 19:44:27.640628 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg9bf\" (UniqueName: \"kubernetes.io/projected/b0112fd9-267c-4357-8120-f42c43662900-kube-api-access-mg9bf\") pod \"b0112fd9-267c-4357-8120-f42c43662900\" (UID: \"b0112fd9-267c-4357-8120-f42c43662900\") " Mar 19 19:44:27 crc kubenswrapper[4826]: I0319 19:44:27.646481 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0112fd9-267c-4357-8120-f42c43662900-kube-api-access-mg9bf" (OuterVolumeSpecName: "kube-api-access-mg9bf") pod "b0112fd9-267c-4357-8120-f42c43662900" (UID: "b0112fd9-267c-4357-8120-f42c43662900"). InnerVolumeSpecName "kube-api-access-mg9bf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:44:27 crc kubenswrapper[4826]: I0319 19:44:27.651043 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0112fd9-267c-4357-8120-f42c43662900-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "b0112fd9-267c-4357-8120-f42c43662900" (UID: "b0112fd9-267c-4357-8120-f42c43662900"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:44:27 crc kubenswrapper[4826]: I0319 19:44:27.682428 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0112fd9-267c-4357-8120-f42c43662900-inventory" (OuterVolumeSpecName: "inventory") pod "b0112fd9-267c-4357-8120-f42c43662900" (UID: "b0112fd9-267c-4357-8120-f42c43662900"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:44:27 crc kubenswrapper[4826]: I0319 19:44:27.682766 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0112fd9-267c-4357-8120-f42c43662900-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b0112fd9-267c-4357-8120-f42c43662900" (UID: "b0112fd9-267c-4357-8120-f42c43662900"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:44:27 crc kubenswrapper[4826]: I0319 19:44:27.705611 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0112fd9-267c-4357-8120-f42c43662900-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "b0112fd9-267c-4357-8120-f42c43662900" (UID: "b0112fd9-267c-4357-8120-f42c43662900"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:44:27 crc kubenswrapper[4826]: I0319 19:44:27.714811 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0112fd9-267c-4357-8120-f42c43662900-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "b0112fd9-267c-4357-8120-f42c43662900" (UID: "b0112fd9-267c-4357-8120-f42c43662900"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:44:27 crc kubenswrapper[4826]: I0319 19:44:27.726796 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0112fd9-267c-4357-8120-f42c43662900-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "b0112fd9-267c-4357-8120-f42c43662900" (UID: "b0112fd9-267c-4357-8120-f42c43662900"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:44:27 crc kubenswrapper[4826]: I0319 19:44:27.743291 4826 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b0112fd9-267c-4357-8120-f42c43662900-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 19 19:44:27 crc kubenswrapper[4826]: I0319 19:44:27.743320 4826 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b0112fd9-267c-4357-8120-f42c43662900-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 19 19:44:27 crc kubenswrapper[4826]: I0319 19:44:27.743331 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg9bf\" (UniqueName: \"kubernetes.io/projected/b0112fd9-267c-4357-8120-f42c43662900-kube-api-access-mg9bf\") on node \"crc\" DevicePath \"\"" Mar 19 19:44:27 crc kubenswrapper[4826]: I0319 19:44:27.743343 4826 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b0112fd9-267c-4357-8120-f42c43662900-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 19:44:27 crc kubenswrapper[4826]: I0319 19:44:27.743351 4826 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/b0112fd9-267c-4357-8120-f42c43662900-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 19 19:44:27 crc kubenswrapper[4826]: I0319 19:44:27.743361 4826 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0112fd9-267c-4357-8120-f42c43662900-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 19:44:27 crc kubenswrapper[4826]: I0319 19:44:27.743369 4826 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0112fd9-267c-4357-8120-f42c43662900-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:44:27 crc kubenswrapper[4826]: I0319 19:44:27.904093 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-78qhl" event={"ID":"b0112fd9-267c-4357-8120-f42c43662900","Type":"ContainerDied","Data":"6b2961f97ceabd95504ebbaa3065a5f2e7743cd4368fc900782f1e3f59e2f11b"} Mar 19 19:44:27 crc kubenswrapper[4826]: I0319 19:44:27.904825 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b2961f97ceabd95504ebbaa3065a5f2e7743cd4368fc900782f1e3f59e2f11b" Mar 19 19:44:27 crc kubenswrapper[4826]: I0319 19:44:27.904144 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-78qhl" Mar 19 19:44:28 crc kubenswrapper[4826]: I0319 19:44:28.027293 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hzf4h"] Mar 19 19:44:28 crc kubenswrapper[4826]: E0319 19:44:28.027751 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3edb6db8-83eb-40a2-b680-cdbf4524b608" containerName="oc" Mar 19 19:44:28 crc kubenswrapper[4826]: I0319 19:44:28.027769 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="3edb6db8-83eb-40a2-b680-cdbf4524b608" containerName="oc" Mar 19 19:44:28 crc kubenswrapper[4826]: E0319 19:44:28.027788 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0112fd9-267c-4357-8120-f42c43662900" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 19 19:44:28 crc kubenswrapper[4826]: I0319 19:44:28.027795 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0112fd9-267c-4357-8120-f42c43662900" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 19 19:44:28 crc kubenswrapper[4826]: I0319 19:44:28.028048 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0112fd9-267c-4357-8120-f42c43662900" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 19 19:44:28 crc kubenswrapper[4826]: I0319 19:44:28.028081 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="3edb6db8-83eb-40a2-b680-cdbf4524b608" containerName="oc" Mar 19 19:44:28 crc kubenswrapper[4826]: I0319 19:44:28.028824 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hzf4h" Mar 19 19:44:28 crc kubenswrapper[4826]: I0319 19:44:28.031529 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 19:44:28 crc kubenswrapper[4826]: I0319 19:44:28.031623 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-ipmi-config-data" Mar 19 19:44:28 crc kubenswrapper[4826]: I0319 19:44:28.031796 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 19:44:28 crc kubenswrapper[4826]: I0319 19:44:28.032130 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jchxw" Mar 19 19:44:28 crc kubenswrapper[4826]: I0319 19:44:28.032390 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 19:44:28 crc kubenswrapper[4826]: I0319 19:44:28.049892 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hzf4h"] Mar 19 19:44:28 crc kubenswrapper[4826]: I0319 19:44:28.154291 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/22837e37-d9e9-4044-b5bf-34a4c866ebf0-ssh-key-openstack-edpm-ipam\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hzf4h\" (UID: \"22837e37-d9e9-4044-b5bf-34a4c866ebf0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hzf4h" Mar 19 19:44:28 crc kubenswrapper[4826]: I0319 19:44:28.154423 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/22837e37-d9e9-4044-b5bf-34a4c866ebf0-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hzf4h\" (UID: \"22837e37-d9e9-4044-b5bf-34a4c866ebf0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hzf4h" Mar 19 19:44:28 crc kubenswrapper[4826]: I0319 19:44:28.154698 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzvpr\" (UniqueName: \"kubernetes.io/projected/22837e37-d9e9-4044-b5bf-34a4c866ebf0-kube-api-access-lzvpr\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hzf4h\" (UID: \"22837e37-d9e9-4044-b5bf-34a4c866ebf0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hzf4h" Mar 19 19:44:28 crc kubenswrapper[4826]: I0319 19:44:28.154835 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/22837e37-d9e9-4044-b5bf-34a4c866ebf0-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hzf4h\" (UID: \"22837e37-d9e9-4044-b5bf-34a4c866ebf0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hzf4h" Mar 19 19:44:28 crc kubenswrapper[4826]: I0319 19:44:28.154913 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/22837e37-d9e9-4044-b5bf-34a4c866ebf0-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hzf4h\" (UID: \"22837e37-d9e9-4044-b5bf-34a4c866ebf0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hzf4h" Mar 19 19:44:28 crc kubenswrapper[4826]: I0319 19:44:28.155006 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22837e37-d9e9-4044-b5bf-34a4c866ebf0-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hzf4h\" (UID: \"22837e37-d9e9-4044-b5bf-34a4c866ebf0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hzf4h" Mar 19 19:44:28 crc kubenswrapper[4826]: I0319 19:44:28.155071 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/22837e37-d9e9-4044-b5bf-34a4c866ebf0-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hzf4h\" (UID: \"22837e37-d9e9-4044-b5bf-34a4c866ebf0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hzf4h" Mar 19 19:44:28 crc kubenswrapper[4826]: I0319 19:44:28.257585 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/22837e37-d9e9-4044-b5bf-34a4c866ebf0-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hzf4h\" (UID: \"22837e37-d9e9-4044-b5bf-34a4c866ebf0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hzf4h" Mar 19 19:44:28 crc kubenswrapper[4826]: I0319 19:44:28.257706 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/22837e37-d9e9-4044-b5bf-34a4c866ebf0-ssh-key-openstack-edpm-ipam\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hzf4h\" (UID: \"22837e37-d9e9-4044-b5bf-34a4c866ebf0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hzf4h" Mar 19 19:44:28 crc kubenswrapper[4826]: I0319 19:44:28.257768 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/22837e37-d9e9-4044-b5bf-34a4c866ebf0-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hzf4h\" (UID: \"22837e37-d9e9-4044-b5bf-34a4c866ebf0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hzf4h" Mar 19 19:44:28 crc kubenswrapper[4826]: I0319 19:44:28.257858 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzvpr\" (UniqueName: \"kubernetes.io/projected/22837e37-d9e9-4044-b5bf-34a4c866ebf0-kube-api-access-lzvpr\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hzf4h\" (UID: \"22837e37-d9e9-4044-b5bf-34a4c866ebf0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hzf4h" Mar 19 19:44:28 crc kubenswrapper[4826]: I0319 19:44:28.257896 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/22837e37-d9e9-4044-b5bf-34a4c866ebf0-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hzf4h\" (UID: \"22837e37-d9e9-4044-b5bf-34a4c866ebf0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hzf4h" Mar 19 19:44:28 crc kubenswrapper[4826]: I0319 19:44:28.257927 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/22837e37-d9e9-4044-b5bf-34a4c866ebf0-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hzf4h\" (UID: \"22837e37-d9e9-4044-b5bf-34a4c866ebf0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hzf4h" Mar 19 19:44:28 crc kubenswrapper[4826]: I0319 19:44:28.257967 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22837e37-d9e9-4044-b5bf-34a4c866ebf0-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hzf4h\" (UID: \"22837e37-d9e9-4044-b5bf-34a4c866ebf0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hzf4h" Mar 19 19:44:28 crc kubenswrapper[4826]: I0319 19:44:28.265470 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/22837e37-d9e9-4044-b5bf-34a4c866ebf0-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hzf4h\" (UID: \"22837e37-d9e9-4044-b5bf-34a4c866ebf0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hzf4h" Mar 19 19:44:28 crc kubenswrapper[4826]: I0319 19:44:28.265746 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/22837e37-d9e9-4044-b5bf-34a4c866ebf0-ssh-key-openstack-edpm-ipam\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hzf4h\" (UID: \"22837e37-d9e9-4044-b5bf-34a4c866ebf0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hzf4h" Mar 19 19:44:28 crc kubenswrapper[4826]: I0319 19:44:28.265921 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/22837e37-d9e9-4044-b5bf-34a4c866ebf0-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hzf4h\" (UID: \"22837e37-d9e9-4044-b5bf-34a4c866ebf0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hzf4h" Mar 19 19:44:28 crc kubenswrapper[4826]: I0319 19:44:28.266002 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/22837e37-d9e9-4044-b5bf-34a4c866ebf0-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hzf4h\" (UID: \"22837e37-d9e9-4044-b5bf-34a4c866ebf0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hzf4h" Mar 19 19:44:28 crc kubenswrapper[4826]: I0319 19:44:28.266436 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/22837e37-d9e9-4044-b5bf-34a4c866ebf0-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hzf4h\" (UID: \"22837e37-d9e9-4044-b5bf-34a4c866ebf0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hzf4h" Mar 19 19:44:28 crc kubenswrapper[4826]: I0319 19:44:28.269425 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22837e37-d9e9-4044-b5bf-34a4c866ebf0-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hzf4h\" (UID: \"22837e37-d9e9-4044-b5bf-34a4c866ebf0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hzf4h" Mar 19 19:44:28 crc kubenswrapper[4826]: I0319 19:44:28.278095 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzvpr\" (UniqueName: \"kubernetes.io/projected/22837e37-d9e9-4044-b5bf-34a4c866ebf0-kube-api-access-lzvpr\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-hzf4h\" (UID: \"22837e37-d9e9-4044-b5bf-34a4c866ebf0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hzf4h" Mar 19 19:44:28 crc kubenswrapper[4826]: I0319 19:44:28.347063 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hzf4h" Mar 19 19:44:28 crc kubenswrapper[4826]: W0319 19:44:28.994396 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22837e37_d9e9_4044_b5bf_34a4c866ebf0.slice/crio-41cd2003b779cc0d0eed50f57e3ac94370a6aa558673c9143bf153f8d90d22a7 WatchSource:0}: Error finding container 41cd2003b779cc0d0eed50f57e3ac94370a6aa558673c9143bf153f8d90d22a7: Status 404 returned error can't find the container with id 41cd2003b779cc0d0eed50f57e3ac94370a6aa558673c9143bf153f8d90d22a7 Mar 19 19:44:28 crc kubenswrapper[4826]: I0319 19:44:28.997946 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hzf4h"] Mar 19 19:44:29 crc kubenswrapper[4826]: I0319 19:44:29.943383 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hzf4h" event={"ID":"22837e37-d9e9-4044-b5bf-34a4c866ebf0","Type":"ContainerStarted","Data":"4d000d55712c92afb7e23e27e81c3003c45a0b3b984e6f508623bd88346b6770"} Mar 19 19:44:29 crc kubenswrapper[4826]: I0319 19:44:29.943913 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hzf4h" event={"ID":"22837e37-d9e9-4044-b5bf-34a4c866ebf0","Type":"ContainerStarted","Data":"41cd2003b779cc0d0eed50f57e3ac94370a6aa558673c9143bf153f8d90d22a7"} Mar 19 19:44:29 crc kubenswrapper[4826]: I0319 19:44:29.971918 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hzf4h" podStartSLOduration=1.484509637 podStartE2EDuration="1.971890418s" podCreationTimestamp="2026-03-19 19:44:28 +0000 UTC" firstStartedPulling="2026-03-19 19:44:28.997772351 +0000 UTC m=+2893.751840664" lastFinishedPulling="2026-03-19 19:44:29.485153132 +0000 UTC m=+2894.239221445" observedRunningTime="2026-03-19 19:44:29.963926044 +0000 UTC m=+2894.717994357" watchObservedRunningTime="2026-03-19 19:44:29.971890418 +0000 UTC m=+2894.725958761" Mar 19 19:44:33 crc kubenswrapper[4826]: I0319 19:44:33.189210 4826 scope.go:117] "RemoveContainer" containerID="4dc34e9c32d100aaa792c14dabd72182c65b8ac3e20f67568af720d1c7dcfe22" Mar 19 19:44:55 crc kubenswrapper[4826]: I0319 19:44:55.400500 4826 patch_prober.go:28] interesting pod/machine-config-daemon-zz87p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:44:55 crc kubenswrapper[4826]: I0319 19:44:55.401323 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:44:55 crc kubenswrapper[4826]: I0319 19:44:55.401412 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" Mar 19 19:44:55 crc kubenswrapper[4826]: I0319 19:44:55.403045 4826 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"830b819005c423ed80223146f9e5bb7d6cfc2d12b0ead3f59adc695507460ab3"} pod="openshift-machine-config-operator/machine-config-daemon-zz87p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 19:44:55 crc kubenswrapper[4826]: I0319 19:44:55.403182 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" containerID="cri-o://830b819005c423ed80223146f9e5bb7d6cfc2d12b0ead3f59adc695507460ab3" gracePeriod=600 Mar 19 19:44:55 crc kubenswrapper[4826]: E0319 19:44:55.529607 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:44:56 crc kubenswrapper[4826]: I0319 19:44:56.297800 4826 generic.go:334] "Generic (PLEG): container finished" podID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerID="830b819005c423ed80223146f9e5bb7d6cfc2d12b0ead3f59adc695507460ab3" exitCode=0 Mar 19 19:44:56 crc kubenswrapper[4826]: I0319 19:44:56.297857 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" event={"ID":"b456fa3f-c7a7-45ca-b560-e7a9b21be05a","Type":"ContainerDied","Data":"830b819005c423ed80223146f9e5bb7d6cfc2d12b0ead3f59adc695507460ab3"} Mar 19 19:44:56 crc kubenswrapper[4826]: I0319 19:44:56.297895 4826 scope.go:117] "RemoveContainer" containerID="f4d1ea77252a9b2871e0b60c8046505311f4f02f5e721a5099d9cfb1206679db" Mar 19 19:44:56 crc kubenswrapper[4826]: I0319 19:44:56.299134 4826 scope.go:117] "RemoveContainer" containerID="830b819005c423ed80223146f9e5bb7d6cfc2d12b0ead3f59adc695507460ab3" Mar 19 19:44:56 crc kubenswrapper[4826]: E0319 19:44:56.299875 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:45:00 crc kubenswrapper[4826]: I0319 19:45:00.154902 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565825-229hw"] Mar 19 19:45:00 crc kubenswrapper[4826]: I0319 19:45:00.157339 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565825-229hw" Mar 19 19:45:00 crc kubenswrapper[4826]: I0319 19:45:00.159278 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 19 19:45:00 crc kubenswrapper[4826]: I0319 19:45:00.161240 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 19 19:45:00 crc kubenswrapper[4826]: I0319 19:45:00.186847 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565825-229hw"] Mar 19 19:45:00 crc kubenswrapper[4826]: I0319 19:45:00.237460 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/af735e0a-a52c-4254-843b-fd55bee90670-secret-volume\") pod \"collect-profiles-29565825-229hw\" (UID: \"af735e0a-a52c-4254-843b-fd55bee90670\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565825-229hw" Mar 19 19:45:00 crc kubenswrapper[4826]: I0319 19:45:00.237558 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4dtm\" (UniqueName: \"kubernetes.io/projected/af735e0a-a52c-4254-843b-fd55bee90670-kube-api-access-m4dtm\") pod \"collect-profiles-29565825-229hw\" (UID: \"af735e0a-a52c-4254-843b-fd55bee90670\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565825-229hw" Mar 19 19:45:00 crc kubenswrapper[4826]: I0319 19:45:00.237634 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/af735e0a-a52c-4254-843b-fd55bee90670-config-volume\") pod \"collect-profiles-29565825-229hw\" (UID: \"af735e0a-a52c-4254-843b-fd55bee90670\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565825-229hw" Mar 19 19:45:00 crc kubenswrapper[4826]: I0319 19:45:00.340281 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/af735e0a-a52c-4254-843b-fd55bee90670-secret-volume\") pod \"collect-profiles-29565825-229hw\" (UID: \"af735e0a-a52c-4254-843b-fd55bee90670\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565825-229hw" Mar 19 19:45:00 crc kubenswrapper[4826]: I0319 19:45:00.340501 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4dtm\" (UniqueName: \"kubernetes.io/projected/af735e0a-a52c-4254-843b-fd55bee90670-kube-api-access-m4dtm\") pod \"collect-profiles-29565825-229hw\" (UID: \"af735e0a-a52c-4254-843b-fd55bee90670\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565825-229hw" Mar 19 19:45:00 crc kubenswrapper[4826]: I0319 19:45:00.340787 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/af735e0a-a52c-4254-843b-fd55bee90670-config-volume\") pod \"collect-profiles-29565825-229hw\" (UID: \"af735e0a-a52c-4254-843b-fd55bee90670\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565825-229hw" Mar 19 19:45:00 crc kubenswrapper[4826]: I0319 19:45:00.342383 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/af735e0a-a52c-4254-843b-fd55bee90670-config-volume\") pod \"collect-profiles-29565825-229hw\" (UID: \"af735e0a-a52c-4254-843b-fd55bee90670\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565825-229hw" Mar 19 19:45:00 crc kubenswrapper[4826]: I0319 19:45:00.346889 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/af735e0a-a52c-4254-843b-fd55bee90670-secret-volume\") pod \"collect-profiles-29565825-229hw\" (UID: \"af735e0a-a52c-4254-843b-fd55bee90670\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565825-229hw" Mar 19 19:45:00 crc kubenswrapper[4826]: I0319 19:45:00.360363 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4dtm\" (UniqueName: \"kubernetes.io/projected/af735e0a-a52c-4254-843b-fd55bee90670-kube-api-access-m4dtm\") pod \"collect-profiles-29565825-229hw\" (UID: \"af735e0a-a52c-4254-843b-fd55bee90670\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565825-229hw" Mar 19 19:45:00 crc kubenswrapper[4826]: I0319 19:45:00.485378 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565825-229hw" Mar 19 19:45:01 crc kubenswrapper[4826]: I0319 19:45:01.055033 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565825-229hw"] Mar 19 19:45:01 crc kubenswrapper[4826]: I0319 19:45:01.412307 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565825-229hw" event={"ID":"af735e0a-a52c-4254-843b-fd55bee90670","Type":"ContainerStarted","Data":"bc6081120dcaba9b18b8df60a7de48e7bc07d3c360e614e940234e59143fd2c7"} Mar 19 19:45:01 crc kubenswrapper[4826]: I0319 19:45:01.412741 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565825-229hw" event={"ID":"af735e0a-a52c-4254-843b-fd55bee90670","Type":"ContainerStarted","Data":"ad667f8aea0b0c573c31ffd288d10549d1927d16c25d61662e76f10afbb5e2e1"} Mar 19 19:45:01 crc kubenswrapper[4826]: I0319 19:45:01.440475 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29565825-229hw" podStartSLOduration=1.440455634 podStartE2EDuration="1.440455634s" podCreationTimestamp="2026-03-19 19:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 19:45:01.430600556 +0000 UTC m=+2926.184668879" watchObservedRunningTime="2026-03-19 19:45:01.440455634 +0000 UTC m=+2926.194523947" Mar 19 19:45:02 crc kubenswrapper[4826]: I0319 19:45:02.429380 4826 generic.go:334] "Generic (PLEG): container finished" podID="af735e0a-a52c-4254-843b-fd55bee90670" containerID="bc6081120dcaba9b18b8df60a7de48e7bc07d3c360e614e940234e59143fd2c7" exitCode=0 Mar 19 19:45:02 crc kubenswrapper[4826]: I0319 19:45:02.429437 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565825-229hw" event={"ID":"af735e0a-a52c-4254-843b-fd55bee90670","Type":"ContainerDied","Data":"bc6081120dcaba9b18b8df60a7de48e7bc07d3c360e614e940234e59143fd2c7"} Mar 19 19:45:03 crc kubenswrapper[4826]: I0319 19:45:03.870932 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565825-229hw" Mar 19 19:45:03 crc kubenswrapper[4826]: I0319 19:45:03.934169 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4dtm\" (UniqueName: \"kubernetes.io/projected/af735e0a-a52c-4254-843b-fd55bee90670-kube-api-access-m4dtm\") pod \"af735e0a-a52c-4254-843b-fd55bee90670\" (UID: \"af735e0a-a52c-4254-843b-fd55bee90670\") " Mar 19 19:45:03 crc kubenswrapper[4826]: I0319 19:45:03.934359 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/af735e0a-a52c-4254-843b-fd55bee90670-secret-volume\") pod \"af735e0a-a52c-4254-843b-fd55bee90670\" (UID: \"af735e0a-a52c-4254-843b-fd55bee90670\") " Mar 19 19:45:03 crc kubenswrapper[4826]: I0319 19:45:03.934400 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/af735e0a-a52c-4254-843b-fd55bee90670-config-volume\") pod \"af735e0a-a52c-4254-843b-fd55bee90670\" (UID: \"af735e0a-a52c-4254-843b-fd55bee90670\") " Mar 19 19:45:03 crc kubenswrapper[4826]: I0319 19:45:03.935709 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af735e0a-a52c-4254-843b-fd55bee90670-config-volume" (OuterVolumeSpecName: "config-volume") pod "af735e0a-a52c-4254-843b-fd55bee90670" (UID: "af735e0a-a52c-4254-843b-fd55bee90670"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 19:45:03 crc kubenswrapper[4826]: I0319 19:45:03.945936 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af735e0a-a52c-4254-843b-fd55bee90670-kube-api-access-m4dtm" (OuterVolumeSpecName: "kube-api-access-m4dtm") pod "af735e0a-a52c-4254-843b-fd55bee90670" (UID: "af735e0a-a52c-4254-843b-fd55bee90670"). InnerVolumeSpecName "kube-api-access-m4dtm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:45:03 crc kubenswrapper[4826]: I0319 19:45:03.956857 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af735e0a-a52c-4254-843b-fd55bee90670-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "af735e0a-a52c-4254-843b-fd55bee90670" (UID: "af735e0a-a52c-4254-843b-fd55bee90670"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:45:04 crc kubenswrapper[4826]: I0319 19:45:04.039227 4826 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/af735e0a-a52c-4254-843b-fd55bee90670-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 19 19:45:04 crc kubenswrapper[4826]: I0319 19:45:04.040297 4826 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/af735e0a-a52c-4254-843b-fd55bee90670-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 19:45:04 crc kubenswrapper[4826]: I0319 19:45:04.040327 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4dtm\" (UniqueName: \"kubernetes.io/projected/af735e0a-a52c-4254-843b-fd55bee90670-kube-api-access-m4dtm\") on node \"crc\" DevicePath \"\"" Mar 19 19:45:04 crc kubenswrapper[4826]: I0319 19:45:04.457604 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565825-229hw" event={"ID":"af735e0a-a52c-4254-843b-fd55bee90670","Type":"ContainerDied","Data":"ad667f8aea0b0c573c31ffd288d10549d1927d16c25d61662e76f10afbb5e2e1"} Mar 19 19:45:04 crc kubenswrapper[4826]: I0319 19:45:04.457648 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565825-229hw" Mar 19 19:45:04 crc kubenswrapper[4826]: I0319 19:45:04.457703 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad667f8aea0b0c573c31ffd288d10549d1927d16c25d61662e76f10afbb5e2e1" Mar 19 19:45:04 crc kubenswrapper[4826]: I0319 19:45:04.544899 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565780-79mff"] Mar 19 19:45:04 crc kubenswrapper[4826]: I0319 19:45:04.560584 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565780-79mff"] Mar 19 19:45:06 crc kubenswrapper[4826]: I0319 19:45:06.005989 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="797e9c0f-d7d9-461c-ab73-7cda8c133c4d" path="/var/lib/kubelet/pods/797e9c0f-d7d9-461c-ab73-7cda8c133c4d/volumes" Mar 19 19:45:07 crc kubenswrapper[4826]: I0319 19:45:07.976489 4826 scope.go:117] "RemoveContainer" containerID="830b819005c423ed80223146f9e5bb7d6cfc2d12b0ead3f59adc695507460ab3" Mar 19 19:45:07 crc kubenswrapper[4826]: E0319 19:45:07.977106 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:45:22 crc kubenswrapper[4826]: I0319 19:45:22.976327 4826 scope.go:117] "RemoveContainer" containerID="830b819005c423ed80223146f9e5bb7d6cfc2d12b0ead3f59adc695507460ab3" Mar 19 19:45:22 crc kubenswrapper[4826]: E0319 19:45:22.977556 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:45:33 crc kubenswrapper[4826]: I0319 19:45:33.313145 4826 scope.go:117] "RemoveContainer" containerID="1cf3c843ba2a4331da681572d286424c30a9112a0cb639da13e8ea62b88d6cb3" Mar 19 19:45:36 crc kubenswrapper[4826]: I0319 19:45:36.977823 4826 scope.go:117] "RemoveContainer" containerID="830b819005c423ed80223146f9e5bb7d6cfc2d12b0ead3f59adc695507460ab3" Mar 19 19:45:36 crc kubenswrapper[4826]: E0319 19:45:36.978853 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:45:49 crc kubenswrapper[4826]: I0319 19:45:49.977145 4826 scope.go:117] "RemoveContainer" containerID="830b819005c423ed80223146f9e5bb7d6cfc2d12b0ead3f59adc695507460ab3" Mar 19 19:45:49 crc kubenswrapper[4826]: E0319 19:45:49.978288 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:46:00 crc kubenswrapper[4826]: I0319 19:46:00.156946 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565826-cv8qd"] Mar 19 19:46:00 crc kubenswrapper[4826]: E0319 19:46:00.158005 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af735e0a-a52c-4254-843b-fd55bee90670" containerName="collect-profiles" Mar 19 19:46:00 crc kubenswrapper[4826]: I0319 19:46:00.158019 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="af735e0a-a52c-4254-843b-fd55bee90670" containerName="collect-profiles" Mar 19 19:46:00 crc kubenswrapper[4826]: I0319 19:46:00.158225 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="af735e0a-a52c-4254-843b-fd55bee90670" containerName="collect-profiles" Mar 19 19:46:00 crc kubenswrapper[4826]: I0319 19:46:00.159000 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565826-cv8qd" Mar 19 19:46:00 crc kubenswrapper[4826]: I0319 19:46:00.161090 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b27wl" Mar 19 19:46:00 crc kubenswrapper[4826]: I0319 19:46:00.161770 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 19:46:00 crc kubenswrapper[4826]: I0319 19:46:00.164316 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 19:46:00 crc kubenswrapper[4826]: I0319 19:46:00.173290 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565826-cv8qd"] Mar 19 19:46:00 crc kubenswrapper[4826]: I0319 19:46:00.329713 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5hbz\" (UniqueName: \"kubernetes.io/projected/079e8287-f64c-474e-96fd-6e0327e99a0b-kube-api-access-w5hbz\") pod \"auto-csr-approver-29565826-cv8qd\" (UID: \"079e8287-f64c-474e-96fd-6e0327e99a0b\") " pod="openshift-infra/auto-csr-approver-29565826-cv8qd" Mar 19 19:46:00 crc kubenswrapper[4826]: I0319 19:46:00.431843 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5hbz\" (UniqueName: \"kubernetes.io/projected/079e8287-f64c-474e-96fd-6e0327e99a0b-kube-api-access-w5hbz\") pod \"auto-csr-approver-29565826-cv8qd\" (UID: \"079e8287-f64c-474e-96fd-6e0327e99a0b\") " pod="openshift-infra/auto-csr-approver-29565826-cv8qd" Mar 19 19:46:00 crc kubenswrapper[4826]: I0319 19:46:00.454722 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5hbz\" (UniqueName: \"kubernetes.io/projected/079e8287-f64c-474e-96fd-6e0327e99a0b-kube-api-access-w5hbz\") pod \"auto-csr-approver-29565826-cv8qd\" (UID: \"079e8287-f64c-474e-96fd-6e0327e99a0b\") " pod="openshift-infra/auto-csr-approver-29565826-cv8qd" Mar 19 19:46:00 crc kubenswrapper[4826]: I0319 19:46:00.481723 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565826-cv8qd" Mar 19 19:46:01 crc kubenswrapper[4826]: I0319 19:46:01.015813 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565826-cv8qd"] Mar 19 19:46:01 crc kubenswrapper[4826]: I0319 19:46:01.024865 4826 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 19:46:01 crc kubenswrapper[4826]: I0319 19:46:01.146053 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565826-cv8qd" event={"ID":"079e8287-f64c-474e-96fd-6e0327e99a0b","Type":"ContainerStarted","Data":"202daa8d0821248af00149de14e131adec1fd95d1990e4b0a78af6d941bf39ec"} Mar 19 19:46:03 crc kubenswrapper[4826]: I0319 19:46:03.176103 4826 generic.go:334] "Generic (PLEG): container finished" podID="079e8287-f64c-474e-96fd-6e0327e99a0b" containerID="17d502b32a7e6b1deb33efcaf60f66e82eebfd461c5f2450b5ea0d8c21b6c280" exitCode=0 Mar 19 19:46:03 crc kubenswrapper[4826]: I0319 19:46:03.176150 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565826-cv8qd" event={"ID":"079e8287-f64c-474e-96fd-6e0327e99a0b","Type":"ContainerDied","Data":"17d502b32a7e6b1deb33efcaf60f66e82eebfd461c5f2450b5ea0d8c21b6c280"} Mar 19 19:46:03 crc kubenswrapper[4826]: I0319 19:46:03.981045 4826 scope.go:117] "RemoveContainer" containerID="830b819005c423ed80223146f9e5bb7d6cfc2d12b0ead3f59adc695507460ab3" Mar 19 19:46:03 crc kubenswrapper[4826]: E0319 19:46:03.981348 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:46:04 crc kubenswrapper[4826]: I0319 19:46:04.653822 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565826-cv8qd" Mar 19 19:46:04 crc kubenswrapper[4826]: I0319 19:46:04.740704 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5hbz\" (UniqueName: \"kubernetes.io/projected/079e8287-f64c-474e-96fd-6e0327e99a0b-kube-api-access-w5hbz\") pod \"079e8287-f64c-474e-96fd-6e0327e99a0b\" (UID: \"079e8287-f64c-474e-96fd-6e0327e99a0b\") " Mar 19 19:46:04 crc kubenswrapper[4826]: I0319 19:46:04.753427 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/079e8287-f64c-474e-96fd-6e0327e99a0b-kube-api-access-w5hbz" (OuterVolumeSpecName: "kube-api-access-w5hbz") pod "079e8287-f64c-474e-96fd-6e0327e99a0b" (UID: "079e8287-f64c-474e-96fd-6e0327e99a0b"). InnerVolumeSpecName "kube-api-access-w5hbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:46:04 crc kubenswrapper[4826]: I0319 19:46:04.845200 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5hbz\" (UniqueName: \"kubernetes.io/projected/079e8287-f64c-474e-96fd-6e0327e99a0b-kube-api-access-w5hbz\") on node \"crc\" DevicePath \"\"" Mar 19 19:46:05 crc kubenswrapper[4826]: I0319 19:46:05.212716 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565826-cv8qd" event={"ID":"079e8287-f64c-474e-96fd-6e0327e99a0b","Type":"ContainerDied","Data":"202daa8d0821248af00149de14e131adec1fd95d1990e4b0a78af6d941bf39ec"} Mar 19 19:46:05 crc kubenswrapper[4826]: I0319 19:46:05.212789 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="202daa8d0821248af00149de14e131adec1fd95d1990e4b0a78af6d941bf39ec" Mar 19 19:46:05 crc kubenswrapper[4826]: I0319 19:46:05.212848 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565826-cv8qd" Mar 19 19:46:05 crc kubenswrapper[4826]: I0319 19:46:05.770135 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565820-4q9l7"] Mar 19 19:46:05 crc kubenswrapper[4826]: I0319 19:46:05.785032 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565820-4q9l7"] Mar 19 19:46:06 crc kubenswrapper[4826]: I0319 19:46:06.002863 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7b0f07e-6a17-4cdf-ba58-0e8835e71c60" path="/var/lib/kubelet/pods/e7b0f07e-6a17-4cdf-ba58-0e8835e71c60/volumes" Mar 19 19:46:16 crc kubenswrapper[4826]: I0319 19:46:16.976435 4826 scope.go:117] "RemoveContainer" containerID="830b819005c423ed80223146f9e5bb7d6cfc2d12b0ead3f59adc695507460ab3" Mar 19 19:46:16 crc kubenswrapper[4826]: E0319 19:46:16.977318 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:46:29 crc kubenswrapper[4826]: I0319 19:46:29.977570 4826 scope.go:117] "RemoveContainer" containerID="830b819005c423ed80223146f9e5bb7d6cfc2d12b0ead3f59adc695507460ab3" Mar 19 19:46:29 crc kubenswrapper[4826]: E0319 19:46:29.979032 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:46:33 crc kubenswrapper[4826]: I0319 19:46:33.384590 4826 scope.go:117] "RemoveContainer" containerID="7faa0834afc197d5950427f0c944db59b695dacf648b7f57b5ada0678cff3e3a" Mar 19 19:46:38 crc kubenswrapper[4826]: I0319 19:46:38.653517 4826 generic.go:334] "Generic (PLEG): container finished" podID="22837e37-d9e9-4044-b5bf-34a4c866ebf0" containerID="4d000d55712c92afb7e23e27e81c3003c45a0b3b984e6f508623bd88346b6770" exitCode=0 Mar 19 19:46:38 crc kubenswrapper[4826]: I0319 19:46:38.653610 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hzf4h" event={"ID":"22837e37-d9e9-4044-b5bf-34a4c866ebf0","Type":"ContainerDied","Data":"4d000d55712c92afb7e23e27e81c3003c45a0b3b984e6f508623bd88346b6770"} Mar 19 19:46:40 crc kubenswrapper[4826]: I0319 19:46:40.243991 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hzf4h" Mar 19 19:46:40 crc kubenswrapper[4826]: I0319 19:46:40.308077 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22837e37-d9e9-4044-b5bf-34a4c866ebf0-telemetry-power-monitoring-combined-ca-bundle\") pod \"22837e37-d9e9-4044-b5bf-34a4c866ebf0\" (UID: \"22837e37-d9e9-4044-b5bf-34a4c866ebf0\") " Mar 19 19:46:40 crc kubenswrapper[4826]: I0319 19:46:40.308266 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzvpr\" (UniqueName: \"kubernetes.io/projected/22837e37-d9e9-4044-b5bf-34a4c866ebf0-kube-api-access-lzvpr\") pod \"22837e37-d9e9-4044-b5bf-34a4c866ebf0\" (UID: \"22837e37-d9e9-4044-b5bf-34a4c866ebf0\") " Mar 19 19:46:40 crc kubenswrapper[4826]: I0319 19:46:40.308298 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/22837e37-d9e9-4044-b5bf-34a4c866ebf0-ceilometer-ipmi-config-data-1\") pod \"22837e37-d9e9-4044-b5bf-34a4c866ebf0\" (UID: \"22837e37-d9e9-4044-b5bf-34a4c866ebf0\") " Mar 19 19:46:40 crc kubenswrapper[4826]: I0319 19:46:40.308393 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/22837e37-d9e9-4044-b5bf-34a4c866ebf0-inventory\") pod \"22837e37-d9e9-4044-b5bf-34a4c866ebf0\" (UID: \"22837e37-d9e9-4044-b5bf-34a4c866ebf0\") " Mar 19 19:46:40 crc kubenswrapper[4826]: I0319 19:46:40.308418 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/22837e37-d9e9-4044-b5bf-34a4c866ebf0-ssh-key-openstack-edpm-ipam\") pod \"22837e37-d9e9-4044-b5bf-34a4c866ebf0\" (UID: \"22837e37-d9e9-4044-b5bf-34a4c866ebf0\") " Mar 19 19:46:40 crc kubenswrapper[4826]: I0319 19:46:40.309380 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/22837e37-d9e9-4044-b5bf-34a4c866ebf0-ceilometer-ipmi-config-data-0\") pod \"22837e37-d9e9-4044-b5bf-34a4c866ebf0\" (UID: \"22837e37-d9e9-4044-b5bf-34a4c866ebf0\") " Mar 19 19:46:40 crc kubenswrapper[4826]: I0319 19:46:40.309426 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/22837e37-d9e9-4044-b5bf-34a4c866ebf0-ceilometer-ipmi-config-data-2\") pod \"22837e37-d9e9-4044-b5bf-34a4c866ebf0\" (UID: \"22837e37-d9e9-4044-b5bf-34a4c866ebf0\") " Mar 19 19:46:40 crc kubenswrapper[4826]: I0319 19:46:40.316347 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22837e37-d9e9-4044-b5bf-34a4c866ebf0-telemetry-power-monitoring-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-power-monitoring-combined-ca-bundle") pod "22837e37-d9e9-4044-b5bf-34a4c866ebf0" (UID: "22837e37-d9e9-4044-b5bf-34a4c866ebf0"). InnerVolumeSpecName "telemetry-power-monitoring-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:46:40 crc kubenswrapper[4826]: I0319 19:46:40.358866 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22837e37-d9e9-4044-b5bf-34a4c866ebf0-kube-api-access-lzvpr" (OuterVolumeSpecName: "kube-api-access-lzvpr") pod "22837e37-d9e9-4044-b5bf-34a4c866ebf0" (UID: "22837e37-d9e9-4044-b5bf-34a4c866ebf0"). InnerVolumeSpecName "kube-api-access-lzvpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:46:40 crc kubenswrapper[4826]: I0319 19:46:40.364771 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22837e37-d9e9-4044-b5bf-34a4c866ebf0-ceilometer-ipmi-config-data-1" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-1") pod "22837e37-d9e9-4044-b5bf-34a4c866ebf0" (UID: "22837e37-d9e9-4044-b5bf-34a4c866ebf0"). InnerVolumeSpecName "ceilometer-ipmi-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:46:40 crc kubenswrapper[4826]: I0319 19:46:40.367250 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22837e37-d9e9-4044-b5bf-34a4c866ebf0-ceilometer-ipmi-config-data-2" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-2") pod "22837e37-d9e9-4044-b5bf-34a4c866ebf0" (UID: "22837e37-d9e9-4044-b5bf-34a4c866ebf0"). InnerVolumeSpecName "ceilometer-ipmi-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:46:40 crc kubenswrapper[4826]: I0319 19:46:40.368509 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22837e37-d9e9-4044-b5bf-34a4c866ebf0-inventory" (OuterVolumeSpecName: "inventory") pod "22837e37-d9e9-4044-b5bf-34a4c866ebf0" (UID: "22837e37-d9e9-4044-b5bf-34a4c866ebf0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:46:40 crc kubenswrapper[4826]: I0319 19:46:40.384850 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22837e37-d9e9-4044-b5bf-34a4c866ebf0-ceilometer-ipmi-config-data-0" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-0") pod "22837e37-d9e9-4044-b5bf-34a4c866ebf0" (UID: "22837e37-d9e9-4044-b5bf-34a4c866ebf0"). InnerVolumeSpecName "ceilometer-ipmi-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:46:40 crc kubenswrapper[4826]: I0319 19:46:40.392622 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22837e37-d9e9-4044-b5bf-34a4c866ebf0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "22837e37-d9e9-4044-b5bf-34a4c866ebf0" (UID: "22837e37-d9e9-4044-b5bf-34a4c866ebf0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:46:40 crc kubenswrapper[4826]: I0319 19:46:40.411743 4826 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/22837e37-d9e9-4044-b5bf-34a4c866ebf0-ceilometer-ipmi-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 19 19:46:40 crc kubenswrapper[4826]: I0319 19:46:40.411772 4826 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/22837e37-d9e9-4044-b5bf-34a4c866ebf0-ceilometer-ipmi-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 19 19:46:40 crc kubenswrapper[4826]: I0319 19:46:40.411786 4826 reconciler_common.go:293] "Volume detached for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22837e37-d9e9-4044-b5bf-34a4c866ebf0-telemetry-power-monitoring-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 19:46:40 crc kubenswrapper[4826]: I0319 19:46:40.411797 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzvpr\" (UniqueName: \"kubernetes.io/projected/22837e37-d9e9-4044-b5bf-34a4c866ebf0-kube-api-access-lzvpr\") on node \"crc\" DevicePath \"\"" Mar 19 19:46:40 crc kubenswrapper[4826]: I0319 19:46:40.411806 4826 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/22837e37-d9e9-4044-b5bf-34a4c866ebf0-ceilometer-ipmi-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 19 19:46:40 crc kubenswrapper[4826]: I0319 19:46:40.411815 4826 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/22837e37-d9e9-4044-b5bf-34a4c866ebf0-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 19:46:40 crc kubenswrapper[4826]: I0319 19:46:40.411824 4826 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/22837e37-d9e9-4044-b5bf-34a4c866ebf0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 19:46:40 crc kubenswrapper[4826]: I0319 19:46:40.695649 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hzf4h" event={"ID":"22837e37-d9e9-4044-b5bf-34a4c866ebf0","Type":"ContainerDied","Data":"41cd2003b779cc0d0eed50f57e3ac94370a6aa558673c9143bf153f8d90d22a7"} Mar 19 19:46:40 crc kubenswrapper[4826]: I0319 19:46:40.696003 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41cd2003b779cc0d0eed50f57e3ac94370a6aa558673c9143bf153f8d90d22a7" Mar 19 19:46:40 crc kubenswrapper[4826]: I0319 19:46:40.695731 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-hzf4h" Mar 19 19:46:40 crc kubenswrapper[4826]: I0319 19:46:40.838918 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-5j76b"] Mar 19 19:46:40 crc kubenswrapper[4826]: E0319 19:46:40.839755 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22837e37-d9e9-4044-b5bf-34a4c866ebf0" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Mar 19 19:46:40 crc kubenswrapper[4826]: I0319 19:46:40.839787 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="22837e37-d9e9-4044-b5bf-34a4c866ebf0" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Mar 19 19:46:40 crc kubenswrapper[4826]: E0319 19:46:40.839818 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="079e8287-f64c-474e-96fd-6e0327e99a0b" containerName="oc" Mar 19 19:46:40 crc kubenswrapper[4826]: I0319 19:46:40.839832 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="079e8287-f64c-474e-96fd-6e0327e99a0b" containerName="oc" Mar 19 19:46:40 crc kubenswrapper[4826]: I0319 19:46:40.840252 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="079e8287-f64c-474e-96fd-6e0327e99a0b" containerName="oc" Mar 19 19:46:40 crc kubenswrapper[4826]: I0319 19:46:40.840323 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="22837e37-d9e9-4044-b5bf-34a4c866ebf0" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Mar 19 19:46:40 crc kubenswrapper[4826]: I0319 19:46:40.841592 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-5j76b" Mar 19 19:46:40 crc kubenswrapper[4826]: I0319 19:46:40.844050 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jchxw" Mar 19 19:46:40 crc kubenswrapper[4826]: I0319 19:46:40.844484 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"logging-compute-config-data" Mar 19 19:46:40 crc kubenswrapper[4826]: I0319 19:46:40.844529 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 19:46:40 crc kubenswrapper[4826]: I0319 19:46:40.844785 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 19:46:40 crc kubenswrapper[4826]: I0319 19:46:40.845122 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 19:46:40 crc kubenswrapper[4826]: I0319 19:46:40.858841 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-5j76b"] Mar 19 19:46:40 crc kubenswrapper[4826]: I0319 19:46:40.925689 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/71a93dcb-a020-4cb2-b7bc-66042b67be66-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-5j76b\" (UID: \"71a93dcb-a020-4cb2-b7bc-66042b67be66\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-5j76b" Mar 19 19:46:40 crc kubenswrapper[4826]: I0319 19:46:40.925944 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/71a93dcb-a020-4cb2-b7bc-66042b67be66-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-5j76b\" (UID: \"71a93dcb-a020-4cb2-b7bc-66042b67be66\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-5j76b" Mar 19 19:46:40 crc kubenswrapper[4826]: I0319 19:46:40.926088 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/71a93dcb-a020-4cb2-b7bc-66042b67be66-ssh-key-openstack-edpm-ipam\") pod \"logging-edpm-deployment-openstack-edpm-ipam-5j76b\" (UID: \"71a93dcb-a020-4cb2-b7bc-66042b67be66\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-5j76b" Mar 19 19:46:40 crc kubenswrapper[4826]: I0319 19:46:40.926235 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/71a93dcb-a020-4cb2-b7bc-66042b67be66-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-5j76b\" (UID: \"71a93dcb-a020-4cb2-b7bc-66042b67be66\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-5j76b" Mar 19 19:46:40 crc kubenswrapper[4826]: I0319 19:46:40.926280 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s96w8\" (UniqueName: \"kubernetes.io/projected/71a93dcb-a020-4cb2-b7bc-66042b67be66-kube-api-access-s96w8\") pod \"logging-edpm-deployment-openstack-edpm-ipam-5j76b\" (UID: \"71a93dcb-a020-4cb2-b7bc-66042b67be66\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-5j76b" Mar 19 19:46:41 crc kubenswrapper[4826]: I0319 19:46:41.028477 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/71a93dcb-a020-4cb2-b7bc-66042b67be66-ssh-key-openstack-edpm-ipam\") pod \"logging-edpm-deployment-openstack-edpm-ipam-5j76b\" (UID: \"71a93dcb-a020-4cb2-b7bc-66042b67be66\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-5j76b" Mar 19 19:46:41 crc kubenswrapper[4826]: I0319 19:46:41.028575 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/71a93dcb-a020-4cb2-b7bc-66042b67be66-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-5j76b\" (UID: \"71a93dcb-a020-4cb2-b7bc-66042b67be66\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-5j76b" Mar 19 19:46:41 crc kubenswrapper[4826]: I0319 19:46:41.028608 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s96w8\" (UniqueName: \"kubernetes.io/projected/71a93dcb-a020-4cb2-b7bc-66042b67be66-kube-api-access-s96w8\") pod \"logging-edpm-deployment-openstack-edpm-ipam-5j76b\" (UID: \"71a93dcb-a020-4cb2-b7bc-66042b67be66\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-5j76b" Mar 19 19:46:41 crc kubenswrapper[4826]: I0319 19:46:41.028694 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/71a93dcb-a020-4cb2-b7bc-66042b67be66-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-5j76b\" (UID: \"71a93dcb-a020-4cb2-b7bc-66042b67be66\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-5j76b" Mar 19 19:46:41 crc kubenswrapper[4826]: I0319 19:46:41.028854 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/71a93dcb-a020-4cb2-b7bc-66042b67be66-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-5j76b\" (UID: \"71a93dcb-a020-4cb2-b7bc-66042b67be66\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-5j76b" Mar 19 19:46:41 crc kubenswrapper[4826]: I0319 19:46:41.033408 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/71a93dcb-a020-4cb2-b7bc-66042b67be66-ssh-key-openstack-edpm-ipam\") pod \"logging-edpm-deployment-openstack-edpm-ipam-5j76b\" (UID: \"71a93dcb-a020-4cb2-b7bc-66042b67be66\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-5j76b" Mar 19 19:46:41 crc kubenswrapper[4826]: I0319 19:46:41.034041 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/71a93dcb-a020-4cb2-b7bc-66042b67be66-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-5j76b\" (UID: \"71a93dcb-a020-4cb2-b7bc-66042b67be66\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-5j76b" Mar 19 19:46:41 crc kubenswrapper[4826]: I0319 19:46:41.034545 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/71a93dcb-a020-4cb2-b7bc-66042b67be66-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-5j76b\" (UID: \"71a93dcb-a020-4cb2-b7bc-66042b67be66\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-5j76b" Mar 19 19:46:41 crc kubenswrapper[4826]: I0319 19:46:41.035769 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/71a93dcb-a020-4cb2-b7bc-66042b67be66-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-5j76b\" (UID: \"71a93dcb-a020-4cb2-b7bc-66042b67be66\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-5j76b" Mar 19 19:46:41 crc kubenswrapper[4826]: I0319 19:46:41.047310 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s96w8\" (UniqueName: \"kubernetes.io/projected/71a93dcb-a020-4cb2-b7bc-66042b67be66-kube-api-access-s96w8\") pod \"logging-edpm-deployment-openstack-edpm-ipam-5j76b\" (UID: \"71a93dcb-a020-4cb2-b7bc-66042b67be66\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-5j76b" Mar 19 19:46:41 crc kubenswrapper[4826]: I0319 19:46:41.174417 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-5j76b" Mar 19 19:46:41 crc kubenswrapper[4826]: I0319 19:46:41.804085 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-5j76b"] Mar 19 19:46:42 crc kubenswrapper[4826]: I0319 19:46:42.717269 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-5j76b" event={"ID":"71a93dcb-a020-4cb2-b7bc-66042b67be66","Type":"ContainerStarted","Data":"b74862c7cdd6880a6df96f009b0f6ba0e1a6f89b2f28fadd7bbee2bfcf0687aa"} Mar 19 19:46:42 crc kubenswrapper[4826]: I0319 19:46:42.717841 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-5j76b" event={"ID":"71a93dcb-a020-4cb2-b7bc-66042b67be66","Type":"ContainerStarted","Data":"09897e66f0b6078598fc1ece8ad1176399a9361b650d7f7081c29a39a978fe08"} Mar 19 19:46:42 crc kubenswrapper[4826]: I0319 19:46:42.735174 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-5j76b" podStartSLOduration=2.217108702 podStartE2EDuration="2.73515698s" podCreationTimestamp="2026-03-19 19:46:40 +0000 UTC" firstStartedPulling="2026-03-19 19:46:41.804857941 +0000 UTC m=+3026.558926264" lastFinishedPulling="2026-03-19 19:46:42.322906229 +0000 UTC m=+3027.076974542" observedRunningTime="2026-03-19 19:46:42.732837106 +0000 UTC m=+3027.486905419" watchObservedRunningTime="2026-03-19 19:46:42.73515698 +0000 UTC m=+3027.489225293" Mar 19 19:46:43 crc kubenswrapper[4826]: I0319 19:46:43.985403 4826 scope.go:117] "RemoveContainer" containerID="830b819005c423ed80223146f9e5bb7d6cfc2d12b0ead3f59adc695507460ab3" Mar 19 19:46:43 crc kubenswrapper[4826]: E0319 19:46:43.986022 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:46:58 crc kubenswrapper[4826]: I0319 19:46:58.923457 4826 generic.go:334] "Generic (PLEG): container finished" podID="71a93dcb-a020-4cb2-b7bc-66042b67be66" containerID="b74862c7cdd6880a6df96f009b0f6ba0e1a6f89b2f28fadd7bbee2bfcf0687aa" exitCode=0 Mar 19 19:46:58 crc kubenswrapper[4826]: I0319 19:46:58.923478 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-5j76b" event={"ID":"71a93dcb-a020-4cb2-b7bc-66042b67be66","Type":"ContainerDied","Data":"b74862c7cdd6880a6df96f009b0f6ba0e1a6f89b2f28fadd7bbee2bfcf0687aa"} Mar 19 19:46:58 crc kubenswrapper[4826]: I0319 19:46:58.990920 4826 scope.go:117] "RemoveContainer" containerID="830b819005c423ed80223146f9e5bb7d6cfc2d12b0ead3f59adc695507460ab3" Mar 19 19:46:58 crc kubenswrapper[4826]: E0319 19:46:58.991371 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:47:00 crc kubenswrapper[4826]: I0319 19:47:00.536084 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-5j76b" Mar 19 19:47:00 crc kubenswrapper[4826]: I0319 19:47:00.633683 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/71a93dcb-a020-4cb2-b7bc-66042b67be66-inventory\") pod \"71a93dcb-a020-4cb2-b7bc-66042b67be66\" (UID: \"71a93dcb-a020-4cb2-b7bc-66042b67be66\") " Mar 19 19:47:00 crc kubenswrapper[4826]: I0319 19:47:00.633932 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/71a93dcb-a020-4cb2-b7bc-66042b67be66-logging-compute-config-data-0\") pod \"71a93dcb-a020-4cb2-b7bc-66042b67be66\" (UID: \"71a93dcb-a020-4cb2-b7bc-66042b67be66\") " Mar 19 19:47:00 crc kubenswrapper[4826]: I0319 19:47:00.634048 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/71a93dcb-a020-4cb2-b7bc-66042b67be66-logging-compute-config-data-1\") pod \"71a93dcb-a020-4cb2-b7bc-66042b67be66\" (UID: \"71a93dcb-a020-4cb2-b7bc-66042b67be66\") " Mar 19 19:47:00 crc kubenswrapper[4826]: I0319 19:47:00.634112 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/71a93dcb-a020-4cb2-b7bc-66042b67be66-ssh-key-openstack-edpm-ipam\") pod \"71a93dcb-a020-4cb2-b7bc-66042b67be66\" (UID: \"71a93dcb-a020-4cb2-b7bc-66042b67be66\") " Mar 19 19:47:00 crc kubenswrapper[4826]: I0319 19:47:00.634175 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s96w8\" (UniqueName: \"kubernetes.io/projected/71a93dcb-a020-4cb2-b7bc-66042b67be66-kube-api-access-s96w8\") pod \"71a93dcb-a020-4cb2-b7bc-66042b67be66\" (UID: \"71a93dcb-a020-4cb2-b7bc-66042b67be66\") " Mar 19 19:47:00 crc kubenswrapper[4826]: I0319 19:47:00.650609 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71a93dcb-a020-4cb2-b7bc-66042b67be66-kube-api-access-s96w8" (OuterVolumeSpecName: "kube-api-access-s96w8") pod "71a93dcb-a020-4cb2-b7bc-66042b67be66" (UID: "71a93dcb-a020-4cb2-b7bc-66042b67be66"). InnerVolumeSpecName "kube-api-access-s96w8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:47:00 crc kubenswrapper[4826]: I0319 19:47:00.665629 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71a93dcb-a020-4cb2-b7bc-66042b67be66-inventory" (OuterVolumeSpecName: "inventory") pod "71a93dcb-a020-4cb2-b7bc-66042b67be66" (UID: "71a93dcb-a020-4cb2-b7bc-66042b67be66"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:47:00 crc kubenswrapper[4826]: I0319 19:47:00.671274 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71a93dcb-a020-4cb2-b7bc-66042b67be66-logging-compute-config-data-0" (OuterVolumeSpecName: "logging-compute-config-data-0") pod "71a93dcb-a020-4cb2-b7bc-66042b67be66" (UID: "71a93dcb-a020-4cb2-b7bc-66042b67be66"). InnerVolumeSpecName "logging-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:47:00 crc kubenswrapper[4826]: I0319 19:47:00.677772 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71a93dcb-a020-4cb2-b7bc-66042b67be66-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "71a93dcb-a020-4cb2-b7bc-66042b67be66" (UID: "71a93dcb-a020-4cb2-b7bc-66042b67be66"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:47:00 crc kubenswrapper[4826]: I0319 19:47:00.679667 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71a93dcb-a020-4cb2-b7bc-66042b67be66-logging-compute-config-data-1" (OuterVolumeSpecName: "logging-compute-config-data-1") pod "71a93dcb-a020-4cb2-b7bc-66042b67be66" (UID: "71a93dcb-a020-4cb2-b7bc-66042b67be66"). InnerVolumeSpecName "logging-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 19:47:00 crc kubenswrapper[4826]: I0319 19:47:00.738666 4826 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/71a93dcb-a020-4cb2-b7bc-66042b67be66-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 19:47:00 crc kubenswrapper[4826]: I0319 19:47:00.738711 4826 reconciler_common.go:293] "Volume detached for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/71a93dcb-a020-4cb2-b7bc-66042b67be66-logging-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 19 19:47:00 crc kubenswrapper[4826]: I0319 19:47:00.738726 4826 reconciler_common.go:293] "Volume detached for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/71a93dcb-a020-4cb2-b7bc-66042b67be66-logging-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 19 19:47:00 crc kubenswrapper[4826]: I0319 19:47:00.738739 4826 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/71a93dcb-a020-4cb2-b7bc-66042b67be66-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 19:47:00 crc kubenswrapper[4826]: I0319 19:47:00.738752 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s96w8\" (UniqueName: \"kubernetes.io/projected/71a93dcb-a020-4cb2-b7bc-66042b67be66-kube-api-access-s96w8\") on node \"crc\" DevicePath \"\"" Mar 19 19:47:00 crc kubenswrapper[4826]: I0319 19:47:00.950766 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-5j76b" event={"ID":"71a93dcb-a020-4cb2-b7bc-66042b67be66","Type":"ContainerDied","Data":"09897e66f0b6078598fc1ece8ad1176399a9361b650d7f7081c29a39a978fe08"} Mar 19 19:47:00 crc kubenswrapper[4826]: I0319 19:47:00.950815 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09897e66f0b6078598fc1ece8ad1176399a9361b650d7f7081c29a39a978fe08" Mar 19 19:47:00 crc kubenswrapper[4826]: I0319 19:47:00.950826 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-5j76b" Mar 19 19:47:13 crc kubenswrapper[4826]: I0319 19:47:13.976153 4826 scope.go:117] "RemoveContainer" containerID="830b819005c423ed80223146f9e5bb7d6cfc2d12b0ead3f59adc695507460ab3" Mar 19 19:47:13 crc kubenswrapper[4826]: E0319 19:47:13.977055 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:47:27 crc kubenswrapper[4826]: I0319 19:47:27.977271 4826 scope.go:117] "RemoveContainer" containerID="830b819005c423ed80223146f9e5bb7d6cfc2d12b0ead3f59adc695507460ab3" Mar 19 19:47:27 crc kubenswrapper[4826]: E0319 19:47:27.978203 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:47:38 crc kubenswrapper[4826]: I0319 19:47:38.977647 4826 scope.go:117] "RemoveContainer" containerID="830b819005c423ed80223146f9e5bb7d6cfc2d12b0ead3f59adc695507460ab3" Mar 19 19:47:38 crc kubenswrapper[4826]: E0319 19:47:38.979142 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:47:51 crc kubenswrapper[4826]: I0319 19:47:51.976934 4826 scope.go:117] "RemoveContainer" containerID="830b819005c423ed80223146f9e5bb7d6cfc2d12b0ead3f59adc695507460ab3" Mar 19 19:47:51 crc kubenswrapper[4826]: E0319 19:47:51.977777 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:48:00 crc kubenswrapper[4826]: I0319 19:48:00.151322 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565828-9v5gm"] Mar 19 19:48:00 crc kubenswrapper[4826]: E0319 19:48:00.152310 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71a93dcb-a020-4cb2-b7bc-66042b67be66" containerName="logging-edpm-deployment-openstack-edpm-ipam" Mar 19 19:48:00 crc kubenswrapper[4826]: I0319 19:48:00.152326 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="71a93dcb-a020-4cb2-b7bc-66042b67be66" containerName="logging-edpm-deployment-openstack-edpm-ipam" Mar 19 19:48:00 crc kubenswrapper[4826]: I0319 19:48:00.154371 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="71a93dcb-a020-4cb2-b7bc-66042b67be66" containerName="logging-edpm-deployment-openstack-edpm-ipam" Mar 19 19:48:00 crc kubenswrapper[4826]: I0319 19:48:00.155200 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565828-9v5gm" Mar 19 19:48:00 crc kubenswrapper[4826]: I0319 19:48:00.157128 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 19:48:00 crc kubenswrapper[4826]: I0319 19:48:00.157134 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b27wl" Mar 19 19:48:00 crc kubenswrapper[4826]: I0319 19:48:00.157934 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 19:48:00 crc kubenswrapper[4826]: I0319 19:48:00.171177 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565828-9v5gm"] Mar 19 19:48:00 crc kubenswrapper[4826]: I0319 19:48:00.263777 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fx5w\" (UniqueName: \"kubernetes.io/projected/99a85285-4e30-4fda-9a29-da2ec169a11e-kube-api-access-9fx5w\") pod \"auto-csr-approver-29565828-9v5gm\" (UID: \"99a85285-4e30-4fda-9a29-da2ec169a11e\") " pod="openshift-infra/auto-csr-approver-29565828-9v5gm" Mar 19 19:48:00 crc kubenswrapper[4826]: I0319 19:48:00.365552 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fx5w\" (UniqueName: \"kubernetes.io/projected/99a85285-4e30-4fda-9a29-da2ec169a11e-kube-api-access-9fx5w\") pod \"auto-csr-approver-29565828-9v5gm\" (UID: \"99a85285-4e30-4fda-9a29-da2ec169a11e\") " pod="openshift-infra/auto-csr-approver-29565828-9v5gm" Mar 19 19:48:00 crc kubenswrapper[4826]: I0319 19:48:00.384993 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fx5w\" (UniqueName: \"kubernetes.io/projected/99a85285-4e30-4fda-9a29-da2ec169a11e-kube-api-access-9fx5w\") pod \"auto-csr-approver-29565828-9v5gm\" (UID: \"99a85285-4e30-4fda-9a29-da2ec169a11e\") " pod="openshift-infra/auto-csr-approver-29565828-9v5gm" Mar 19 19:48:00 crc kubenswrapper[4826]: I0319 19:48:00.478274 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565828-9v5gm" Mar 19 19:48:00 crc kubenswrapper[4826]: I0319 19:48:00.929955 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565828-9v5gm"] Mar 19 19:48:00 crc kubenswrapper[4826]: W0319 19:48:00.941467 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod99a85285_4e30_4fda_9a29_da2ec169a11e.slice/crio-d94f5576591d597f13759eee88620b39d9437049ea7703a047e19412aeef6e40 WatchSource:0}: Error finding container d94f5576591d597f13759eee88620b39d9437049ea7703a047e19412aeef6e40: Status 404 returned error can't find the container with id d94f5576591d597f13759eee88620b39d9437049ea7703a047e19412aeef6e40 Mar 19 19:48:01 crc kubenswrapper[4826]: I0319 19:48:01.754553 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565828-9v5gm" event={"ID":"99a85285-4e30-4fda-9a29-da2ec169a11e","Type":"ContainerStarted","Data":"d94f5576591d597f13759eee88620b39d9437049ea7703a047e19412aeef6e40"} Mar 19 19:48:02 crc kubenswrapper[4826]: I0319 19:48:02.771372 4826 generic.go:334] "Generic (PLEG): container finished" podID="99a85285-4e30-4fda-9a29-da2ec169a11e" containerID="a8d403ef5f2775e744e7c44c73209b052a08a8870ad88093b2fbc3c06d4348e2" exitCode=0 Mar 19 19:48:02 crc kubenswrapper[4826]: I0319 19:48:02.771928 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565828-9v5gm" event={"ID":"99a85285-4e30-4fda-9a29-da2ec169a11e","Type":"ContainerDied","Data":"a8d403ef5f2775e744e7c44c73209b052a08a8870ad88093b2fbc3c06d4348e2"} Mar 19 19:48:04 crc kubenswrapper[4826]: I0319 19:48:04.239987 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565828-9v5gm" Mar 19 19:48:04 crc kubenswrapper[4826]: I0319 19:48:04.423006 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fx5w\" (UniqueName: \"kubernetes.io/projected/99a85285-4e30-4fda-9a29-da2ec169a11e-kube-api-access-9fx5w\") pod \"99a85285-4e30-4fda-9a29-da2ec169a11e\" (UID: \"99a85285-4e30-4fda-9a29-da2ec169a11e\") " Mar 19 19:48:04 crc kubenswrapper[4826]: I0319 19:48:04.428370 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99a85285-4e30-4fda-9a29-da2ec169a11e-kube-api-access-9fx5w" (OuterVolumeSpecName: "kube-api-access-9fx5w") pod "99a85285-4e30-4fda-9a29-da2ec169a11e" (UID: "99a85285-4e30-4fda-9a29-da2ec169a11e"). InnerVolumeSpecName "kube-api-access-9fx5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:48:04 crc kubenswrapper[4826]: I0319 19:48:04.526212 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fx5w\" (UniqueName: \"kubernetes.io/projected/99a85285-4e30-4fda-9a29-da2ec169a11e-kube-api-access-9fx5w\") on node \"crc\" DevicePath \"\"" Mar 19 19:48:04 crc kubenswrapper[4826]: I0319 19:48:04.802763 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565828-9v5gm" event={"ID":"99a85285-4e30-4fda-9a29-da2ec169a11e","Type":"ContainerDied","Data":"d94f5576591d597f13759eee88620b39d9437049ea7703a047e19412aeef6e40"} Mar 19 19:48:04 crc kubenswrapper[4826]: I0319 19:48:04.802819 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d94f5576591d597f13759eee88620b39d9437049ea7703a047e19412aeef6e40" Mar 19 19:48:04 crc kubenswrapper[4826]: I0319 19:48:04.802854 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565828-9v5gm" Mar 19 19:48:04 crc kubenswrapper[4826]: I0319 19:48:04.976944 4826 scope.go:117] "RemoveContainer" containerID="830b819005c423ed80223146f9e5bb7d6cfc2d12b0ead3f59adc695507460ab3" Mar 19 19:48:04 crc kubenswrapper[4826]: E0319 19:48:04.977839 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:48:05 crc kubenswrapper[4826]: I0319 19:48:05.342342 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565822-bt6tr"] Mar 19 19:48:05 crc kubenswrapper[4826]: I0319 19:48:05.364563 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565822-bt6tr"] Mar 19 19:48:05 crc kubenswrapper[4826]: I0319 19:48:05.994207 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64c59e3a-0560-4fb5-9e8c-2bb1508b9bd1" path="/var/lib/kubelet/pods/64c59e3a-0560-4fb5-9e8c-2bb1508b9bd1/volumes" Mar 19 19:48:19 crc kubenswrapper[4826]: I0319 19:48:19.977250 4826 scope.go:117] "RemoveContainer" containerID="830b819005c423ed80223146f9e5bb7d6cfc2d12b0ead3f59adc695507460ab3" Mar 19 19:48:19 crc kubenswrapper[4826]: E0319 19:48:19.996861 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:48:32 crc kubenswrapper[4826]: I0319 19:48:32.976991 4826 scope.go:117] "RemoveContainer" containerID="830b819005c423ed80223146f9e5bb7d6cfc2d12b0ead3f59adc695507460ab3" Mar 19 19:48:32 crc kubenswrapper[4826]: E0319 19:48:32.979115 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:48:33 crc kubenswrapper[4826]: I0319 19:48:33.548444 4826 scope.go:117] "RemoveContainer" containerID="532f76c99aef136f1b552bd11a5f2e49684dd5eab2678609af11e40537af7c72" Mar 19 19:48:41 crc kubenswrapper[4826]: I0319 19:48:41.933732 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-k2bd8"] Mar 19 19:48:41 crc kubenswrapper[4826]: E0319 19:48:41.935004 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99a85285-4e30-4fda-9a29-da2ec169a11e" containerName="oc" Mar 19 19:48:41 crc kubenswrapper[4826]: I0319 19:48:41.935026 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="99a85285-4e30-4fda-9a29-da2ec169a11e" containerName="oc" Mar 19 19:48:41 crc kubenswrapper[4826]: I0319 19:48:41.935496 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="99a85285-4e30-4fda-9a29-da2ec169a11e" containerName="oc" Mar 19 19:48:41 crc kubenswrapper[4826]: I0319 19:48:41.938443 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k2bd8" Mar 19 19:48:41 crc kubenswrapper[4826]: I0319 19:48:41.974309 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k2bd8"] Mar 19 19:48:42 crc kubenswrapper[4826]: I0319 19:48:42.048696 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee384521-b042-4a17-a82d-f45b3670ddd2-catalog-content\") pod \"certified-operators-k2bd8\" (UID: \"ee384521-b042-4a17-a82d-f45b3670ddd2\") " pod="openshift-marketplace/certified-operators-k2bd8" Mar 19 19:48:42 crc kubenswrapper[4826]: I0319 19:48:42.050082 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t76zs\" (UniqueName: \"kubernetes.io/projected/ee384521-b042-4a17-a82d-f45b3670ddd2-kube-api-access-t76zs\") pod \"certified-operators-k2bd8\" (UID: \"ee384521-b042-4a17-a82d-f45b3670ddd2\") " pod="openshift-marketplace/certified-operators-k2bd8" Mar 19 19:48:42 crc kubenswrapper[4826]: I0319 19:48:42.050240 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee384521-b042-4a17-a82d-f45b3670ddd2-utilities\") pod \"certified-operators-k2bd8\" (UID: \"ee384521-b042-4a17-a82d-f45b3670ddd2\") " pod="openshift-marketplace/certified-operators-k2bd8" Mar 19 19:48:42 crc kubenswrapper[4826]: I0319 19:48:42.153254 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee384521-b042-4a17-a82d-f45b3670ddd2-catalog-content\") pod \"certified-operators-k2bd8\" (UID: \"ee384521-b042-4a17-a82d-f45b3670ddd2\") " pod="openshift-marketplace/certified-operators-k2bd8" Mar 19 19:48:42 crc kubenswrapper[4826]: I0319 19:48:42.154109 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t76zs\" (UniqueName: \"kubernetes.io/projected/ee384521-b042-4a17-a82d-f45b3670ddd2-kube-api-access-t76zs\") pod \"certified-operators-k2bd8\" (UID: \"ee384521-b042-4a17-a82d-f45b3670ddd2\") " pod="openshift-marketplace/certified-operators-k2bd8" Mar 19 19:48:42 crc kubenswrapper[4826]: I0319 19:48:42.154180 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee384521-b042-4a17-a82d-f45b3670ddd2-utilities\") pod \"certified-operators-k2bd8\" (UID: \"ee384521-b042-4a17-a82d-f45b3670ddd2\") " pod="openshift-marketplace/certified-operators-k2bd8" Mar 19 19:48:42 crc kubenswrapper[4826]: I0319 19:48:42.154223 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee384521-b042-4a17-a82d-f45b3670ddd2-catalog-content\") pod \"certified-operators-k2bd8\" (UID: \"ee384521-b042-4a17-a82d-f45b3670ddd2\") " pod="openshift-marketplace/certified-operators-k2bd8" Mar 19 19:48:42 crc kubenswrapper[4826]: I0319 19:48:42.154905 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee384521-b042-4a17-a82d-f45b3670ddd2-utilities\") pod \"certified-operators-k2bd8\" (UID: \"ee384521-b042-4a17-a82d-f45b3670ddd2\") " pod="openshift-marketplace/certified-operators-k2bd8" Mar 19 19:48:42 crc kubenswrapper[4826]: I0319 19:48:42.175486 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t76zs\" (UniqueName: \"kubernetes.io/projected/ee384521-b042-4a17-a82d-f45b3670ddd2-kube-api-access-t76zs\") pod \"certified-operators-k2bd8\" (UID: \"ee384521-b042-4a17-a82d-f45b3670ddd2\") " pod="openshift-marketplace/certified-operators-k2bd8" Mar 19 19:48:42 crc kubenswrapper[4826]: I0319 19:48:42.274842 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k2bd8" Mar 19 19:48:42 crc kubenswrapper[4826]: I0319 19:48:42.778054 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k2bd8"] Mar 19 19:48:42 crc kubenswrapper[4826]: W0319 19:48:42.792386 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee384521_b042_4a17_a82d_f45b3670ddd2.slice/crio-31b89174317d8f4778e029b7e7e4fab5abad653bedd31c3dce68a0fd1e2b54ce WatchSource:0}: Error finding container 31b89174317d8f4778e029b7e7e4fab5abad653bedd31c3dce68a0fd1e2b54ce: Status 404 returned error can't find the container with id 31b89174317d8f4778e029b7e7e4fab5abad653bedd31c3dce68a0fd1e2b54ce Mar 19 19:48:43 crc kubenswrapper[4826]: I0319 19:48:43.319475 4826 generic.go:334] "Generic (PLEG): container finished" podID="ee384521-b042-4a17-a82d-f45b3670ddd2" containerID="48b1823bf746030a0dc02790c20e14d692c7f9dc74a0d710706fb2ad44ee9c6e" exitCode=0 Mar 19 19:48:43 crc kubenswrapper[4826]: I0319 19:48:43.319602 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k2bd8" event={"ID":"ee384521-b042-4a17-a82d-f45b3670ddd2","Type":"ContainerDied","Data":"48b1823bf746030a0dc02790c20e14d692c7f9dc74a0d710706fb2ad44ee9c6e"} Mar 19 19:48:43 crc kubenswrapper[4826]: I0319 19:48:43.319935 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k2bd8" event={"ID":"ee384521-b042-4a17-a82d-f45b3670ddd2","Type":"ContainerStarted","Data":"31b89174317d8f4778e029b7e7e4fab5abad653bedd31c3dce68a0fd1e2b54ce"} Mar 19 19:48:44 crc kubenswrapper[4826]: I0319 19:48:44.335239 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k2bd8" event={"ID":"ee384521-b042-4a17-a82d-f45b3670ddd2","Type":"ContainerStarted","Data":"b9ded1c57579a6eb26b394238531504495f1bb4cf1532eb6d85b7a04dd194b23"} Mar 19 19:48:46 crc kubenswrapper[4826]: I0319 19:48:46.355523 4826 generic.go:334] "Generic (PLEG): container finished" podID="ee384521-b042-4a17-a82d-f45b3670ddd2" containerID="b9ded1c57579a6eb26b394238531504495f1bb4cf1532eb6d85b7a04dd194b23" exitCode=0 Mar 19 19:48:46 crc kubenswrapper[4826]: I0319 19:48:46.355562 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k2bd8" event={"ID":"ee384521-b042-4a17-a82d-f45b3670ddd2","Type":"ContainerDied","Data":"b9ded1c57579a6eb26b394238531504495f1bb4cf1532eb6d85b7a04dd194b23"} Mar 19 19:48:46 crc kubenswrapper[4826]: I0319 19:48:46.976834 4826 scope.go:117] "RemoveContainer" containerID="830b819005c423ed80223146f9e5bb7d6cfc2d12b0ead3f59adc695507460ab3" Mar 19 19:48:46 crc kubenswrapper[4826]: E0319 19:48:46.977351 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:48:47 crc kubenswrapper[4826]: I0319 19:48:47.370341 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k2bd8" event={"ID":"ee384521-b042-4a17-a82d-f45b3670ddd2","Type":"ContainerStarted","Data":"612debb85fae249165d50efb24aa6baa05888f2a85698c6ae6dbc1922602dea1"} Mar 19 19:48:47 crc kubenswrapper[4826]: I0319 19:48:47.409959 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-k2bd8" podStartSLOduration=2.948869711 podStartE2EDuration="6.409929039s" podCreationTimestamp="2026-03-19 19:48:41 +0000 UTC" firstStartedPulling="2026-03-19 19:48:43.321947223 +0000 UTC m=+3148.076015546" lastFinishedPulling="2026-03-19 19:48:46.783006531 +0000 UTC m=+3151.537074874" observedRunningTime="2026-03-19 19:48:47.390069978 +0000 UTC m=+3152.144138371" watchObservedRunningTime="2026-03-19 19:48:47.409929039 +0000 UTC m=+3152.163997412" Mar 19 19:48:52 crc kubenswrapper[4826]: I0319 19:48:52.275375 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-k2bd8" Mar 19 19:48:52 crc kubenswrapper[4826]: I0319 19:48:52.276035 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-k2bd8" Mar 19 19:48:52 crc kubenswrapper[4826]: I0319 19:48:52.353897 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-k2bd8" Mar 19 19:48:52 crc kubenswrapper[4826]: I0319 19:48:52.509581 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-k2bd8" Mar 19 19:48:55 crc kubenswrapper[4826]: I0319 19:48:55.921419 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k2bd8"] Mar 19 19:48:55 crc kubenswrapper[4826]: I0319 19:48:55.922154 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-k2bd8" podUID="ee384521-b042-4a17-a82d-f45b3670ddd2" containerName="registry-server" containerID="cri-o://612debb85fae249165d50efb24aa6baa05888f2a85698c6ae6dbc1922602dea1" gracePeriod=2 Mar 19 19:48:56 crc kubenswrapper[4826]: I0319 19:48:56.509538 4826 generic.go:334] "Generic (PLEG): container finished" podID="ee384521-b042-4a17-a82d-f45b3670ddd2" containerID="612debb85fae249165d50efb24aa6baa05888f2a85698c6ae6dbc1922602dea1" exitCode=0 Mar 19 19:48:56 crc kubenswrapper[4826]: I0319 19:48:56.509887 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k2bd8" event={"ID":"ee384521-b042-4a17-a82d-f45b3670ddd2","Type":"ContainerDied","Data":"612debb85fae249165d50efb24aa6baa05888f2a85698c6ae6dbc1922602dea1"} Mar 19 19:48:56 crc kubenswrapper[4826]: I0319 19:48:56.509917 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k2bd8" event={"ID":"ee384521-b042-4a17-a82d-f45b3670ddd2","Type":"ContainerDied","Data":"31b89174317d8f4778e029b7e7e4fab5abad653bedd31c3dce68a0fd1e2b54ce"} Mar 19 19:48:56 crc kubenswrapper[4826]: I0319 19:48:56.509945 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31b89174317d8f4778e029b7e7e4fab5abad653bedd31c3dce68a0fd1e2b54ce" Mar 19 19:48:56 crc kubenswrapper[4826]: I0319 19:48:56.598004 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k2bd8" Mar 19 19:48:56 crc kubenswrapper[4826]: I0319 19:48:56.770774 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t76zs\" (UniqueName: \"kubernetes.io/projected/ee384521-b042-4a17-a82d-f45b3670ddd2-kube-api-access-t76zs\") pod \"ee384521-b042-4a17-a82d-f45b3670ddd2\" (UID: \"ee384521-b042-4a17-a82d-f45b3670ddd2\") " Mar 19 19:48:56 crc kubenswrapper[4826]: I0319 19:48:56.771256 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee384521-b042-4a17-a82d-f45b3670ddd2-catalog-content\") pod \"ee384521-b042-4a17-a82d-f45b3670ddd2\" (UID: \"ee384521-b042-4a17-a82d-f45b3670ddd2\") " Mar 19 19:48:56 crc kubenswrapper[4826]: I0319 19:48:56.771459 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee384521-b042-4a17-a82d-f45b3670ddd2-utilities\") pod \"ee384521-b042-4a17-a82d-f45b3670ddd2\" (UID: \"ee384521-b042-4a17-a82d-f45b3670ddd2\") " Mar 19 19:48:56 crc kubenswrapper[4826]: I0319 19:48:56.772169 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee384521-b042-4a17-a82d-f45b3670ddd2-utilities" (OuterVolumeSpecName: "utilities") pod "ee384521-b042-4a17-a82d-f45b3670ddd2" (UID: "ee384521-b042-4a17-a82d-f45b3670ddd2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:48:56 crc kubenswrapper[4826]: I0319 19:48:56.778733 4826 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee384521-b042-4a17-a82d-f45b3670ddd2-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 19:48:56 crc kubenswrapper[4826]: I0319 19:48:56.779695 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee384521-b042-4a17-a82d-f45b3670ddd2-kube-api-access-t76zs" (OuterVolumeSpecName: "kube-api-access-t76zs") pod "ee384521-b042-4a17-a82d-f45b3670ddd2" (UID: "ee384521-b042-4a17-a82d-f45b3670ddd2"). InnerVolumeSpecName "kube-api-access-t76zs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:48:56 crc kubenswrapper[4826]: I0319 19:48:56.845085 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee384521-b042-4a17-a82d-f45b3670ddd2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ee384521-b042-4a17-a82d-f45b3670ddd2" (UID: "ee384521-b042-4a17-a82d-f45b3670ddd2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:48:56 crc kubenswrapper[4826]: I0319 19:48:56.880677 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t76zs\" (UniqueName: \"kubernetes.io/projected/ee384521-b042-4a17-a82d-f45b3670ddd2-kube-api-access-t76zs\") on node \"crc\" DevicePath \"\"" Mar 19 19:48:56 crc kubenswrapper[4826]: I0319 19:48:56.880710 4826 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee384521-b042-4a17-a82d-f45b3670ddd2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 19:48:57 crc kubenswrapper[4826]: I0319 19:48:57.520778 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k2bd8" Mar 19 19:48:57 crc kubenswrapper[4826]: I0319 19:48:57.571429 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k2bd8"] Mar 19 19:48:57 crc kubenswrapper[4826]: I0319 19:48:57.581772 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-k2bd8"] Mar 19 19:48:57 crc kubenswrapper[4826]: I0319 19:48:57.990168 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee384521-b042-4a17-a82d-f45b3670ddd2" path="/var/lib/kubelet/pods/ee384521-b042-4a17-a82d-f45b3670ddd2/volumes" Mar 19 19:48:59 crc kubenswrapper[4826]: I0319 19:48:59.977447 4826 scope.go:117] "RemoveContainer" containerID="830b819005c423ed80223146f9e5bb7d6cfc2d12b0ead3f59adc695507460ab3" Mar 19 19:48:59 crc kubenswrapper[4826]: E0319 19:48:59.978126 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:49:11 crc kubenswrapper[4826]: I0319 19:49:11.981804 4826 scope.go:117] "RemoveContainer" containerID="830b819005c423ed80223146f9e5bb7d6cfc2d12b0ead3f59adc695507460ab3" Mar 19 19:49:11 crc kubenswrapper[4826]: E0319 19:49:11.983182 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:49:23 crc kubenswrapper[4826]: I0319 19:49:23.977732 4826 scope.go:117] "RemoveContainer" containerID="830b819005c423ed80223146f9e5bb7d6cfc2d12b0ead3f59adc695507460ab3" Mar 19 19:49:23 crc kubenswrapper[4826]: E0319 19:49:23.978628 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:49:38 crc kubenswrapper[4826]: I0319 19:49:38.984412 4826 scope.go:117] "RemoveContainer" containerID="830b819005c423ed80223146f9e5bb7d6cfc2d12b0ead3f59adc695507460ab3" Mar 19 19:49:38 crc kubenswrapper[4826]: E0319 19:49:38.985271 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:49:53 crc kubenswrapper[4826]: I0319 19:49:53.984767 4826 scope.go:117] "RemoveContainer" containerID="830b819005c423ed80223146f9e5bb7d6cfc2d12b0ead3f59adc695507460ab3" Mar 19 19:49:53 crc kubenswrapper[4826]: E0319 19:49:53.986273 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:50:00 crc kubenswrapper[4826]: I0319 19:50:00.195440 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565830-rdm7n"] Mar 19 19:50:00 crc kubenswrapper[4826]: E0319 19:50:00.196459 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee384521-b042-4a17-a82d-f45b3670ddd2" containerName="registry-server" Mar 19 19:50:00 crc kubenswrapper[4826]: I0319 19:50:00.196474 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee384521-b042-4a17-a82d-f45b3670ddd2" containerName="registry-server" Mar 19 19:50:00 crc kubenswrapper[4826]: E0319 19:50:00.196508 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee384521-b042-4a17-a82d-f45b3670ddd2" containerName="extract-content" Mar 19 19:50:00 crc kubenswrapper[4826]: I0319 19:50:00.196516 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee384521-b042-4a17-a82d-f45b3670ddd2" containerName="extract-content" Mar 19 19:50:00 crc kubenswrapper[4826]: E0319 19:50:00.196552 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee384521-b042-4a17-a82d-f45b3670ddd2" containerName="extract-utilities" Mar 19 19:50:00 crc kubenswrapper[4826]: I0319 19:50:00.196561 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee384521-b042-4a17-a82d-f45b3670ddd2" containerName="extract-utilities" Mar 19 19:50:00 crc kubenswrapper[4826]: I0319 19:50:00.196963 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee384521-b042-4a17-a82d-f45b3670ddd2" containerName="registry-server" Mar 19 19:50:00 crc kubenswrapper[4826]: I0319 19:50:00.198096 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565830-rdm7n" Mar 19 19:50:00 crc kubenswrapper[4826]: I0319 19:50:00.202084 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 19:50:00 crc kubenswrapper[4826]: I0319 19:50:00.202739 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 19:50:00 crc kubenswrapper[4826]: I0319 19:50:00.203000 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b27wl" Mar 19 19:50:00 crc kubenswrapper[4826]: I0319 19:50:00.210769 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565830-rdm7n"] Mar 19 19:50:00 crc kubenswrapper[4826]: I0319 19:50:00.308390 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dlgs\" (UniqueName: \"kubernetes.io/projected/b2740d37-b4cf-4399-91b0-d8fe78dbdd99-kube-api-access-2dlgs\") pod \"auto-csr-approver-29565830-rdm7n\" (UID: \"b2740d37-b4cf-4399-91b0-d8fe78dbdd99\") " pod="openshift-infra/auto-csr-approver-29565830-rdm7n" Mar 19 19:50:00 crc kubenswrapper[4826]: I0319 19:50:00.410410 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dlgs\" (UniqueName: \"kubernetes.io/projected/b2740d37-b4cf-4399-91b0-d8fe78dbdd99-kube-api-access-2dlgs\") pod \"auto-csr-approver-29565830-rdm7n\" (UID: \"b2740d37-b4cf-4399-91b0-d8fe78dbdd99\") " pod="openshift-infra/auto-csr-approver-29565830-rdm7n" Mar 19 19:50:00 crc kubenswrapper[4826]: I0319 19:50:00.447206 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dlgs\" (UniqueName: \"kubernetes.io/projected/b2740d37-b4cf-4399-91b0-d8fe78dbdd99-kube-api-access-2dlgs\") pod \"auto-csr-approver-29565830-rdm7n\" (UID: \"b2740d37-b4cf-4399-91b0-d8fe78dbdd99\") " pod="openshift-infra/auto-csr-approver-29565830-rdm7n" Mar 19 19:50:00 crc kubenswrapper[4826]: I0319 19:50:00.523126 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565830-rdm7n" Mar 19 19:50:01 crc kubenswrapper[4826]: I0319 19:50:01.089198 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565830-rdm7n"] Mar 19 19:50:01 crc kubenswrapper[4826]: I0319 19:50:01.318566 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565830-rdm7n" event={"ID":"b2740d37-b4cf-4399-91b0-d8fe78dbdd99","Type":"ContainerStarted","Data":"f20d95126e7b2183582122a2efec868637c16353b8aa073d52ee31c2a63c4915"} Mar 19 19:50:03 crc kubenswrapper[4826]: I0319 19:50:03.389366 4826 generic.go:334] "Generic (PLEG): container finished" podID="b2740d37-b4cf-4399-91b0-d8fe78dbdd99" containerID="cd2a12d4e9beb80871fb4eaaec2cdaf9127ddad465cc52d495ae195c96f238df" exitCode=0 Mar 19 19:50:03 crc kubenswrapper[4826]: I0319 19:50:03.389780 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565830-rdm7n" event={"ID":"b2740d37-b4cf-4399-91b0-d8fe78dbdd99","Type":"ContainerDied","Data":"cd2a12d4e9beb80871fb4eaaec2cdaf9127ddad465cc52d495ae195c96f238df"} Mar 19 19:50:04 crc kubenswrapper[4826]: I0319 19:50:04.905400 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565830-rdm7n" Mar 19 19:50:04 crc kubenswrapper[4826]: I0319 19:50:04.953094 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dlgs\" (UniqueName: \"kubernetes.io/projected/b2740d37-b4cf-4399-91b0-d8fe78dbdd99-kube-api-access-2dlgs\") pod \"b2740d37-b4cf-4399-91b0-d8fe78dbdd99\" (UID: \"b2740d37-b4cf-4399-91b0-d8fe78dbdd99\") " Mar 19 19:50:04 crc kubenswrapper[4826]: I0319 19:50:04.961782 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2740d37-b4cf-4399-91b0-d8fe78dbdd99-kube-api-access-2dlgs" (OuterVolumeSpecName: "kube-api-access-2dlgs") pod "b2740d37-b4cf-4399-91b0-d8fe78dbdd99" (UID: "b2740d37-b4cf-4399-91b0-d8fe78dbdd99"). InnerVolumeSpecName "kube-api-access-2dlgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:50:05 crc kubenswrapper[4826]: I0319 19:50:05.064390 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dlgs\" (UniqueName: \"kubernetes.io/projected/b2740d37-b4cf-4399-91b0-d8fe78dbdd99-kube-api-access-2dlgs\") on node \"crc\" DevicePath \"\"" Mar 19 19:50:05 crc kubenswrapper[4826]: I0319 19:50:05.418366 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565830-rdm7n" event={"ID":"b2740d37-b4cf-4399-91b0-d8fe78dbdd99","Type":"ContainerDied","Data":"f20d95126e7b2183582122a2efec868637c16353b8aa073d52ee31c2a63c4915"} Mar 19 19:50:05 crc kubenswrapper[4826]: I0319 19:50:05.418419 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f20d95126e7b2183582122a2efec868637c16353b8aa073d52ee31c2a63c4915" Mar 19 19:50:05 crc kubenswrapper[4826]: I0319 19:50:05.418445 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565830-rdm7n" Mar 19 19:50:06 crc kubenswrapper[4826]: I0319 19:50:05.999809 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565824-5zkh8"] Mar 19 19:50:06 crc kubenswrapper[4826]: I0319 19:50:06.015151 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565824-5zkh8"] Mar 19 19:50:06 crc kubenswrapper[4826]: I0319 19:50:06.976593 4826 scope.go:117] "RemoveContainer" containerID="830b819005c423ed80223146f9e5bb7d6cfc2d12b0ead3f59adc695507460ab3" Mar 19 19:50:07 crc kubenswrapper[4826]: I0319 19:50:07.441975 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" event={"ID":"b456fa3f-c7a7-45ca-b560-e7a9b21be05a","Type":"ContainerStarted","Data":"c8cc57047ec94ef10f76fd2ba288391160ac1023ed586c075e64e8fb5b6a6564"} Mar 19 19:50:07 crc kubenswrapper[4826]: I0319 19:50:07.993918 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3edb6db8-83eb-40a2-b680-cdbf4524b608" path="/var/lib/kubelet/pods/3edb6db8-83eb-40a2-b680-cdbf4524b608/volumes" Mar 19 19:50:33 crc kubenswrapper[4826]: I0319 19:50:33.686602 4826 scope.go:117] "RemoveContainer" containerID="65c80aeee7cb6c0e621b68a4b4202a2ff50636e37fdeb37cbf5da06e3aea8210" Mar 19 19:51:25 crc kubenswrapper[4826]: I0319 19:51:25.526314 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-w988l"] Mar 19 19:51:25 crc kubenswrapper[4826]: E0319 19:51:25.529685 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2740d37-b4cf-4399-91b0-d8fe78dbdd99" containerName="oc" Mar 19 19:51:25 crc kubenswrapper[4826]: I0319 19:51:25.529709 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2740d37-b4cf-4399-91b0-d8fe78dbdd99" containerName="oc" Mar 19 19:51:25 crc kubenswrapper[4826]: I0319 19:51:25.530012 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2740d37-b4cf-4399-91b0-d8fe78dbdd99" containerName="oc" Mar 19 19:51:25 crc kubenswrapper[4826]: I0319 19:51:25.532166 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w988l" Mar 19 19:51:25 crc kubenswrapper[4826]: I0319 19:51:25.543844 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-w988l"] Mar 19 19:51:25 crc kubenswrapper[4826]: I0319 19:51:25.604459 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2721c5d-b137-4dac-aecb-c424679d5dd7-utilities\") pod \"community-operators-w988l\" (UID: \"f2721c5d-b137-4dac-aecb-c424679d5dd7\") " pod="openshift-marketplace/community-operators-w988l" Mar 19 19:51:25 crc kubenswrapper[4826]: I0319 19:51:25.604674 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btpl6\" (UniqueName: \"kubernetes.io/projected/f2721c5d-b137-4dac-aecb-c424679d5dd7-kube-api-access-btpl6\") pod \"community-operators-w988l\" (UID: \"f2721c5d-b137-4dac-aecb-c424679d5dd7\") " pod="openshift-marketplace/community-operators-w988l" Mar 19 19:51:25 crc kubenswrapper[4826]: I0319 19:51:25.604891 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2721c5d-b137-4dac-aecb-c424679d5dd7-catalog-content\") pod \"community-operators-w988l\" (UID: \"f2721c5d-b137-4dac-aecb-c424679d5dd7\") " pod="openshift-marketplace/community-operators-w988l" Mar 19 19:51:25 crc kubenswrapper[4826]: I0319 19:51:25.708075 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2721c5d-b137-4dac-aecb-c424679d5dd7-utilities\") pod \"community-operators-w988l\" (UID: \"f2721c5d-b137-4dac-aecb-c424679d5dd7\") " pod="openshift-marketplace/community-operators-w988l" Mar 19 19:51:25 crc kubenswrapper[4826]: I0319 19:51:25.708155 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btpl6\" (UniqueName: \"kubernetes.io/projected/f2721c5d-b137-4dac-aecb-c424679d5dd7-kube-api-access-btpl6\") pod \"community-operators-w988l\" (UID: \"f2721c5d-b137-4dac-aecb-c424679d5dd7\") " pod="openshift-marketplace/community-operators-w988l" Mar 19 19:51:25 crc kubenswrapper[4826]: I0319 19:51:25.708229 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2721c5d-b137-4dac-aecb-c424679d5dd7-catalog-content\") pod \"community-operators-w988l\" (UID: \"f2721c5d-b137-4dac-aecb-c424679d5dd7\") " pod="openshift-marketplace/community-operators-w988l" Mar 19 19:51:25 crc kubenswrapper[4826]: I0319 19:51:25.708699 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2721c5d-b137-4dac-aecb-c424679d5dd7-catalog-content\") pod \"community-operators-w988l\" (UID: \"f2721c5d-b137-4dac-aecb-c424679d5dd7\") " pod="openshift-marketplace/community-operators-w988l" Mar 19 19:51:25 crc kubenswrapper[4826]: I0319 19:51:25.708957 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2721c5d-b137-4dac-aecb-c424679d5dd7-utilities\") pod \"community-operators-w988l\" (UID: \"f2721c5d-b137-4dac-aecb-c424679d5dd7\") " pod="openshift-marketplace/community-operators-w988l" Mar 19 19:51:25 crc kubenswrapper[4826]: I0319 19:51:25.727961 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btpl6\" (UniqueName: \"kubernetes.io/projected/f2721c5d-b137-4dac-aecb-c424679d5dd7-kube-api-access-btpl6\") pod \"community-operators-w988l\" (UID: \"f2721c5d-b137-4dac-aecb-c424679d5dd7\") " pod="openshift-marketplace/community-operators-w988l" Mar 19 19:51:25 crc kubenswrapper[4826]: I0319 19:51:25.862752 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w988l" Mar 19 19:51:26 crc kubenswrapper[4826]: I0319 19:51:26.497459 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-w988l"] Mar 19 19:51:27 crc kubenswrapper[4826]: I0319 19:51:27.383171 4826 generic.go:334] "Generic (PLEG): container finished" podID="f2721c5d-b137-4dac-aecb-c424679d5dd7" containerID="174ae597c457e5337acb473b9db8d411a6d514ff647dc1f5287dbb7f41f4a3db" exitCode=0 Mar 19 19:51:27 crc kubenswrapper[4826]: I0319 19:51:27.383253 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w988l" event={"ID":"f2721c5d-b137-4dac-aecb-c424679d5dd7","Type":"ContainerDied","Data":"174ae597c457e5337acb473b9db8d411a6d514ff647dc1f5287dbb7f41f4a3db"} Mar 19 19:51:27 crc kubenswrapper[4826]: I0319 19:51:27.383486 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w988l" event={"ID":"f2721c5d-b137-4dac-aecb-c424679d5dd7","Type":"ContainerStarted","Data":"3ca133c0ea523c3e2177e34e7af137bbdb58637c7d96f1ad76e0379c5cdf8651"} Mar 19 19:51:27 crc kubenswrapper[4826]: I0319 19:51:27.386449 4826 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 19:51:30 crc kubenswrapper[4826]: I0319 19:51:30.434696 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w988l" event={"ID":"f2721c5d-b137-4dac-aecb-c424679d5dd7","Type":"ContainerStarted","Data":"710178deffac0dc7ce006dbbeb2a16275a90f977386c425e8a52cfc0233a7fc8"} Mar 19 19:51:31 crc kubenswrapper[4826]: I0319 19:51:31.450804 4826 generic.go:334] "Generic (PLEG): container finished" podID="f2721c5d-b137-4dac-aecb-c424679d5dd7" containerID="710178deffac0dc7ce006dbbeb2a16275a90f977386c425e8a52cfc0233a7fc8" exitCode=0 Mar 19 19:51:31 crc kubenswrapper[4826]: I0319 19:51:31.451187 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w988l" event={"ID":"f2721c5d-b137-4dac-aecb-c424679d5dd7","Type":"ContainerDied","Data":"710178deffac0dc7ce006dbbeb2a16275a90f977386c425e8a52cfc0233a7fc8"} Mar 19 19:51:32 crc kubenswrapper[4826]: I0319 19:51:32.472891 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w988l" event={"ID":"f2721c5d-b137-4dac-aecb-c424679d5dd7","Type":"ContainerStarted","Data":"d4eaae4ef327f03f190ebc76a0064d9431050fed52477affca9460d927433a8f"} Mar 19 19:51:32 crc kubenswrapper[4826]: I0319 19:51:32.500935 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-w988l" podStartSLOduration=2.6981456230000003 podStartE2EDuration="7.500911038s" podCreationTimestamp="2026-03-19 19:51:25 +0000 UTC" firstStartedPulling="2026-03-19 19:51:27.386148584 +0000 UTC m=+3312.140216917" lastFinishedPulling="2026-03-19 19:51:32.188914019 +0000 UTC m=+3316.942982332" observedRunningTime="2026-03-19 19:51:32.492455977 +0000 UTC m=+3317.246524290" watchObservedRunningTime="2026-03-19 19:51:32.500911038 +0000 UTC m=+3317.254979351" Mar 19 19:51:35 crc kubenswrapper[4826]: I0319 19:51:35.862965 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-w988l" Mar 19 19:51:35 crc kubenswrapper[4826]: I0319 19:51:35.863787 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-w988l" Mar 19 19:51:36 crc kubenswrapper[4826]: I0319 19:51:36.933322 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-w988l" podUID="f2721c5d-b137-4dac-aecb-c424679d5dd7" containerName="registry-server" probeResult="failure" output=< Mar 19 19:51:36 crc kubenswrapper[4826]: timeout: failed to connect service ":50051" within 1s Mar 19 19:51:36 crc kubenswrapper[4826]: > Mar 19 19:51:45 crc kubenswrapper[4826]: I0319 19:51:45.918953 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-w988l" Mar 19 19:51:45 crc kubenswrapper[4826]: I0319 19:51:45.971229 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-w988l" Mar 19 19:51:46 crc kubenswrapper[4826]: I0319 19:51:46.167197 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-w988l"] Mar 19 19:51:47 crc kubenswrapper[4826]: I0319 19:51:47.680867 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-w988l" podUID="f2721c5d-b137-4dac-aecb-c424679d5dd7" containerName="registry-server" containerID="cri-o://d4eaae4ef327f03f190ebc76a0064d9431050fed52477affca9460d927433a8f" gracePeriod=2 Mar 19 19:51:48 crc kubenswrapper[4826]: I0319 19:51:48.299488 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w988l" Mar 19 19:51:48 crc kubenswrapper[4826]: I0319 19:51:48.405539 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2721c5d-b137-4dac-aecb-c424679d5dd7-utilities\") pod \"f2721c5d-b137-4dac-aecb-c424679d5dd7\" (UID: \"f2721c5d-b137-4dac-aecb-c424679d5dd7\") " Mar 19 19:51:48 crc kubenswrapper[4826]: I0319 19:51:48.405999 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btpl6\" (UniqueName: \"kubernetes.io/projected/f2721c5d-b137-4dac-aecb-c424679d5dd7-kube-api-access-btpl6\") pod \"f2721c5d-b137-4dac-aecb-c424679d5dd7\" (UID: \"f2721c5d-b137-4dac-aecb-c424679d5dd7\") " Mar 19 19:51:48 crc kubenswrapper[4826]: I0319 19:51:48.406083 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2721c5d-b137-4dac-aecb-c424679d5dd7-catalog-content\") pod \"f2721c5d-b137-4dac-aecb-c424679d5dd7\" (UID: \"f2721c5d-b137-4dac-aecb-c424679d5dd7\") " Mar 19 19:51:48 crc kubenswrapper[4826]: I0319 19:51:48.406673 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2721c5d-b137-4dac-aecb-c424679d5dd7-utilities" (OuterVolumeSpecName: "utilities") pod "f2721c5d-b137-4dac-aecb-c424679d5dd7" (UID: "f2721c5d-b137-4dac-aecb-c424679d5dd7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:51:48 crc kubenswrapper[4826]: I0319 19:51:48.406911 4826 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2721c5d-b137-4dac-aecb-c424679d5dd7-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 19:51:48 crc kubenswrapper[4826]: I0319 19:51:48.412890 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2721c5d-b137-4dac-aecb-c424679d5dd7-kube-api-access-btpl6" (OuterVolumeSpecName: "kube-api-access-btpl6") pod "f2721c5d-b137-4dac-aecb-c424679d5dd7" (UID: "f2721c5d-b137-4dac-aecb-c424679d5dd7"). InnerVolumeSpecName "kube-api-access-btpl6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:51:48 crc kubenswrapper[4826]: I0319 19:51:48.461744 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2721c5d-b137-4dac-aecb-c424679d5dd7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f2721c5d-b137-4dac-aecb-c424679d5dd7" (UID: "f2721c5d-b137-4dac-aecb-c424679d5dd7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:51:48 crc kubenswrapper[4826]: I0319 19:51:48.509069 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btpl6\" (UniqueName: \"kubernetes.io/projected/f2721c5d-b137-4dac-aecb-c424679d5dd7-kube-api-access-btpl6\") on node \"crc\" DevicePath \"\"" Mar 19 19:51:48 crc kubenswrapper[4826]: I0319 19:51:48.509143 4826 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2721c5d-b137-4dac-aecb-c424679d5dd7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 19:51:48 crc kubenswrapper[4826]: I0319 19:51:48.703413 4826 generic.go:334] "Generic (PLEG): container finished" podID="f2721c5d-b137-4dac-aecb-c424679d5dd7" containerID="d4eaae4ef327f03f190ebc76a0064d9431050fed52477affca9460d927433a8f" exitCode=0 Mar 19 19:51:48 crc kubenswrapper[4826]: I0319 19:51:48.703496 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w988l" Mar 19 19:51:48 crc kubenswrapper[4826]: I0319 19:51:48.703531 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w988l" event={"ID":"f2721c5d-b137-4dac-aecb-c424679d5dd7","Type":"ContainerDied","Data":"d4eaae4ef327f03f190ebc76a0064d9431050fed52477affca9460d927433a8f"} Mar 19 19:51:48 crc kubenswrapper[4826]: I0319 19:51:48.704106 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w988l" event={"ID":"f2721c5d-b137-4dac-aecb-c424679d5dd7","Type":"ContainerDied","Data":"3ca133c0ea523c3e2177e34e7af137bbdb58637c7d96f1ad76e0379c5cdf8651"} Mar 19 19:51:48 crc kubenswrapper[4826]: I0319 19:51:48.704172 4826 scope.go:117] "RemoveContainer" containerID="d4eaae4ef327f03f190ebc76a0064d9431050fed52477affca9460d927433a8f" Mar 19 19:51:48 crc kubenswrapper[4826]: I0319 19:51:48.751390 4826 scope.go:117] "RemoveContainer" containerID="710178deffac0dc7ce006dbbeb2a16275a90f977386c425e8a52cfc0233a7fc8" Mar 19 19:51:48 crc kubenswrapper[4826]: I0319 19:51:48.767051 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-w988l"] Mar 19 19:51:48 crc kubenswrapper[4826]: I0319 19:51:48.777964 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-w988l"] Mar 19 19:51:48 crc kubenswrapper[4826]: I0319 19:51:48.793902 4826 scope.go:117] "RemoveContainer" containerID="174ae597c457e5337acb473b9db8d411a6d514ff647dc1f5287dbb7f41f4a3db" Mar 19 19:51:48 crc kubenswrapper[4826]: I0319 19:51:48.861029 4826 scope.go:117] "RemoveContainer" containerID="d4eaae4ef327f03f190ebc76a0064d9431050fed52477affca9460d927433a8f" Mar 19 19:51:48 crc kubenswrapper[4826]: E0319 19:51:48.863270 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4eaae4ef327f03f190ebc76a0064d9431050fed52477affca9460d927433a8f\": container with ID starting with d4eaae4ef327f03f190ebc76a0064d9431050fed52477affca9460d927433a8f not found: ID does not exist" containerID="d4eaae4ef327f03f190ebc76a0064d9431050fed52477affca9460d927433a8f" Mar 19 19:51:48 crc kubenswrapper[4826]: I0319 19:51:48.863346 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4eaae4ef327f03f190ebc76a0064d9431050fed52477affca9460d927433a8f"} err="failed to get container status \"d4eaae4ef327f03f190ebc76a0064d9431050fed52477affca9460d927433a8f\": rpc error: code = NotFound desc = could not find container \"d4eaae4ef327f03f190ebc76a0064d9431050fed52477affca9460d927433a8f\": container with ID starting with d4eaae4ef327f03f190ebc76a0064d9431050fed52477affca9460d927433a8f not found: ID does not exist" Mar 19 19:51:48 crc kubenswrapper[4826]: I0319 19:51:48.863388 4826 scope.go:117] "RemoveContainer" containerID="710178deffac0dc7ce006dbbeb2a16275a90f977386c425e8a52cfc0233a7fc8" Mar 19 19:51:48 crc kubenswrapper[4826]: E0319 19:51:48.863850 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"710178deffac0dc7ce006dbbeb2a16275a90f977386c425e8a52cfc0233a7fc8\": container with ID starting with 710178deffac0dc7ce006dbbeb2a16275a90f977386c425e8a52cfc0233a7fc8 not found: ID does not exist" containerID="710178deffac0dc7ce006dbbeb2a16275a90f977386c425e8a52cfc0233a7fc8" Mar 19 19:51:48 crc kubenswrapper[4826]: I0319 19:51:48.863880 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"710178deffac0dc7ce006dbbeb2a16275a90f977386c425e8a52cfc0233a7fc8"} err="failed to get container status \"710178deffac0dc7ce006dbbeb2a16275a90f977386c425e8a52cfc0233a7fc8\": rpc error: code = NotFound desc = could not find container \"710178deffac0dc7ce006dbbeb2a16275a90f977386c425e8a52cfc0233a7fc8\": container with ID starting with 710178deffac0dc7ce006dbbeb2a16275a90f977386c425e8a52cfc0233a7fc8 not found: ID does not exist" Mar 19 19:51:48 crc kubenswrapper[4826]: I0319 19:51:48.863905 4826 scope.go:117] "RemoveContainer" containerID="174ae597c457e5337acb473b9db8d411a6d514ff647dc1f5287dbb7f41f4a3db" Mar 19 19:51:48 crc kubenswrapper[4826]: E0319 19:51:48.864250 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"174ae597c457e5337acb473b9db8d411a6d514ff647dc1f5287dbb7f41f4a3db\": container with ID starting with 174ae597c457e5337acb473b9db8d411a6d514ff647dc1f5287dbb7f41f4a3db not found: ID does not exist" containerID="174ae597c457e5337acb473b9db8d411a6d514ff647dc1f5287dbb7f41f4a3db" Mar 19 19:51:48 crc kubenswrapper[4826]: I0319 19:51:48.864416 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"174ae597c457e5337acb473b9db8d411a6d514ff647dc1f5287dbb7f41f4a3db"} err="failed to get container status \"174ae597c457e5337acb473b9db8d411a6d514ff647dc1f5287dbb7f41f4a3db\": rpc error: code = NotFound desc = could not find container \"174ae597c457e5337acb473b9db8d411a6d514ff647dc1f5287dbb7f41f4a3db\": container with ID starting with 174ae597c457e5337acb473b9db8d411a6d514ff647dc1f5287dbb7f41f4a3db not found: ID does not exist" Mar 19 19:51:49 crc kubenswrapper[4826]: I0319 19:51:49.990230 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2721c5d-b137-4dac-aecb-c424679d5dd7" path="/var/lib/kubelet/pods/f2721c5d-b137-4dac-aecb-c424679d5dd7/volumes" Mar 19 19:52:00 crc kubenswrapper[4826]: I0319 19:52:00.158940 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565832-m6mtl"] Mar 19 19:52:00 crc kubenswrapper[4826]: E0319 19:52:00.159921 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2721c5d-b137-4dac-aecb-c424679d5dd7" containerName="extract-utilities" Mar 19 19:52:00 crc kubenswrapper[4826]: I0319 19:52:00.159934 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2721c5d-b137-4dac-aecb-c424679d5dd7" containerName="extract-utilities" Mar 19 19:52:00 crc kubenswrapper[4826]: E0319 19:52:00.159958 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2721c5d-b137-4dac-aecb-c424679d5dd7" containerName="registry-server" Mar 19 19:52:00 crc kubenswrapper[4826]: I0319 19:52:00.159965 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2721c5d-b137-4dac-aecb-c424679d5dd7" containerName="registry-server" Mar 19 19:52:00 crc kubenswrapper[4826]: E0319 19:52:00.159976 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2721c5d-b137-4dac-aecb-c424679d5dd7" containerName="extract-content" Mar 19 19:52:00 crc kubenswrapper[4826]: I0319 19:52:00.159982 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2721c5d-b137-4dac-aecb-c424679d5dd7" containerName="extract-content" Mar 19 19:52:00 crc kubenswrapper[4826]: I0319 19:52:00.160228 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2721c5d-b137-4dac-aecb-c424679d5dd7" containerName="registry-server" Mar 19 19:52:00 crc kubenswrapper[4826]: I0319 19:52:00.161021 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565832-m6mtl" Mar 19 19:52:00 crc kubenswrapper[4826]: I0319 19:52:00.163395 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b27wl" Mar 19 19:52:00 crc kubenswrapper[4826]: I0319 19:52:00.167000 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 19:52:00 crc kubenswrapper[4826]: I0319 19:52:00.167015 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 19:52:00 crc kubenswrapper[4826]: I0319 19:52:00.197369 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565832-m6mtl"] Mar 19 19:52:00 crc kubenswrapper[4826]: I0319 19:52:00.243277 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrxkt\" (UniqueName: \"kubernetes.io/projected/fcff6b33-0cd8-4d47-bbb6-74791f4da46f-kube-api-access-mrxkt\") pod \"auto-csr-approver-29565832-m6mtl\" (UID: \"fcff6b33-0cd8-4d47-bbb6-74791f4da46f\") " pod="openshift-infra/auto-csr-approver-29565832-m6mtl" Mar 19 19:52:00 crc kubenswrapper[4826]: I0319 19:52:00.345719 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrxkt\" (UniqueName: \"kubernetes.io/projected/fcff6b33-0cd8-4d47-bbb6-74791f4da46f-kube-api-access-mrxkt\") pod \"auto-csr-approver-29565832-m6mtl\" (UID: \"fcff6b33-0cd8-4d47-bbb6-74791f4da46f\") " pod="openshift-infra/auto-csr-approver-29565832-m6mtl" Mar 19 19:52:00 crc kubenswrapper[4826]: I0319 19:52:00.366774 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrxkt\" (UniqueName: \"kubernetes.io/projected/fcff6b33-0cd8-4d47-bbb6-74791f4da46f-kube-api-access-mrxkt\") pod \"auto-csr-approver-29565832-m6mtl\" (UID: \"fcff6b33-0cd8-4d47-bbb6-74791f4da46f\") " pod="openshift-infra/auto-csr-approver-29565832-m6mtl" Mar 19 19:52:00 crc kubenswrapper[4826]: I0319 19:52:00.494987 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565832-m6mtl" Mar 19 19:52:01 crc kubenswrapper[4826]: W0319 19:52:01.057249 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfcff6b33_0cd8_4d47_bbb6_74791f4da46f.slice/crio-fad674248644c7fb239f1eca4c32a261acfb9b090943a3b7bf90783c5212bca8 WatchSource:0}: Error finding container fad674248644c7fb239f1eca4c32a261acfb9b090943a3b7bf90783c5212bca8: Status 404 returned error can't find the container with id fad674248644c7fb239f1eca4c32a261acfb9b090943a3b7bf90783c5212bca8 Mar 19 19:52:01 crc kubenswrapper[4826]: I0319 19:52:01.058796 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565832-m6mtl"] Mar 19 19:52:01 crc kubenswrapper[4826]: I0319 19:52:01.870846 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565832-m6mtl" event={"ID":"fcff6b33-0cd8-4d47-bbb6-74791f4da46f","Type":"ContainerStarted","Data":"fad674248644c7fb239f1eca4c32a261acfb9b090943a3b7bf90783c5212bca8"} Mar 19 19:52:02 crc kubenswrapper[4826]: I0319 19:52:02.884038 4826 generic.go:334] "Generic (PLEG): container finished" podID="fcff6b33-0cd8-4d47-bbb6-74791f4da46f" containerID="a4206b1fc532a8535a9bb2a9bcbdfe3cf1c3b35089f6c55bd28686c117669d5e" exitCode=0 Mar 19 19:52:02 crc kubenswrapper[4826]: I0319 19:52:02.884303 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565832-m6mtl" event={"ID":"fcff6b33-0cd8-4d47-bbb6-74791f4da46f","Type":"ContainerDied","Data":"a4206b1fc532a8535a9bb2a9bcbdfe3cf1c3b35089f6c55bd28686c117669d5e"} Mar 19 19:52:04 crc kubenswrapper[4826]: I0319 19:52:04.386494 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565832-m6mtl" Mar 19 19:52:04 crc kubenswrapper[4826]: I0319 19:52:04.565859 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrxkt\" (UniqueName: \"kubernetes.io/projected/fcff6b33-0cd8-4d47-bbb6-74791f4da46f-kube-api-access-mrxkt\") pod \"fcff6b33-0cd8-4d47-bbb6-74791f4da46f\" (UID: \"fcff6b33-0cd8-4d47-bbb6-74791f4da46f\") " Mar 19 19:52:04 crc kubenswrapper[4826]: I0319 19:52:04.579959 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcff6b33-0cd8-4d47-bbb6-74791f4da46f-kube-api-access-mrxkt" (OuterVolumeSpecName: "kube-api-access-mrxkt") pod "fcff6b33-0cd8-4d47-bbb6-74791f4da46f" (UID: "fcff6b33-0cd8-4d47-bbb6-74791f4da46f"). InnerVolumeSpecName "kube-api-access-mrxkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:52:04 crc kubenswrapper[4826]: I0319 19:52:04.669248 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrxkt\" (UniqueName: \"kubernetes.io/projected/fcff6b33-0cd8-4d47-bbb6-74791f4da46f-kube-api-access-mrxkt\") on node \"crc\" DevicePath \"\"" Mar 19 19:52:04 crc kubenswrapper[4826]: I0319 19:52:04.909892 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565832-m6mtl" event={"ID":"fcff6b33-0cd8-4d47-bbb6-74791f4da46f","Type":"ContainerDied","Data":"fad674248644c7fb239f1eca4c32a261acfb9b090943a3b7bf90783c5212bca8"} Mar 19 19:52:04 crc kubenswrapper[4826]: I0319 19:52:04.909973 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fad674248644c7fb239f1eca4c32a261acfb9b090943a3b7bf90783c5212bca8" Mar 19 19:52:04 crc kubenswrapper[4826]: I0319 19:52:04.909990 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565832-m6mtl" Mar 19 19:52:05 crc kubenswrapper[4826]: I0319 19:52:05.462226 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565826-cv8qd"] Mar 19 19:52:05 crc kubenswrapper[4826]: I0319 19:52:05.473990 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565826-cv8qd"] Mar 19 19:52:06 crc kubenswrapper[4826]: I0319 19:52:05.999691 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="079e8287-f64c-474e-96fd-6e0327e99a0b" path="/var/lib/kubelet/pods/079e8287-f64c-474e-96fd-6e0327e99a0b/volumes" Mar 19 19:52:11 crc kubenswrapper[4826]: E0319 19:52:11.284814 4826 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.69:46476->38.102.83.69:34213: write tcp 38.102.83.69:46476->38.102.83.69:34213: write: connection reset by peer Mar 19 19:52:25 crc kubenswrapper[4826]: I0319 19:52:25.400451 4826 patch_prober.go:28] interesting pod/machine-config-daemon-zz87p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:52:25 crc kubenswrapper[4826]: I0319 19:52:25.401186 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:52:33 crc kubenswrapper[4826]: I0319 19:52:33.809513 4826 scope.go:117] "RemoveContainer" containerID="17d502b32a7e6b1deb33efcaf60f66e82eebfd461c5f2450b5ea0d8c21b6c280" Mar 19 19:52:55 crc kubenswrapper[4826]: I0319 19:52:55.401088 4826 patch_prober.go:28] interesting pod/machine-config-daemon-zz87p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:52:55 crc kubenswrapper[4826]: I0319 19:52:55.401964 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:53:25 crc kubenswrapper[4826]: I0319 19:53:25.401018 4826 patch_prober.go:28] interesting pod/machine-config-daemon-zz87p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:53:25 crc kubenswrapper[4826]: I0319 19:53:25.401477 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:53:25 crc kubenswrapper[4826]: I0319 19:53:25.401520 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" Mar 19 19:53:25 crc kubenswrapper[4826]: I0319 19:53:25.402338 4826 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c8cc57047ec94ef10f76fd2ba288391160ac1023ed586c075e64e8fb5b6a6564"} pod="openshift-machine-config-operator/machine-config-daemon-zz87p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 19:53:25 crc kubenswrapper[4826]: I0319 19:53:25.402384 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" containerID="cri-o://c8cc57047ec94ef10f76fd2ba288391160ac1023ed586c075e64e8fb5b6a6564" gracePeriod=600 Mar 19 19:53:25 crc kubenswrapper[4826]: I0319 19:53:25.956002 4826 generic.go:334] "Generic (PLEG): container finished" podID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerID="c8cc57047ec94ef10f76fd2ba288391160ac1023ed586c075e64e8fb5b6a6564" exitCode=0 Mar 19 19:53:25 crc kubenswrapper[4826]: I0319 19:53:25.956103 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" event={"ID":"b456fa3f-c7a7-45ca-b560-e7a9b21be05a","Type":"ContainerDied","Data":"c8cc57047ec94ef10f76fd2ba288391160ac1023ed586c075e64e8fb5b6a6564"} Mar 19 19:53:25 crc kubenswrapper[4826]: I0319 19:53:25.956359 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" event={"ID":"b456fa3f-c7a7-45ca-b560-e7a9b21be05a","Type":"ContainerStarted","Data":"19026675d22c90c0227b4d60a456b6c13458f69e7a00a61e59a04ce0e98ff9e3"} Mar 19 19:53:25 crc kubenswrapper[4826]: I0319 19:53:25.956387 4826 scope.go:117] "RemoveContainer" containerID="830b819005c423ed80223146f9e5bb7d6cfc2d12b0ead3f59adc695507460ab3" Mar 19 19:54:00 crc kubenswrapper[4826]: I0319 19:54:00.160853 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565834-zgtjs"] Mar 19 19:54:00 crc kubenswrapper[4826]: E0319 19:54:00.162150 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcff6b33-0cd8-4d47-bbb6-74791f4da46f" containerName="oc" Mar 19 19:54:00 crc kubenswrapper[4826]: I0319 19:54:00.162168 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcff6b33-0cd8-4d47-bbb6-74791f4da46f" containerName="oc" Mar 19 19:54:00 crc kubenswrapper[4826]: I0319 19:54:00.162565 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcff6b33-0cd8-4d47-bbb6-74791f4da46f" containerName="oc" Mar 19 19:54:00 crc kubenswrapper[4826]: I0319 19:54:00.163576 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565834-zgtjs" Mar 19 19:54:00 crc kubenswrapper[4826]: I0319 19:54:00.166528 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 19:54:00 crc kubenswrapper[4826]: I0319 19:54:00.166766 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b27wl" Mar 19 19:54:00 crc kubenswrapper[4826]: I0319 19:54:00.167000 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 19:54:00 crc kubenswrapper[4826]: I0319 19:54:00.181830 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565834-zgtjs"] Mar 19 19:54:00 crc kubenswrapper[4826]: I0319 19:54:00.281429 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsjx5\" (UniqueName: \"kubernetes.io/projected/473327a3-285f-478f-9711-fe1c21c8976c-kube-api-access-bsjx5\") pod \"auto-csr-approver-29565834-zgtjs\" (UID: \"473327a3-285f-478f-9711-fe1c21c8976c\") " pod="openshift-infra/auto-csr-approver-29565834-zgtjs" Mar 19 19:54:00 crc kubenswrapper[4826]: I0319 19:54:00.383923 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsjx5\" (UniqueName: \"kubernetes.io/projected/473327a3-285f-478f-9711-fe1c21c8976c-kube-api-access-bsjx5\") pod \"auto-csr-approver-29565834-zgtjs\" (UID: \"473327a3-285f-478f-9711-fe1c21c8976c\") " pod="openshift-infra/auto-csr-approver-29565834-zgtjs" Mar 19 19:54:00 crc kubenswrapper[4826]: I0319 19:54:00.420698 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsjx5\" (UniqueName: \"kubernetes.io/projected/473327a3-285f-478f-9711-fe1c21c8976c-kube-api-access-bsjx5\") pod \"auto-csr-approver-29565834-zgtjs\" (UID: \"473327a3-285f-478f-9711-fe1c21c8976c\") " pod="openshift-infra/auto-csr-approver-29565834-zgtjs" Mar 19 19:54:00 crc kubenswrapper[4826]: I0319 19:54:00.485952 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565834-zgtjs" Mar 19 19:54:00 crc kubenswrapper[4826]: I0319 19:54:00.957179 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565834-zgtjs"] Mar 19 19:54:01 crc kubenswrapper[4826]: I0319 19:54:01.444112 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565834-zgtjs" event={"ID":"473327a3-285f-478f-9711-fe1c21c8976c","Type":"ContainerStarted","Data":"1f14babe4bc8f8a8948c023e4efaad8ac5f5e5c91aa9aba63e6de7f43cbc86ab"} Mar 19 19:54:03 crc kubenswrapper[4826]: I0319 19:54:03.470231 4826 generic.go:334] "Generic (PLEG): container finished" podID="473327a3-285f-478f-9711-fe1c21c8976c" containerID="eed37c96caee102079a66ef473badf59fd46444c6eb78daa8597dcbaab0156f9" exitCode=0 Mar 19 19:54:03 crc kubenswrapper[4826]: I0319 19:54:03.470638 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565834-zgtjs" event={"ID":"473327a3-285f-478f-9711-fe1c21c8976c","Type":"ContainerDied","Data":"eed37c96caee102079a66ef473badf59fd46444c6eb78daa8597dcbaab0156f9"} Mar 19 19:54:05 crc kubenswrapper[4826]: I0319 19:54:05.027673 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565834-zgtjs" Mar 19 19:54:05 crc kubenswrapper[4826]: I0319 19:54:05.126244 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsjx5\" (UniqueName: \"kubernetes.io/projected/473327a3-285f-478f-9711-fe1c21c8976c-kube-api-access-bsjx5\") pod \"473327a3-285f-478f-9711-fe1c21c8976c\" (UID: \"473327a3-285f-478f-9711-fe1c21c8976c\") " Mar 19 19:54:05 crc kubenswrapper[4826]: I0319 19:54:05.134383 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/473327a3-285f-478f-9711-fe1c21c8976c-kube-api-access-bsjx5" (OuterVolumeSpecName: "kube-api-access-bsjx5") pod "473327a3-285f-478f-9711-fe1c21c8976c" (UID: "473327a3-285f-478f-9711-fe1c21c8976c"). InnerVolumeSpecName "kube-api-access-bsjx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:54:05 crc kubenswrapper[4826]: I0319 19:54:05.229208 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bsjx5\" (UniqueName: \"kubernetes.io/projected/473327a3-285f-478f-9711-fe1c21c8976c-kube-api-access-bsjx5\") on node \"crc\" DevicePath \"\"" Mar 19 19:54:05 crc kubenswrapper[4826]: I0319 19:54:05.498244 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565834-zgtjs" event={"ID":"473327a3-285f-478f-9711-fe1c21c8976c","Type":"ContainerDied","Data":"1f14babe4bc8f8a8948c023e4efaad8ac5f5e5c91aa9aba63e6de7f43cbc86ab"} Mar 19 19:54:05 crc kubenswrapper[4826]: I0319 19:54:05.498309 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f14babe4bc8f8a8948c023e4efaad8ac5f5e5c91aa9aba63e6de7f43cbc86ab" Mar 19 19:54:05 crc kubenswrapper[4826]: I0319 19:54:05.498410 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565834-zgtjs" Mar 19 19:54:06 crc kubenswrapper[4826]: I0319 19:54:06.139010 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565828-9v5gm"] Mar 19 19:54:06 crc kubenswrapper[4826]: I0319 19:54:06.176466 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565828-9v5gm"] Mar 19 19:54:07 crc kubenswrapper[4826]: I0319 19:54:07.994526 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99a85285-4e30-4fda-9a29-da2ec169a11e" path="/var/lib/kubelet/pods/99a85285-4e30-4fda-9a29-da2ec169a11e/volumes" Mar 19 19:54:33 crc kubenswrapper[4826]: I0319 19:54:33.923401 4826 scope.go:117] "RemoveContainer" containerID="a8d403ef5f2775e744e7c44c73209b052a08a8870ad88093b2fbc3c06d4348e2" Mar 19 19:54:52 crc kubenswrapper[4826]: I0319 19:54:52.598961 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2hwcr"] Mar 19 19:54:52 crc kubenswrapper[4826]: E0319 19:54:52.600029 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="473327a3-285f-478f-9711-fe1c21c8976c" containerName="oc" Mar 19 19:54:52 crc kubenswrapper[4826]: I0319 19:54:52.600045 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="473327a3-285f-478f-9711-fe1c21c8976c" containerName="oc" Mar 19 19:54:52 crc kubenswrapper[4826]: I0319 19:54:52.600344 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="473327a3-285f-478f-9711-fe1c21c8976c" containerName="oc" Mar 19 19:54:52 crc kubenswrapper[4826]: I0319 19:54:52.602352 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2hwcr" Mar 19 19:54:52 crc kubenswrapper[4826]: I0319 19:54:52.624561 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2hwcr"] Mar 19 19:54:52 crc kubenswrapper[4826]: I0319 19:54:52.661771 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9wnk\" (UniqueName: \"kubernetes.io/projected/110d7255-55dd-4c5b-b0b1-57e5457691d6-kube-api-access-h9wnk\") pod \"redhat-operators-2hwcr\" (UID: \"110d7255-55dd-4c5b-b0b1-57e5457691d6\") " pod="openshift-marketplace/redhat-operators-2hwcr" Mar 19 19:54:52 crc kubenswrapper[4826]: I0319 19:54:52.661860 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/110d7255-55dd-4c5b-b0b1-57e5457691d6-catalog-content\") pod \"redhat-operators-2hwcr\" (UID: \"110d7255-55dd-4c5b-b0b1-57e5457691d6\") " pod="openshift-marketplace/redhat-operators-2hwcr" Mar 19 19:54:52 crc kubenswrapper[4826]: I0319 19:54:52.661914 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/110d7255-55dd-4c5b-b0b1-57e5457691d6-utilities\") pod \"redhat-operators-2hwcr\" (UID: \"110d7255-55dd-4c5b-b0b1-57e5457691d6\") " pod="openshift-marketplace/redhat-operators-2hwcr" Mar 19 19:54:52 crc kubenswrapper[4826]: I0319 19:54:52.764976 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9wnk\" (UniqueName: \"kubernetes.io/projected/110d7255-55dd-4c5b-b0b1-57e5457691d6-kube-api-access-h9wnk\") pod \"redhat-operators-2hwcr\" (UID: \"110d7255-55dd-4c5b-b0b1-57e5457691d6\") " pod="openshift-marketplace/redhat-operators-2hwcr" Mar 19 19:54:52 crc kubenswrapper[4826]: I0319 19:54:52.765079 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/110d7255-55dd-4c5b-b0b1-57e5457691d6-catalog-content\") pod \"redhat-operators-2hwcr\" (UID: \"110d7255-55dd-4c5b-b0b1-57e5457691d6\") " pod="openshift-marketplace/redhat-operators-2hwcr" Mar 19 19:54:52 crc kubenswrapper[4826]: I0319 19:54:52.765128 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/110d7255-55dd-4c5b-b0b1-57e5457691d6-utilities\") pod \"redhat-operators-2hwcr\" (UID: \"110d7255-55dd-4c5b-b0b1-57e5457691d6\") " pod="openshift-marketplace/redhat-operators-2hwcr" Mar 19 19:54:52 crc kubenswrapper[4826]: I0319 19:54:52.765833 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/110d7255-55dd-4c5b-b0b1-57e5457691d6-catalog-content\") pod \"redhat-operators-2hwcr\" (UID: \"110d7255-55dd-4c5b-b0b1-57e5457691d6\") " pod="openshift-marketplace/redhat-operators-2hwcr" Mar 19 19:54:52 crc kubenswrapper[4826]: I0319 19:54:52.765939 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/110d7255-55dd-4c5b-b0b1-57e5457691d6-utilities\") pod \"redhat-operators-2hwcr\" (UID: \"110d7255-55dd-4c5b-b0b1-57e5457691d6\") " pod="openshift-marketplace/redhat-operators-2hwcr" Mar 19 19:54:52 crc kubenswrapper[4826]: I0319 19:54:52.787608 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9wnk\" (UniqueName: \"kubernetes.io/projected/110d7255-55dd-4c5b-b0b1-57e5457691d6-kube-api-access-h9wnk\") pod \"redhat-operators-2hwcr\" (UID: \"110d7255-55dd-4c5b-b0b1-57e5457691d6\") " pod="openshift-marketplace/redhat-operators-2hwcr" Mar 19 19:54:52 crc kubenswrapper[4826]: I0319 19:54:52.952278 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2hwcr" Mar 19 19:54:53 crc kubenswrapper[4826]: I0319 19:54:53.499090 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2hwcr"] Mar 19 19:54:54 crc kubenswrapper[4826]: I0319 19:54:54.166677 4826 generic.go:334] "Generic (PLEG): container finished" podID="110d7255-55dd-4c5b-b0b1-57e5457691d6" containerID="7f90d1c8903e52021d713f634fb94f37634113bce87e61896c0ebf6e591abf88" exitCode=0 Mar 19 19:54:54 crc kubenswrapper[4826]: I0319 19:54:54.166728 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2hwcr" event={"ID":"110d7255-55dd-4c5b-b0b1-57e5457691d6","Type":"ContainerDied","Data":"7f90d1c8903e52021d713f634fb94f37634113bce87e61896c0ebf6e591abf88"} Mar 19 19:54:54 crc kubenswrapper[4826]: I0319 19:54:54.166996 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2hwcr" event={"ID":"110d7255-55dd-4c5b-b0b1-57e5457691d6","Type":"ContainerStarted","Data":"af23c0757a34f264ab4ca22940ee489e629466bfb725a1797f99bf9a75acd443"} Mar 19 19:54:56 crc kubenswrapper[4826]: I0319 19:54:56.200479 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2hwcr" event={"ID":"110d7255-55dd-4c5b-b0b1-57e5457691d6","Type":"ContainerStarted","Data":"30d3f5a845ed1d10b7abbc0e05dfb0f59ef0313ce0ea2c3f923a62c02f4a5305"} Mar 19 19:55:01 crc kubenswrapper[4826]: I0319 19:55:01.267430 4826 generic.go:334] "Generic (PLEG): container finished" podID="110d7255-55dd-4c5b-b0b1-57e5457691d6" containerID="30d3f5a845ed1d10b7abbc0e05dfb0f59ef0313ce0ea2c3f923a62c02f4a5305" exitCode=0 Mar 19 19:55:01 crc kubenswrapper[4826]: I0319 19:55:01.268367 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2hwcr" event={"ID":"110d7255-55dd-4c5b-b0b1-57e5457691d6","Type":"ContainerDied","Data":"30d3f5a845ed1d10b7abbc0e05dfb0f59ef0313ce0ea2c3f923a62c02f4a5305"} Mar 19 19:55:02 crc kubenswrapper[4826]: I0319 19:55:02.283491 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2hwcr" event={"ID":"110d7255-55dd-4c5b-b0b1-57e5457691d6","Type":"ContainerStarted","Data":"04d37a8e0ef534d24c26f2ac2a5dd356d5742bc3cf3e8730cb8cc11ca6798bbd"} Mar 19 19:55:02 crc kubenswrapper[4826]: I0319 19:55:02.313267 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2hwcr" podStartSLOduration=2.796785141 podStartE2EDuration="10.313250401s" podCreationTimestamp="2026-03-19 19:54:52 +0000 UTC" firstStartedPulling="2026-03-19 19:54:54.16930835 +0000 UTC m=+3518.923376663" lastFinishedPulling="2026-03-19 19:55:01.6857736 +0000 UTC m=+3526.439841923" observedRunningTime="2026-03-19 19:55:02.304866878 +0000 UTC m=+3527.058935191" watchObservedRunningTime="2026-03-19 19:55:02.313250401 +0000 UTC m=+3527.067318714" Mar 19 19:55:02 crc kubenswrapper[4826]: I0319 19:55:02.952385 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2hwcr" Mar 19 19:55:02 crc kubenswrapper[4826]: I0319 19:55:02.952792 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2hwcr" Mar 19 19:55:04 crc kubenswrapper[4826]: I0319 19:55:04.012035 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2hwcr" podUID="110d7255-55dd-4c5b-b0b1-57e5457691d6" containerName="registry-server" probeResult="failure" output=< Mar 19 19:55:04 crc kubenswrapper[4826]: timeout: failed to connect service ":50051" within 1s Mar 19 19:55:04 crc kubenswrapper[4826]: > Mar 19 19:55:14 crc kubenswrapper[4826]: I0319 19:55:14.005166 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2hwcr" podUID="110d7255-55dd-4c5b-b0b1-57e5457691d6" containerName="registry-server" probeResult="failure" output=< Mar 19 19:55:14 crc kubenswrapper[4826]: timeout: failed to connect service ":50051" within 1s Mar 19 19:55:14 crc kubenswrapper[4826]: > Mar 19 19:55:24 crc kubenswrapper[4826]: I0319 19:55:24.011807 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2hwcr" podUID="110d7255-55dd-4c5b-b0b1-57e5457691d6" containerName="registry-server" probeResult="failure" output=< Mar 19 19:55:24 crc kubenswrapper[4826]: timeout: failed to connect service ":50051" within 1s Mar 19 19:55:24 crc kubenswrapper[4826]: > Mar 19 19:55:25 crc kubenswrapper[4826]: I0319 19:55:25.400588 4826 patch_prober.go:28] interesting pod/machine-config-daemon-zz87p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:55:25 crc kubenswrapper[4826]: I0319 19:55:25.400921 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:55:33 crc kubenswrapper[4826]: I0319 19:55:33.045950 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2hwcr" Mar 19 19:55:33 crc kubenswrapper[4826]: I0319 19:55:33.097494 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2hwcr" Mar 19 19:55:33 crc kubenswrapper[4826]: I0319 19:55:33.287282 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2hwcr"] Mar 19 19:55:34 crc kubenswrapper[4826]: I0319 19:55:34.060461 4826 scope.go:117] "RemoveContainer" containerID="612debb85fae249165d50efb24aa6baa05888f2a85698c6ae6dbc1922602dea1" Mar 19 19:55:34 crc kubenswrapper[4826]: I0319 19:55:34.120270 4826 scope.go:117] "RemoveContainer" containerID="48b1823bf746030a0dc02790c20e14d692c7f9dc74a0d710706fb2ad44ee9c6e" Mar 19 19:55:34 crc kubenswrapper[4826]: I0319 19:55:34.146041 4826 scope.go:117] "RemoveContainer" containerID="b9ded1c57579a6eb26b394238531504495f1bb4cf1532eb6d85b7a04dd194b23" Mar 19 19:55:34 crc kubenswrapper[4826]: I0319 19:55:34.698611 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2hwcr" podUID="110d7255-55dd-4c5b-b0b1-57e5457691d6" containerName="registry-server" containerID="cri-o://04d37a8e0ef534d24c26f2ac2a5dd356d5742bc3cf3e8730cb8cc11ca6798bbd" gracePeriod=2 Mar 19 19:55:35 crc kubenswrapper[4826]: I0319 19:55:35.289213 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2hwcr" Mar 19 19:55:35 crc kubenswrapper[4826]: I0319 19:55:35.486070 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/110d7255-55dd-4c5b-b0b1-57e5457691d6-catalog-content\") pod \"110d7255-55dd-4c5b-b0b1-57e5457691d6\" (UID: \"110d7255-55dd-4c5b-b0b1-57e5457691d6\") " Mar 19 19:55:35 crc kubenswrapper[4826]: I0319 19:55:35.489201 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9wnk\" (UniqueName: \"kubernetes.io/projected/110d7255-55dd-4c5b-b0b1-57e5457691d6-kube-api-access-h9wnk\") pod \"110d7255-55dd-4c5b-b0b1-57e5457691d6\" (UID: \"110d7255-55dd-4c5b-b0b1-57e5457691d6\") " Mar 19 19:55:35 crc kubenswrapper[4826]: I0319 19:55:35.489346 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/110d7255-55dd-4c5b-b0b1-57e5457691d6-utilities\") pod \"110d7255-55dd-4c5b-b0b1-57e5457691d6\" (UID: \"110d7255-55dd-4c5b-b0b1-57e5457691d6\") " Mar 19 19:55:35 crc kubenswrapper[4826]: I0319 19:55:35.489999 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/110d7255-55dd-4c5b-b0b1-57e5457691d6-utilities" (OuterVolumeSpecName: "utilities") pod "110d7255-55dd-4c5b-b0b1-57e5457691d6" (UID: "110d7255-55dd-4c5b-b0b1-57e5457691d6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:55:35 crc kubenswrapper[4826]: I0319 19:55:35.490539 4826 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/110d7255-55dd-4c5b-b0b1-57e5457691d6-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 19:55:35 crc kubenswrapper[4826]: I0319 19:55:35.498064 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/110d7255-55dd-4c5b-b0b1-57e5457691d6-kube-api-access-h9wnk" (OuterVolumeSpecName: "kube-api-access-h9wnk") pod "110d7255-55dd-4c5b-b0b1-57e5457691d6" (UID: "110d7255-55dd-4c5b-b0b1-57e5457691d6"). InnerVolumeSpecName "kube-api-access-h9wnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:55:35 crc kubenswrapper[4826]: I0319 19:55:35.593492 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9wnk\" (UniqueName: \"kubernetes.io/projected/110d7255-55dd-4c5b-b0b1-57e5457691d6-kube-api-access-h9wnk\") on node \"crc\" DevicePath \"\"" Mar 19 19:55:35 crc kubenswrapper[4826]: I0319 19:55:35.631914 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/110d7255-55dd-4c5b-b0b1-57e5457691d6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "110d7255-55dd-4c5b-b0b1-57e5457691d6" (UID: "110d7255-55dd-4c5b-b0b1-57e5457691d6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:55:35 crc kubenswrapper[4826]: I0319 19:55:35.696750 4826 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/110d7255-55dd-4c5b-b0b1-57e5457691d6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 19:55:35 crc kubenswrapper[4826]: I0319 19:55:35.715350 4826 generic.go:334] "Generic (PLEG): container finished" podID="110d7255-55dd-4c5b-b0b1-57e5457691d6" containerID="04d37a8e0ef534d24c26f2ac2a5dd356d5742bc3cf3e8730cb8cc11ca6798bbd" exitCode=0 Mar 19 19:55:35 crc kubenswrapper[4826]: I0319 19:55:35.715390 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2hwcr" event={"ID":"110d7255-55dd-4c5b-b0b1-57e5457691d6","Type":"ContainerDied","Data":"04d37a8e0ef534d24c26f2ac2a5dd356d5742bc3cf3e8730cb8cc11ca6798bbd"} Mar 19 19:55:35 crc kubenswrapper[4826]: I0319 19:55:35.715415 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2hwcr" event={"ID":"110d7255-55dd-4c5b-b0b1-57e5457691d6","Type":"ContainerDied","Data":"af23c0757a34f264ab4ca22940ee489e629466bfb725a1797f99bf9a75acd443"} Mar 19 19:55:35 crc kubenswrapper[4826]: I0319 19:55:35.715431 4826 scope.go:117] "RemoveContainer" containerID="04d37a8e0ef534d24c26f2ac2a5dd356d5742bc3cf3e8730cb8cc11ca6798bbd" Mar 19 19:55:35 crc kubenswrapper[4826]: I0319 19:55:35.715461 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2hwcr" Mar 19 19:55:35 crc kubenswrapper[4826]: I0319 19:55:35.751412 4826 scope.go:117] "RemoveContainer" containerID="30d3f5a845ed1d10b7abbc0e05dfb0f59ef0313ce0ea2c3f923a62c02f4a5305" Mar 19 19:55:35 crc kubenswrapper[4826]: I0319 19:55:35.761729 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2hwcr"] Mar 19 19:55:35 crc kubenswrapper[4826]: I0319 19:55:35.774969 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2hwcr"] Mar 19 19:55:35 crc kubenswrapper[4826]: I0319 19:55:35.780136 4826 scope.go:117] "RemoveContainer" containerID="7f90d1c8903e52021d713f634fb94f37634113bce87e61896c0ebf6e591abf88" Mar 19 19:55:35 crc kubenswrapper[4826]: I0319 19:55:35.835178 4826 scope.go:117] "RemoveContainer" containerID="04d37a8e0ef534d24c26f2ac2a5dd356d5742bc3cf3e8730cb8cc11ca6798bbd" Mar 19 19:55:35 crc kubenswrapper[4826]: E0319 19:55:35.835951 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04d37a8e0ef534d24c26f2ac2a5dd356d5742bc3cf3e8730cb8cc11ca6798bbd\": container with ID starting with 04d37a8e0ef534d24c26f2ac2a5dd356d5742bc3cf3e8730cb8cc11ca6798bbd not found: ID does not exist" containerID="04d37a8e0ef534d24c26f2ac2a5dd356d5742bc3cf3e8730cb8cc11ca6798bbd" Mar 19 19:55:35 crc kubenswrapper[4826]: I0319 19:55:35.836027 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04d37a8e0ef534d24c26f2ac2a5dd356d5742bc3cf3e8730cb8cc11ca6798bbd"} err="failed to get container status \"04d37a8e0ef534d24c26f2ac2a5dd356d5742bc3cf3e8730cb8cc11ca6798bbd\": rpc error: code = NotFound desc = could not find container \"04d37a8e0ef534d24c26f2ac2a5dd356d5742bc3cf3e8730cb8cc11ca6798bbd\": container with ID starting with 04d37a8e0ef534d24c26f2ac2a5dd356d5742bc3cf3e8730cb8cc11ca6798bbd not found: ID does not exist" Mar 19 19:55:35 crc kubenswrapper[4826]: I0319 19:55:35.836086 4826 scope.go:117] "RemoveContainer" containerID="30d3f5a845ed1d10b7abbc0e05dfb0f59ef0313ce0ea2c3f923a62c02f4a5305" Mar 19 19:55:35 crc kubenswrapper[4826]: E0319 19:55:35.836520 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30d3f5a845ed1d10b7abbc0e05dfb0f59ef0313ce0ea2c3f923a62c02f4a5305\": container with ID starting with 30d3f5a845ed1d10b7abbc0e05dfb0f59ef0313ce0ea2c3f923a62c02f4a5305 not found: ID does not exist" containerID="30d3f5a845ed1d10b7abbc0e05dfb0f59ef0313ce0ea2c3f923a62c02f4a5305" Mar 19 19:55:35 crc kubenswrapper[4826]: I0319 19:55:35.836599 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30d3f5a845ed1d10b7abbc0e05dfb0f59ef0313ce0ea2c3f923a62c02f4a5305"} err="failed to get container status \"30d3f5a845ed1d10b7abbc0e05dfb0f59ef0313ce0ea2c3f923a62c02f4a5305\": rpc error: code = NotFound desc = could not find container \"30d3f5a845ed1d10b7abbc0e05dfb0f59ef0313ce0ea2c3f923a62c02f4a5305\": container with ID starting with 30d3f5a845ed1d10b7abbc0e05dfb0f59ef0313ce0ea2c3f923a62c02f4a5305 not found: ID does not exist" Mar 19 19:55:35 crc kubenswrapper[4826]: I0319 19:55:35.836693 4826 scope.go:117] "RemoveContainer" containerID="7f90d1c8903e52021d713f634fb94f37634113bce87e61896c0ebf6e591abf88" Mar 19 19:55:35 crc kubenswrapper[4826]: E0319 19:55:35.837156 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f90d1c8903e52021d713f634fb94f37634113bce87e61896c0ebf6e591abf88\": container with ID starting with 7f90d1c8903e52021d713f634fb94f37634113bce87e61896c0ebf6e591abf88 not found: ID does not exist" containerID="7f90d1c8903e52021d713f634fb94f37634113bce87e61896c0ebf6e591abf88" Mar 19 19:55:35 crc kubenswrapper[4826]: I0319 19:55:35.837204 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f90d1c8903e52021d713f634fb94f37634113bce87e61896c0ebf6e591abf88"} err="failed to get container status \"7f90d1c8903e52021d713f634fb94f37634113bce87e61896c0ebf6e591abf88\": rpc error: code = NotFound desc = could not find container \"7f90d1c8903e52021d713f634fb94f37634113bce87e61896c0ebf6e591abf88\": container with ID starting with 7f90d1c8903e52021d713f634fb94f37634113bce87e61896c0ebf6e591abf88 not found: ID does not exist" Mar 19 19:55:35 crc kubenswrapper[4826]: I0319 19:55:35.994917 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="110d7255-55dd-4c5b-b0b1-57e5457691d6" path="/var/lib/kubelet/pods/110d7255-55dd-4c5b-b0b1-57e5457691d6/volumes" Mar 19 19:55:55 crc kubenswrapper[4826]: I0319 19:55:55.400379 4826 patch_prober.go:28] interesting pod/machine-config-daemon-zz87p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:55:55 crc kubenswrapper[4826]: I0319 19:55:55.400853 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:56:00 crc kubenswrapper[4826]: I0319 19:56:00.144846 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565836-dkt7n"] Mar 19 19:56:00 crc kubenswrapper[4826]: E0319 19:56:00.147089 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="110d7255-55dd-4c5b-b0b1-57e5457691d6" containerName="extract-utilities" Mar 19 19:56:00 crc kubenswrapper[4826]: I0319 19:56:00.147113 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="110d7255-55dd-4c5b-b0b1-57e5457691d6" containerName="extract-utilities" Mar 19 19:56:00 crc kubenswrapper[4826]: E0319 19:56:00.147128 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="110d7255-55dd-4c5b-b0b1-57e5457691d6" containerName="registry-server" Mar 19 19:56:00 crc kubenswrapper[4826]: I0319 19:56:00.147134 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="110d7255-55dd-4c5b-b0b1-57e5457691d6" containerName="registry-server" Mar 19 19:56:00 crc kubenswrapper[4826]: E0319 19:56:00.147163 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="110d7255-55dd-4c5b-b0b1-57e5457691d6" containerName="extract-content" Mar 19 19:56:00 crc kubenswrapper[4826]: I0319 19:56:00.147169 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="110d7255-55dd-4c5b-b0b1-57e5457691d6" containerName="extract-content" Mar 19 19:56:00 crc kubenswrapper[4826]: I0319 19:56:00.147394 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="110d7255-55dd-4c5b-b0b1-57e5457691d6" containerName="registry-server" Mar 19 19:56:00 crc kubenswrapper[4826]: I0319 19:56:00.148379 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565836-dkt7n" Mar 19 19:56:00 crc kubenswrapper[4826]: I0319 19:56:00.151582 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 19:56:00 crc kubenswrapper[4826]: I0319 19:56:00.152002 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 19:56:00 crc kubenswrapper[4826]: I0319 19:56:00.152065 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b27wl" Mar 19 19:56:00 crc kubenswrapper[4826]: I0319 19:56:00.165456 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565836-dkt7n"] Mar 19 19:56:00 crc kubenswrapper[4826]: I0319 19:56:00.283269 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxsx2\" (UniqueName: \"kubernetes.io/projected/0f93bd14-0892-4dd0-8a34-3352dcd8e36f-kube-api-access-fxsx2\") pod \"auto-csr-approver-29565836-dkt7n\" (UID: \"0f93bd14-0892-4dd0-8a34-3352dcd8e36f\") " pod="openshift-infra/auto-csr-approver-29565836-dkt7n" Mar 19 19:56:00 crc kubenswrapper[4826]: I0319 19:56:00.386390 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxsx2\" (UniqueName: \"kubernetes.io/projected/0f93bd14-0892-4dd0-8a34-3352dcd8e36f-kube-api-access-fxsx2\") pod \"auto-csr-approver-29565836-dkt7n\" (UID: \"0f93bd14-0892-4dd0-8a34-3352dcd8e36f\") " pod="openshift-infra/auto-csr-approver-29565836-dkt7n" Mar 19 19:56:00 crc kubenswrapper[4826]: I0319 19:56:00.415222 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxsx2\" (UniqueName: \"kubernetes.io/projected/0f93bd14-0892-4dd0-8a34-3352dcd8e36f-kube-api-access-fxsx2\") pod \"auto-csr-approver-29565836-dkt7n\" (UID: \"0f93bd14-0892-4dd0-8a34-3352dcd8e36f\") " pod="openshift-infra/auto-csr-approver-29565836-dkt7n" Mar 19 19:56:00 crc kubenswrapper[4826]: I0319 19:56:00.467040 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565836-dkt7n" Mar 19 19:56:01 crc kubenswrapper[4826]: I0319 19:56:01.116607 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565836-dkt7n"] Mar 19 19:56:02 crc kubenswrapper[4826]: I0319 19:56:02.089271 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565836-dkt7n" event={"ID":"0f93bd14-0892-4dd0-8a34-3352dcd8e36f","Type":"ContainerStarted","Data":"8d7f0005bf6bda704438a0a80adbb2776c4040ef16cb3ee0be30b70637a7386c"} Mar 19 19:56:03 crc kubenswrapper[4826]: I0319 19:56:03.103529 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565836-dkt7n" event={"ID":"0f93bd14-0892-4dd0-8a34-3352dcd8e36f","Type":"ContainerStarted","Data":"35355a22aac8ad5229841f60e5c5ee1082cf0ca4d2b9db36d2899c0f43e497d7"} Mar 19 19:56:03 crc kubenswrapper[4826]: I0319 19:56:03.150447 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565836-dkt7n" podStartSLOduration=2.148534051 podStartE2EDuration="3.150425152s" podCreationTimestamp="2026-03-19 19:56:00 +0000 UTC" firstStartedPulling="2026-03-19 19:56:01.129470845 +0000 UTC m=+3585.883539158" lastFinishedPulling="2026-03-19 19:56:02.131361906 +0000 UTC m=+3586.885430259" observedRunningTime="2026-03-19 19:56:03.1428885 +0000 UTC m=+3587.896956833" watchObservedRunningTime="2026-03-19 19:56:03.150425152 +0000 UTC m=+3587.904493475" Mar 19 19:56:04 crc kubenswrapper[4826]: I0319 19:56:04.116424 4826 generic.go:334] "Generic (PLEG): container finished" podID="0f93bd14-0892-4dd0-8a34-3352dcd8e36f" containerID="35355a22aac8ad5229841f60e5c5ee1082cf0ca4d2b9db36d2899c0f43e497d7" exitCode=0 Mar 19 19:56:04 crc kubenswrapper[4826]: I0319 19:56:04.116535 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565836-dkt7n" event={"ID":"0f93bd14-0892-4dd0-8a34-3352dcd8e36f","Type":"ContainerDied","Data":"35355a22aac8ad5229841f60e5c5ee1082cf0ca4d2b9db36d2899c0f43e497d7"} Mar 19 19:56:05 crc kubenswrapper[4826]: I0319 19:56:05.596169 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565836-dkt7n" Mar 19 19:56:05 crc kubenswrapper[4826]: I0319 19:56:05.791366 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxsx2\" (UniqueName: \"kubernetes.io/projected/0f93bd14-0892-4dd0-8a34-3352dcd8e36f-kube-api-access-fxsx2\") pod \"0f93bd14-0892-4dd0-8a34-3352dcd8e36f\" (UID: \"0f93bd14-0892-4dd0-8a34-3352dcd8e36f\") " Mar 19 19:56:05 crc kubenswrapper[4826]: I0319 19:56:05.817065 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f93bd14-0892-4dd0-8a34-3352dcd8e36f-kube-api-access-fxsx2" (OuterVolumeSpecName: "kube-api-access-fxsx2") pod "0f93bd14-0892-4dd0-8a34-3352dcd8e36f" (UID: "0f93bd14-0892-4dd0-8a34-3352dcd8e36f"). InnerVolumeSpecName "kube-api-access-fxsx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:56:05 crc kubenswrapper[4826]: I0319 19:56:05.895088 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxsx2\" (UniqueName: \"kubernetes.io/projected/0f93bd14-0892-4dd0-8a34-3352dcd8e36f-kube-api-access-fxsx2\") on node \"crc\" DevicePath \"\"" Mar 19 19:56:06 crc kubenswrapper[4826]: I0319 19:56:06.701368 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565830-rdm7n"] Mar 19 19:56:06 crc kubenswrapper[4826]: I0319 19:56:06.704959 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565836-dkt7n" event={"ID":"0f93bd14-0892-4dd0-8a34-3352dcd8e36f","Type":"ContainerDied","Data":"8d7f0005bf6bda704438a0a80adbb2776c4040ef16cb3ee0be30b70637a7386c"} Mar 19 19:56:06 crc kubenswrapper[4826]: I0319 19:56:06.705024 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d7f0005bf6bda704438a0a80adbb2776c4040ef16cb3ee0be30b70637a7386c" Mar 19 19:56:06 crc kubenswrapper[4826]: I0319 19:56:06.705280 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565836-dkt7n" Mar 19 19:56:06 crc kubenswrapper[4826]: I0319 19:56:06.709772 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565830-rdm7n"] Mar 19 19:56:07 crc kubenswrapper[4826]: I0319 19:56:07.995984 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2740d37-b4cf-4399-91b0-d8fe78dbdd99" path="/var/lib/kubelet/pods/b2740d37-b4cf-4399-91b0-d8fe78dbdd99/volumes" Mar 19 19:56:25 crc kubenswrapper[4826]: I0319 19:56:25.401149 4826 patch_prober.go:28] interesting pod/machine-config-daemon-zz87p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 19:56:25 crc kubenswrapper[4826]: I0319 19:56:25.401766 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 19:56:25 crc kubenswrapper[4826]: I0319 19:56:25.401813 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" Mar 19 19:56:25 crc kubenswrapper[4826]: I0319 19:56:25.402955 4826 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"19026675d22c90c0227b4d60a456b6c13458f69e7a00a61e59a04ce0e98ff9e3"} pod="openshift-machine-config-operator/machine-config-daemon-zz87p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 19:56:25 crc kubenswrapper[4826]: I0319 19:56:25.403022 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" containerID="cri-o://19026675d22c90c0227b4d60a456b6c13458f69e7a00a61e59a04ce0e98ff9e3" gracePeriod=600 Mar 19 19:56:25 crc kubenswrapper[4826]: E0319 19:56:25.534625 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:56:25 crc kubenswrapper[4826]: I0319 19:56:25.939024 4826 generic.go:334] "Generic (PLEG): container finished" podID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerID="19026675d22c90c0227b4d60a456b6c13458f69e7a00a61e59a04ce0e98ff9e3" exitCode=0 Mar 19 19:56:25 crc kubenswrapper[4826]: I0319 19:56:25.939125 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" event={"ID":"b456fa3f-c7a7-45ca-b560-e7a9b21be05a","Type":"ContainerDied","Data":"19026675d22c90c0227b4d60a456b6c13458f69e7a00a61e59a04ce0e98ff9e3"} Mar 19 19:56:25 crc kubenswrapper[4826]: I0319 19:56:25.939408 4826 scope.go:117] "RemoveContainer" containerID="c8cc57047ec94ef10f76fd2ba288391160ac1023ed586c075e64e8fb5b6a6564" Mar 19 19:56:25 crc kubenswrapper[4826]: I0319 19:56:25.940591 4826 scope.go:117] "RemoveContainer" containerID="19026675d22c90c0227b4d60a456b6c13458f69e7a00a61e59a04ce0e98ff9e3" Mar 19 19:56:25 crc kubenswrapper[4826]: E0319 19:56:25.941399 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:56:34 crc kubenswrapper[4826]: I0319 19:56:34.249150 4826 scope.go:117] "RemoveContainer" containerID="cd2a12d4e9beb80871fb4eaaec2cdaf9127ddad465cc52d495ae195c96f238df" Mar 19 19:56:36 crc kubenswrapper[4826]: I0319 19:56:36.977435 4826 scope.go:117] "RemoveContainer" containerID="19026675d22c90c0227b4d60a456b6c13458f69e7a00a61e59a04ce0e98ff9e3" Mar 19 19:56:36 crc kubenswrapper[4826]: E0319 19:56:36.978741 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:56:50 crc kubenswrapper[4826]: I0319 19:56:50.977197 4826 scope.go:117] "RemoveContainer" containerID="19026675d22c90c0227b4d60a456b6c13458f69e7a00a61e59a04ce0e98ff9e3" Mar 19 19:56:50 crc kubenswrapper[4826]: E0319 19:56:50.978380 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:57:04 crc kubenswrapper[4826]: I0319 19:57:04.976825 4826 scope.go:117] "RemoveContainer" containerID="19026675d22c90c0227b4d60a456b6c13458f69e7a00a61e59a04ce0e98ff9e3" Mar 19 19:57:04 crc kubenswrapper[4826]: E0319 19:57:04.977943 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:57:17 crc kubenswrapper[4826]: I0319 19:57:17.976538 4826 scope.go:117] "RemoveContainer" containerID="19026675d22c90c0227b4d60a456b6c13458f69e7a00a61e59a04ce0e98ff9e3" Mar 19 19:57:17 crc kubenswrapper[4826]: E0319 19:57:17.977694 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:57:32 crc kubenswrapper[4826]: I0319 19:57:32.977384 4826 scope.go:117] "RemoveContainer" containerID="19026675d22c90c0227b4d60a456b6c13458f69e7a00a61e59a04ce0e98ff9e3" Mar 19 19:57:32 crc kubenswrapper[4826]: E0319 19:57:32.978449 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:57:47 crc kubenswrapper[4826]: I0319 19:57:47.977208 4826 scope.go:117] "RemoveContainer" containerID="19026675d22c90c0227b4d60a456b6c13458f69e7a00a61e59a04ce0e98ff9e3" Mar 19 19:57:47 crc kubenswrapper[4826]: E0319 19:57:47.978206 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:58:00 crc kubenswrapper[4826]: I0319 19:58:00.160542 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565838-b7gj4"] Mar 19 19:58:00 crc kubenswrapper[4826]: E0319 19:58:00.161644 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f93bd14-0892-4dd0-8a34-3352dcd8e36f" containerName="oc" Mar 19 19:58:00 crc kubenswrapper[4826]: I0319 19:58:00.161681 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f93bd14-0892-4dd0-8a34-3352dcd8e36f" containerName="oc" Mar 19 19:58:00 crc kubenswrapper[4826]: I0319 19:58:00.161988 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f93bd14-0892-4dd0-8a34-3352dcd8e36f" containerName="oc" Mar 19 19:58:00 crc kubenswrapper[4826]: I0319 19:58:00.163314 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565838-b7gj4" Mar 19 19:58:00 crc kubenswrapper[4826]: I0319 19:58:00.166259 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b27wl" Mar 19 19:58:00 crc kubenswrapper[4826]: I0319 19:58:00.167300 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 19:58:00 crc kubenswrapper[4826]: I0319 19:58:00.175153 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 19:58:00 crc kubenswrapper[4826]: I0319 19:58:00.211848 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrgkn\" (UniqueName: \"kubernetes.io/projected/93827aa6-a38e-406a-8d19-82c7399bcc16-kube-api-access-jrgkn\") pod \"auto-csr-approver-29565838-b7gj4\" (UID: \"93827aa6-a38e-406a-8d19-82c7399bcc16\") " pod="openshift-infra/auto-csr-approver-29565838-b7gj4" Mar 19 19:58:00 crc kubenswrapper[4826]: I0319 19:58:00.227011 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565838-b7gj4"] Mar 19 19:58:00 crc kubenswrapper[4826]: I0319 19:58:00.314426 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrgkn\" (UniqueName: \"kubernetes.io/projected/93827aa6-a38e-406a-8d19-82c7399bcc16-kube-api-access-jrgkn\") pod \"auto-csr-approver-29565838-b7gj4\" (UID: \"93827aa6-a38e-406a-8d19-82c7399bcc16\") " pod="openshift-infra/auto-csr-approver-29565838-b7gj4" Mar 19 19:58:00 crc kubenswrapper[4826]: I0319 19:58:00.333764 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrgkn\" (UniqueName: \"kubernetes.io/projected/93827aa6-a38e-406a-8d19-82c7399bcc16-kube-api-access-jrgkn\") pod \"auto-csr-approver-29565838-b7gj4\" (UID: \"93827aa6-a38e-406a-8d19-82c7399bcc16\") " pod="openshift-infra/auto-csr-approver-29565838-b7gj4" Mar 19 19:58:00 crc kubenswrapper[4826]: I0319 19:58:00.496400 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565838-b7gj4" Mar 19 19:58:01 crc kubenswrapper[4826]: I0319 19:58:01.015050 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565838-b7gj4"] Mar 19 19:58:01 crc kubenswrapper[4826]: I0319 19:58:01.017693 4826 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 19:58:01 crc kubenswrapper[4826]: I0319 19:58:01.318690 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565838-b7gj4" event={"ID":"93827aa6-a38e-406a-8d19-82c7399bcc16","Type":"ContainerStarted","Data":"d89a0e0ae75613650e406cee37a35dc80c57f7b7d9ab5756dc45add3b374e422"} Mar 19 19:58:01 crc kubenswrapper[4826]: I0319 19:58:01.991005 4826 scope.go:117] "RemoveContainer" containerID="19026675d22c90c0227b4d60a456b6c13458f69e7a00a61e59a04ce0e98ff9e3" Mar 19 19:58:01 crc kubenswrapper[4826]: E0319 19:58:01.991923 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:58:03 crc kubenswrapper[4826]: I0319 19:58:03.339889 4826 generic.go:334] "Generic (PLEG): container finished" podID="93827aa6-a38e-406a-8d19-82c7399bcc16" containerID="ba1eca8e661ffc4097f4bdec17414174139c3e1d60ee1ebb14e5e465ecf5b687" exitCode=0 Mar 19 19:58:03 crc kubenswrapper[4826]: I0319 19:58:03.340353 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565838-b7gj4" event={"ID":"93827aa6-a38e-406a-8d19-82c7399bcc16","Type":"ContainerDied","Data":"ba1eca8e661ffc4097f4bdec17414174139c3e1d60ee1ebb14e5e465ecf5b687"} Mar 19 19:58:04 crc kubenswrapper[4826]: I0319 19:58:04.889272 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565838-b7gj4" Mar 19 19:58:05 crc kubenswrapper[4826]: I0319 19:58:05.066795 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrgkn\" (UniqueName: \"kubernetes.io/projected/93827aa6-a38e-406a-8d19-82c7399bcc16-kube-api-access-jrgkn\") pod \"93827aa6-a38e-406a-8d19-82c7399bcc16\" (UID: \"93827aa6-a38e-406a-8d19-82c7399bcc16\") " Mar 19 19:58:05 crc kubenswrapper[4826]: I0319 19:58:05.075356 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93827aa6-a38e-406a-8d19-82c7399bcc16-kube-api-access-jrgkn" (OuterVolumeSpecName: "kube-api-access-jrgkn") pod "93827aa6-a38e-406a-8d19-82c7399bcc16" (UID: "93827aa6-a38e-406a-8d19-82c7399bcc16"). InnerVolumeSpecName "kube-api-access-jrgkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:58:05 crc kubenswrapper[4826]: I0319 19:58:05.170715 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrgkn\" (UniqueName: \"kubernetes.io/projected/93827aa6-a38e-406a-8d19-82c7399bcc16-kube-api-access-jrgkn\") on node \"crc\" DevicePath \"\"" Mar 19 19:58:05 crc kubenswrapper[4826]: I0319 19:58:05.365422 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565838-b7gj4" event={"ID":"93827aa6-a38e-406a-8d19-82c7399bcc16","Type":"ContainerDied","Data":"d89a0e0ae75613650e406cee37a35dc80c57f7b7d9ab5756dc45add3b374e422"} Mar 19 19:58:05 crc kubenswrapper[4826]: I0319 19:58:05.365465 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d89a0e0ae75613650e406cee37a35dc80c57f7b7d9ab5756dc45add3b374e422" Mar 19 19:58:05 crc kubenswrapper[4826]: I0319 19:58:05.365479 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565838-b7gj4" Mar 19 19:58:05 crc kubenswrapper[4826]: I0319 19:58:05.992526 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565832-m6mtl"] Mar 19 19:58:06 crc kubenswrapper[4826]: I0319 19:58:06.006473 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565832-m6mtl"] Mar 19 19:58:07 crc kubenswrapper[4826]: I0319 19:58:07.991705 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcff6b33-0cd8-4d47-bbb6-74791f4da46f" path="/var/lib/kubelet/pods/fcff6b33-0cd8-4d47-bbb6-74791f4da46f/volumes" Mar 19 19:58:16 crc kubenswrapper[4826]: I0319 19:58:16.062961 4826 scope.go:117] "RemoveContainer" containerID="19026675d22c90c0227b4d60a456b6c13458f69e7a00a61e59a04ce0e98ff9e3" Mar 19 19:58:16 crc kubenswrapper[4826]: E0319 19:58:16.094365 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:58:29 crc kubenswrapper[4826]: I0319 19:58:29.976707 4826 scope.go:117] "RemoveContainer" containerID="19026675d22c90c0227b4d60a456b6c13458f69e7a00a61e59a04ce0e98ff9e3" Mar 19 19:58:29 crc kubenswrapper[4826]: E0319 19:58:29.977620 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:58:34 crc kubenswrapper[4826]: I0319 19:58:34.406357 4826 scope.go:117] "RemoveContainer" containerID="a4206b1fc532a8535a9bb2a9bcbdfe3cf1c3b35089f6c55bd28686c117669d5e" Mar 19 19:58:40 crc kubenswrapper[4826]: I0319 19:58:40.976823 4826 scope.go:117] "RemoveContainer" containerID="19026675d22c90c0227b4d60a456b6c13458f69e7a00a61e59a04ce0e98ff9e3" Mar 19 19:58:40 crc kubenswrapper[4826]: E0319 19:58:40.977371 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:58:54 crc kubenswrapper[4826]: I0319 19:58:54.976327 4826 scope.go:117] "RemoveContainer" containerID="19026675d22c90c0227b4d60a456b6c13458f69e7a00a61e59a04ce0e98ff9e3" Mar 19 19:58:54 crc kubenswrapper[4826]: E0319 19:58:54.977178 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:59:04 crc kubenswrapper[4826]: I0319 19:59:04.943088 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mw64r"] Mar 19 19:59:04 crc kubenswrapper[4826]: E0319 19:59:04.944351 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93827aa6-a38e-406a-8d19-82c7399bcc16" containerName="oc" Mar 19 19:59:04 crc kubenswrapper[4826]: I0319 19:59:04.944373 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="93827aa6-a38e-406a-8d19-82c7399bcc16" containerName="oc" Mar 19 19:59:04 crc kubenswrapper[4826]: I0319 19:59:04.944692 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="93827aa6-a38e-406a-8d19-82c7399bcc16" containerName="oc" Mar 19 19:59:04 crc kubenswrapper[4826]: I0319 19:59:04.946946 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mw64r" Mar 19 19:59:04 crc kubenswrapper[4826]: I0319 19:59:04.981361 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mw64r"] Mar 19 19:59:05 crc kubenswrapper[4826]: I0319 19:59:05.130259 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3821b4d-7122-428f-be08-2c5f72a29b1d-utilities\") pod \"certified-operators-mw64r\" (UID: \"a3821b4d-7122-428f-be08-2c5f72a29b1d\") " pod="openshift-marketplace/certified-operators-mw64r" Mar 19 19:59:05 crc kubenswrapper[4826]: I0319 19:59:05.130365 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6s5g\" (UniqueName: \"kubernetes.io/projected/a3821b4d-7122-428f-be08-2c5f72a29b1d-kube-api-access-p6s5g\") pod \"certified-operators-mw64r\" (UID: \"a3821b4d-7122-428f-be08-2c5f72a29b1d\") " pod="openshift-marketplace/certified-operators-mw64r" Mar 19 19:59:05 crc kubenswrapper[4826]: I0319 19:59:05.130484 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3821b4d-7122-428f-be08-2c5f72a29b1d-catalog-content\") pod \"certified-operators-mw64r\" (UID: \"a3821b4d-7122-428f-be08-2c5f72a29b1d\") " pod="openshift-marketplace/certified-operators-mw64r" Mar 19 19:59:05 crc kubenswrapper[4826]: I0319 19:59:05.232665 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3821b4d-7122-428f-be08-2c5f72a29b1d-catalog-content\") pod \"certified-operators-mw64r\" (UID: \"a3821b4d-7122-428f-be08-2c5f72a29b1d\") " pod="openshift-marketplace/certified-operators-mw64r" Mar 19 19:59:05 crc kubenswrapper[4826]: I0319 19:59:05.232794 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3821b4d-7122-428f-be08-2c5f72a29b1d-utilities\") pod \"certified-operators-mw64r\" (UID: \"a3821b4d-7122-428f-be08-2c5f72a29b1d\") " pod="openshift-marketplace/certified-operators-mw64r" Mar 19 19:59:05 crc kubenswrapper[4826]: I0319 19:59:05.232845 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6s5g\" (UniqueName: \"kubernetes.io/projected/a3821b4d-7122-428f-be08-2c5f72a29b1d-kube-api-access-p6s5g\") pod \"certified-operators-mw64r\" (UID: \"a3821b4d-7122-428f-be08-2c5f72a29b1d\") " pod="openshift-marketplace/certified-operators-mw64r" Mar 19 19:59:05 crc kubenswrapper[4826]: I0319 19:59:05.233247 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3821b4d-7122-428f-be08-2c5f72a29b1d-catalog-content\") pod \"certified-operators-mw64r\" (UID: \"a3821b4d-7122-428f-be08-2c5f72a29b1d\") " pod="openshift-marketplace/certified-operators-mw64r" Mar 19 19:59:05 crc kubenswrapper[4826]: I0319 19:59:05.233321 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3821b4d-7122-428f-be08-2c5f72a29b1d-utilities\") pod \"certified-operators-mw64r\" (UID: \"a3821b4d-7122-428f-be08-2c5f72a29b1d\") " pod="openshift-marketplace/certified-operators-mw64r" Mar 19 19:59:05 crc kubenswrapper[4826]: I0319 19:59:05.266313 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6s5g\" (UniqueName: \"kubernetes.io/projected/a3821b4d-7122-428f-be08-2c5f72a29b1d-kube-api-access-p6s5g\") pod \"certified-operators-mw64r\" (UID: \"a3821b4d-7122-428f-be08-2c5f72a29b1d\") " pod="openshift-marketplace/certified-operators-mw64r" Mar 19 19:59:05 crc kubenswrapper[4826]: I0319 19:59:05.273085 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mw64r" Mar 19 19:59:05 crc kubenswrapper[4826]: I0319 19:59:05.823966 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mw64r"] Mar 19 19:59:06 crc kubenswrapper[4826]: I0319 19:59:06.135559 4826 generic.go:334] "Generic (PLEG): container finished" podID="a3821b4d-7122-428f-be08-2c5f72a29b1d" containerID="a427fd93295c519cd12283dea83bc7436d7ea12374cc3ef7cece193fae150c56" exitCode=0 Mar 19 19:59:06 crc kubenswrapper[4826]: I0319 19:59:06.135619 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mw64r" event={"ID":"a3821b4d-7122-428f-be08-2c5f72a29b1d","Type":"ContainerDied","Data":"a427fd93295c519cd12283dea83bc7436d7ea12374cc3ef7cece193fae150c56"} Mar 19 19:59:06 crc kubenswrapper[4826]: I0319 19:59:06.135648 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mw64r" event={"ID":"a3821b4d-7122-428f-be08-2c5f72a29b1d","Type":"ContainerStarted","Data":"7707ff92aef447340231d78c6f19d9be1f9d38c7388590309cf67535421ba68d"} Mar 19 19:59:09 crc kubenswrapper[4826]: I0319 19:59:09.977474 4826 scope.go:117] "RemoveContainer" containerID="19026675d22c90c0227b4d60a456b6c13458f69e7a00a61e59a04ce0e98ff9e3" Mar 19 19:59:09 crc kubenswrapper[4826]: E0319 19:59:09.978565 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:59:12 crc kubenswrapper[4826]: I0319 19:59:12.197450 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mw64r" event={"ID":"a3821b4d-7122-428f-be08-2c5f72a29b1d","Type":"ContainerStarted","Data":"08537e5760a8bfaed592dddc252874d4142bc0b1d4f6361601899fc0c4f28557"} Mar 19 19:59:12 crc kubenswrapper[4826]: E0319 19:59:12.506250 4826 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3821b4d_7122_428f_be08_2c5f72a29b1d.slice/crio-08537e5760a8bfaed592dddc252874d4142bc0b1d4f6361601899fc0c4f28557.scope\": RecentStats: unable to find data in memory cache]" Mar 19 19:59:13 crc kubenswrapper[4826]: I0319 19:59:13.215448 4826 generic.go:334] "Generic (PLEG): container finished" podID="a3821b4d-7122-428f-be08-2c5f72a29b1d" containerID="08537e5760a8bfaed592dddc252874d4142bc0b1d4f6361601899fc0c4f28557" exitCode=0 Mar 19 19:59:13 crc kubenswrapper[4826]: I0319 19:59:13.215492 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mw64r" event={"ID":"a3821b4d-7122-428f-be08-2c5f72a29b1d","Type":"ContainerDied","Data":"08537e5760a8bfaed592dddc252874d4142bc0b1d4f6361601899fc0c4f28557"} Mar 19 19:59:14 crc kubenswrapper[4826]: I0319 19:59:14.226542 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mw64r" event={"ID":"a3821b4d-7122-428f-be08-2c5f72a29b1d","Type":"ContainerStarted","Data":"e6c1a43948ff731869966687b07215663ad11a9bf634c7121864e9ef8748d2df"} Mar 19 19:59:14 crc kubenswrapper[4826]: I0319 19:59:14.252826 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mw64r" podStartSLOduration=2.762587908 podStartE2EDuration="10.252802945s" podCreationTimestamp="2026-03-19 19:59:04 +0000 UTC" firstStartedPulling="2026-03-19 19:59:06.140863027 +0000 UTC m=+3770.894931340" lastFinishedPulling="2026-03-19 19:59:13.631078064 +0000 UTC m=+3778.385146377" observedRunningTime="2026-03-19 19:59:14.242275521 +0000 UTC m=+3778.996343854" watchObservedRunningTime="2026-03-19 19:59:14.252802945 +0000 UTC m=+3779.006871268" Mar 19 19:59:15 crc kubenswrapper[4826]: I0319 19:59:15.273853 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mw64r" Mar 19 19:59:15 crc kubenswrapper[4826]: I0319 19:59:15.274145 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mw64r" Mar 19 19:59:16 crc kubenswrapper[4826]: I0319 19:59:16.326862 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-mw64r" podUID="a3821b4d-7122-428f-be08-2c5f72a29b1d" containerName="registry-server" probeResult="failure" output=< Mar 19 19:59:16 crc kubenswrapper[4826]: timeout: failed to connect service ":50051" within 1s Mar 19 19:59:16 crc kubenswrapper[4826]: > Mar 19 19:59:20 crc kubenswrapper[4826]: I0319 19:59:20.976262 4826 scope.go:117] "RemoveContainer" containerID="19026675d22c90c0227b4d60a456b6c13458f69e7a00a61e59a04ce0e98ff9e3" Mar 19 19:59:20 crc kubenswrapper[4826]: E0319 19:59:20.978940 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:59:25 crc kubenswrapper[4826]: I0319 19:59:25.356840 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mw64r" Mar 19 19:59:25 crc kubenswrapper[4826]: I0319 19:59:25.464411 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mw64r" Mar 19 19:59:25 crc kubenswrapper[4826]: I0319 19:59:25.574686 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mw64r"] Mar 19 19:59:25 crc kubenswrapper[4826]: I0319 19:59:25.657602 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rqc8n"] Mar 19 19:59:25 crc kubenswrapper[4826]: I0319 19:59:25.657852 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rqc8n" podUID="2b5eabb2-64fe-402a-be4d-7980a750f598" containerName="registry-server" containerID="cri-o://f0961250ddbd2ec4583a7d47122a27e7133fc5f61784419a753acc2e055e17b8" gracePeriod=2 Mar 19 19:59:26 crc kubenswrapper[4826]: I0319 19:59:26.232770 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rqc8n" Mar 19 19:59:26 crc kubenswrapper[4826]: I0319 19:59:26.331218 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-stgfq\" (UniqueName: \"kubernetes.io/projected/2b5eabb2-64fe-402a-be4d-7980a750f598-kube-api-access-stgfq\") pod \"2b5eabb2-64fe-402a-be4d-7980a750f598\" (UID: \"2b5eabb2-64fe-402a-be4d-7980a750f598\") " Mar 19 19:59:26 crc kubenswrapper[4826]: I0319 19:59:26.331265 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b5eabb2-64fe-402a-be4d-7980a750f598-utilities\") pod \"2b5eabb2-64fe-402a-be4d-7980a750f598\" (UID: \"2b5eabb2-64fe-402a-be4d-7980a750f598\") " Mar 19 19:59:26 crc kubenswrapper[4826]: I0319 19:59:26.331439 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b5eabb2-64fe-402a-be4d-7980a750f598-catalog-content\") pod \"2b5eabb2-64fe-402a-be4d-7980a750f598\" (UID: \"2b5eabb2-64fe-402a-be4d-7980a750f598\") " Mar 19 19:59:26 crc kubenswrapper[4826]: I0319 19:59:26.332052 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b5eabb2-64fe-402a-be4d-7980a750f598-utilities" (OuterVolumeSpecName: "utilities") pod "2b5eabb2-64fe-402a-be4d-7980a750f598" (UID: "2b5eabb2-64fe-402a-be4d-7980a750f598"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:59:26 crc kubenswrapper[4826]: I0319 19:59:26.337940 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b5eabb2-64fe-402a-be4d-7980a750f598-kube-api-access-stgfq" (OuterVolumeSpecName: "kube-api-access-stgfq") pod "2b5eabb2-64fe-402a-be4d-7980a750f598" (UID: "2b5eabb2-64fe-402a-be4d-7980a750f598"). InnerVolumeSpecName "kube-api-access-stgfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 19:59:26 crc kubenswrapper[4826]: I0319 19:59:26.368561 4826 generic.go:334] "Generic (PLEG): container finished" podID="2b5eabb2-64fe-402a-be4d-7980a750f598" containerID="f0961250ddbd2ec4583a7d47122a27e7133fc5f61784419a753acc2e055e17b8" exitCode=0 Mar 19 19:59:26 crc kubenswrapper[4826]: I0319 19:59:26.369669 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rqc8n" Mar 19 19:59:26 crc kubenswrapper[4826]: I0319 19:59:26.370192 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rqc8n" event={"ID":"2b5eabb2-64fe-402a-be4d-7980a750f598","Type":"ContainerDied","Data":"f0961250ddbd2ec4583a7d47122a27e7133fc5f61784419a753acc2e055e17b8"} Mar 19 19:59:26 crc kubenswrapper[4826]: I0319 19:59:26.370229 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rqc8n" event={"ID":"2b5eabb2-64fe-402a-be4d-7980a750f598","Type":"ContainerDied","Data":"2281d633ee6455bc6913e54e27e59bf25800e6184cc831bce307e1ea5be9cab9"} Mar 19 19:59:26 crc kubenswrapper[4826]: I0319 19:59:26.370246 4826 scope.go:117] "RemoveContainer" containerID="f0961250ddbd2ec4583a7d47122a27e7133fc5f61784419a753acc2e055e17b8" Mar 19 19:59:26 crc kubenswrapper[4826]: I0319 19:59:26.405550 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b5eabb2-64fe-402a-be4d-7980a750f598-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2b5eabb2-64fe-402a-be4d-7980a750f598" (UID: "2b5eabb2-64fe-402a-be4d-7980a750f598"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 19:59:26 crc kubenswrapper[4826]: I0319 19:59:26.420335 4826 scope.go:117] "RemoveContainer" containerID="2bec5c48233c0fd9af4ed7dbb91b882bf22652be3800623926f6b22489f9cd14" Mar 19 19:59:26 crc kubenswrapper[4826]: I0319 19:59:26.434467 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-stgfq\" (UniqueName: \"kubernetes.io/projected/2b5eabb2-64fe-402a-be4d-7980a750f598-kube-api-access-stgfq\") on node \"crc\" DevicePath \"\"" Mar 19 19:59:26 crc kubenswrapper[4826]: I0319 19:59:26.434797 4826 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b5eabb2-64fe-402a-be4d-7980a750f598-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 19:59:26 crc kubenswrapper[4826]: I0319 19:59:26.434808 4826 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b5eabb2-64fe-402a-be4d-7980a750f598-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 19:59:26 crc kubenswrapper[4826]: I0319 19:59:26.455379 4826 scope.go:117] "RemoveContainer" containerID="8c678dba61fc0271b41a72fb141993aa4244bf0d15d77c3c2d7f6815f5b5145a" Mar 19 19:59:26 crc kubenswrapper[4826]: I0319 19:59:26.500544 4826 scope.go:117] "RemoveContainer" containerID="f0961250ddbd2ec4583a7d47122a27e7133fc5f61784419a753acc2e055e17b8" Mar 19 19:59:26 crc kubenswrapper[4826]: E0319 19:59:26.501100 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0961250ddbd2ec4583a7d47122a27e7133fc5f61784419a753acc2e055e17b8\": container with ID starting with f0961250ddbd2ec4583a7d47122a27e7133fc5f61784419a753acc2e055e17b8 not found: ID does not exist" containerID="f0961250ddbd2ec4583a7d47122a27e7133fc5f61784419a753acc2e055e17b8" Mar 19 19:59:26 crc kubenswrapper[4826]: I0319 19:59:26.501148 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0961250ddbd2ec4583a7d47122a27e7133fc5f61784419a753acc2e055e17b8"} err="failed to get container status \"f0961250ddbd2ec4583a7d47122a27e7133fc5f61784419a753acc2e055e17b8\": rpc error: code = NotFound desc = could not find container \"f0961250ddbd2ec4583a7d47122a27e7133fc5f61784419a753acc2e055e17b8\": container with ID starting with f0961250ddbd2ec4583a7d47122a27e7133fc5f61784419a753acc2e055e17b8 not found: ID does not exist" Mar 19 19:59:26 crc kubenswrapper[4826]: I0319 19:59:26.501177 4826 scope.go:117] "RemoveContainer" containerID="2bec5c48233c0fd9af4ed7dbb91b882bf22652be3800623926f6b22489f9cd14" Mar 19 19:59:26 crc kubenswrapper[4826]: E0319 19:59:26.501648 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bec5c48233c0fd9af4ed7dbb91b882bf22652be3800623926f6b22489f9cd14\": container with ID starting with 2bec5c48233c0fd9af4ed7dbb91b882bf22652be3800623926f6b22489f9cd14 not found: ID does not exist" containerID="2bec5c48233c0fd9af4ed7dbb91b882bf22652be3800623926f6b22489f9cd14" Mar 19 19:59:26 crc kubenswrapper[4826]: I0319 19:59:26.501697 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bec5c48233c0fd9af4ed7dbb91b882bf22652be3800623926f6b22489f9cd14"} err="failed to get container status \"2bec5c48233c0fd9af4ed7dbb91b882bf22652be3800623926f6b22489f9cd14\": rpc error: code = NotFound desc = could not find container \"2bec5c48233c0fd9af4ed7dbb91b882bf22652be3800623926f6b22489f9cd14\": container with ID starting with 2bec5c48233c0fd9af4ed7dbb91b882bf22652be3800623926f6b22489f9cd14 not found: ID does not exist" Mar 19 19:59:26 crc kubenswrapper[4826]: I0319 19:59:26.501720 4826 scope.go:117] "RemoveContainer" containerID="8c678dba61fc0271b41a72fb141993aa4244bf0d15d77c3c2d7f6815f5b5145a" Mar 19 19:59:26 crc kubenswrapper[4826]: E0319 19:59:26.501979 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c678dba61fc0271b41a72fb141993aa4244bf0d15d77c3c2d7f6815f5b5145a\": container with ID starting with 8c678dba61fc0271b41a72fb141993aa4244bf0d15d77c3c2d7f6815f5b5145a not found: ID does not exist" containerID="8c678dba61fc0271b41a72fb141993aa4244bf0d15d77c3c2d7f6815f5b5145a" Mar 19 19:59:26 crc kubenswrapper[4826]: I0319 19:59:26.502004 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c678dba61fc0271b41a72fb141993aa4244bf0d15d77c3c2d7f6815f5b5145a"} err="failed to get container status \"8c678dba61fc0271b41a72fb141993aa4244bf0d15d77c3c2d7f6815f5b5145a\": rpc error: code = NotFound desc = could not find container \"8c678dba61fc0271b41a72fb141993aa4244bf0d15d77c3c2d7f6815f5b5145a\": container with ID starting with 8c678dba61fc0271b41a72fb141993aa4244bf0d15d77c3c2d7f6815f5b5145a not found: ID does not exist" Mar 19 19:59:26 crc kubenswrapper[4826]: I0319 19:59:26.704115 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rqc8n"] Mar 19 19:59:26 crc kubenswrapper[4826]: I0319 19:59:26.713563 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rqc8n"] Mar 19 19:59:28 crc kubenswrapper[4826]: I0319 19:59:28.005546 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b5eabb2-64fe-402a-be4d-7980a750f598" path="/var/lib/kubelet/pods/2b5eabb2-64fe-402a-be4d-7980a750f598/volumes" Mar 19 19:59:32 crc kubenswrapper[4826]: I0319 19:59:32.977328 4826 scope.go:117] "RemoveContainer" containerID="19026675d22c90c0227b4d60a456b6c13458f69e7a00a61e59a04ce0e98ff9e3" Mar 19 19:59:32 crc kubenswrapper[4826]: E0319 19:59:32.979643 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:59:44 crc kubenswrapper[4826]: I0319 19:59:44.976424 4826 scope.go:117] "RemoveContainer" containerID="19026675d22c90c0227b4d60a456b6c13458f69e7a00a61e59a04ce0e98ff9e3" Mar 19 19:59:44 crc kubenswrapper[4826]: E0319 19:59:44.977996 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 19:59:56 crc kubenswrapper[4826]: I0319 19:59:56.977208 4826 scope.go:117] "RemoveContainer" containerID="19026675d22c90c0227b4d60a456b6c13458f69e7a00a61e59a04ce0e98ff9e3" Mar 19 19:59:56 crc kubenswrapper[4826]: E0319 19:59:56.977853 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 20:00:00 crc kubenswrapper[4826]: I0319 20:00:00.174052 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565840-cpxqz"] Mar 19 20:00:00 crc kubenswrapper[4826]: E0319 20:00:00.176091 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b5eabb2-64fe-402a-be4d-7980a750f598" containerName="registry-server" Mar 19 20:00:00 crc kubenswrapper[4826]: I0319 20:00:00.176121 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b5eabb2-64fe-402a-be4d-7980a750f598" containerName="registry-server" Mar 19 20:00:00 crc kubenswrapper[4826]: E0319 20:00:00.176146 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b5eabb2-64fe-402a-be4d-7980a750f598" containerName="extract-content" Mar 19 20:00:00 crc kubenswrapper[4826]: I0319 20:00:00.176169 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b5eabb2-64fe-402a-be4d-7980a750f598" containerName="extract-content" Mar 19 20:00:00 crc kubenswrapper[4826]: E0319 20:00:00.176189 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b5eabb2-64fe-402a-be4d-7980a750f598" containerName="extract-utilities" Mar 19 20:00:00 crc kubenswrapper[4826]: I0319 20:00:00.176198 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b5eabb2-64fe-402a-be4d-7980a750f598" containerName="extract-utilities" Mar 19 20:00:00 crc kubenswrapper[4826]: I0319 20:00:00.176942 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b5eabb2-64fe-402a-be4d-7980a750f598" containerName="registry-server" Mar 19 20:00:00 crc kubenswrapper[4826]: I0319 20:00:00.178338 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565840-cpxqz" Mar 19 20:00:00 crc kubenswrapper[4826]: I0319 20:00:00.187982 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b27wl" Mar 19 20:00:00 crc kubenswrapper[4826]: I0319 20:00:00.188725 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 20:00:00 crc kubenswrapper[4826]: I0319 20:00:00.196274 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 20:00:00 crc kubenswrapper[4826]: I0319 20:00:00.208293 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565840-cpxqz"] Mar 19 20:00:00 crc kubenswrapper[4826]: I0319 20:00:00.267719 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565840-54ml5"] Mar 19 20:00:00 crc kubenswrapper[4826]: I0319 20:00:00.269906 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565840-54ml5" Mar 19 20:00:00 crc kubenswrapper[4826]: I0319 20:00:00.272401 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 19 20:00:00 crc kubenswrapper[4826]: I0319 20:00:00.278013 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 19 20:00:00 crc kubenswrapper[4826]: I0319 20:00:00.283926 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565840-54ml5"] Mar 19 20:00:00 crc kubenswrapper[4826]: I0319 20:00:00.302632 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/73094e35-9b01-4551-8655-208af32b0af9-config-volume\") pod \"collect-profiles-29565840-54ml5\" (UID: \"73094e35-9b01-4551-8655-208af32b0af9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565840-54ml5" Mar 19 20:00:00 crc kubenswrapper[4826]: I0319 20:00:00.302717 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shckg\" (UniqueName: \"kubernetes.io/projected/b4124973-19ff-4e28-b7ce-94423ac25de8-kube-api-access-shckg\") pod \"auto-csr-approver-29565840-cpxqz\" (UID: \"b4124973-19ff-4e28-b7ce-94423ac25de8\") " pod="openshift-infra/auto-csr-approver-29565840-cpxqz" Mar 19 20:00:00 crc kubenswrapper[4826]: I0319 20:00:00.302776 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/73094e35-9b01-4551-8655-208af32b0af9-secret-volume\") pod \"collect-profiles-29565840-54ml5\" (UID: \"73094e35-9b01-4551-8655-208af32b0af9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565840-54ml5" Mar 19 20:00:00 crc kubenswrapper[4826]: I0319 20:00:00.302806 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbwf8\" (UniqueName: \"kubernetes.io/projected/73094e35-9b01-4551-8655-208af32b0af9-kube-api-access-lbwf8\") pod \"collect-profiles-29565840-54ml5\" (UID: \"73094e35-9b01-4551-8655-208af32b0af9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565840-54ml5" Mar 19 20:00:00 crc kubenswrapper[4826]: I0319 20:00:00.405267 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/73094e35-9b01-4551-8655-208af32b0af9-config-volume\") pod \"collect-profiles-29565840-54ml5\" (UID: \"73094e35-9b01-4551-8655-208af32b0af9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565840-54ml5" Mar 19 20:00:00 crc kubenswrapper[4826]: I0319 20:00:00.405317 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shckg\" (UniqueName: \"kubernetes.io/projected/b4124973-19ff-4e28-b7ce-94423ac25de8-kube-api-access-shckg\") pod \"auto-csr-approver-29565840-cpxqz\" (UID: \"b4124973-19ff-4e28-b7ce-94423ac25de8\") " pod="openshift-infra/auto-csr-approver-29565840-cpxqz" Mar 19 20:00:00 crc kubenswrapper[4826]: I0319 20:00:00.405363 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/73094e35-9b01-4551-8655-208af32b0af9-secret-volume\") pod \"collect-profiles-29565840-54ml5\" (UID: \"73094e35-9b01-4551-8655-208af32b0af9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565840-54ml5" Mar 19 20:00:00 crc kubenswrapper[4826]: I0319 20:00:00.405383 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbwf8\" (UniqueName: \"kubernetes.io/projected/73094e35-9b01-4551-8655-208af32b0af9-kube-api-access-lbwf8\") pod \"collect-profiles-29565840-54ml5\" (UID: \"73094e35-9b01-4551-8655-208af32b0af9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565840-54ml5" Mar 19 20:00:00 crc kubenswrapper[4826]: I0319 20:00:00.406571 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/73094e35-9b01-4551-8655-208af32b0af9-config-volume\") pod \"collect-profiles-29565840-54ml5\" (UID: \"73094e35-9b01-4551-8655-208af32b0af9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565840-54ml5" Mar 19 20:00:00 crc kubenswrapper[4826]: I0319 20:00:00.418015 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/73094e35-9b01-4551-8655-208af32b0af9-secret-volume\") pod \"collect-profiles-29565840-54ml5\" (UID: \"73094e35-9b01-4551-8655-208af32b0af9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565840-54ml5" Mar 19 20:00:00 crc kubenswrapper[4826]: I0319 20:00:00.424213 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shckg\" (UniqueName: \"kubernetes.io/projected/b4124973-19ff-4e28-b7ce-94423ac25de8-kube-api-access-shckg\") pod \"auto-csr-approver-29565840-cpxqz\" (UID: \"b4124973-19ff-4e28-b7ce-94423ac25de8\") " pod="openshift-infra/auto-csr-approver-29565840-cpxqz" Mar 19 20:00:00 crc kubenswrapper[4826]: I0319 20:00:00.425765 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbwf8\" (UniqueName: \"kubernetes.io/projected/73094e35-9b01-4551-8655-208af32b0af9-kube-api-access-lbwf8\") pod \"collect-profiles-29565840-54ml5\" (UID: \"73094e35-9b01-4551-8655-208af32b0af9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565840-54ml5" Mar 19 20:00:00 crc kubenswrapper[4826]: I0319 20:00:00.510428 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565840-cpxqz" Mar 19 20:00:00 crc kubenswrapper[4826]: I0319 20:00:00.594120 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565840-54ml5" Mar 19 20:00:01 crc kubenswrapper[4826]: I0319 20:00:01.059744 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565840-cpxqz"] Mar 19 20:00:01 crc kubenswrapper[4826]: W0319 20:00:01.077169 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73094e35_9b01_4551_8655_208af32b0af9.slice/crio-be784abeee36f2762feb62d474274d623cbecba1778d8ed50cf1915bb5b8f1b7 WatchSource:0}: Error finding container be784abeee36f2762feb62d474274d623cbecba1778d8ed50cf1915bb5b8f1b7: Status 404 returned error can't find the container with id be784abeee36f2762feb62d474274d623cbecba1778d8ed50cf1915bb5b8f1b7 Mar 19 20:00:01 crc kubenswrapper[4826]: I0319 20:00:01.085957 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565840-54ml5"] Mar 19 20:00:01 crc kubenswrapper[4826]: I0319 20:00:01.861880 4826 generic.go:334] "Generic (PLEG): container finished" podID="73094e35-9b01-4551-8655-208af32b0af9" containerID="9c94b4546c03f2086ea7da6bd1f135450b0a985023f5b66124a9a47c1f7bfa04" exitCode=0 Mar 19 20:00:01 crc kubenswrapper[4826]: I0319 20:00:01.862012 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565840-54ml5" event={"ID":"73094e35-9b01-4551-8655-208af32b0af9","Type":"ContainerDied","Data":"9c94b4546c03f2086ea7da6bd1f135450b0a985023f5b66124a9a47c1f7bfa04"} Mar 19 20:00:01 crc kubenswrapper[4826]: I0319 20:00:01.862897 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565840-54ml5" event={"ID":"73094e35-9b01-4551-8655-208af32b0af9","Type":"ContainerStarted","Data":"be784abeee36f2762feb62d474274d623cbecba1778d8ed50cf1915bb5b8f1b7"} Mar 19 20:00:01 crc kubenswrapper[4826]: I0319 20:00:01.868187 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565840-cpxqz" event={"ID":"b4124973-19ff-4e28-b7ce-94423ac25de8","Type":"ContainerStarted","Data":"57c488d450c350a0f30a00fcd237001b688ddc88ae67d78d7fdbb1a27bd6ebd9"} Mar 19 20:00:03 crc kubenswrapper[4826]: I0319 20:00:03.422106 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565840-54ml5" Mar 19 20:00:03 crc kubenswrapper[4826]: I0319 20:00:03.478494 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/73094e35-9b01-4551-8655-208af32b0af9-config-volume\") pod \"73094e35-9b01-4551-8655-208af32b0af9\" (UID: \"73094e35-9b01-4551-8655-208af32b0af9\") " Mar 19 20:00:03 crc kubenswrapper[4826]: I0319 20:00:03.478561 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/73094e35-9b01-4551-8655-208af32b0af9-secret-volume\") pod \"73094e35-9b01-4551-8655-208af32b0af9\" (UID: \"73094e35-9b01-4551-8655-208af32b0af9\") " Mar 19 20:00:03 crc kubenswrapper[4826]: I0319 20:00:03.478712 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbwf8\" (UniqueName: \"kubernetes.io/projected/73094e35-9b01-4551-8655-208af32b0af9-kube-api-access-lbwf8\") pod \"73094e35-9b01-4551-8655-208af32b0af9\" (UID: \"73094e35-9b01-4551-8655-208af32b0af9\") " Mar 19 20:00:03 crc kubenswrapper[4826]: I0319 20:00:03.479573 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73094e35-9b01-4551-8655-208af32b0af9-config-volume" (OuterVolumeSpecName: "config-volume") pod "73094e35-9b01-4551-8655-208af32b0af9" (UID: "73094e35-9b01-4551-8655-208af32b0af9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:00:03 crc kubenswrapper[4826]: I0319 20:00:03.484019 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73094e35-9b01-4551-8655-208af32b0af9-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "73094e35-9b01-4551-8655-208af32b0af9" (UID: "73094e35-9b01-4551-8655-208af32b0af9"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:00:03 crc kubenswrapper[4826]: I0319 20:00:03.484354 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73094e35-9b01-4551-8655-208af32b0af9-kube-api-access-lbwf8" (OuterVolumeSpecName: "kube-api-access-lbwf8") pod "73094e35-9b01-4551-8655-208af32b0af9" (UID: "73094e35-9b01-4551-8655-208af32b0af9"). InnerVolumeSpecName "kube-api-access-lbwf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:00:03 crc kubenswrapper[4826]: I0319 20:00:03.581805 4826 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/73094e35-9b01-4551-8655-208af32b0af9-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 20:00:03 crc kubenswrapper[4826]: I0319 20:00:03.581833 4826 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/73094e35-9b01-4551-8655-208af32b0af9-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 19 20:00:03 crc kubenswrapper[4826]: I0319 20:00:03.581865 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbwf8\" (UniqueName: \"kubernetes.io/projected/73094e35-9b01-4551-8655-208af32b0af9-kube-api-access-lbwf8\") on node \"crc\" DevicePath \"\"" Mar 19 20:00:03 crc kubenswrapper[4826]: I0319 20:00:03.889533 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565840-54ml5" event={"ID":"73094e35-9b01-4551-8655-208af32b0af9","Type":"ContainerDied","Data":"be784abeee36f2762feb62d474274d623cbecba1778d8ed50cf1915bb5b8f1b7"} Mar 19 20:00:03 crc kubenswrapper[4826]: I0319 20:00:03.889949 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be784abeee36f2762feb62d474274d623cbecba1778d8ed50cf1915bb5b8f1b7" Mar 19 20:00:03 crc kubenswrapper[4826]: I0319 20:00:03.889604 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565840-54ml5" Mar 19 20:00:04 crc kubenswrapper[4826]: I0319 20:00:04.530225 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565795-49jd6"] Mar 19 20:00:04 crc kubenswrapper[4826]: I0319 20:00:04.542354 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565795-49jd6"] Mar 19 20:00:04 crc kubenswrapper[4826]: I0319 20:00:04.905847 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565840-cpxqz" event={"ID":"b4124973-19ff-4e28-b7ce-94423ac25de8","Type":"ContainerStarted","Data":"b043cd8f5631b5bd47670db8780d74581e614575119f02b354d622f3f423f5c4"} Mar 19 20:00:04 crc kubenswrapper[4826]: I0319 20:00:04.939746 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565840-cpxqz" podStartSLOduration=1.6926308319999999 podStartE2EDuration="4.939722751s" podCreationTimestamp="2026-03-19 20:00:00 +0000 UTC" firstStartedPulling="2026-03-19 20:00:01.060811987 +0000 UTC m=+3825.814880310" lastFinishedPulling="2026-03-19 20:00:04.307903876 +0000 UTC m=+3829.061972229" observedRunningTime="2026-03-19 20:00:04.926317157 +0000 UTC m=+3829.680385460" watchObservedRunningTime="2026-03-19 20:00:04.939722751 +0000 UTC m=+3829.693791074" Mar 19 20:00:05 crc kubenswrapper[4826]: I0319 20:00:05.921050 4826 generic.go:334] "Generic (PLEG): container finished" podID="b4124973-19ff-4e28-b7ce-94423ac25de8" containerID="b043cd8f5631b5bd47670db8780d74581e614575119f02b354d622f3f423f5c4" exitCode=0 Mar 19 20:00:05 crc kubenswrapper[4826]: I0319 20:00:05.921194 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565840-cpxqz" event={"ID":"b4124973-19ff-4e28-b7ce-94423ac25de8","Type":"ContainerDied","Data":"b043cd8f5631b5bd47670db8780d74581e614575119f02b354d622f3f423f5c4"} Mar 19 20:00:06 crc kubenswrapper[4826]: I0319 20:00:06.005540 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9aa2d483-2b01-4910-91fc-28ad369bbb17" path="/var/lib/kubelet/pods/9aa2d483-2b01-4910-91fc-28ad369bbb17/volumes" Mar 19 20:00:07 crc kubenswrapper[4826]: I0319 20:00:07.463698 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565840-cpxqz" Mar 19 20:00:07 crc kubenswrapper[4826]: I0319 20:00:07.591170 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shckg\" (UniqueName: \"kubernetes.io/projected/b4124973-19ff-4e28-b7ce-94423ac25de8-kube-api-access-shckg\") pod \"b4124973-19ff-4e28-b7ce-94423ac25de8\" (UID: \"b4124973-19ff-4e28-b7ce-94423ac25de8\") " Mar 19 20:00:07 crc kubenswrapper[4826]: I0319 20:00:07.598985 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4124973-19ff-4e28-b7ce-94423ac25de8-kube-api-access-shckg" (OuterVolumeSpecName: "kube-api-access-shckg") pod "b4124973-19ff-4e28-b7ce-94423ac25de8" (UID: "b4124973-19ff-4e28-b7ce-94423ac25de8"). InnerVolumeSpecName "kube-api-access-shckg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:00:07 crc kubenswrapper[4826]: I0319 20:00:07.694111 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shckg\" (UniqueName: \"kubernetes.io/projected/b4124973-19ff-4e28-b7ce-94423ac25de8-kube-api-access-shckg\") on node \"crc\" DevicePath \"\"" Mar 19 20:00:07 crc kubenswrapper[4826]: I0319 20:00:07.953561 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565840-cpxqz" event={"ID":"b4124973-19ff-4e28-b7ce-94423ac25de8","Type":"ContainerDied","Data":"57c488d450c350a0f30a00fcd237001b688ddc88ae67d78d7fdbb1a27bd6ebd9"} Mar 19 20:00:07 crc kubenswrapper[4826]: I0319 20:00:07.953607 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565840-cpxqz" Mar 19 20:00:07 crc kubenswrapper[4826]: I0319 20:00:07.953632 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57c488d450c350a0f30a00fcd237001b688ddc88ae67d78d7fdbb1a27bd6ebd9" Mar 19 20:00:07 crc kubenswrapper[4826]: I0319 20:00:07.979134 4826 scope.go:117] "RemoveContainer" containerID="19026675d22c90c0227b4d60a456b6c13458f69e7a00a61e59a04ce0e98ff9e3" Mar 19 20:00:07 crc kubenswrapper[4826]: E0319 20:00:07.981240 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 20:00:08 crc kubenswrapper[4826]: I0319 20:00:08.018944 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565834-zgtjs"] Mar 19 20:00:08 crc kubenswrapper[4826]: I0319 20:00:08.032566 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565834-zgtjs"] Mar 19 20:00:10 crc kubenswrapper[4826]: I0319 20:00:10.002457 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="473327a3-285f-478f-9711-fe1c21c8976c" path="/var/lib/kubelet/pods/473327a3-285f-478f-9711-fe1c21c8976c/volumes" Mar 19 20:00:21 crc kubenswrapper[4826]: I0319 20:00:21.976363 4826 scope.go:117] "RemoveContainer" containerID="19026675d22c90c0227b4d60a456b6c13458f69e7a00a61e59a04ce0e98ff9e3" Mar 19 20:00:21 crc kubenswrapper[4826]: E0319 20:00:21.977258 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 20:00:34 crc kubenswrapper[4826]: I0319 20:00:34.604882 4826 scope.go:117] "RemoveContainer" containerID="fc3e9fe6b07ef836e5f967c917f083ea8f6c8fd4f479a1732ee5ee4089692923" Mar 19 20:00:34 crc kubenswrapper[4826]: I0319 20:00:34.644231 4826 scope.go:117] "RemoveContainer" containerID="eed37c96caee102079a66ef473badf59fd46444c6eb78daa8597dcbaab0156f9" Mar 19 20:00:36 crc kubenswrapper[4826]: I0319 20:00:36.976821 4826 scope.go:117] "RemoveContainer" containerID="19026675d22c90c0227b4d60a456b6c13458f69e7a00a61e59a04ce0e98ff9e3" Mar 19 20:00:36 crc kubenswrapper[4826]: E0319 20:00:36.978645 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 20:00:38 crc kubenswrapper[4826]: I0319 20:00:38.951597 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-zc2nx" Mar 19 20:00:48 crc kubenswrapper[4826]: I0319 20:00:48.976763 4826 scope.go:117] "RemoveContainer" containerID="19026675d22c90c0227b4d60a456b6c13458f69e7a00a61e59a04ce0e98ff9e3" Mar 19 20:00:48 crc kubenswrapper[4826]: E0319 20:00:48.977830 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 20:01:00 crc kubenswrapper[4826]: I0319 20:01:00.193903 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29565841-mg6mz"] Mar 19 20:01:00 crc kubenswrapper[4826]: E0319 20:01:00.195142 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73094e35-9b01-4551-8655-208af32b0af9" containerName="collect-profiles" Mar 19 20:01:00 crc kubenswrapper[4826]: I0319 20:01:00.195160 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="73094e35-9b01-4551-8655-208af32b0af9" containerName="collect-profiles" Mar 19 20:01:00 crc kubenswrapper[4826]: E0319 20:01:00.195178 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4124973-19ff-4e28-b7ce-94423ac25de8" containerName="oc" Mar 19 20:01:00 crc kubenswrapper[4826]: I0319 20:01:00.195187 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4124973-19ff-4e28-b7ce-94423ac25de8" containerName="oc" Mar 19 20:01:00 crc kubenswrapper[4826]: I0319 20:01:00.195469 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="73094e35-9b01-4551-8655-208af32b0af9" containerName="collect-profiles" Mar 19 20:01:00 crc kubenswrapper[4826]: I0319 20:01:00.195491 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4124973-19ff-4e28-b7ce-94423ac25de8" containerName="oc" Mar 19 20:01:00 crc kubenswrapper[4826]: I0319 20:01:00.196573 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29565841-mg6mz" Mar 19 20:01:00 crc kubenswrapper[4826]: I0319 20:01:00.208572 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29565841-mg6mz"] Mar 19 20:01:00 crc kubenswrapper[4826]: I0319 20:01:00.225347 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29dfabdd-c376-4871-9fa7-68f2feef0e4a-config-data\") pod \"keystone-cron-29565841-mg6mz\" (UID: \"29dfabdd-c376-4871-9fa7-68f2feef0e4a\") " pod="openstack/keystone-cron-29565841-mg6mz" Mar 19 20:01:00 crc kubenswrapper[4826]: I0319 20:01:00.225404 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29dfabdd-c376-4871-9fa7-68f2feef0e4a-combined-ca-bundle\") pod \"keystone-cron-29565841-mg6mz\" (UID: \"29dfabdd-c376-4871-9fa7-68f2feef0e4a\") " pod="openstack/keystone-cron-29565841-mg6mz" Mar 19 20:01:00 crc kubenswrapper[4826]: I0319 20:01:00.225479 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/29dfabdd-c376-4871-9fa7-68f2feef0e4a-fernet-keys\") pod \"keystone-cron-29565841-mg6mz\" (UID: \"29dfabdd-c376-4871-9fa7-68f2feef0e4a\") " pod="openstack/keystone-cron-29565841-mg6mz" Mar 19 20:01:00 crc kubenswrapper[4826]: I0319 20:01:00.225511 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqlw7\" (UniqueName: \"kubernetes.io/projected/29dfabdd-c376-4871-9fa7-68f2feef0e4a-kube-api-access-cqlw7\") pod \"keystone-cron-29565841-mg6mz\" (UID: \"29dfabdd-c376-4871-9fa7-68f2feef0e4a\") " pod="openstack/keystone-cron-29565841-mg6mz" Mar 19 20:01:00 crc kubenswrapper[4826]: I0319 20:01:00.332937 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29dfabdd-c376-4871-9fa7-68f2feef0e4a-config-data\") pod \"keystone-cron-29565841-mg6mz\" (UID: \"29dfabdd-c376-4871-9fa7-68f2feef0e4a\") " pod="openstack/keystone-cron-29565841-mg6mz" Mar 19 20:01:00 crc kubenswrapper[4826]: I0319 20:01:00.333192 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29dfabdd-c376-4871-9fa7-68f2feef0e4a-combined-ca-bundle\") pod \"keystone-cron-29565841-mg6mz\" (UID: \"29dfabdd-c376-4871-9fa7-68f2feef0e4a\") " pod="openstack/keystone-cron-29565841-mg6mz" Mar 19 20:01:00 crc kubenswrapper[4826]: I0319 20:01:00.333278 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/29dfabdd-c376-4871-9fa7-68f2feef0e4a-fernet-keys\") pod \"keystone-cron-29565841-mg6mz\" (UID: \"29dfabdd-c376-4871-9fa7-68f2feef0e4a\") " pod="openstack/keystone-cron-29565841-mg6mz" Mar 19 20:01:00 crc kubenswrapper[4826]: I0319 20:01:00.333310 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqlw7\" (UniqueName: \"kubernetes.io/projected/29dfabdd-c376-4871-9fa7-68f2feef0e4a-kube-api-access-cqlw7\") pod \"keystone-cron-29565841-mg6mz\" (UID: \"29dfabdd-c376-4871-9fa7-68f2feef0e4a\") " pod="openstack/keystone-cron-29565841-mg6mz" Mar 19 20:01:00 crc kubenswrapper[4826]: I0319 20:01:00.341009 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29dfabdd-c376-4871-9fa7-68f2feef0e4a-config-data\") pod \"keystone-cron-29565841-mg6mz\" (UID: \"29dfabdd-c376-4871-9fa7-68f2feef0e4a\") " pod="openstack/keystone-cron-29565841-mg6mz" Mar 19 20:01:00 crc kubenswrapper[4826]: I0319 20:01:00.341136 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/29dfabdd-c376-4871-9fa7-68f2feef0e4a-fernet-keys\") pod \"keystone-cron-29565841-mg6mz\" (UID: \"29dfabdd-c376-4871-9fa7-68f2feef0e4a\") " pod="openstack/keystone-cron-29565841-mg6mz" Mar 19 20:01:00 crc kubenswrapper[4826]: I0319 20:01:00.346636 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29dfabdd-c376-4871-9fa7-68f2feef0e4a-combined-ca-bundle\") pod \"keystone-cron-29565841-mg6mz\" (UID: \"29dfabdd-c376-4871-9fa7-68f2feef0e4a\") " pod="openstack/keystone-cron-29565841-mg6mz" Mar 19 20:01:00 crc kubenswrapper[4826]: I0319 20:01:00.350913 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqlw7\" (UniqueName: \"kubernetes.io/projected/29dfabdd-c376-4871-9fa7-68f2feef0e4a-kube-api-access-cqlw7\") pod \"keystone-cron-29565841-mg6mz\" (UID: \"29dfabdd-c376-4871-9fa7-68f2feef0e4a\") " pod="openstack/keystone-cron-29565841-mg6mz" Mar 19 20:01:00 crc kubenswrapper[4826]: I0319 20:01:00.533257 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29565841-mg6mz" Mar 19 20:01:01 crc kubenswrapper[4826]: I0319 20:01:01.325319 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29565841-mg6mz"] Mar 19 20:01:01 crc kubenswrapper[4826]: I0319 20:01:01.603669 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29565841-mg6mz" event={"ID":"29dfabdd-c376-4871-9fa7-68f2feef0e4a","Type":"ContainerStarted","Data":"912bd836e3eedfaa5d5a04487061ee7fb24e7626745472eb030eff071c801224"} Mar 19 20:01:01 crc kubenswrapper[4826]: I0319 20:01:01.987348 4826 scope.go:117] "RemoveContainer" containerID="19026675d22c90c0227b4d60a456b6c13458f69e7a00a61e59a04ce0e98ff9e3" Mar 19 20:01:01 crc kubenswrapper[4826]: E0319 20:01:01.989161 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 20:01:02 crc kubenswrapper[4826]: I0319 20:01:02.618476 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29565841-mg6mz" event={"ID":"29dfabdd-c376-4871-9fa7-68f2feef0e4a","Type":"ContainerStarted","Data":"ae55f20f78d80c2c0060bc0b8c21db2ba8a5565ac6d38d2bab09efd66c1c7d84"} Mar 19 20:01:02 crc kubenswrapper[4826]: I0319 20:01:02.643995 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29565841-mg6mz" podStartSLOduration=2.643976883 podStartE2EDuration="2.643976883s" podCreationTimestamp="2026-03-19 20:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:01:02.637366603 +0000 UTC m=+3887.391434916" watchObservedRunningTime="2026-03-19 20:01:02.643976883 +0000 UTC m=+3887.398045196" Mar 19 20:01:05 crc kubenswrapper[4826]: I0319 20:01:05.655552 4826 generic.go:334] "Generic (PLEG): container finished" podID="29dfabdd-c376-4871-9fa7-68f2feef0e4a" containerID="ae55f20f78d80c2c0060bc0b8c21db2ba8a5565ac6d38d2bab09efd66c1c7d84" exitCode=0 Mar 19 20:01:05 crc kubenswrapper[4826]: I0319 20:01:05.655673 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29565841-mg6mz" event={"ID":"29dfabdd-c376-4871-9fa7-68f2feef0e4a","Type":"ContainerDied","Data":"ae55f20f78d80c2c0060bc0b8c21db2ba8a5565ac6d38d2bab09efd66c1c7d84"} Mar 19 20:01:07 crc kubenswrapper[4826]: I0319 20:01:07.486024 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29565841-mg6mz" Mar 19 20:01:07 crc kubenswrapper[4826]: I0319 20:01:07.654932 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/29dfabdd-c376-4871-9fa7-68f2feef0e4a-fernet-keys\") pod \"29dfabdd-c376-4871-9fa7-68f2feef0e4a\" (UID: \"29dfabdd-c376-4871-9fa7-68f2feef0e4a\") " Mar 19 20:01:07 crc kubenswrapper[4826]: I0319 20:01:07.655094 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29dfabdd-c376-4871-9fa7-68f2feef0e4a-config-data\") pod \"29dfabdd-c376-4871-9fa7-68f2feef0e4a\" (UID: \"29dfabdd-c376-4871-9fa7-68f2feef0e4a\") " Mar 19 20:01:07 crc kubenswrapper[4826]: I0319 20:01:07.655162 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29dfabdd-c376-4871-9fa7-68f2feef0e4a-combined-ca-bundle\") pod \"29dfabdd-c376-4871-9fa7-68f2feef0e4a\" (UID: \"29dfabdd-c376-4871-9fa7-68f2feef0e4a\") " Mar 19 20:01:07 crc kubenswrapper[4826]: I0319 20:01:07.655948 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqlw7\" (UniqueName: \"kubernetes.io/projected/29dfabdd-c376-4871-9fa7-68f2feef0e4a-kube-api-access-cqlw7\") pod \"29dfabdd-c376-4871-9fa7-68f2feef0e4a\" (UID: \"29dfabdd-c376-4871-9fa7-68f2feef0e4a\") " Mar 19 20:01:07 crc kubenswrapper[4826]: I0319 20:01:07.660477 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29dfabdd-c376-4871-9fa7-68f2feef0e4a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "29dfabdd-c376-4871-9fa7-68f2feef0e4a" (UID: "29dfabdd-c376-4871-9fa7-68f2feef0e4a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:01:07 crc kubenswrapper[4826]: I0319 20:01:07.660961 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29dfabdd-c376-4871-9fa7-68f2feef0e4a-kube-api-access-cqlw7" (OuterVolumeSpecName: "kube-api-access-cqlw7") pod "29dfabdd-c376-4871-9fa7-68f2feef0e4a" (UID: "29dfabdd-c376-4871-9fa7-68f2feef0e4a"). InnerVolumeSpecName "kube-api-access-cqlw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:01:07 crc kubenswrapper[4826]: I0319 20:01:07.679540 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29565841-mg6mz" event={"ID":"29dfabdd-c376-4871-9fa7-68f2feef0e4a","Type":"ContainerDied","Data":"912bd836e3eedfaa5d5a04487061ee7fb24e7626745472eb030eff071c801224"} Mar 19 20:01:07 crc kubenswrapper[4826]: I0319 20:01:07.679593 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="912bd836e3eedfaa5d5a04487061ee7fb24e7626745472eb030eff071c801224" Mar 19 20:01:07 crc kubenswrapper[4826]: I0319 20:01:07.679700 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29565841-mg6mz" Mar 19 20:01:07 crc kubenswrapper[4826]: I0319 20:01:07.719757 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29dfabdd-c376-4871-9fa7-68f2feef0e4a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "29dfabdd-c376-4871-9fa7-68f2feef0e4a" (UID: "29dfabdd-c376-4871-9fa7-68f2feef0e4a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:01:07 crc kubenswrapper[4826]: I0319 20:01:07.733532 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29dfabdd-c376-4871-9fa7-68f2feef0e4a-config-data" (OuterVolumeSpecName: "config-data") pod "29dfabdd-c376-4871-9fa7-68f2feef0e4a" (UID: "29dfabdd-c376-4871-9fa7-68f2feef0e4a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:01:07 crc kubenswrapper[4826]: I0319 20:01:07.759523 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqlw7\" (UniqueName: \"kubernetes.io/projected/29dfabdd-c376-4871-9fa7-68f2feef0e4a-kube-api-access-cqlw7\") on node \"crc\" DevicePath \"\"" Mar 19 20:01:07 crc kubenswrapper[4826]: I0319 20:01:07.759560 4826 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/29dfabdd-c376-4871-9fa7-68f2feef0e4a-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 19 20:01:07 crc kubenswrapper[4826]: I0319 20:01:07.759569 4826 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29dfabdd-c376-4871-9fa7-68f2feef0e4a-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 20:01:07 crc kubenswrapper[4826]: I0319 20:01:07.759580 4826 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29dfabdd-c376-4871-9fa7-68f2feef0e4a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 20:01:14 crc kubenswrapper[4826]: I0319 20:01:14.977560 4826 scope.go:117] "RemoveContainer" containerID="19026675d22c90c0227b4d60a456b6c13458f69e7a00a61e59a04ce0e98ff9e3" Mar 19 20:01:14 crc kubenswrapper[4826]: E0319 20:01:14.978349 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 20:01:28 crc kubenswrapper[4826]: I0319 20:01:28.143351 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rvnxh"] Mar 19 20:01:28 crc kubenswrapper[4826]: E0319 20:01:28.144780 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29dfabdd-c376-4871-9fa7-68f2feef0e4a" containerName="keystone-cron" Mar 19 20:01:28 crc kubenswrapper[4826]: I0319 20:01:28.144805 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="29dfabdd-c376-4871-9fa7-68f2feef0e4a" containerName="keystone-cron" Mar 19 20:01:28 crc kubenswrapper[4826]: I0319 20:01:28.145312 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="29dfabdd-c376-4871-9fa7-68f2feef0e4a" containerName="keystone-cron" Mar 19 20:01:28 crc kubenswrapper[4826]: I0319 20:01:28.149137 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rvnxh" Mar 19 20:01:28 crc kubenswrapper[4826]: I0319 20:01:28.157240 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rvnxh"] Mar 19 20:01:28 crc kubenswrapper[4826]: I0319 20:01:28.304819 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81d4f8b9-16e7-45d6-ae37-f270e1b5b22e-utilities\") pod \"redhat-marketplace-rvnxh\" (UID: \"81d4f8b9-16e7-45d6-ae37-f270e1b5b22e\") " pod="openshift-marketplace/redhat-marketplace-rvnxh" Mar 19 20:01:28 crc kubenswrapper[4826]: I0319 20:01:28.304985 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhlmp\" (UniqueName: \"kubernetes.io/projected/81d4f8b9-16e7-45d6-ae37-f270e1b5b22e-kube-api-access-lhlmp\") pod \"redhat-marketplace-rvnxh\" (UID: \"81d4f8b9-16e7-45d6-ae37-f270e1b5b22e\") " pod="openshift-marketplace/redhat-marketplace-rvnxh" Mar 19 20:01:28 crc kubenswrapper[4826]: I0319 20:01:28.305036 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81d4f8b9-16e7-45d6-ae37-f270e1b5b22e-catalog-content\") pod \"redhat-marketplace-rvnxh\" (UID: \"81d4f8b9-16e7-45d6-ae37-f270e1b5b22e\") " pod="openshift-marketplace/redhat-marketplace-rvnxh" Mar 19 20:01:28 crc kubenswrapper[4826]: I0319 20:01:28.406993 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81d4f8b9-16e7-45d6-ae37-f270e1b5b22e-catalog-content\") pod \"redhat-marketplace-rvnxh\" (UID: \"81d4f8b9-16e7-45d6-ae37-f270e1b5b22e\") " pod="openshift-marketplace/redhat-marketplace-rvnxh" Mar 19 20:01:28 crc kubenswrapper[4826]: I0319 20:01:28.407211 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81d4f8b9-16e7-45d6-ae37-f270e1b5b22e-utilities\") pod \"redhat-marketplace-rvnxh\" (UID: \"81d4f8b9-16e7-45d6-ae37-f270e1b5b22e\") " pod="openshift-marketplace/redhat-marketplace-rvnxh" Mar 19 20:01:28 crc kubenswrapper[4826]: I0319 20:01:28.407386 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhlmp\" (UniqueName: \"kubernetes.io/projected/81d4f8b9-16e7-45d6-ae37-f270e1b5b22e-kube-api-access-lhlmp\") pod \"redhat-marketplace-rvnxh\" (UID: \"81d4f8b9-16e7-45d6-ae37-f270e1b5b22e\") " pod="openshift-marketplace/redhat-marketplace-rvnxh" Mar 19 20:01:28 crc kubenswrapper[4826]: I0319 20:01:28.408352 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81d4f8b9-16e7-45d6-ae37-f270e1b5b22e-catalog-content\") pod \"redhat-marketplace-rvnxh\" (UID: \"81d4f8b9-16e7-45d6-ae37-f270e1b5b22e\") " pod="openshift-marketplace/redhat-marketplace-rvnxh" Mar 19 20:01:28 crc kubenswrapper[4826]: I0319 20:01:28.408775 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81d4f8b9-16e7-45d6-ae37-f270e1b5b22e-utilities\") pod \"redhat-marketplace-rvnxh\" (UID: \"81d4f8b9-16e7-45d6-ae37-f270e1b5b22e\") " pod="openshift-marketplace/redhat-marketplace-rvnxh" Mar 19 20:01:28 crc kubenswrapper[4826]: I0319 20:01:28.427303 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhlmp\" (UniqueName: \"kubernetes.io/projected/81d4f8b9-16e7-45d6-ae37-f270e1b5b22e-kube-api-access-lhlmp\") pod \"redhat-marketplace-rvnxh\" (UID: \"81d4f8b9-16e7-45d6-ae37-f270e1b5b22e\") " pod="openshift-marketplace/redhat-marketplace-rvnxh" Mar 19 20:01:28 crc kubenswrapper[4826]: I0319 20:01:28.488566 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rvnxh" Mar 19 20:01:28 crc kubenswrapper[4826]: I0319 20:01:28.977107 4826 scope.go:117] "RemoveContainer" containerID="19026675d22c90c0227b4d60a456b6c13458f69e7a00a61e59a04ce0e98ff9e3" Mar 19 20:01:29 crc kubenswrapper[4826]: I0319 20:01:29.063837 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rvnxh"] Mar 19 20:01:29 crc kubenswrapper[4826]: W0319 20:01:29.071255 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81d4f8b9_16e7_45d6_ae37_f270e1b5b22e.slice/crio-902928db90fe8d9e0008b31bd2a66a600fe7465137b53e53b71b2013fd3fedfb WatchSource:0}: Error finding container 902928db90fe8d9e0008b31bd2a66a600fe7465137b53e53b71b2013fd3fedfb: Status 404 returned error can't find the container with id 902928db90fe8d9e0008b31bd2a66a600fe7465137b53e53b71b2013fd3fedfb Mar 19 20:01:29 crc kubenswrapper[4826]: I0319 20:01:29.954754 4826 generic.go:334] "Generic (PLEG): container finished" podID="81d4f8b9-16e7-45d6-ae37-f270e1b5b22e" containerID="2de06391657cbe62da50b08b48c1c45e6832b063c398752a98b5d9697624b05f" exitCode=0 Mar 19 20:01:29 crc kubenswrapper[4826]: I0319 20:01:29.954796 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rvnxh" event={"ID":"81d4f8b9-16e7-45d6-ae37-f270e1b5b22e","Type":"ContainerDied","Data":"2de06391657cbe62da50b08b48c1c45e6832b063c398752a98b5d9697624b05f"} Mar 19 20:01:29 crc kubenswrapper[4826]: I0319 20:01:29.955051 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rvnxh" event={"ID":"81d4f8b9-16e7-45d6-ae37-f270e1b5b22e","Type":"ContainerStarted","Data":"902928db90fe8d9e0008b31bd2a66a600fe7465137b53e53b71b2013fd3fedfb"} Mar 19 20:01:29 crc kubenswrapper[4826]: I0319 20:01:29.963761 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" event={"ID":"b456fa3f-c7a7-45ca-b560-e7a9b21be05a","Type":"ContainerStarted","Data":"2753947510effd8dc69a5256d346c2fd89d2c31e9d6c5a8623efc428660b5aaa"} Mar 19 20:01:31 crc kubenswrapper[4826]: I0319 20:01:31.727158 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mwnqk"] Mar 19 20:01:31 crc kubenswrapper[4826]: I0319 20:01:31.730480 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mwnqk" Mar 19 20:01:31 crc kubenswrapper[4826]: I0319 20:01:31.741349 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mwnqk"] Mar 19 20:01:31 crc kubenswrapper[4826]: I0319 20:01:31.819302 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z86gl\" (UniqueName: \"kubernetes.io/projected/ada29661-69db-494b-81ac-f4f3b95e60a9-kube-api-access-z86gl\") pod \"community-operators-mwnqk\" (UID: \"ada29661-69db-494b-81ac-f4f3b95e60a9\") " pod="openshift-marketplace/community-operators-mwnqk" Mar 19 20:01:31 crc kubenswrapper[4826]: I0319 20:01:31.819353 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ada29661-69db-494b-81ac-f4f3b95e60a9-utilities\") pod \"community-operators-mwnqk\" (UID: \"ada29661-69db-494b-81ac-f4f3b95e60a9\") " pod="openshift-marketplace/community-operators-mwnqk" Mar 19 20:01:31 crc kubenswrapper[4826]: I0319 20:01:31.819407 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ada29661-69db-494b-81ac-f4f3b95e60a9-catalog-content\") pod \"community-operators-mwnqk\" (UID: \"ada29661-69db-494b-81ac-f4f3b95e60a9\") " pod="openshift-marketplace/community-operators-mwnqk" Mar 19 20:01:31 crc kubenswrapper[4826]: I0319 20:01:31.921990 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z86gl\" (UniqueName: \"kubernetes.io/projected/ada29661-69db-494b-81ac-f4f3b95e60a9-kube-api-access-z86gl\") pod \"community-operators-mwnqk\" (UID: \"ada29661-69db-494b-81ac-f4f3b95e60a9\") " pod="openshift-marketplace/community-operators-mwnqk" Mar 19 20:01:31 crc kubenswrapper[4826]: I0319 20:01:31.922043 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ada29661-69db-494b-81ac-f4f3b95e60a9-utilities\") pod \"community-operators-mwnqk\" (UID: \"ada29661-69db-494b-81ac-f4f3b95e60a9\") " pod="openshift-marketplace/community-operators-mwnqk" Mar 19 20:01:31 crc kubenswrapper[4826]: I0319 20:01:31.922095 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ada29661-69db-494b-81ac-f4f3b95e60a9-catalog-content\") pod \"community-operators-mwnqk\" (UID: \"ada29661-69db-494b-81ac-f4f3b95e60a9\") " pod="openshift-marketplace/community-operators-mwnqk" Mar 19 20:01:31 crc kubenswrapper[4826]: I0319 20:01:31.922588 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ada29661-69db-494b-81ac-f4f3b95e60a9-utilities\") pod \"community-operators-mwnqk\" (UID: \"ada29661-69db-494b-81ac-f4f3b95e60a9\") " pod="openshift-marketplace/community-operators-mwnqk" Mar 19 20:01:31 crc kubenswrapper[4826]: I0319 20:01:31.922623 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ada29661-69db-494b-81ac-f4f3b95e60a9-catalog-content\") pod \"community-operators-mwnqk\" (UID: \"ada29661-69db-494b-81ac-f4f3b95e60a9\") " pod="openshift-marketplace/community-operators-mwnqk" Mar 19 20:01:31 crc kubenswrapper[4826]: I0319 20:01:31.944308 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z86gl\" (UniqueName: \"kubernetes.io/projected/ada29661-69db-494b-81ac-f4f3b95e60a9-kube-api-access-z86gl\") pod \"community-operators-mwnqk\" (UID: \"ada29661-69db-494b-81ac-f4f3b95e60a9\") " pod="openshift-marketplace/community-operators-mwnqk" Mar 19 20:01:31 crc kubenswrapper[4826]: I0319 20:01:31.995230 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rvnxh" event={"ID":"81d4f8b9-16e7-45d6-ae37-f270e1b5b22e","Type":"ContainerStarted","Data":"7fbb31971994ac237597f1650b9b23b74510eed485f1f6d47735357c848f74e0"} Mar 19 20:01:32 crc kubenswrapper[4826]: I0319 20:01:32.050529 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mwnqk" Mar 19 20:01:32 crc kubenswrapper[4826]: I0319 20:01:32.685613 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mwnqk"] Mar 19 20:01:33 crc kubenswrapper[4826]: I0319 20:01:33.012843 4826 generic.go:334] "Generic (PLEG): container finished" podID="ada29661-69db-494b-81ac-f4f3b95e60a9" containerID="57302cca73651de6bc098948a6fed2ed6ecf5ef60e3e9d929643812fcb3a7d39" exitCode=0 Mar 19 20:01:33 crc kubenswrapper[4826]: I0319 20:01:33.013187 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mwnqk" event={"ID":"ada29661-69db-494b-81ac-f4f3b95e60a9","Type":"ContainerDied","Data":"57302cca73651de6bc098948a6fed2ed6ecf5ef60e3e9d929643812fcb3a7d39"} Mar 19 20:01:33 crc kubenswrapper[4826]: I0319 20:01:33.013213 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mwnqk" event={"ID":"ada29661-69db-494b-81ac-f4f3b95e60a9","Type":"ContainerStarted","Data":"f453726fe1007ffe74edb5cab90544b8b3f40bf8ea272af896bdd8a284c4fa5d"} Mar 19 20:01:33 crc kubenswrapper[4826]: I0319 20:01:33.017084 4826 generic.go:334] "Generic (PLEG): container finished" podID="81d4f8b9-16e7-45d6-ae37-f270e1b5b22e" containerID="7fbb31971994ac237597f1650b9b23b74510eed485f1f6d47735357c848f74e0" exitCode=0 Mar 19 20:01:33 crc kubenswrapper[4826]: I0319 20:01:33.017120 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rvnxh" event={"ID":"81d4f8b9-16e7-45d6-ae37-f270e1b5b22e","Type":"ContainerDied","Data":"7fbb31971994ac237597f1650b9b23b74510eed485f1f6d47735357c848f74e0"} Mar 19 20:01:34 crc kubenswrapper[4826]: I0319 20:01:34.033354 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rvnxh" event={"ID":"81d4f8b9-16e7-45d6-ae37-f270e1b5b22e","Type":"ContainerStarted","Data":"cb25dfb2dcc376bf7ba6faecd094a5c0eaa0090756cb9437d93e2b90e5bb5422"} Mar 19 20:01:34 crc kubenswrapper[4826]: I0319 20:01:34.036531 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mwnqk" event={"ID":"ada29661-69db-494b-81ac-f4f3b95e60a9","Type":"ContainerStarted","Data":"ea88a6ffa2ecb59d564d121d27de113f34f70f7f2b755107ee3b499cb88cb7ea"} Mar 19 20:01:34 crc kubenswrapper[4826]: I0319 20:01:34.055955 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rvnxh" podStartSLOduration=2.572622153 podStartE2EDuration="6.055930563s" podCreationTimestamp="2026-03-19 20:01:28 +0000 UTC" firstStartedPulling="2026-03-19 20:01:29.958135068 +0000 UTC m=+3914.712203401" lastFinishedPulling="2026-03-19 20:01:33.441443458 +0000 UTC m=+3918.195511811" observedRunningTime="2026-03-19 20:01:34.050731277 +0000 UTC m=+3918.804799640" watchObservedRunningTime="2026-03-19 20:01:34.055930563 +0000 UTC m=+3918.809998876" Mar 19 20:01:36 crc kubenswrapper[4826]: I0319 20:01:36.060011 4826 generic.go:334] "Generic (PLEG): container finished" podID="ada29661-69db-494b-81ac-f4f3b95e60a9" containerID="ea88a6ffa2ecb59d564d121d27de113f34f70f7f2b755107ee3b499cb88cb7ea" exitCode=0 Mar 19 20:01:36 crc kubenswrapper[4826]: I0319 20:01:36.060083 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mwnqk" event={"ID":"ada29661-69db-494b-81ac-f4f3b95e60a9","Type":"ContainerDied","Data":"ea88a6ffa2ecb59d564d121d27de113f34f70f7f2b755107ee3b499cb88cb7ea"} Mar 19 20:01:37 crc kubenswrapper[4826]: I0319 20:01:37.073848 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mwnqk" event={"ID":"ada29661-69db-494b-81ac-f4f3b95e60a9","Type":"ContainerStarted","Data":"dc6c780750284161f83aeac3e0bc7c884b19446ae061541352bd2f07290fec16"} Mar 19 20:01:37 crc kubenswrapper[4826]: I0319 20:01:37.097422 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mwnqk" podStartSLOduration=2.592627383 podStartE2EDuration="6.097403871s" podCreationTimestamp="2026-03-19 20:01:31 +0000 UTC" firstStartedPulling="2026-03-19 20:01:33.014433055 +0000 UTC m=+3917.768501368" lastFinishedPulling="2026-03-19 20:01:36.519209533 +0000 UTC m=+3921.273277856" observedRunningTime="2026-03-19 20:01:37.091809286 +0000 UTC m=+3921.845877609" watchObservedRunningTime="2026-03-19 20:01:37.097403871 +0000 UTC m=+3921.851472184" Mar 19 20:01:38 crc kubenswrapper[4826]: I0319 20:01:38.489797 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rvnxh" Mar 19 20:01:38 crc kubenswrapper[4826]: I0319 20:01:38.490127 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rvnxh" Mar 19 20:01:38 crc kubenswrapper[4826]: I0319 20:01:38.605513 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rvnxh" Mar 19 20:01:39 crc kubenswrapper[4826]: I0319 20:01:39.146145 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rvnxh" Mar 19 20:01:39 crc kubenswrapper[4826]: I0319 20:01:39.704523 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rvnxh"] Mar 19 20:01:41 crc kubenswrapper[4826]: I0319 20:01:41.128544 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rvnxh" podUID="81d4f8b9-16e7-45d6-ae37-f270e1b5b22e" containerName="registry-server" containerID="cri-o://cb25dfb2dcc376bf7ba6faecd094a5c0eaa0090756cb9437d93e2b90e5bb5422" gracePeriod=2 Mar 19 20:01:41 crc kubenswrapper[4826]: I0319 20:01:41.702489 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rvnxh" Mar 19 20:01:41 crc kubenswrapper[4826]: I0319 20:01:41.798867 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhlmp\" (UniqueName: \"kubernetes.io/projected/81d4f8b9-16e7-45d6-ae37-f270e1b5b22e-kube-api-access-lhlmp\") pod \"81d4f8b9-16e7-45d6-ae37-f270e1b5b22e\" (UID: \"81d4f8b9-16e7-45d6-ae37-f270e1b5b22e\") " Mar 19 20:01:41 crc kubenswrapper[4826]: I0319 20:01:41.798963 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81d4f8b9-16e7-45d6-ae37-f270e1b5b22e-utilities\") pod \"81d4f8b9-16e7-45d6-ae37-f270e1b5b22e\" (UID: \"81d4f8b9-16e7-45d6-ae37-f270e1b5b22e\") " Mar 19 20:01:41 crc kubenswrapper[4826]: I0319 20:01:41.798992 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81d4f8b9-16e7-45d6-ae37-f270e1b5b22e-catalog-content\") pod \"81d4f8b9-16e7-45d6-ae37-f270e1b5b22e\" (UID: \"81d4f8b9-16e7-45d6-ae37-f270e1b5b22e\") " Mar 19 20:01:41 crc kubenswrapper[4826]: I0319 20:01:41.799682 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81d4f8b9-16e7-45d6-ae37-f270e1b5b22e-utilities" (OuterVolumeSpecName: "utilities") pod "81d4f8b9-16e7-45d6-ae37-f270e1b5b22e" (UID: "81d4f8b9-16e7-45d6-ae37-f270e1b5b22e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:01:41 crc kubenswrapper[4826]: I0319 20:01:41.806452 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81d4f8b9-16e7-45d6-ae37-f270e1b5b22e-kube-api-access-lhlmp" (OuterVolumeSpecName: "kube-api-access-lhlmp") pod "81d4f8b9-16e7-45d6-ae37-f270e1b5b22e" (UID: "81d4f8b9-16e7-45d6-ae37-f270e1b5b22e"). InnerVolumeSpecName "kube-api-access-lhlmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:01:41 crc kubenswrapper[4826]: I0319 20:01:41.821812 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81d4f8b9-16e7-45d6-ae37-f270e1b5b22e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "81d4f8b9-16e7-45d6-ae37-f270e1b5b22e" (UID: "81d4f8b9-16e7-45d6-ae37-f270e1b5b22e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:01:41 crc kubenswrapper[4826]: I0319 20:01:41.903193 4826 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81d4f8b9-16e7-45d6-ae37-f270e1b5b22e-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 20:01:41 crc kubenswrapper[4826]: I0319 20:01:41.903239 4826 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81d4f8b9-16e7-45d6-ae37-f270e1b5b22e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 20:01:41 crc kubenswrapper[4826]: I0319 20:01:41.903251 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhlmp\" (UniqueName: \"kubernetes.io/projected/81d4f8b9-16e7-45d6-ae37-f270e1b5b22e-kube-api-access-lhlmp\") on node \"crc\" DevicePath \"\"" Mar 19 20:01:42 crc kubenswrapper[4826]: I0319 20:01:42.051127 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mwnqk" Mar 19 20:01:42 crc kubenswrapper[4826]: I0319 20:01:42.051515 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mwnqk" Mar 19 20:01:42 crc kubenswrapper[4826]: I0319 20:01:42.101797 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mwnqk" Mar 19 20:01:42 crc kubenswrapper[4826]: I0319 20:01:42.149680 4826 generic.go:334] "Generic (PLEG): container finished" podID="81d4f8b9-16e7-45d6-ae37-f270e1b5b22e" containerID="cb25dfb2dcc376bf7ba6faecd094a5c0eaa0090756cb9437d93e2b90e5bb5422" exitCode=0 Mar 19 20:01:42 crc kubenswrapper[4826]: I0319 20:01:42.149794 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rvnxh" Mar 19 20:01:42 crc kubenswrapper[4826]: I0319 20:01:42.149794 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rvnxh" event={"ID":"81d4f8b9-16e7-45d6-ae37-f270e1b5b22e","Type":"ContainerDied","Data":"cb25dfb2dcc376bf7ba6faecd094a5c0eaa0090756cb9437d93e2b90e5bb5422"} Mar 19 20:01:42 crc kubenswrapper[4826]: I0319 20:01:42.149860 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rvnxh" event={"ID":"81d4f8b9-16e7-45d6-ae37-f270e1b5b22e","Type":"ContainerDied","Data":"902928db90fe8d9e0008b31bd2a66a600fe7465137b53e53b71b2013fd3fedfb"} Mar 19 20:01:42 crc kubenswrapper[4826]: I0319 20:01:42.149890 4826 scope.go:117] "RemoveContainer" containerID="cb25dfb2dcc376bf7ba6faecd094a5c0eaa0090756cb9437d93e2b90e5bb5422" Mar 19 20:01:42 crc kubenswrapper[4826]: I0319 20:01:42.183000 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rvnxh"] Mar 19 20:01:42 crc kubenswrapper[4826]: I0319 20:01:42.201526 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rvnxh"] Mar 19 20:01:42 crc kubenswrapper[4826]: I0319 20:01:42.206724 4826 scope.go:117] "RemoveContainer" containerID="7fbb31971994ac237597f1650b9b23b74510eed485f1f6d47735357c848f74e0" Mar 19 20:01:42 crc kubenswrapper[4826]: I0319 20:01:42.211095 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mwnqk" Mar 19 20:01:42 crc kubenswrapper[4826]: I0319 20:01:42.235607 4826 scope.go:117] "RemoveContainer" containerID="2de06391657cbe62da50b08b48c1c45e6832b063c398752a98b5d9697624b05f" Mar 19 20:01:42 crc kubenswrapper[4826]: I0319 20:01:42.297642 4826 scope.go:117] "RemoveContainer" containerID="cb25dfb2dcc376bf7ba6faecd094a5c0eaa0090756cb9437d93e2b90e5bb5422" Mar 19 20:01:42 crc kubenswrapper[4826]: E0319 20:01:42.298300 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb25dfb2dcc376bf7ba6faecd094a5c0eaa0090756cb9437d93e2b90e5bb5422\": container with ID starting with cb25dfb2dcc376bf7ba6faecd094a5c0eaa0090756cb9437d93e2b90e5bb5422 not found: ID does not exist" containerID="cb25dfb2dcc376bf7ba6faecd094a5c0eaa0090756cb9437d93e2b90e5bb5422" Mar 19 20:01:42 crc kubenswrapper[4826]: I0319 20:01:42.298336 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb25dfb2dcc376bf7ba6faecd094a5c0eaa0090756cb9437d93e2b90e5bb5422"} err="failed to get container status \"cb25dfb2dcc376bf7ba6faecd094a5c0eaa0090756cb9437d93e2b90e5bb5422\": rpc error: code = NotFound desc = could not find container \"cb25dfb2dcc376bf7ba6faecd094a5c0eaa0090756cb9437d93e2b90e5bb5422\": container with ID starting with cb25dfb2dcc376bf7ba6faecd094a5c0eaa0090756cb9437d93e2b90e5bb5422 not found: ID does not exist" Mar 19 20:01:42 crc kubenswrapper[4826]: I0319 20:01:42.298363 4826 scope.go:117] "RemoveContainer" containerID="7fbb31971994ac237597f1650b9b23b74510eed485f1f6d47735357c848f74e0" Mar 19 20:01:42 crc kubenswrapper[4826]: E0319 20:01:42.305046 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fbb31971994ac237597f1650b9b23b74510eed485f1f6d47735357c848f74e0\": container with ID starting with 7fbb31971994ac237597f1650b9b23b74510eed485f1f6d47735357c848f74e0 not found: ID does not exist" containerID="7fbb31971994ac237597f1650b9b23b74510eed485f1f6d47735357c848f74e0" Mar 19 20:01:42 crc kubenswrapper[4826]: I0319 20:01:42.305086 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fbb31971994ac237597f1650b9b23b74510eed485f1f6d47735357c848f74e0"} err="failed to get container status \"7fbb31971994ac237597f1650b9b23b74510eed485f1f6d47735357c848f74e0\": rpc error: code = NotFound desc = could not find container \"7fbb31971994ac237597f1650b9b23b74510eed485f1f6d47735357c848f74e0\": container with ID starting with 7fbb31971994ac237597f1650b9b23b74510eed485f1f6d47735357c848f74e0 not found: ID does not exist" Mar 19 20:01:42 crc kubenswrapper[4826]: I0319 20:01:42.305113 4826 scope.go:117] "RemoveContainer" containerID="2de06391657cbe62da50b08b48c1c45e6832b063c398752a98b5d9697624b05f" Mar 19 20:01:42 crc kubenswrapper[4826]: E0319 20:01:42.308221 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2de06391657cbe62da50b08b48c1c45e6832b063c398752a98b5d9697624b05f\": container with ID starting with 2de06391657cbe62da50b08b48c1c45e6832b063c398752a98b5d9697624b05f not found: ID does not exist" containerID="2de06391657cbe62da50b08b48c1c45e6832b063c398752a98b5d9697624b05f" Mar 19 20:01:42 crc kubenswrapper[4826]: I0319 20:01:42.308264 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2de06391657cbe62da50b08b48c1c45e6832b063c398752a98b5d9697624b05f"} err="failed to get container status \"2de06391657cbe62da50b08b48c1c45e6832b063c398752a98b5d9697624b05f\": rpc error: code = NotFound desc = could not find container \"2de06391657cbe62da50b08b48c1c45e6832b063c398752a98b5d9697624b05f\": container with ID starting with 2de06391657cbe62da50b08b48c1c45e6832b063c398752a98b5d9697624b05f not found: ID does not exist" Mar 19 20:01:43 crc kubenswrapper[4826]: I0319 20:01:43.994679 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81d4f8b9-16e7-45d6-ae37-f270e1b5b22e" path="/var/lib/kubelet/pods/81d4f8b9-16e7-45d6-ae37-f270e1b5b22e/volumes" Mar 19 20:01:44 crc kubenswrapper[4826]: I0319 20:01:44.514594 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mwnqk"] Mar 19 20:01:44 crc kubenswrapper[4826]: I0319 20:01:44.515194 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mwnqk" podUID="ada29661-69db-494b-81ac-f4f3b95e60a9" containerName="registry-server" containerID="cri-o://dc6c780750284161f83aeac3e0bc7c884b19446ae061541352bd2f07290fec16" gracePeriod=2 Mar 19 20:01:45 crc kubenswrapper[4826]: I0319 20:01:45.081206 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mwnqk" Mar 19 20:01:45 crc kubenswrapper[4826]: I0319 20:01:45.184833 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z86gl\" (UniqueName: \"kubernetes.io/projected/ada29661-69db-494b-81ac-f4f3b95e60a9-kube-api-access-z86gl\") pod \"ada29661-69db-494b-81ac-f4f3b95e60a9\" (UID: \"ada29661-69db-494b-81ac-f4f3b95e60a9\") " Mar 19 20:01:45 crc kubenswrapper[4826]: I0319 20:01:45.185212 4826 generic.go:334] "Generic (PLEG): container finished" podID="ada29661-69db-494b-81ac-f4f3b95e60a9" containerID="dc6c780750284161f83aeac3e0bc7c884b19446ae061541352bd2f07290fec16" exitCode=0 Mar 19 20:01:45 crc kubenswrapper[4826]: I0319 20:01:45.185259 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mwnqk" event={"ID":"ada29661-69db-494b-81ac-f4f3b95e60a9","Type":"ContainerDied","Data":"dc6c780750284161f83aeac3e0bc7c884b19446ae061541352bd2f07290fec16"} Mar 19 20:01:45 crc kubenswrapper[4826]: I0319 20:01:45.185262 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mwnqk" Mar 19 20:01:45 crc kubenswrapper[4826]: I0319 20:01:45.185299 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mwnqk" event={"ID":"ada29661-69db-494b-81ac-f4f3b95e60a9","Type":"ContainerDied","Data":"f453726fe1007ffe74edb5cab90544b8b3f40bf8ea272af896bdd8a284c4fa5d"} Mar 19 20:01:45 crc kubenswrapper[4826]: I0319 20:01:45.185324 4826 scope.go:117] "RemoveContainer" containerID="dc6c780750284161f83aeac3e0bc7c884b19446ae061541352bd2f07290fec16" Mar 19 20:01:45 crc kubenswrapper[4826]: I0319 20:01:45.185451 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ada29661-69db-494b-81ac-f4f3b95e60a9-utilities\") pod \"ada29661-69db-494b-81ac-f4f3b95e60a9\" (UID: \"ada29661-69db-494b-81ac-f4f3b95e60a9\") " Mar 19 20:01:45 crc kubenswrapper[4826]: I0319 20:01:45.185774 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ada29661-69db-494b-81ac-f4f3b95e60a9-catalog-content\") pod \"ada29661-69db-494b-81ac-f4f3b95e60a9\" (UID: \"ada29661-69db-494b-81ac-f4f3b95e60a9\") " Mar 19 20:01:45 crc kubenswrapper[4826]: I0319 20:01:45.187068 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ada29661-69db-494b-81ac-f4f3b95e60a9-utilities" (OuterVolumeSpecName: "utilities") pod "ada29661-69db-494b-81ac-f4f3b95e60a9" (UID: "ada29661-69db-494b-81ac-f4f3b95e60a9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:01:45 crc kubenswrapper[4826]: I0319 20:01:45.187344 4826 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ada29661-69db-494b-81ac-f4f3b95e60a9-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 20:01:45 crc kubenswrapper[4826]: I0319 20:01:45.194013 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ada29661-69db-494b-81ac-f4f3b95e60a9-kube-api-access-z86gl" (OuterVolumeSpecName: "kube-api-access-z86gl") pod "ada29661-69db-494b-81ac-f4f3b95e60a9" (UID: "ada29661-69db-494b-81ac-f4f3b95e60a9"). InnerVolumeSpecName "kube-api-access-z86gl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:01:45 crc kubenswrapper[4826]: I0319 20:01:45.237953 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ada29661-69db-494b-81ac-f4f3b95e60a9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ada29661-69db-494b-81ac-f4f3b95e60a9" (UID: "ada29661-69db-494b-81ac-f4f3b95e60a9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:01:45 crc kubenswrapper[4826]: I0319 20:01:45.258598 4826 scope.go:117] "RemoveContainer" containerID="ea88a6ffa2ecb59d564d121d27de113f34f70f7f2b755107ee3b499cb88cb7ea" Mar 19 20:01:45 crc kubenswrapper[4826]: I0319 20:01:45.280766 4826 scope.go:117] "RemoveContainer" containerID="57302cca73651de6bc098948a6fed2ed6ecf5ef60e3e9d929643812fcb3a7d39" Mar 19 20:01:45 crc kubenswrapper[4826]: I0319 20:01:45.289810 4826 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ada29661-69db-494b-81ac-f4f3b95e60a9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 20:01:45 crc kubenswrapper[4826]: I0319 20:01:45.289848 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z86gl\" (UniqueName: \"kubernetes.io/projected/ada29661-69db-494b-81ac-f4f3b95e60a9-kube-api-access-z86gl\") on node \"crc\" DevicePath \"\"" Mar 19 20:01:45 crc kubenswrapper[4826]: I0319 20:01:45.344154 4826 scope.go:117] "RemoveContainer" containerID="dc6c780750284161f83aeac3e0bc7c884b19446ae061541352bd2f07290fec16" Mar 19 20:01:45 crc kubenswrapper[4826]: E0319 20:01:45.344495 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc6c780750284161f83aeac3e0bc7c884b19446ae061541352bd2f07290fec16\": container with ID starting with dc6c780750284161f83aeac3e0bc7c884b19446ae061541352bd2f07290fec16 not found: ID does not exist" containerID="dc6c780750284161f83aeac3e0bc7c884b19446ae061541352bd2f07290fec16" Mar 19 20:01:45 crc kubenswrapper[4826]: I0319 20:01:45.344530 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc6c780750284161f83aeac3e0bc7c884b19446ae061541352bd2f07290fec16"} err="failed to get container status \"dc6c780750284161f83aeac3e0bc7c884b19446ae061541352bd2f07290fec16\": rpc error: code = NotFound desc = could not find container \"dc6c780750284161f83aeac3e0bc7c884b19446ae061541352bd2f07290fec16\": container with ID starting with dc6c780750284161f83aeac3e0bc7c884b19446ae061541352bd2f07290fec16 not found: ID does not exist" Mar 19 20:01:45 crc kubenswrapper[4826]: I0319 20:01:45.344558 4826 scope.go:117] "RemoveContainer" containerID="ea88a6ffa2ecb59d564d121d27de113f34f70f7f2b755107ee3b499cb88cb7ea" Mar 19 20:01:45 crc kubenswrapper[4826]: E0319 20:01:45.344834 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea88a6ffa2ecb59d564d121d27de113f34f70f7f2b755107ee3b499cb88cb7ea\": container with ID starting with ea88a6ffa2ecb59d564d121d27de113f34f70f7f2b755107ee3b499cb88cb7ea not found: ID does not exist" containerID="ea88a6ffa2ecb59d564d121d27de113f34f70f7f2b755107ee3b499cb88cb7ea" Mar 19 20:01:45 crc kubenswrapper[4826]: I0319 20:01:45.344862 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea88a6ffa2ecb59d564d121d27de113f34f70f7f2b755107ee3b499cb88cb7ea"} err="failed to get container status \"ea88a6ffa2ecb59d564d121d27de113f34f70f7f2b755107ee3b499cb88cb7ea\": rpc error: code = NotFound desc = could not find container \"ea88a6ffa2ecb59d564d121d27de113f34f70f7f2b755107ee3b499cb88cb7ea\": container with ID starting with ea88a6ffa2ecb59d564d121d27de113f34f70f7f2b755107ee3b499cb88cb7ea not found: ID does not exist" Mar 19 20:01:45 crc kubenswrapper[4826]: I0319 20:01:45.344880 4826 scope.go:117] "RemoveContainer" containerID="57302cca73651de6bc098948a6fed2ed6ecf5ef60e3e9d929643812fcb3a7d39" Mar 19 20:01:45 crc kubenswrapper[4826]: E0319 20:01:45.345173 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57302cca73651de6bc098948a6fed2ed6ecf5ef60e3e9d929643812fcb3a7d39\": container with ID starting with 57302cca73651de6bc098948a6fed2ed6ecf5ef60e3e9d929643812fcb3a7d39 not found: ID does not exist" containerID="57302cca73651de6bc098948a6fed2ed6ecf5ef60e3e9d929643812fcb3a7d39" Mar 19 20:01:45 crc kubenswrapper[4826]: I0319 20:01:45.345198 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57302cca73651de6bc098948a6fed2ed6ecf5ef60e3e9d929643812fcb3a7d39"} err="failed to get container status \"57302cca73651de6bc098948a6fed2ed6ecf5ef60e3e9d929643812fcb3a7d39\": rpc error: code = NotFound desc = could not find container \"57302cca73651de6bc098948a6fed2ed6ecf5ef60e3e9d929643812fcb3a7d39\": container with ID starting with 57302cca73651de6bc098948a6fed2ed6ecf5ef60e3e9d929643812fcb3a7d39 not found: ID does not exist" Mar 19 20:01:45 crc kubenswrapper[4826]: I0319 20:01:45.519528 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mwnqk"] Mar 19 20:01:45 crc kubenswrapper[4826]: I0319 20:01:45.529462 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mwnqk"] Mar 19 20:01:46 crc kubenswrapper[4826]: I0319 20:01:46.003742 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ada29661-69db-494b-81ac-f4f3b95e60a9" path="/var/lib/kubelet/pods/ada29661-69db-494b-81ac-f4f3b95e60a9/volumes" Mar 19 20:02:00 crc kubenswrapper[4826]: I0319 20:02:00.168888 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565842-4dzdg"] Mar 19 20:02:00 crc kubenswrapper[4826]: E0319 20:02:00.170229 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ada29661-69db-494b-81ac-f4f3b95e60a9" containerName="extract-utilities" Mar 19 20:02:00 crc kubenswrapper[4826]: I0319 20:02:00.170250 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="ada29661-69db-494b-81ac-f4f3b95e60a9" containerName="extract-utilities" Mar 19 20:02:00 crc kubenswrapper[4826]: E0319 20:02:00.170281 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81d4f8b9-16e7-45d6-ae37-f270e1b5b22e" containerName="registry-server" Mar 19 20:02:00 crc kubenswrapper[4826]: I0319 20:02:00.170294 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="81d4f8b9-16e7-45d6-ae37-f270e1b5b22e" containerName="registry-server" Mar 19 20:02:00 crc kubenswrapper[4826]: E0319 20:02:00.170346 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ada29661-69db-494b-81ac-f4f3b95e60a9" containerName="extract-content" Mar 19 20:02:00 crc kubenswrapper[4826]: I0319 20:02:00.170359 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="ada29661-69db-494b-81ac-f4f3b95e60a9" containerName="extract-content" Mar 19 20:02:00 crc kubenswrapper[4826]: E0319 20:02:00.170391 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81d4f8b9-16e7-45d6-ae37-f270e1b5b22e" containerName="extract-content" Mar 19 20:02:00 crc kubenswrapper[4826]: I0319 20:02:00.170415 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="81d4f8b9-16e7-45d6-ae37-f270e1b5b22e" containerName="extract-content" Mar 19 20:02:00 crc kubenswrapper[4826]: E0319 20:02:00.170455 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ada29661-69db-494b-81ac-f4f3b95e60a9" containerName="registry-server" Mar 19 20:02:00 crc kubenswrapper[4826]: I0319 20:02:00.170467 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="ada29661-69db-494b-81ac-f4f3b95e60a9" containerName="registry-server" Mar 19 20:02:00 crc kubenswrapper[4826]: E0319 20:02:00.170501 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81d4f8b9-16e7-45d6-ae37-f270e1b5b22e" containerName="extract-utilities" Mar 19 20:02:00 crc kubenswrapper[4826]: I0319 20:02:00.170515 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="81d4f8b9-16e7-45d6-ae37-f270e1b5b22e" containerName="extract-utilities" Mar 19 20:02:00 crc kubenswrapper[4826]: I0319 20:02:00.170928 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="ada29661-69db-494b-81ac-f4f3b95e60a9" containerName="registry-server" Mar 19 20:02:00 crc kubenswrapper[4826]: I0319 20:02:00.170971 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="81d4f8b9-16e7-45d6-ae37-f270e1b5b22e" containerName="registry-server" Mar 19 20:02:00 crc kubenswrapper[4826]: I0319 20:02:00.172164 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565842-4dzdg" Mar 19 20:02:00 crc kubenswrapper[4826]: I0319 20:02:00.177119 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b27wl" Mar 19 20:02:00 crc kubenswrapper[4826]: I0319 20:02:00.177467 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 20:02:00 crc kubenswrapper[4826]: I0319 20:02:00.179510 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 20:02:00 crc kubenswrapper[4826]: I0319 20:02:00.181034 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565842-4dzdg"] Mar 19 20:02:00 crc kubenswrapper[4826]: I0319 20:02:00.190440 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh8t2\" (UniqueName: \"kubernetes.io/projected/9d2c54a8-551d-4e31-a2cd-c42f4b7c3021-kube-api-access-fh8t2\") pod \"auto-csr-approver-29565842-4dzdg\" (UID: \"9d2c54a8-551d-4e31-a2cd-c42f4b7c3021\") " pod="openshift-infra/auto-csr-approver-29565842-4dzdg" Mar 19 20:02:00 crc kubenswrapper[4826]: I0319 20:02:00.293044 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fh8t2\" (UniqueName: \"kubernetes.io/projected/9d2c54a8-551d-4e31-a2cd-c42f4b7c3021-kube-api-access-fh8t2\") pod \"auto-csr-approver-29565842-4dzdg\" (UID: \"9d2c54a8-551d-4e31-a2cd-c42f4b7c3021\") " pod="openshift-infra/auto-csr-approver-29565842-4dzdg" Mar 19 20:02:00 crc kubenswrapper[4826]: I0319 20:02:00.313224 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh8t2\" (UniqueName: \"kubernetes.io/projected/9d2c54a8-551d-4e31-a2cd-c42f4b7c3021-kube-api-access-fh8t2\") pod \"auto-csr-approver-29565842-4dzdg\" (UID: \"9d2c54a8-551d-4e31-a2cd-c42f4b7c3021\") " pod="openshift-infra/auto-csr-approver-29565842-4dzdg" Mar 19 20:02:00 crc kubenswrapper[4826]: I0319 20:02:00.509437 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565842-4dzdg" Mar 19 20:02:01 crc kubenswrapper[4826]: I0319 20:02:01.036377 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565842-4dzdg"] Mar 19 20:02:01 crc kubenswrapper[4826]: I0319 20:02:01.425320 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565842-4dzdg" event={"ID":"9d2c54a8-551d-4e31-a2cd-c42f4b7c3021","Type":"ContainerStarted","Data":"5d3edb771a3d94dd482e8ea429b554e5f9c6f7a373ee61bf9607c60d5d2397c4"} Mar 19 20:02:03 crc kubenswrapper[4826]: I0319 20:02:03.452304 4826 generic.go:334] "Generic (PLEG): container finished" podID="9d2c54a8-551d-4e31-a2cd-c42f4b7c3021" containerID="8701c874e8a3369005f16919b1594de37677752f3f7df185f4e63dfb70b33b59" exitCode=0 Mar 19 20:02:03 crc kubenswrapper[4826]: I0319 20:02:03.452600 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565842-4dzdg" event={"ID":"9d2c54a8-551d-4e31-a2cd-c42f4b7c3021","Type":"ContainerDied","Data":"8701c874e8a3369005f16919b1594de37677752f3f7df185f4e63dfb70b33b59"} Mar 19 20:02:04 crc kubenswrapper[4826]: I0319 20:02:04.979245 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565842-4dzdg" Mar 19 20:02:05 crc kubenswrapper[4826]: I0319 20:02:05.134678 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fh8t2\" (UniqueName: \"kubernetes.io/projected/9d2c54a8-551d-4e31-a2cd-c42f4b7c3021-kube-api-access-fh8t2\") pod \"9d2c54a8-551d-4e31-a2cd-c42f4b7c3021\" (UID: \"9d2c54a8-551d-4e31-a2cd-c42f4b7c3021\") " Mar 19 20:02:05 crc kubenswrapper[4826]: I0319 20:02:05.141480 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d2c54a8-551d-4e31-a2cd-c42f4b7c3021-kube-api-access-fh8t2" (OuterVolumeSpecName: "kube-api-access-fh8t2") pod "9d2c54a8-551d-4e31-a2cd-c42f4b7c3021" (UID: "9d2c54a8-551d-4e31-a2cd-c42f4b7c3021"). InnerVolumeSpecName "kube-api-access-fh8t2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:02:05 crc kubenswrapper[4826]: I0319 20:02:05.237439 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fh8t2\" (UniqueName: \"kubernetes.io/projected/9d2c54a8-551d-4e31-a2cd-c42f4b7c3021-kube-api-access-fh8t2\") on node \"crc\" DevicePath \"\"" Mar 19 20:02:05 crc kubenswrapper[4826]: I0319 20:02:05.493766 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565842-4dzdg" event={"ID":"9d2c54a8-551d-4e31-a2cd-c42f4b7c3021","Type":"ContainerDied","Data":"5d3edb771a3d94dd482e8ea429b554e5f9c6f7a373ee61bf9607c60d5d2397c4"} Mar 19 20:02:05 crc kubenswrapper[4826]: I0319 20:02:05.493848 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d3edb771a3d94dd482e8ea429b554e5f9c6f7a373ee61bf9607c60d5d2397c4" Mar 19 20:02:05 crc kubenswrapper[4826]: I0319 20:02:05.493922 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565842-4dzdg" Mar 19 20:02:06 crc kubenswrapper[4826]: I0319 20:02:06.073595 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565836-dkt7n"] Mar 19 20:02:06 crc kubenswrapper[4826]: I0319 20:02:06.086525 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565836-dkt7n"] Mar 19 20:02:08 crc kubenswrapper[4826]: I0319 20:02:08.000434 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f93bd14-0892-4dd0-8a34-3352dcd8e36f" path="/var/lib/kubelet/pods/0f93bd14-0892-4dd0-8a34-3352dcd8e36f/volumes" Mar 19 20:02:34 crc kubenswrapper[4826]: I0319 20:02:34.860786 4826 scope.go:117] "RemoveContainer" containerID="35355a22aac8ad5229841f60e5c5ee1082cf0ca4d2b9db36d2899c0f43e497d7" Mar 19 20:03:55 crc kubenswrapper[4826]: I0319 20:03:55.400795 4826 patch_prober.go:28] interesting pod/machine-config-daemon-zz87p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:03:55 crc kubenswrapper[4826]: I0319 20:03:55.401369 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:04:00 crc kubenswrapper[4826]: I0319 20:04:00.181962 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565844-42ht6"] Mar 19 20:04:00 crc kubenswrapper[4826]: E0319 20:04:00.183018 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d2c54a8-551d-4e31-a2cd-c42f4b7c3021" containerName="oc" Mar 19 20:04:00 crc kubenswrapper[4826]: I0319 20:04:00.183032 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d2c54a8-551d-4e31-a2cd-c42f4b7c3021" containerName="oc" Mar 19 20:04:00 crc kubenswrapper[4826]: I0319 20:04:00.183324 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d2c54a8-551d-4e31-a2cd-c42f4b7c3021" containerName="oc" Mar 19 20:04:00 crc kubenswrapper[4826]: I0319 20:04:00.184315 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565844-42ht6" Mar 19 20:04:00 crc kubenswrapper[4826]: I0319 20:04:00.189210 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 20:04:00 crc kubenswrapper[4826]: I0319 20:04:00.189427 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b27wl" Mar 19 20:04:00 crc kubenswrapper[4826]: I0319 20:04:00.189484 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 20:04:00 crc kubenswrapper[4826]: I0319 20:04:00.199835 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565844-42ht6"] Mar 19 20:04:00 crc kubenswrapper[4826]: I0319 20:04:00.229190 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5b8l4\" (UniqueName: \"kubernetes.io/projected/ab1e329f-1f51-4363-92bf-b7922d6a4df5-kube-api-access-5b8l4\") pod \"auto-csr-approver-29565844-42ht6\" (UID: \"ab1e329f-1f51-4363-92bf-b7922d6a4df5\") " pod="openshift-infra/auto-csr-approver-29565844-42ht6" Mar 19 20:04:00 crc kubenswrapper[4826]: I0319 20:04:00.331486 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5b8l4\" (UniqueName: \"kubernetes.io/projected/ab1e329f-1f51-4363-92bf-b7922d6a4df5-kube-api-access-5b8l4\") pod \"auto-csr-approver-29565844-42ht6\" (UID: \"ab1e329f-1f51-4363-92bf-b7922d6a4df5\") " pod="openshift-infra/auto-csr-approver-29565844-42ht6" Mar 19 20:04:00 crc kubenswrapper[4826]: I0319 20:04:00.353719 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5b8l4\" (UniqueName: \"kubernetes.io/projected/ab1e329f-1f51-4363-92bf-b7922d6a4df5-kube-api-access-5b8l4\") pod \"auto-csr-approver-29565844-42ht6\" (UID: \"ab1e329f-1f51-4363-92bf-b7922d6a4df5\") " pod="openshift-infra/auto-csr-approver-29565844-42ht6" Mar 19 20:04:00 crc kubenswrapper[4826]: I0319 20:04:00.508823 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565844-42ht6" Mar 19 20:04:01 crc kubenswrapper[4826]: I0319 20:04:01.019566 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565844-42ht6"] Mar 19 20:04:01 crc kubenswrapper[4826]: I0319 20:04:01.030870 4826 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 20:04:01 crc kubenswrapper[4826]: I0319 20:04:01.974897 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565844-42ht6" event={"ID":"ab1e329f-1f51-4363-92bf-b7922d6a4df5","Type":"ContainerStarted","Data":"2b6f9d5f2eb55ef4de6c0cc7ad0364b38d5963deb2a8a067b29129cb7cb071c5"} Mar 19 20:04:03 crc kubenswrapper[4826]: I0319 20:04:03.080042 4826 generic.go:334] "Generic (PLEG): container finished" podID="ab1e329f-1f51-4363-92bf-b7922d6a4df5" containerID="2af134f899da8a5d2dea2aa1c80bf6eeaec2bae25e66a15e34dda4868f03576f" exitCode=0 Mar 19 20:04:03 crc kubenswrapper[4826]: I0319 20:04:03.080600 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565844-42ht6" event={"ID":"ab1e329f-1f51-4363-92bf-b7922d6a4df5","Type":"ContainerDied","Data":"2af134f899da8a5d2dea2aa1c80bf6eeaec2bae25e66a15e34dda4868f03576f"} Mar 19 20:04:04 crc kubenswrapper[4826]: I0319 20:04:04.577202 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565844-42ht6" Mar 19 20:04:04 crc kubenswrapper[4826]: I0319 20:04:04.650373 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5b8l4\" (UniqueName: \"kubernetes.io/projected/ab1e329f-1f51-4363-92bf-b7922d6a4df5-kube-api-access-5b8l4\") pod \"ab1e329f-1f51-4363-92bf-b7922d6a4df5\" (UID: \"ab1e329f-1f51-4363-92bf-b7922d6a4df5\") " Mar 19 20:04:04 crc kubenswrapper[4826]: I0319 20:04:04.656719 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab1e329f-1f51-4363-92bf-b7922d6a4df5-kube-api-access-5b8l4" (OuterVolumeSpecName: "kube-api-access-5b8l4") pod "ab1e329f-1f51-4363-92bf-b7922d6a4df5" (UID: "ab1e329f-1f51-4363-92bf-b7922d6a4df5"). InnerVolumeSpecName "kube-api-access-5b8l4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:04:04 crc kubenswrapper[4826]: I0319 20:04:04.753620 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5b8l4\" (UniqueName: \"kubernetes.io/projected/ab1e329f-1f51-4363-92bf-b7922d6a4df5-kube-api-access-5b8l4\") on node \"crc\" DevicePath \"\"" Mar 19 20:04:05 crc kubenswrapper[4826]: I0319 20:04:05.103805 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565844-42ht6" event={"ID":"ab1e329f-1f51-4363-92bf-b7922d6a4df5","Type":"ContainerDied","Data":"2b6f9d5f2eb55ef4de6c0cc7ad0364b38d5963deb2a8a067b29129cb7cb071c5"} Mar 19 20:04:05 crc kubenswrapper[4826]: I0319 20:04:05.103848 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b6f9d5f2eb55ef4de6c0cc7ad0364b38d5963deb2a8a067b29129cb7cb071c5" Mar 19 20:04:05 crc kubenswrapper[4826]: I0319 20:04:05.103860 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565844-42ht6" Mar 19 20:04:05 crc kubenswrapper[4826]: I0319 20:04:05.666509 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565838-b7gj4"] Mar 19 20:04:05 crc kubenswrapper[4826]: I0319 20:04:05.686239 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565838-b7gj4"] Mar 19 20:04:05 crc kubenswrapper[4826]: I0319 20:04:05.990426 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93827aa6-a38e-406a-8d19-82c7399bcc16" path="/var/lib/kubelet/pods/93827aa6-a38e-406a-8d19-82c7399bcc16/volumes" Mar 19 20:04:25 crc kubenswrapper[4826]: I0319 20:04:25.400475 4826 patch_prober.go:28] interesting pod/machine-config-daemon-zz87p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:04:25 crc kubenswrapper[4826]: I0319 20:04:25.401194 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:04:35 crc kubenswrapper[4826]: I0319 20:04:35.035588 4826 scope.go:117] "RemoveContainer" containerID="ba1eca8e661ffc4097f4bdec17414174139c3e1d60ee1ebb14e5e465ecf5b687" Mar 19 20:04:55 crc kubenswrapper[4826]: I0319 20:04:55.400398 4826 patch_prober.go:28] interesting pod/machine-config-daemon-zz87p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:04:55 crc kubenswrapper[4826]: I0319 20:04:55.401159 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:04:55 crc kubenswrapper[4826]: I0319 20:04:55.401218 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" Mar 19 20:04:55 crc kubenswrapper[4826]: I0319 20:04:55.402537 4826 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2753947510effd8dc69a5256d346c2fd89d2c31e9d6c5a8623efc428660b5aaa"} pod="openshift-machine-config-operator/machine-config-daemon-zz87p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 20:04:55 crc kubenswrapper[4826]: I0319 20:04:55.402630 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" containerID="cri-o://2753947510effd8dc69a5256d346c2fd89d2c31e9d6c5a8623efc428660b5aaa" gracePeriod=600 Mar 19 20:04:55 crc kubenswrapper[4826]: I0319 20:04:55.845639 4826 generic.go:334] "Generic (PLEG): container finished" podID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerID="2753947510effd8dc69a5256d346c2fd89d2c31e9d6c5a8623efc428660b5aaa" exitCode=0 Mar 19 20:04:55 crc kubenswrapper[4826]: I0319 20:04:55.846545 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" event={"ID":"b456fa3f-c7a7-45ca-b560-e7a9b21be05a","Type":"ContainerDied","Data":"2753947510effd8dc69a5256d346c2fd89d2c31e9d6c5a8623efc428660b5aaa"} Mar 19 20:04:55 crc kubenswrapper[4826]: I0319 20:04:55.846753 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" event={"ID":"b456fa3f-c7a7-45ca-b560-e7a9b21be05a","Type":"ContainerStarted","Data":"02b3c6e47f7a47f581a85215aee0d596eac58c76c3ea4fd0865e77774a333c9d"} Mar 19 20:04:55 crc kubenswrapper[4826]: I0319 20:04:55.846976 4826 scope.go:117] "RemoveContainer" containerID="19026675d22c90c0227b4d60a456b6c13458f69e7a00a61e59a04ce0e98ff9e3" Mar 19 20:05:38 crc kubenswrapper[4826]: I0319 20:05:38.480532 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2dlwd"] Mar 19 20:05:38 crc kubenswrapper[4826]: E0319 20:05:38.481685 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab1e329f-1f51-4363-92bf-b7922d6a4df5" containerName="oc" Mar 19 20:05:38 crc kubenswrapper[4826]: I0319 20:05:38.481704 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab1e329f-1f51-4363-92bf-b7922d6a4df5" containerName="oc" Mar 19 20:05:38 crc kubenswrapper[4826]: I0319 20:05:38.482025 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab1e329f-1f51-4363-92bf-b7922d6a4df5" containerName="oc" Mar 19 20:05:38 crc kubenswrapper[4826]: I0319 20:05:38.484101 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2dlwd" Mar 19 20:05:38 crc kubenswrapper[4826]: I0319 20:05:38.545712 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2dlwd"] Mar 19 20:05:38 crc kubenswrapper[4826]: I0319 20:05:38.573172 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2q57\" (UniqueName: \"kubernetes.io/projected/594a0e00-3ed1-487e-ac55-db5940afa789-kube-api-access-n2q57\") pod \"redhat-operators-2dlwd\" (UID: \"594a0e00-3ed1-487e-ac55-db5940afa789\") " pod="openshift-marketplace/redhat-operators-2dlwd" Mar 19 20:05:38 crc kubenswrapper[4826]: I0319 20:05:38.573224 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/594a0e00-3ed1-487e-ac55-db5940afa789-utilities\") pod \"redhat-operators-2dlwd\" (UID: \"594a0e00-3ed1-487e-ac55-db5940afa789\") " pod="openshift-marketplace/redhat-operators-2dlwd" Mar 19 20:05:38 crc kubenswrapper[4826]: I0319 20:05:38.573290 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/594a0e00-3ed1-487e-ac55-db5940afa789-catalog-content\") pod \"redhat-operators-2dlwd\" (UID: \"594a0e00-3ed1-487e-ac55-db5940afa789\") " pod="openshift-marketplace/redhat-operators-2dlwd" Mar 19 20:05:38 crc kubenswrapper[4826]: I0319 20:05:38.675785 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2q57\" (UniqueName: \"kubernetes.io/projected/594a0e00-3ed1-487e-ac55-db5940afa789-kube-api-access-n2q57\") pod \"redhat-operators-2dlwd\" (UID: \"594a0e00-3ed1-487e-ac55-db5940afa789\") " pod="openshift-marketplace/redhat-operators-2dlwd" Mar 19 20:05:38 crc kubenswrapper[4826]: I0319 20:05:38.675834 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/594a0e00-3ed1-487e-ac55-db5940afa789-utilities\") pod \"redhat-operators-2dlwd\" (UID: \"594a0e00-3ed1-487e-ac55-db5940afa789\") " pod="openshift-marketplace/redhat-operators-2dlwd" Mar 19 20:05:38 crc kubenswrapper[4826]: I0319 20:05:38.675901 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/594a0e00-3ed1-487e-ac55-db5940afa789-catalog-content\") pod \"redhat-operators-2dlwd\" (UID: \"594a0e00-3ed1-487e-ac55-db5940afa789\") " pod="openshift-marketplace/redhat-operators-2dlwd" Mar 19 20:05:38 crc kubenswrapper[4826]: I0319 20:05:38.676893 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/594a0e00-3ed1-487e-ac55-db5940afa789-catalog-content\") pod \"redhat-operators-2dlwd\" (UID: \"594a0e00-3ed1-487e-ac55-db5940afa789\") " pod="openshift-marketplace/redhat-operators-2dlwd" Mar 19 20:05:38 crc kubenswrapper[4826]: I0319 20:05:38.676927 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/594a0e00-3ed1-487e-ac55-db5940afa789-utilities\") pod \"redhat-operators-2dlwd\" (UID: \"594a0e00-3ed1-487e-ac55-db5940afa789\") " pod="openshift-marketplace/redhat-operators-2dlwd" Mar 19 20:05:38 crc kubenswrapper[4826]: I0319 20:05:38.715686 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2q57\" (UniqueName: \"kubernetes.io/projected/594a0e00-3ed1-487e-ac55-db5940afa789-kube-api-access-n2q57\") pod \"redhat-operators-2dlwd\" (UID: \"594a0e00-3ed1-487e-ac55-db5940afa789\") " pod="openshift-marketplace/redhat-operators-2dlwd" Mar 19 20:05:38 crc kubenswrapper[4826]: I0319 20:05:38.821607 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2dlwd" Mar 19 20:05:39 crc kubenswrapper[4826]: I0319 20:05:39.312343 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2dlwd"] Mar 19 20:05:39 crc kubenswrapper[4826]: I0319 20:05:39.560576 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2dlwd" event={"ID":"594a0e00-3ed1-487e-ac55-db5940afa789","Type":"ContainerStarted","Data":"ee4aa50ecb03774f585362bcf3e0b586a3b03979c12abbb7de98ddcba20ebb23"} Mar 19 20:05:39 crc kubenswrapper[4826]: I0319 20:05:39.560620 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2dlwd" event={"ID":"594a0e00-3ed1-487e-ac55-db5940afa789","Type":"ContainerStarted","Data":"c89786d594d532ed112701e059c54151d0bf6216de7b7657575c5d5ebbc19ba1"} Mar 19 20:05:40 crc kubenswrapper[4826]: I0319 20:05:40.571381 4826 generic.go:334] "Generic (PLEG): container finished" podID="594a0e00-3ed1-487e-ac55-db5940afa789" containerID="ee4aa50ecb03774f585362bcf3e0b586a3b03979c12abbb7de98ddcba20ebb23" exitCode=0 Mar 19 20:05:40 crc kubenswrapper[4826]: I0319 20:05:40.571461 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2dlwd" event={"ID":"594a0e00-3ed1-487e-ac55-db5940afa789","Type":"ContainerDied","Data":"ee4aa50ecb03774f585362bcf3e0b586a3b03979c12abbb7de98ddcba20ebb23"} Mar 19 20:05:41 crc kubenswrapper[4826]: I0319 20:05:41.588024 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2dlwd" event={"ID":"594a0e00-3ed1-487e-ac55-db5940afa789","Type":"ContainerStarted","Data":"3cd4c0afe04b27b45ef7b705fc738f551d2440b86a0330a9551a5e2d7c5df677"} Mar 19 20:05:46 crc kubenswrapper[4826]: I0319 20:05:46.684763 4826 generic.go:334] "Generic (PLEG): container finished" podID="594a0e00-3ed1-487e-ac55-db5940afa789" containerID="3cd4c0afe04b27b45ef7b705fc738f551d2440b86a0330a9551a5e2d7c5df677" exitCode=0 Mar 19 20:05:46 crc kubenswrapper[4826]: I0319 20:05:46.684969 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2dlwd" event={"ID":"594a0e00-3ed1-487e-ac55-db5940afa789","Type":"ContainerDied","Data":"3cd4c0afe04b27b45ef7b705fc738f551d2440b86a0330a9551a5e2d7c5df677"} Mar 19 20:05:47 crc kubenswrapper[4826]: I0319 20:05:47.703627 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2dlwd" event={"ID":"594a0e00-3ed1-487e-ac55-db5940afa789","Type":"ContainerStarted","Data":"ea8083dd987309834e310327047505670714e9eceec3e1bf19bae814150040f7"} Mar 19 20:05:47 crc kubenswrapper[4826]: I0319 20:05:47.730528 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2dlwd" podStartSLOduration=2.205181779 podStartE2EDuration="9.730416503s" podCreationTimestamp="2026-03-19 20:05:38 +0000 UTC" firstStartedPulling="2026-03-19 20:05:39.562351772 +0000 UTC m=+4164.316420075" lastFinishedPulling="2026-03-19 20:05:47.087586486 +0000 UTC m=+4171.841654799" observedRunningTime="2026-03-19 20:05:47.720602668 +0000 UTC m=+4172.474671001" watchObservedRunningTime="2026-03-19 20:05:47.730416503 +0000 UTC m=+4172.484484836" Mar 19 20:05:48 crc kubenswrapper[4826]: I0319 20:05:48.822462 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2dlwd" Mar 19 20:05:48 crc kubenswrapper[4826]: I0319 20:05:48.822851 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2dlwd" Mar 19 20:05:49 crc kubenswrapper[4826]: I0319 20:05:49.903287 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2dlwd" podUID="594a0e00-3ed1-487e-ac55-db5940afa789" containerName="registry-server" probeResult="failure" output=< Mar 19 20:05:49 crc kubenswrapper[4826]: timeout: failed to connect service ":50051" within 1s Mar 19 20:05:49 crc kubenswrapper[4826]: > Mar 19 20:05:59 crc kubenswrapper[4826]: I0319 20:05:59.885788 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2dlwd" podUID="594a0e00-3ed1-487e-ac55-db5940afa789" containerName="registry-server" probeResult="failure" output=< Mar 19 20:05:59 crc kubenswrapper[4826]: timeout: failed to connect service ":50051" within 1s Mar 19 20:05:59 crc kubenswrapper[4826]: > Mar 19 20:06:00 crc kubenswrapper[4826]: I0319 20:06:00.173322 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565846-49tqd"] Mar 19 20:06:00 crc kubenswrapper[4826]: I0319 20:06:00.175363 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565846-49tqd" Mar 19 20:06:00 crc kubenswrapper[4826]: I0319 20:06:00.177373 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 20:06:00 crc kubenswrapper[4826]: I0319 20:06:00.177544 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 20:06:00 crc kubenswrapper[4826]: I0319 20:06:00.178710 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b27wl" Mar 19 20:06:00 crc kubenswrapper[4826]: I0319 20:06:00.195892 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565846-49tqd"] Mar 19 20:06:00 crc kubenswrapper[4826]: I0319 20:06:00.238385 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xnq4\" (UniqueName: \"kubernetes.io/projected/f3a29d0b-189d-44fa-a64e-d7988ea4031c-kube-api-access-6xnq4\") pod \"auto-csr-approver-29565846-49tqd\" (UID: \"f3a29d0b-189d-44fa-a64e-d7988ea4031c\") " pod="openshift-infra/auto-csr-approver-29565846-49tqd" Mar 19 20:06:00 crc kubenswrapper[4826]: I0319 20:06:00.340166 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xnq4\" (UniqueName: \"kubernetes.io/projected/f3a29d0b-189d-44fa-a64e-d7988ea4031c-kube-api-access-6xnq4\") pod \"auto-csr-approver-29565846-49tqd\" (UID: \"f3a29d0b-189d-44fa-a64e-d7988ea4031c\") " pod="openshift-infra/auto-csr-approver-29565846-49tqd" Mar 19 20:06:00 crc kubenswrapper[4826]: I0319 20:06:00.359621 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xnq4\" (UniqueName: \"kubernetes.io/projected/f3a29d0b-189d-44fa-a64e-d7988ea4031c-kube-api-access-6xnq4\") pod \"auto-csr-approver-29565846-49tqd\" (UID: \"f3a29d0b-189d-44fa-a64e-d7988ea4031c\") " pod="openshift-infra/auto-csr-approver-29565846-49tqd" Mar 19 20:06:00 crc kubenswrapper[4826]: I0319 20:06:00.491894 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565846-49tqd" Mar 19 20:06:01 crc kubenswrapper[4826]: I0319 20:06:01.106713 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565846-49tqd"] Mar 19 20:06:01 crc kubenswrapper[4826]: I0319 20:06:01.875387 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565846-49tqd" event={"ID":"f3a29d0b-189d-44fa-a64e-d7988ea4031c","Type":"ContainerStarted","Data":"91d03876ee97254f150f3b3ac12ce738e92923322b300b6d81306b9884ab85f0"} Mar 19 20:06:02 crc kubenswrapper[4826]: I0319 20:06:02.912806 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565846-49tqd" event={"ID":"f3a29d0b-189d-44fa-a64e-d7988ea4031c","Type":"ContainerStarted","Data":"da8af6207021e2cac817da2b8fc34ec1a5956ab409e23a13207ea94fb036bb6e"} Mar 19 20:06:02 crc kubenswrapper[4826]: I0319 20:06:02.932243 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565846-49tqd" podStartSLOduration=1.648271829 podStartE2EDuration="2.932226861s" podCreationTimestamp="2026-03-19 20:06:00 +0000 UTC" firstStartedPulling="2026-03-19 20:06:01.108915525 +0000 UTC m=+4185.862983848" lastFinishedPulling="2026-03-19 20:06:02.392870567 +0000 UTC m=+4187.146938880" observedRunningTime="2026-03-19 20:06:02.925208746 +0000 UTC m=+4187.679277059" watchObservedRunningTime="2026-03-19 20:06:02.932226861 +0000 UTC m=+4187.686295174" Mar 19 20:06:03 crc kubenswrapper[4826]: I0319 20:06:03.925976 4826 generic.go:334] "Generic (PLEG): container finished" podID="f3a29d0b-189d-44fa-a64e-d7988ea4031c" containerID="da8af6207021e2cac817da2b8fc34ec1a5956ab409e23a13207ea94fb036bb6e" exitCode=0 Mar 19 20:06:03 crc kubenswrapper[4826]: I0319 20:06:03.926267 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565846-49tqd" event={"ID":"f3a29d0b-189d-44fa-a64e-d7988ea4031c","Type":"ContainerDied","Data":"da8af6207021e2cac817da2b8fc34ec1a5956ab409e23a13207ea94fb036bb6e"} Mar 19 20:06:05 crc kubenswrapper[4826]: I0319 20:06:05.549796 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565846-49tqd" Mar 19 20:06:05 crc kubenswrapper[4826]: I0319 20:06:05.588344 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xnq4\" (UniqueName: \"kubernetes.io/projected/f3a29d0b-189d-44fa-a64e-d7988ea4031c-kube-api-access-6xnq4\") pod \"f3a29d0b-189d-44fa-a64e-d7988ea4031c\" (UID: \"f3a29d0b-189d-44fa-a64e-d7988ea4031c\") " Mar 19 20:06:05 crc kubenswrapper[4826]: I0319 20:06:05.611976 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3a29d0b-189d-44fa-a64e-d7988ea4031c-kube-api-access-6xnq4" (OuterVolumeSpecName: "kube-api-access-6xnq4") pod "f3a29d0b-189d-44fa-a64e-d7988ea4031c" (UID: "f3a29d0b-189d-44fa-a64e-d7988ea4031c"). InnerVolumeSpecName "kube-api-access-6xnq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:06:05 crc kubenswrapper[4826]: I0319 20:06:05.691066 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xnq4\" (UniqueName: \"kubernetes.io/projected/f3a29d0b-189d-44fa-a64e-d7988ea4031c-kube-api-access-6xnq4\") on node \"crc\" DevicePath \"\"" Mar 19 20:06:05 crc kubenswrapper[4826]: I0319 20:06:05.988175 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565846-49tqd" Mar 19 20:06:06 crc kubenswrapper[4826]: I0319 20:06:06.024923 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565846-49tqd" event={"ID":"f3a29d0b-189d-44fa-a64e-d7988ea4031c","Type":"ContainerDied","Data":"91d03876ee97254f150f3b3ac12ce738e92923322b300b6d81306b9884ab85f0"} Mar 19 20:06:06 crc kubenswrapper[4826]: I0319 20:06:06.024978 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91d03876ee97254f150f3b3ac12ce738e92923322b300b6d81306b9884ab85f0" Mar 19 20:06:06 crc kubenswrapper[4826]: I0319 20:06:06.034955 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565840-cpxqz"] Mar 19 20:06:06 crc kubenswrapper[4826]: I0319 20:06:06.046812 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565840-cpxqz"] Mar 19 20:06:07 crc kubenswrapper[4826]: I0319 20:06:07.994991 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4124973-19ff-4e28-b7ce-94423ac25de8" path="/var/lib/kubelet/pods/b4124973-19ff-4e28-b7ce-94423ac25de8/volumes" Mar 19 20:06:08 crc kubenswrapper[4826]: I0319 20:06:08.885897 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2dlwd" Mar 19 20:06:08 crc kubenswrapper[4826]: I0319 20:06:08.978877 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2dlwd" Mar 19 20:06:09 crc kubenswrapper[4826]: I0319 20:06:09.671863 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2dlwd"] Mar 19 20:06:10 crc kubenswrapper[4826]: I0319 20:06:10.046455 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2dlwd" podUID="594a0e00-3ed1-487e-ac55-db5940afa789" containerName="registry-server" containerID="cri-o://ea8083dd987309834e310327047505670714e9eceec3e1bf19bae814150040f7" gracePeriod=2 Mar 19 20:06:11 crc kubenswrapper[4826]: I0319 20:06:11.061691 4826 generic.go:334] "Generic (PLEG): container finished" podID="594a0e00-3ed1-487e-ac55-db5940afa789" containerID="ea8083dd987309834e310327047505670714e9eceec3e1bf19bae814150040f7" exitCode=0 Mar 19 20:06:11 crc kubenswrapper[4826]: I0319 20:06:11.061731 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2dlwd" event={"ID":"594a0e00-3ed1-487e-ac55-db5940afa789","Type":"ContainerDied","Data":"ea8083dd987309834e310327047505670714e9eceec3e1bf19bae814150040f7"} Mar 19 20:06:11 crc kubenswrapper[4826]: I0319 20:06:11.062096 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2dlwd" event={"ID":"594a0e00-3ed1-487e-ac55-db5940afa789","Type":"ContainerDied","Data":"c89786d594d532ed112701e059c54151d0bf6216de7b7657575c5d5ebbc19ba1"} Mar 19 20:06:11 crc kubenswrapper[4826]: I0319 20:06:11.062123 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c89786d594d532ed112701e059c54151d0bf6216de7b7657575c5d5ebbc19ba1" Mar 19 20:06:11 crc kubenswrapper[4826]: I0319 20:06:11.137713 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2dlwd" Mar 19 20:06:11 crc kubenswrapper[4826]: I0319 20:06:11.242214 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2q57\" (UniqueName: \"kubernetes.io/projected/594a0e00-3ed1-487e-ac55-db5940afa789-kube-api-access-n2q57\") pod \"594a0e00-3ed1-487e-ac55-db5940afa789\" (UID: \"594a0e00-3ed1-487e-ac55-db5940afa789\") " Mar 19 20:06:11 crc kubenswrapper[4826]: I0319 20:06:11.242527 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/594a0e00-3ed1-487e-ac55-db5940afa789-catalog-content\") pod \"594a0e00-3ed1-487e-ac55-db5940afa789\" (UID: \"594a0e00-3ed1-487e-ac55-db5940afa789\") " Mar 19 20:06:11 crc kubenswrapper[4826]: I0319 20:06:11.242727 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/594a0e00-3ed1-487e-ac55-db5940afa789-utilities\") pod \"594a0e00-3ed1-487e-ac55-db5940afa789\" (UID: \"594a0e00-3ed1-487e-ac55-db5940afa789\") " Mar 19 20:06:11 crc kubenswrapper[4826]: I0319 20:06:11.244160 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/594a0e00-3ed1-487e-ac55-db5940afa789-utilities" (OuterVolumeSpecName: "utilities") pod "594a0e00-3ed1-487e-ac55-db5940afa789" (UID: "594a0e00-3ed1-487e-ac55-db5940afa789"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:06:11 crc kubenswrapper[4826]: I0319 20:06:11.261896 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/594a0e00-3ed1-487e-ac55-db5940afa789-kube-api-access-n2q57" (OuterVolumeSpecName: "kube-api-access-n2q57") pod "594a0e00-3ed1-487e-ac55-db5940afa789" (UID: "594a0e00-3ed1-487e-ac55-db5940afa789"). InnerVolumeSpecName "kube-api-access-n2q57". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:06:11 crc kubenswrapper[4826]: I0319 20:06:11.346107 4826 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/594a0e00-3ed1-487e-ac55-db5940afa789-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 20:06:11 crc kubenswrapper[4826]: I0319 20:06:11.346326 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2q57\" (UniqueName: \"kubernetes.io/projected/594a0e00-3ed1-487e-ac55-db5940afa789-kube-api-access-n2q57\") on node \"crc\" DevicePath \"\"" Mar 19 20:06:11 crc kubenswrapper[4826]: I0319 20:06:11.419708 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/594a0e00-3ed1-487e-ac55-db5940afa789-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "594a0e00-3ed1-487e-ac55-db5940afa789" (UID: "594a0e00-3ed1-487e-ac55-db5940afa789"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:06:11 crc kubenswrapper[4826]: I0319 20:06:11.449742 4826 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/594a0e00-3ed1-487e-ac55-db5940afa789-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 20:06:12 crc kubenswrapper[4826]: I0319 20:06:12.074228 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2dlwd" Mar 19 20:06:12 crc kubenswrapper[4826]: I0319 20:06:12.110602 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2dlwd"] Mar 19 20:06:12 crc kubenswrapper[4826]: I0319 20:06:12.125961 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2dlwd"] Mar 19 20:06:13 crc kubenswrapper[4826]: I0319 20:06:13.999349 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="594a0e00-3ed1-487e-ac55-db5940afa789" path="/var/lib/kubelet/pods/594a0e00-3ed1-487e-ac55-db5940afa789/volumes" Mar 19 20:06:25 crc kubenswrapper[4826]: E0319 20:06:25.069054 4826 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.69:55494->38.102.83.69:34213: write tcp 38.102.83.69:55494->38.102.83.69:34213: write: broken pipe Mar 19 20:06:35 crc kubenswrapper[4826]: I0319 20:06:35.189679 4826 scope.go:117] "RemoveContainer" containerID="b043cd8f5631b5bd47670db8780d74581e614575119f02b354d622f3f423f5c4" Mar 19 20:06:55 crc kubenswrapper[4826]: I0319 20:06:55.400397 4826 patch_prober.go:28] interesting pod/machine-config-daemon-zz87p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:06:55 crc kubenswrapper[4826]: I0319 20:06:55.401015 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:07:25 crc kubenswrapper[4826]: I0319 20:07:25.400396 4826 patch_prober.go:28] interesting pod/machine-config-daemon-zz87p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:07:25 crc kubenswrapper[4826]: I0319 20:07:25.400953 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:07:55 crc kubenswrapper[4826]: I0319 20:07:55.400852 4826 patch_prober.go:28] interesting pod/machine-config-daemon-zz87p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:07:55 crc kubenswrapper[4826]: I0319 20:07:55.401307 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:07:55 crc kubenswrapper[4826]: I0319 20:07:55.401377 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" Mar 19 20:07:55 crc kubenswrapper[4826]: I0319 20:07:55.402189 4826 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"02b3c6e47f7a47f581a85215aee0d596eac58c76c3ea4fd0865e77774a333c9d"} pod="openshift-machine-config-operator/machine-config-daemon-zz87p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 20:07:55 crc kubenswrapper[4826]: I0319 20:07:55.402243 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" containerID="cri-o://02b3c6e47f7a47f581a85215aee0d596eac58c76c3ea4fd0865e77774a333c9d" gracePeriod=600 Mar 19 20:07:56 crc kubenswrapper[4826]: E0319 20:07:56.174110 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 20:07:56 crc kubenswrapper[4826]: I0319 20:07:56.458270 4826 generic.go:334] "Generic (PLEG): container finished" podID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerID="02b3c6e47f7a47f581a85215aee0d596eac58c76c3ea4fd0865e77774a333c9d" exitCode=0 Mar 19 20:07:56 crc kubenswrapper[4826]: I0319 20:07:56.458332 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" event={"ID":"b456fa3f-c7a7-45ca-b560-e7a9b21be05a","Type":"ContainerDied","Data":"02b3c6e47f7a47f581a85215aee0d596eac58c76c3ea4fd0865e77774a333c9d"} Mar 19 20:07:56 crc kubenswrapper[4826]: I0319 20:07:56.458386 4826 scope.go:117] "RemoveContainer" containerID="2753947510effd8dc69a5256d346c2fd89d2c31e9d6c5a8623efc428660b5aaa" Mar 19 20:07:56 crc kubenswrapper[4826]: I0319 20:07:56.459785 4826 scope.go:117] "RemoveContainer" containerID="02b3c6e47f7a47f581a85215aee0d596eac58c76c3ea4fd0865e77774a333c9d" Mar 19 20:07:56 crc kubenswrapper[4826]: E0319 20:07:56.460295 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 20:08:00 crc kubenswrapper[4826]: I0319 20:08:00.159483 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565848-9hxcr"] Mar 19 20:08:00 crc kubenswrapper[4826]: E0319 20:08:00.161705 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3a29d0b-189d-44fa-a64e-d7988ea4031c" containerName="oc" Mar 19 20:08:00 crc kubenswrapper[4826]: I0319 20:08:00.162202 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3a29d0b-189d-44fa-a64e-d7988ea4031c" containerName="oc" Mar 19 20:08:00 crc kubenswrapper[4826]: E0319 20:08:00.162344 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="594a0e00-3ed1-487e-ac55-db5940afa789" containerName="registry-server" Mar 19 20:08:00 crc kubenswrapper[4826]: I0319 20:08:00.162431 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="594a0e00-3ed1-487e-ac55-db5940afa789" containerName="registry-server" Mar 19 20:08:00 crc kubenswrapper[4826]: E0319 20:08:00.162567 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="594a0e00-3ed1-487e-ac55-db5940afa789" containerName="extract-content" Mar 19 20:08:00 crc kubenswrapper[4826]: I0319 20:08:00.162676 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="594a0e00-3ed1-487e-ac55-db5940afa789" containerName="extract-content" Mar 19 20:08:00 crc kubenswrapper[4826]: E0319 20:08:00.162778 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="594a0e00-3ed1-487e-ac55-db5940afa789" containerName="extract-utilities" Mar 19 20:08:00 crc kubenswrapper[4826]: I0319 20:08:00.162885 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="594a0e00-3ed1-487e-ac55-db5940afa789" containerName="extract-utilities" Mar 19 20:08:00 crc kubenswrapper[4826]: I0319 20:08:00.163297 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="594a0e00-3ed1-487e-ac55-db5940afa789" containerName="registry-server" Mar 19 20:08:00 crc kubenswrapper[4826]: I0319 20:08:00.163439 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3a29d0b-189d-44fa-a64e-d7988ea4031c" containerName="oc" Mar 19 20:08:00 crc kubenswrapper[4826]: I0319 20:08:00.164474 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565848-9hxcr" Mar 19 20:08:00 crc kubenswrapper[4826]: I0319 20:08:00.168931 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 20:08:00 crc kubenswrapper[4826]: I0319 20:08:00.168931 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b27wl" Mar 19 20:08:00 crc kubenswrapper[4826]: I0319 20:08:00.171247 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 20:08:00 crc kubenswrapper[4826]: I0319 20:08:00.174536 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565848-9hxcr"] Mar 19 20:08:00 crc kubenswrapper[4826]: I0319 20:08:00.284923 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxdmc\" (UniqueName: \"kubernetes.io/projected/93d364b9-495c-49a7-9c05-b6f0194baff2-kube-api-access-pxdmc\") pod \"auto-csr-approver-29565848-9hxcr\" (UID: \"93d364b9-495c-49a7-9c05-b6f0194baff2\") " pod="openshift-infra/auto-csr-approver-29565848-9hxcr" Mar 19 20:08:00 crc kubenswrapper[4826]: I0319 20:08:00.387683 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxdmc\" (UniqueName: \"kubernetes.io/projected/93d364b9-495c-49a7-9c05-b6f0194baff2-kube-api-access-pxdmc\") pod \"auto-csr-approver-29565848-9hxcr\" (UID: \"93d364b9-495c-49a7-9c05-b6f0194baff2\") " pod="openshift-infra/auto-csr-approver-29565848-9hxcr" Mar 19 20:08:00 crc kubenswrapper[4826]: I0319 20:08:00.408403 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxdmc\" (UniqueName: \"kubernetes.io/projected/93d364b9-495c-49a7-9c05-b6f0194baff2-kube-api-access-pxdmc\") pod \"auto-csr-approver-29565848-9hxcr\" (UID: \"93d364b9-495c-49a7-9c05-b6f0194baff2\") " pod="openshift-infra/auto-csr-approver-29565848-9hxcr" Mar 19 20:08:00 crc kubenswrapper[4826]: I0319 20:08:00.496863 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565848-9hxcr" Mar 19 20:08:01 crc kubenswrapper[4826]: I0319 20:08:01.001738 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565848-9hxcr"] Mar 19 20:08:01 crc kubenswrapper[4826]: I0319 20:08:01.523298 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565848-9hxcr" event={"ID":"93d364b9-495c-49a7-9c05-b6f0194baff2","Type":"ContainerStarted","Data":"30dbf410a5d3b57ba9e63e5b22ec7f194b721639884a037bb0c3f930c67cbe26"} Mar 19 20:08:02 crc kubenswrapper[4826]: I0319 20:08:02.548198 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565848-9hxcr" event={"ID":"93d364b9-495c-49a7-9c05-b6f0194baff2","Type":"ContainerStarted","Data":"2c3a27dcd08892facd14d19c4f81989418a74d404528ba37ee3545669c729224"} Mar 19 20:08:02 crc kubenswrapper[4826]: I0319 20:08:02.629580 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565848-9hxcr" podStartSLOduration=1.6832065410000001 podStartE2EDuration="2.629555695s" podCreationTimestamp="2026-03-19 20:08:00 +0000 UTC" firstStartedPulling="2026-03-19 20:08:01.007124304 +0000 UTC m=+4305.761192657" lastFinishedPulling="2026-03-19 20:08:01.953473458 +0000 UTC m=+4306.707541811" observedRunningTime="2026-03-19 20:08:02.603845513 +0000 UTC m=+4307.357913826" watchObservedRunningTime="2026-03-19 20:08:02.629555695 +0000 UTC m=+4307.383624008" Mar 19 20:08:03 crc kubenswrapper[4826]: I0319 20:08:03.558461 4826 generic.go:334] "Generic (PLEG): container finished" podID="93d364b9-495c-49a7-9c05-b6f0194baff2" containerID="2c3a27dcd08892facd14d19c4f81989418a74d404528ba37ee3545669c729224" exitCode=0 Mar 19 20:08:03 crc kubenswrapper[4826]: I0319 20:08:03.558527 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565848-9hxcr" event={"ID":"93d364b9-495c-49a7-9c05-b6f0194baff2","Type":"ContainerDied","Data":"2c3a27dcd08892facd14d19c4f81989418a74d404528ba37ee3545669c729224"} Mar 19 20:08:05 crc kubenswrapper[4826]: I0319 20:08:05.058630 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565848-9hxcr" Mar 19 20:08:05 crc kubenswrapper[4826]: I0319 20:08:05.191446 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxdmc\" (UniqueName: \"kubernetes.io/projected/93d364b9-495c-49a7-9c05-b6f0194baff2-kube-api-access-pxdmc\") pod \"93d364b9-495c-49a7-9c05-b6f0194baff2\" (UID: \"93d364b9-495c-49a7-9c05-b6f0194baff2\") " Mar 19 20:08:05 crc kubenswrapper[4826]: I0319 20:08:05.203858 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93d364b9-495c-49a7-9c05-b6f0194baff2-kube-api-access-pxdmc" (OuterVolumeSpecName: "kube-api-access-pxdmc") pod "93d364b9-495c-49a7-9c05-b6f0194baff2" (UID: "93d364b9-495c-49a7-9c05-b6f0194baff2"). InnerVolumeSpecName "kube-api-access-pxdmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:08:05 crc kubenswrapper[4826]: I0319 20:08:05.294325 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxdmc\" (UniqueName: \"kubernetes.io/projected/93d364b9-495c-49a7-9c05-b6f0194baff2-kube-api-access-pxdmc\") on node \"crc\" DevicePath \"\"" Mar 19 20:08:05 crc kubenswrapper[4826]: I0319 20:08:05.590149 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565848-9hxcr" event={"ID":"93d364b9-495c-49a7-9c05-b6f0194baff2","Type":"ContainerDied","Data":"30dbf410a5d3b57ba9e63e5b22ec7f194b721639884a037bb0c3f930c67cbe26"} Mar 19 20:08:05 crc kubenswrapper[4826]: I0319 20:08:05.590194 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30dbf410a5d3b57ba9e63e5b22ec7f194b721639884a037bb0c3f930c67cbe26" Mar 19 20:08:05 crc kubenswrapper[4826]: I0319 20:08:05.590218 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565848-9hxcr" Mar 19 20:08:05 crc kubenswrapper[4826]: I0319 20:08:05.666769 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565842-4dzdg"] Mar 19 20:08:05 crc kubenswrapper[4826]: I0319 20:08:05.683647 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565842-4dzdg"] Mar 19 20:08:06 crc kubenswrapper[4826]: I0319 20:08:06.002935 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d2c54a8-551d-4e31-a2cd-c42f4b7c3021" path="/var/lib/kubelet/pods/9d2c54a8-551d-4e31-a2cd-c42f4b7c3021/volumes" Mar 19 20:08:08 crc kubenswrapper[4826]: I0319 20:08:08.978359 4826 scope.go:117] "RemoveContainer" containerID="02b3c6e47f7a47f581a85215aee0d596eac58c76c3ea4fd0865e77774a333c9d" Mar 19 20:08:08 crc kubenswrapper[4826]: E0319 20:08:08.979218 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 20:08:23 crc kubenswrapper[4826]: I0319 20:08:23.980735 4826 scope.go:117] "RemoveContainer" containerID="02b3c6e47f7a47f581a85215aee0d596eac58c76c3ea4fd0865e77774a333c9d" Mar 19 20:08:23 crc kubenswrapper[4826]: E0319 20:08:23.981865 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 20:08:35 crc kubenswrapper[4826]: I0319 20:08:35.316183 4826 scope.go:117] "RemoveContainer" containerID="8701c874e8a3369005f16919b1594de37677752f3f7df185f4e63dfb70b33b59" Mar 19 20:08:35 crc kubenswrapper[4826]: I0319 20:08:35.989002 4826 scope.go:117] "RemoveContainer" containerID="02b3c6e47f7a47f581a85215aee0d596eac58c76c3ea4fd0865e77774a333c9d" Mar 19 20:08:35 crc kubenswrapper[4826]: E0319 20:08:35.989955 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 20:08:48 crc kubenswrapper[4826]: I0319 20:08:48.977020 4826 scope.go:117] "RemoveContainer" containerID="02b3c6e47f7a47f581a85215aee0d596eac58c76c3ea4fd0865e77774a333c9d" Mar 19 20:08:48 crc kubenswrapper[4826]: E0319 20:08:48.977841 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 20:09:00 crc kubenswrapper[4826]: I0319 20:09:00.975964 4826 scope.go:117] "RemoveContainer" containerID="02b3c6e47f7a47f581a85215aee0d596eac58c76c3ea4fd0865e77774a333c9d" Mar 19 20:09:00 crc kubenswrapper[4826]: E0319 20:09:00.976860 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 20:09:13 crc kubenswrapper[4826]: I0319 20:09:13.976266 4826 scope.go:117] "RemoveContainer" containerID="02b3c6e47f7a47f581a85215aee0d596eac58c76c3ea4fd0865e77774a333c9d" Mar 19 20:09:13 crc kubenswrapper[4826]: E0319 20:09:13.977109 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 20:09:27 crc kubenswrapper[4826]: I0319 20:09:27.976566 4826 scope.go:117] "RemoveContainer" containerID="02b3c6e47f7a47f581a85215aee0d596eac58c76c3ea4fd0865e77774a333c9d" Mar 19 20:09:27 crc kubenswrapper[4826]: E0319 20:09:27.977493 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 20:09:29 crc kubenswrapper[4826]: I0319 20:09:29.413372 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2wr5q"] Mar 19 20:09:29 crc kubenswrapper[4826]: E0319 20:09:29.429702 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93d364b9-495c-49a7-9c05-b6f0194baff2" containerName="oc" Mar 19 20:09:29 crc kubenswrapper[4826]: I0319 20:09:29.429736 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="93d364b9-495c-49a7-9c05-b6f0194baff2" containerName="oc" Mar 19 20:09:29 crc kubenswrapper[4826]: I0319 20:09:29.430168 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="93d364b9-495c-49a7-9c05-b6f0194baff2" containerName="oc" Mar 19 20:09:29 crc kubenswrapper[4826]: I0319 20:09:29.432238 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2wr5q"] Mar 19 20:09:29 crc kubenswrapper[4826]: I0319 20:09:29.432343 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2wr5q" Mar 19 20:09:29 crc kubenswrapper[4826]: I0319 20:09:29.610373 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/593dbadd-7d6d-4fb8-afc2-affede4bf261-catalog-content\") pod \"certified-operators-2wr5q\" (UID: \"593dbadd-7d6d-4fb8-afc2-affede4bf261\") " pod="openshift-marketplace/certified-operators-2wr5q" Mar 19 20:09:29 crc kubenswrapper[4826]: I0319 20:09:29.611375 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/593dbadd-7d6d-4fb8-afc2-affede4bf261-utilities\") pod \"certified-operators-2wr5q\" (UID: \"593dbadd-7d6d-4fb8-afc2-affede4bf261\") " pod="openshift-marketplace/certified-operators-2wr5q" Mar 19 20:09:29 crc kubenswrapper[4826]: I0319 20:09:29.611505 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdrgw\" (UniqueName: \"kubernetes.io/projected/593dbadd-7d6d-4fb8-afc2-affede4bf261-kube-api-access-xdrgw\") pod \"certified-operators-2wr5q\" (UID: \"593dbadd-7d6d-4fb8-afc2-affede4bf261\") " pod="openshift-marketplace/certified-operators-2wr5q" Mar 19 20:09:29 crc kubenswrapper[4826]: I0319 20:09:29.714221 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/593dbadd-7d6d-4fb8-afc2-affede4bf261-catalog-content\") pod \"certified-operators-2wr5q\" (UID: \"593dbadd-7d6d-4fb8-afc2-affede4bf261\") " pod="openshift-marketplace/certified-operators-2wr5q" Mar 19 20:09:29 crc kubenswrapper[4826]: I0319 20:09:29.714291 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/593dbadd-7d6d-4fb8-afc2-affede4bf261-utilities\") pod \"certified-operators-2wr5q\" (UID: \"593dbadd-7d6d-4fb8-afc2-affede4bf261\") " pod="openshift-marketplace/certified-operators-2wr5q" Mar 19 20:09:29 crc kubenswrapper[4826]: I0319 20:09:29.714322 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdrgw\" (UniqueName: \"kubernetes.io/projected/593dbadd-7d6d-4fb8-afc2-affede4bf261-kube-api-access-xdrgw\") pod \"certified-operators-2wr5q\" (UID: \"593dbadd-7d6d-4fb8-afc2-affede4bf261\") " pod="openshift-marketplace/certified-operators-2wr5q" Mar 19 20:09:29 crc kubenswrapper[4826]: I0319 20:09:29.714844 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/593dbadd-7d6d-4fb8-afc2-affede4bf261-catalog-content\") pod \"certified-operators-2wr5q\" (UID: \"593dbadd-7d6d-4fb8-afc2-affede4bf261\") " pod="openshift-marketplace/certified-operators-2wr5q" Mar 19 20:09:29 crc kubenswrapper[4826]: I0319 20:09:29.715194 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/593dbadd-7d6d-4fb8-afc2-affede4bf261-utilities\") pod \"certified-operators-2wr5q\" (UID: \"593dbadd-7d6d-4fb8-afc2-affede4bf261\") " pod="openshift-marketplace/certified-operators-2wr5q" Mar 19 20:09:29 crc kubenswrapper[4826]: I0319 20:09:29.768878 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdrgw\" (UniqueName: \"kubernetes.io/projected/593dbadd-7d6d-4fb8-afc2-affede4bf261-kube-api-access-xdrgw\") pod \"certified-operators-2wr5q\" (UID: \"593dbadd-7d6d-4fb8-afc2-affede4bf261\") " pod="openshift-marketplace/certified-operators-2wr5q" Mar 19 20:09:29 crc kubenswrapper[4826]: I0319 20:09:29.780176 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2wr5q" Mar 19 20:09:30 crc kubenswrapper[4826]: I0319 20:09:30.374149 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2wr5q"] Mar 19 20:09:30 crc kubenswrapper[4826]: I0319 20:09:30.709579 4826 generic.go:334] "Generic (PLEG): container finished" podID="593dbadd-7d6d-4fb8-afc2-affede4bf261" containerID="dfbf8115086b049cab83b0a9eb40204186fa8bd1aac2e0be0d9bba7397e461aa" exitCode=0 Mar 19 20:09:30 crc kubenswrapper[4826]: I0319 20:09:30.709807 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2wr5q" event={"ID":"593dbadd-7d6d-4fb8-afc2-affede4bf261","Type":"ContainerDied","Data":"dfbf8115086b049cab83b0a9eb40204186fa8bd1aac2e0be0d9bba7397e461aa"} Mar 19 20:09:30 crc kubenswrapper[4826]: I0319 20:09:30.710013 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2wr5q" event={"ID":"593dbadd-7d6d-4fb8-afc2-affede4bf261","Type":"ContainerStarted","Data":"ff815051c9a7163f298d711770ca21d8349d6b58cb3db9cf6985aad14e8ad87a"} Mar 19 20:09:30 crc kubenswrapper[4826]: I0319 20:09:30.712858 4826 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 20:09:32 crc kubenswrapper[4826]: I0319 20:09:32.742777 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2wr5q" event={"ID":"593dbadd-7d6d-4fb8-afc2-affede4bf261","Type":"ContainerStarted","Data":"1181f1e85d37968dca11a9bf80a6bf2eca2a940a56f4783cd9cc143174606cc1"} Mar 19 20:09:33 crc kubenswrapper[4826]: I0319 20:09:33.756614 4826 generic.go:334] "Generic (PLEG): container finished" podID="593dbadd-7d6d-4fb8-afc2-affede4bf261" containerID="1181f1e85d37968dca11a9bf80a6bf2eca2a940a56f4783cd9cc143174606cc1" exitCode=0 Mar 19 20:09:33 crc kubenswrapper[4826]: I0319 20:09:33.756920 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2wr5q" event={"ID":"593dbadd-7d6d-4fb8-afc2-affede4bf261","Type":"ContainerDied","Data":"1181f1e85d37968dca11a9bf80a6bf2eca2a940a56f4783cd9cc143174606cc1"} Mar 19 20:09:34 crc kubenswrapper[4826]: I0319 20:09:34.771862 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2wr5q" event={"ID":"593dbadd-7d6d-4fb8-afc2-affede4bf261","Type":"ContainerStarted","Data":"dac29b09f77d6c2c9d7a5de83cd70e6bb8432a588f64eed9685ea5671dfae844"} Mar 19 20:09:34 crc kubenswrapper[4826]: I0319 20:09:34.791039 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2wr5q" podStartSLOduration=2.146877244 podStartE2EDuration="5.791020179s" podCreationTimestamp="2026-03-19 20:09:29 +0000 UTC" firstStartedPulling="2026-03-19 20:09:30.712552773 +0000 UTC m=+4395.466621086" lastFinishedPulling="2026-03-19 20:09:34.356695698 +0000 UTC m=+4399.110764021" observedRunningTime="2026-03-19 20:09:34.788857497 +0000 UTC m=+4399.542925820" watchObservedRunningTime="2026-03-19 20:09:34.791020179 +0000 UTC m=+4399.545088502" Mar 19 20:09:39 crc kubenswrapper[4826]: I0319 20:09:39.780380 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2wr5q" Mar 19 20:09:39 crc kubenswrapper[4826]: I0319 20:09:39.781124 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2wr5q" Mar 19 20:09:39 crc kubenswrapper[4826]: I0319 20:09:39.850565 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2wr5q" Mar 19 20:09:39 crc kubenswrapper[4826]: I0319 20:09:39.931884 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2wr5q" Mar 19 20:09:40 crc kubenswrapper[4826]: I0319 20:09:40.101539 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2wr5q"] Mar 19 20:09:41 crc kubenswrapper[4826]: I0319 20:09:41.852919 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2wr5q" podUID="593dbadd-7d6d-4fb8-afc2-affede4bf261" containerName="registry-server" containerID="cri-o://dac29b09f77d6c2c9d7a5de83cd70e6bb8432a588f64eed9685ea5671dfae844" gracePeriod=2 Mar 19 20:09:41 crc kubenswrapper[4826]: I0319 20:09:41.979323 4826 scope.go:117] "RemoveContainer" containerID="02b3c6e47f7a47f581a85215aee0d596eac58c76c3ea4fd0865e77774a333c9d" Mar 19 20:09:41 crc kubenswrapper[4826]: E0319 20:09:41.979890 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 20:09:42 crc kubenswrapper[4826]: I0319 20:09:42.411805 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2wr5q" Mar 19 20:09:42 crc kubenswrapper[4826]: I0319 20:09:42.584818 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdrgw\" (UniqueName: \"kubernetes.io/projected/593dbadd-7d6d-4fb8-afc2-affede4bf261-kube-api-access-xdrgw\") pod \"593dbadd-7d6d-4fb8-afc2-affede4bf261\" (UID: \"593dbadd-7d6d-4fb8-afc2-affede4bf261\") " Mar 19 20:09:42 crc kubenswrapper[4826]: I0319 20:09:42.585101 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/593dbadd-7d6d-4fb8-afc2-affede4bf261-utilities\") pod \"593dbadd-7d6d-4fb8-afc2-affede4bf261\" (UID: \"593dbadd-7d6d-4fb8-afc2-affede4bf261\") " Mar 19 20:09:42 crc kubenswrapper[4826]: I0319 20:09:42.585257 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/593dbadd-7d6d-4fb8-afc2-affede4bf261-catalog-content\") pod \"593dbadd-7d6d-4fb8-afc2-affede4bf261\" (UID: \"593dbadd-7d6d-4fb8-afc2-affede4bf261\") " Mar 19 20:09:42 crc kubenswrapper[4826]: I0319 20:09:42.586575 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/593dbadd-7d6d-4fb8-afc2-affede4bf261-utilities" (OuterVolumeSpecName: "utilities") pod "593dbadd-7d6d-4fb8-afc2-affede4bf261" (UID: "593dbadd-7d6d-4fb8-afc2-affede4bf261"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:09:42 crc kubenswrapper[4826]: I0319 20:09:42.598232 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/593dbadd-7d6d-4fb8-afc2-affede4bf261-kube-api-access-xdrgw" (OuterVolumeSpecName: "kube-api-access-xdrgw") pod "593dbadd-7d6d-4fb8-afc2-affede4bf261" (UID: "593dbadd-7d6d-4fb8-afc2-affede4bf261"). InnerVolumeSpecName "kube-api-access-xdrgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:09:42 crc kubenswrapper[4826]: I0319 20:09:42.688533 4826 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/593dbadd-7d6d-4fb8-afc2-affede4bf261-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 20:09:42 crc kubenswrapper[4826]: I0319 20:09:42.688577 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdrgw\" (UniqueName: \"kubernetes.io/projected/593dbadd-7d6d-4fb8-afc2-affede4bf261-kube-api-access-xdrgw\") on node \"crc\" DevicePath \"\"" Mar 19 20:09:42 crc kubenswrapper[4826]: I0319 20:09:42.769025 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/593dbadd-7d6d-4fb8-afc2-affede4bf261-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "593dbadd-7d6d-4fb8-afc2-affede4bf261" (UID: "593dbadd-7d6d-4fb8-afc2-affede4bf261"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:09:42 crc kubenswrapper[4826]: I0319 20:09:42.790557 4826 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/593dbadd-7d6d-4fb8-afc2-affede4bf261-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 20:09:42 crc kubenswrapper[4826]: I0319 20:09:42.867600 4826 generic.go:334] "Generic (PLEG): container finished" podID="593dbadd-7d6d-4fb8-afc2-affede4bf261" containerID="dac29b09f77d6c2c9d7a5de83cd70e6bb8432a588f64eed9685ea5671dfae844" exitCode=0 Mar 19 20:09:42 crc kubenswrapper[4826]: I0319 20:09:42.867682 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2wr5q" Mar 19 20:09:42 crc kubenswrapper[4826]: I0319 20:09:42.867697 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2wr5q" event={"ID":"593dbadd-7d6d-4fb8-afc2-affede4bf261","Type":"ContainerDied","Data":"dac29b09f77d6c2c9d7a5de83cd70e6bb8432a588f64eed9685ea5671dfae844"} Mar 19 20:09:42 crc kubenswrapper[4826]: I0319 20:09:42.868126 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2wr5q" event={"ID":"593dbadd-7d6d-4fb8-afc2-affede4bf261","Type":"ContainerDied","Data":"ff815051c9a7163f298d711770ca21d8349d6b58cb3db9cf6985aad14e8ad87a"} Mar 19 20:09:42 crc kubenswrapper[4826]: I0319 20:09:42.868157 4826 scope.go:117] "RemoveContainer" containerID="dac29b09f77d6c2c9d7a5de83cd70e6bb8432a588f64eed9685ea5671dfae844" Mar 19 20:09:42 crc kubenswrapper[4826]: I0319 20:09:42.889175 4826 scope.go:117] "RemoveContainer" containerID="1181f1e85d37968dca11a9bf80a6bf2eca2a940a56f4783cd9cc143174606cc1" Mar 19 20:09:42 crc kubenswrapper[4826]: I0319 20:09:42.926727 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2wr5q"] Mar 19 20:09:42 crc kubenswrapper[4826]: I0319 20:09:42.931106 4826 scope.go:117] "RemoveContainer" containerID="dfbf8115086b049cab83b0a9eb40204186fa8bd1aac2e0be0d9bba7397e461aa" Mar 19 20:09:42 crc kubenswrapper[4826]: I0319 20:09:42.939370 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2wr5q"] Mar 19 20:09:42 crc kubenswrapper[4826]: I0319 20:09:42.981727 4826 scope.go:117] "RemoveContainer" containerID="dac29b09f77d6c2c9d7a5de83cd70e6bb8432a588f64eed9685ea5671dfae844" Mar 19 20:09:42 crc kubenswrapper[4826]: E0319 20:09:42.983170 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dac29b09f77d6c2c9d7a5de83cd70e6bb8432a588f64eed9685ea5671dfae844\": container with ID starting with dac29b09f77d6c2c9d7a5de83cd70e6bb8432a588f64eed9685ea5671dfae844 not found: ID does not exist" containerID="dac29b09f77d6c2c9d7a5de83cd70e6bb8432a588f64eed9685ea5671dfae844" Mar 19 20:09:42 crc kubenswrapper[4826]: I0319 20:09:42.983428 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dac29b09f77d6c2c9d7a5de83cd70e6bb8432a588f64eed9685ea5671dfae844"} err="failed to get container status \"dac29b09f77d6c2c9d7a5de83cd70e6bb8432a588f64eed9685ea5671dfae844\": rpc error: code = NotFound desc = could not find container \"dac29b09f77d6c2c9d7a5de83cd70e6bb8432a588f64eed9685ea5671dfae844\": container with ID starting with dac29b09f77d6c2c9d7a5de83cd70e6bb8432a588f64eed9685ea5671dfae844 not found: ID does not exist" Mar 19 20:09:42 crc kubenswrapper[4826]: I0319 20:09:42.983490 4826 scope.go:117] "RemoveContainer" containerID="1181f1e85d37968dca11a9bf80a6bf2eca2a940a56f4783cd9cc143174606cc1" Mar 19 20:09:42 crc kubenswrapper[4826]: E0319 20:09:42.984171 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1181f1e85d37968dca11a9bf80a6bf2eca2a940a56f4783cd9cc143174606cc1\": container with ID starting with 1181f1e85d37968dca11a9bf80a6bf2eca2a940a56f4783cd9cc143174606cc1 not found: ID does not exist" containerID="1181f1e85d37968dca11a9bf80a6bf2eca2a940a56f4783cd9cc143174606cc1" Mar 19 20:09:42 crc kubenswrapper[4826]: I0319 20:09:42.984291 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1181f1e85d37968dca11a9bf80a6bf2eca2a940a56f4783cd9cc143174606cc1"} err="failed to get container status \"1181f1e85d37968dca11a9bf80a6bf2eca2a940a56f4783cd9cc143174606cc1\": rpc error: code = NotFound desc = could not find container \"1181f1e85d37968dca11a9bf80a6bf2eca2a940a56f4783cd9cc143174606cc1\": container with ID starting with 1181f1e85d37968dca11a9bf80a6bf2eca2a940a56f4783cd9cc143174606cc1 not found: ID does not exist" Mar 19 20:09:42 crc kubenswrapper[4826]: I0319 20:09:42.984597 4826 scope.go:117] "RemoveContainer" containerID="dfbf8115086b049cab83b0a9eb40204186fa8bd1aac2e0be0d9bba7397e461aa" Mar 19 20:09:42 crc kubenswrapper[4826]: E0319 20:09:42.985021 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfbf8115086b049cab83b0a9eb40204186fa8bd1aac2e0be0d9bba7397e461aa\": container with ID starting with dfbf8115086b049cab83b0a9eb40204186fa8bd1aac2e0be0d9bba7397e461aa not found: ID does not exist" containerID="dfbf8115086b049cab83b0a9eb40204186fa8bd1aac2e0be0d9bba7397e461aa" Mar 19 20:09:42 crc kubenswrapper[4826]: I0319 20:09:42.985047 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfbf8115086b049cab83b0a9eb40204186fa8bd1aac2e0be0d9bba7397e461aa"} err="failed to get container status \"dfbf8115086b049cab83b0a9eb40204186fa8bd1aac2e0be0d9bba7397e461aa\": rpc error: code = NotFound desc = could not find container \"dfbf8115086b049cab83b0a9eb40204186fa8bd1aac2e0be0d9bba7397e461aa\": container with ID starting with dfbf8115086b049cab83b0a9eb40204186fa8bd1aac2e0be0d9bba7397e461aa not found: ID does not exist" Mar 19 20:09:44 crc kubenswrapper[4826]: I0319 20:09:44.003485 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="593dbadd-7d6d-4fb8-afc2-affede4bf261" path="/var/lib/kubelet/pods/593dbadd-7d6d-4fb8-afc2-affede4bf261/volumes" Mar 19 20:09:54 crc kubenswrapper[4826]: I0319 20:09:54.978129 4826 scope.go:117] "RemoveContainer" containerID="02b3c6e47f7a47f581a85215aee0d596eac58c76c3ea4fd0865e77774a333c9d" Mar 19 20:09:54 crc kubenswrapper[4826]: E0319 20:09:54.978960 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 20:10:00 crc kubenswrapper[4826]: I0319 20:10:00.234843 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565850-df5nq"] Mar 19 20:10:00 crc kubenswrapper[4826]: E0319 20:10:00.235913 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="593dbadd-7d6d-4fb8-afc2-affede4bf261" containerName="registry-server" Mar 19 20:10:00 crc kubenswrapper[4826]: I0319 20:10:00.235926 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="593dbadd-7d6d-4fb8-afc2-affede4bf261" containerName="registry-server" Mar 19 20:10:00 crc kubenswrapper[4826]: E0319 20:10:00.235961 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="593dbadd-7d6d-4fb8-afc2-affede4bf261" containerName="extract-content" Mar 19 20:10:00 crc kubenswrapper[4826]: I0319 20:10:00.235966 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="593dbadd-7d6d-4fb8-afc2-affede4bf261" containerName="extract-content" Mar 19 20:10:00 crc kubenswrapper[4826]: E0319 20:10:00.235977 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="593dbadd-7d6d-4fb8-afc2-affede4bf261" containerName="extract-utilities" Mar 19 20:10:00 crc kubenswrapper[4826]: I0319 20:10:00.235984 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="593dbadd-7d6d-4fb8-afc2-affede4bf261" containerName="extract-utilities" Mar 19 20:10:00 crc kubenswrapper[4826]: I0319 20:10:00.236227 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="593dbadd-7d6d-4fb8-afc2-affede4bf261" containerName="registry-server" Mar 19 20:10:00 crc kubenswrapper[4826]: I0319 20:10:00.237023 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565850-df5nq" Mar 19 20:10:00 crc kubenswrapper[4826]: I0319 20:10:00.240490 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b27wl" Mar 19 20:10:00 crc kubenswrapper[4826]: I0319 20:10:00.240604 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 20:10:00 crc kubenswrapper[4826]: I0319 20:10:00.240761 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 20:10:00 crc kubenswrapper[4826]: I0319 20:10:00.250118 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565850-df5nq"] Mar 19 20:10:00 crc kubenswrapper[4826]: I0319 20:10:00.274272 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr9pt\" (UniqueName: \"kubernetes.io/projected/ee52295e-4450-4ddd-8da0-07a48a53694f-kube-api-access-cr9pt\") pod \"auto-csr-approver-29565850-df5nq\" (UID: \"ee52295e-4450-4ddd-8da0-07a48a53694f\") " pod="openshift-infra/auto-csr-approver-29565850-df5nq" Mar 19 20:10:00 crc kubenswrapper[4826]: I0319 20:10:00.377590 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cr9pt\" (UniqueName: \"kubernetes.io/projected/ee52295e-4450-4ddd-8da0-07a48a53694f-kube-api-access-cr9pt\") pod \"auto-csr-approver-29565850-df5nq\" (UID: \"ee52295e-4450-4ddd-8da0-07a48a53694f\") " pod="openshift-infra/auto-csr-approver-29565850-df5nq" Mar 19 20:10:00 crc kubenswrapper[4826]: I0319 20:10:00.402397 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cr9pt\" (UniqueName: \"kubernetes.io/projected/ee52295e-4450-4ddd-8da0-07a48a53694f-kube-api-access-cr9pt\") pod \"auto-csr-approver-29565850-df5nq\" (UID: \"ee52295e-4450-4ddd-8da0-07a48a53694f\") " pod="openshift-infra/auto-csr-approver-29565850-df5nq" Mar 19 20:10:00 crc kubenswrapper[4826]: I0319 20:10:00.567009 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565850-df5nq" Mar 19 20:10:01 crc kubenswrapper[4826]: I0319 20:10:01.052821 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565850-df5nq"] Mar 19 20:10:01 crc kubenswrapper[4826]: I0319 20:10:01.165912 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565850-df5nq" event={"ID":"ee52295e-4450-4ddd-8da0-07a48a53694f","Type":"ContainerStarted","Data":"3c03e19af4bd6580062e85dedafead5b94c3d945e35a9d8acac38dd0618116bc"} Mar 19 20:10:03 crc kubenswrapper[4826]: I0319 20:10:03.189731 4826 generic.go:334] "Generic (PLEG): container finished" podID="ee52295e-4450-4ddd-8da0-07a48a53694f" containerID="17e4391027936fd369c038fedd40d620a05c047ebf7d0009fcb95507e3cc054a" exitCode=0 Mar 19 20:10:03 crc kubenswrapper[4826]: I0319 20:10:03.190284 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565850-df5nq" event={"ID":"ee52295e-4450-4ddd-8da0-07a48a53694f","Type":"ContainerDied","Data":"17e4391027936fd369c038fedd40d620a05c047ebf7d0009fcb95507e3cc054a"} Mar 19 20:10:04 crc kubenswrapper[4826]: I0319 20:10:04.822550 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565850-df5nq" Mar 19 20:10:05 crc kubenswrapper[4826]: I0319 20:10:05.023273 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cr9pt\" (UniqueName: \"kubernetes.io/projected/ee52295e-4450-4ddd-8da0-07a48a53694f-kube-api-access-cr9pt\") pod \"ee52295e-4450-4ddd-8da0-07a48a53694f\" (UID: \"ee52295e-4450-4ddd-8da0-07a48a53694f\") " Mar 19 20:10:05 crc kubenswrapper[4826]: I0319 20:10:05.028831 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee52295e-4450-4ddd-8da0-07a48a53694f-kube-api-access-cr9pt" (OuterVolumeSpecName: "kube-api-access-cr9pt") pod "ee52295e-4450-4ddd-8da0-07a48a53694f" (UID: "ee52295e-4450-4ddd-8da0-07a48a53694f"). InnerVolumeSpecName "kube-api-access-cr9pt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:10:05 crc kubenswrapper[4826]: I0319 20:10:05.129395 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cr9pt\" (UniqueName: \"kubernetes.io/projected/ee52295e-4450-4ddd-8da0-07a48a53694f-kube-api-access-cr9pt\") on node \"crc\" DevicePath \"\"" Mar 19 20:10:05 crc kubenswrapper[4826]: I0319 20:10:05.232808 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565850-df5nq" event={"ID":"ee52295e-4450-4ddd-8da0-07a48a53694f","Type":"ContainerDied","Data":"3c03e19af4bd6580062e85dedafead5b94c3d945e35a9d8acac38dd0618116bc"} Mar 19 20:10:05 crc kubenswrapper[4826]: I0319 20:10:05.232868 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c03e19af4bd6580062e85dedafead5b94c3d945e35a9d8acac38dd0618116bc" Mar 19 20:10:05 crc kubenswrapper[4826]: I0319 20:10:05.232896 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565850-df5nq" Mar 19 20:10:05 crc kubenswrapper[4826]: I0319 20:10:05.934405 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565844-42ht6"] Mar 19 20:10:05 crc kubenswrapper[4826]: I0319 20:10:05.953337 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565844-42ht6"] Mar 19 20:10:06 crc kubenswrapper[4826]: I0319 20:10:06.002024 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab1e329f-1f51-4363-92bf-b7922d6a4df5" path="/var/lib/kubelet/pods/ab1e329f-1f51-4363-92bf-b7922d6a4df5/volumes" Mar 19 20:10:08 crc kubenswrapper[4826]: I0319 20:10:08.978008 4826 scope.go:117] "RemoveContainer" containerID="02b3c6e47f7a47f581a85215aee0d596eac58c76c3ea4fd0865e77774a333c9d" Mar 19 20:10:08 crc kubenswrapper[4826]: E0319 20:10:08.979416 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 20:10:13 crc kubenswrapper[4826]: E0319 20:10:13.436702 4826 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.69:45354->38.102.83.69:34213: write tcp 38.102.83.69:45354->38.102.83.69:34213: write: broken pipe Mar 19 20:10:21 crc kubenswrapper[4826]: I0319 20:10:21.977419 4826 scope.go:117] "RemoveContainer" containerID="02b3c6e47f7a47f581a85215aee0d596eac58c76c3ea4fd0865e77774a333c9d" Mar 19 20:10:21 crc kubenswrapper[4826]: E0319 20:10:21.978964 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 20:10:36 crc kubenswrapper[4826]: I0319 20:10:36.077452 4826 scope.go:117] "RemoveContainer" containerID="2af134f899da8a5d2dea2aa1c80bf6eeaec2bae25e66a15e34dda4868f03576f" Mar 19 20:10:36 crc kubenswrapper[4826]: I0319 20:10:36.976864 4826 scope.go:117] "RemoveContainer" containerID="02b3c6e47f7a47f581a85215aee0d596eac58c76c3ea4fd0865e77774a333c9d" Mar 19 20:10:36 crc kubenswrapper[4826]: E0319 20:10:36.977512 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 20:10:50 crc kubenswrapper[4826]: I0319 20:10:50.977389 4826 scope.go:117] "RemoveContainer" containerID="02b3c6e47f7a47f581a85215aee0d596eac58c76c3ea4fd0865e77774a333c9d" Mar 19 20:10:50 crc kubenswrapper[4826]: E0319 20:10:50.978372 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 20:11:03 crc kubenswrapper[4826]: I0319 20:11:03.975999 4826 scope.go:117] "RemoveContainer" containerID="02b3c6e47f7a47f581a85215aee0d596eac58c76c3ea4fd0865e77774a333c9d" Mar 19 20:11:03 crc kubenswrapper[4826]: E0319 20:11:03.976974 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 20:11:15 crc kubenswrapper[4826]: I0319 20:11:15.996168 4826 scope.go:117] "RemoveContainer" containerID="02b3c6e47f7a47f581a85215aee0d596eac58c76c3ea4fd0865e77774a333c9d" Mar 19 20:11:15 crc kubenswrapper[4826]: E0319 20:11:15.997905 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 20:11:29 crc kubenswrapper[4826]: I0319 20:11:29.976979 4826 scope.go:117] "RemoveContainer" containerID="02b3c6e47f7a47f581a85215aee0d596eac58c76c3ea4fd0865e77774a333c9d" Mar 19 20:11:29 crc kubenswrapper[4826]: E0319 20:11:29.977805 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 20:11:42 crc kubenswrapper[4826]: I0319 20:11:42.977745 4826 scope.go:117] "RemoveContainer" containerID="02b3c6e47f7a47f581a85215aee0d596eac58c76c3ea4fd0865e77774a333c9d" Mar 19 20:11:42 crc kubenswrapper[4826]: E0319 20:11:42.979160 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 20:11:56 crc kubenswrapper[4826]: I0319 20:11:56.977119 4826 scope.go:117] "RemoveContainer" containerID="02b3c6e47f7a47f581a85215aee0d596eac58c76c3ea4fd0865e77774a333c9d" Mar 19 20:11:56 crc kubenswrapper[4826]: E0319 20:11:56.979034 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 20:12:00 crc kubenswrapper[4826]: I0319 20:12:00.170043 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565852-knh8r"] Mar 19 20:12:00 crc kubenswrapper[4826]: E0319 20:12:00.171131 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee52295e-4450-4ddd-8da0-07a48a53694f" containerName="oc" Mar 19 20:12:00 crc kubenswrapper[4826]: I0319 20:12:00.171153 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee52295e-4450-4ddd-8da0-07a48a53694f" containerName="oc" Mar 19 20:12:00 crc kubenswrapper[4826]: I0319 20:12:00.171613 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee52295e-4450-4ddd-8da0-07a48a53694f" containerName="oc" Mar 19 20:12:00 crc kubenswrapper[4826]: I0319 20:12:00.172968 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565852-knh8r" Mar 19 20:12:00 crc kubenswrapper[4826]: I0319 20:12:00.175637 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b27wl" Mar 19 20:12:00 crc kubenswrapper[4826]: I0319 20:12:00.176650 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 20:12:00 crc kubenswrapper[4826]: I0319 20:12:00.177572 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 20:12:00 crc kubenswrapper[4826]: I0319 20:12:00.184518 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565852-knh8r"] Mar 19 20:12:00 crc kubenswrapper[4826]: I0319 20:12:00.349091 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nr7df\" (UniqueName: \"kubernetes.io/projected/d6bc3f75-71a9-48cf-bc96-8ab9b191a3f7-kube-api-access-nr7df\") pod \"auto-csr-approver-29565852-knh8r\" (UID: \"d6bc3f75-71a9-48cf-bc96-8ab9b191a3f7\") " pod="openshift-infra/auto-csr-approver-29565852-knh8r" Mar 19 20:12:00 crc kubenswrapper[4826]: I0319 20:12:00.452201 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nr7df\" (UniqueName: \"kubernetes.io/projected/d6bc3f75-71a9-48cf-bc96-8ab9b191a3f7-kube-api-access-nr7df\") pod \"auto-csr-approver-29565852-knh8r\" (UID: \"d6bc3f75-71a9-48cf-bc96-8ab9b191a3f7\") " pod="openshift-infra/auto-csr-approver-29565852-knh8r" Mar 19 20:12:01 crc kubenswrapper[4826]: I0319 20:12:01.080860 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nr7df\" (UniqueName: \"kubernetes.io/projected/d6bc3f75-71a9-48cf-bc96-8ab9b191a3f7-kube-api-access-nr7df\") pod \"auto-csr-approver-29565852-knh8r\" (UID: \"d6bc3f75-71a9-48cf-bc96-8ab9b191a3f7\") " pod="openshift-infra/auto-csr-approver-29565852-knh8r" Mar 19 20:12:01 crc kubenswrapper[4826]: I0319 20:12:01.103133 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565852-knh8r" Mar 19 20:12:01 crc kubenswrapper[4826]: I0319 20:12:01.622749 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565852-knh8r"] Mar 19 20:12:01 crc kubenswrapper[4826]: I0319 20:12:01.884813 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565852-knh8r" event={"ID":"d6bc3f75-71a9-48cf-bc96-8ab9b191a3f7","Type":"ContainerStarted","Data":"2e23ce14909bfeccf853de20e540cde0f1bda4c35695a95b2cd73490bfbc28dc"} Mar 19 20:12:03 crc kubenswrapper[4826]: I0319 20:12:03.912318 4826 generic.go:334] "Generic (PLEG): container finished" podID="d6bc3f75-71a9-48cf-bc96-8ab9b191a3f7" containerID="d0993fecaa7e2ef7f6f11e5a016a4032e34b14c64a539a8ca2f4d6a9e3ff9adf" exitCode=0 Mar 19 20:12:03 crc kubenswrapper[4826]: I0319 20:12:03.912410 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565852-knh8r" event={"ID":"d6bc3f75-71a9-48cf-bc96-8ab9b191a3f7","Type":"ContainerDied","Data":"d0993fecaa7e2ef7f6f11e5a016a4032e34b14c64a539a8ca2f4d6a9e3ff9adf"} Mar 19 20:12:05 crc kubenswrapper[4826]: I0319 20:12:05.419328 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565852-knh8r" Mar 19 20:12:05 crc kubenswrapper[4826]: I0319 20:12:05.518001 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nr7df\" (UniqueName: \"kubernetes.io/projected/d6bc3f75-71a9-48cf-bc96-8ab9b191a3f7-kube-api-access-nr7df\") pod \"d6bc3f75-71a9-48cf-bc96-8ab9b191a3f7\" (UID: \"d6bc3f75-71a9-48cf-bc96-8ab9b191a3f7\") " Mar 19 20:12:05 crc kubenswrapper[4826]: I0319 20:12:05.530963 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6bc3f75-71a9-48cf-bc96-8ab9b191a3f7-kube-api-access-nr7df" (OuterVolumeSpecName: "kube-api-access-nr7df") pod "d6bc3f75-71a9-48cf-bc96-8ab9b191a3f7" (UID: "d6bc3f75-71a9-48cf-bc96-8ab9b191a3f7"). InnerVolumeSpecName "kube-api-access-nr7df". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:12:05 crc kubenswrapper[4826]: I0319 20:12:05.620589 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nr7df\" (UniqueName: \"kubernetes.io/projected/d6bc3f75-71a9-48cf-bc96-8ab9b191a3f7-kube-api-access-nr7df\") on node \"crc\" DevicePath \"\"" Mar 19 20:12:05 crc kubenswrapper[4826]: I0319 20:12:05.941605 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565852-knh8r" event={"ID":"d6bc3f75-71a9-48cf-bc96-8ab9b191a3f7","Type":"ContainerDied","Data":"2e23ce14909bfeccf853de20e540cde0f1bda4c35695a95b2cd73490bfbc28dc"} Mar 19 20:12:05 crc kubenswrapper[4826]: I0319 20:12:05.942017 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e23ce14909bfeccf853de20e540cde0f1bda4c35695a95b2cd73490bfbc28dc" Mar 19 20:12:05 crc kubenswrapper[4826]: I0319 20:12:05.941748 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565852-knh8r" Mar 19 20:12:06 crc kubenswrapper[4826]: I0319 20:12:06.543004 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565846-49tqd"] Mar 19 20:12:06 crc kubenswrapper[4826]: I0319 20:12:06.557682 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565846-49tqd"] Mar 19 20:12:07 crc kubenswrapper[4826]: I0319 20:12:07.992574 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3a29d0b-189d-44fa-a64e-d7988ea4031c" path="/var/lib/kubelet/pods/f3a29d0b-189d-44fa-a64e-d7988ea4031c/volumes" Mar 19 20:12:10 crc kubenswrapper[4826]: I0319 20:12:10.976546 4826 scope.go:117] "RemoveContainer" containerID="02b3c6e47f7a47f581a85215aee0d596eac58c76c3ea4fd0865e77774a333c9d" Mar 19 20:12:10 crc kubenswrapper[4826]: E0319 20:12:10.977302 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 20:12:22 crc kubenswrapper[4826]: I0319 20:12:22.977299 4826 scope.go:117] "RemoveContainer" containerID="02b3c6e47f7a47f581a85215aee0d596eac58c76c3ea4fd0865e77774a333c9d" Mar 19 20:12:22 crc kubenswrapper[4826]: E0319 20:12:22.978782 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 20:12:36 crc kubenswrapper[4826]: I0319 20:12:36.241935 4826 scope.go:117] "RemoveContainer" containerID="ee4aa50ecb03774f585362bcf3e0b586a3b03979c12abbb7de98ddcba20ebb23" Mar 19 20:12:36 crc kubenswrapper[4826]: I0319 20:12:36.293772 4826 scope.go:117] "RemoveContainer" containerID="3cd4c0afe04b27b45ef7b705fc738f551d2440b86a0330a9551a5e2d7c5df677" Mar 19 20:12:36 crc kubenswrapper[4826]: I0319 20:12:36.374703 4826 scope.go:117] "RemoveContainer" containerID="da8af6207021e2cac817da2b8fc34ec1a5956ab409e23a13207ea94fb036bb6e" Mar 19 20:12:36 crc kubenswrapper[4826]: I0319 20:12:36.509831 4826 scope.go:117] "RemoveContainer" containerID="ea8083dd987309834e310327047505670714e9eceec3e1bf19bae814150040f7" Mar 19 20:12:37 crc kubenswrapper[4826]: I0319 20:12:37.976974 4826 scope.go:117] "RemoveContainer" containerID="02b3c6e47f7a47f581a85215aee0d596eac58c76c3ea4fd0865e77774a333c9d" Mar 19 20:12:37 crc kubenswrapper[4826]: E0319 20:12:37.978098 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 20:12:48 crc kubenswrapper[4826]: I0319 20:12:48.750894 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-p9rmq"] Mar 19 20:12:48 crc kubenswrapper[4826]: E0319 20:12:48.751981 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6bc3f75-71a9-48cf-bc96-8ab9b191a3f7" containerName="oc" Mar 19 20:12:48 crc kubenswrapper[4826]: I0319 20:12:48.751999 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6bc3f75-71a9-48cf-bc96-8ab9b191a3f7" containerName="oc" Mar 19 20:12:48 crc kubenswrapper[4826]: I0319 20:12:48.752256 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6bc3f75-71a9-48cf-bc96-8ab9b191a3f7" containerName="oc" Mar 19 20:12:48 crc kubenswrapper[4826]: I0319 20:12:48.755227 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p9rmq" Mar 19 20:12:48 crc kubenswrapper[4826]: I0319 20:12:48.771822 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p9rmq"] Mar 19 20:12:48 crc kubenswrapper[4826]: I0319 20:12:48.880802 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a20f6a8-01f3-4492-856d-e5f494672fa3-catalog-content\") pod \"community-operators-p9rmq\" (UID: \"3a20f6a8-01f3-4492-856d-e5f494672fa3\") " pod="openshift-marketplace/community-operators-p9rmq" Mar 19 20:12:48 crc kubenswrapper[4826]: I0319 20:12:48.880965 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a20f6a8-01f3-4492-856d-e5f494672fa3-utilities\") pod \"community-operators-p9rmq\" (UID: \"3a20f6a8-01f3-4492-856d-e5f494672fa3\") " pod="openshift-marketplace/community-operators-p9rmq" Mar 19 20:12:48 crc kubenswrapper[4826]: I0319 20:12:48.881124 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxbjs\" (UniqueName: \"kubernetes.io/projected/3a20f6a8-01f3-4492-856d-e5f494672fa3-kube-api-access-hxbjs\") pod \"community-operators-p9rmq\" (UID: \"3a20f6a8-01f3-4492-856d-e5f494672fa3\") " pod="openshift-marketplace/community-operators-p9rmq" Mar 19 20:12:48 crc kubenswrapper[4826]: I0319 20:12:48.982962 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a20f6a8-01f3-4492-856d-e5f494672fa3-catalog-content\") pod \"community-operators-p9rmq\" (UID: \"3a20f6a8-01f3-4492-856d-e5f494672fa3\") " pod="openshift-marketplace/community-operators-p9rmq" Mar 19 20:12:48 crc kubenswrapper[4826]: I0319 20:12:48.983277 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a20f6a8-01f3-4492-856d-e5f494672fa3-utilities\") pod \"community-operators-p9rmq\" (UID: \"3a20f6a8-01f3-4492-856d-e5f494672fa3\") " pod="openshift-marketplace/community-operators-p9rmq" Mar 19 20:12:48 crc kubenswrapper[4826]: I0319 20:12:48.983449 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxbjs\" (UniqueName: \"kubernetes.io/projected/3a20f6a8-01f3-4492-856d-e5f494672fa3-kube-api-access-hxbjs\") pod \"community-operators-p9rmq\" (UID: \"3a20f6a8-01f3-4492-856d-e5f494672fa3\") " pod="openshift-marketplace/community-operators-p9rmq" Mar 19 20:12:48 crc kubenswrapper[4826]: I0319 20:12:48.983981 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a20f6a8-01f3-4492-856d-e5f494672fa3-catalog-content\") pod \"community-operators-p9rmq\" (UID: \"3a20f6a8-01f3-4492-856d-e5f494672fa3\") " pod="openshift-marketplace/community-operators-p9rmq" Mar 19 20:12:48 crc kubenswrapper[4826]: I0319 20:12:48.984076 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a20f6a8-01f3-4492-856d-e5f494672fa3-utilities\") pod \"community-operators-p9rmq\" (UID: \"3a20f6a8-01f3-4492-856d-e5f494672fa3\") " pod="openshift-marketplace/community-operators-p9rmq" Mar 19 20:12:49 crc kubenswrapper[4826]: I0319 20:12:49.001001 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxbjs\" (UniqueName: \"kubernetes.io/projected/3a20f6a8-01f3-4492-856d-e5f494672fa3-kube-api-access-hxbjs\") pod \"community-operators-p9rmq\" (UID: \"3a20f6a8-01f3-4492-856d-e5f494672fa3\") " pod="openshift-marketplace/community-operators-p9rmq" Mar 19 20:12:49 crc kubenswrapper[4826]: I0319 20:12:49.089874 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p9rmq" Mar 19 20:12:49 crc kubenswrapper[4826]: I0319 20:12:49.796686 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p9rmq"] Mar 19 20:12:50 crc kubenswrapper[4826]: I0319 20:12:50.572801 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p9rmq" event={"ID":"3a20f6a8-01f3-4492-856d-e5f494672fa3","Type":"ContainerStarted","Data":"601189c1e732bebc9351e307fb9c52d88050e0065959d2d742a233bf90cd7990"} Mar 19 20:12:51 crc kubenswrapper[4826]: I0319 20:12:51.589788 4826 generic.go:334] "Generic (PLEG): container finished" podID="3a20f6a8-01f3-4492-856d-e5f494672fa3" containerID="ba9dc5be262f80f55f34565661c39a39ae31550ceb677a11f2f9130a03b66e77" exitCode=0 Mar 19 20:12:51 crc kubenswrapper[4826]: I0319 20:12:51.589880 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p9rmq" event={"ID":"3a20f6a8-01f3-4492-856d-e5f494672fa3","Type":"ContainerDied","Data":"ba9dc5be262f80f55f34565661c39a39ae31550ceb677a11f2f9130a03b66e77"} Mar 19 20:12:51 crc kubenswrapper[4826]: I0319 20:12:51.979172 4826 scope.go:117] "RemoveContainer" containerID="02b3c6e47f7a47f581a85215aee0d596eac58c76c3ea4fd0865e77774a333c9d" Mar 19 20:12:51 crc kubenswrapper[4826]: E0319 20:12:51.979979 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 20:12:54 crc kubenswrapper[4826]: I0319 20:12:54.595786 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rrk78"] Mar 19 20:12:54 crc kubenswrapper[4826]: I0319 20:12:54.602633 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rrk78" Mar 19 20:12:54 crc kubenswrapper[4826]: I0319 20:12:54.622919 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c355bc9-260d-4bdb-ac08-a31641c1efb0-utilities\") pod \"redhat-marketplace-rrk78\" (UID: \"1c355bc9-260d-4bdb-ac08-a31641c1efb0\") " pod="openshift-marketplace/redhat-marketplace-rrk78" Mar 19 20:12:54 crc kubenswrapper[4826]: I0319 20:12:54.623151 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c355bc9-260d-4bdb-ac08-a31641c1efb0-catalog-content\") pod \"redhat-marketplace-rrk78\" (UID: \"1c355bc9-260d-4bdb-ac08-a31641c1efb0\") " pod="openshift-marketplace/redhat-marketplace-rrk78" Mar 19 20:12:54 crc kubenswrapper[4826]: I0319 20:12:54.623406 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdmm9\" (UniqueName: \"kubernetes.io/projected/1c355bc9-260d-4bdb-ac08-a31641c1efb0-kube-api-access-zdmm9\") pod \"redhat-marketplace-rrk78\" (UID: \"1c355bc9-260d-4bdb-ac08-a31641c1efb0\") " pod="openshift-marketplace/redhat-marketplace-rrk78" Mar 19 20:12:54 crc kubenswrapper[4826]: I0319 20:12:54.652845 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rrk78"] Mar 19 20:12:54 crc kubenswrapper[4826]: I0319 20:12:54.726448 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c355bc9-260d-4bdb-ac08-a31641c1efb0-utilities\") pod \"redhat-marketplace-rrk78\" (UID: \"1c355bc9-260d-4bdb-ac08-a31641c1efb0\") " pod="openshift-marketplace/redhat-marketplace-rrk78" Mar 19 20:12:54 crc kubenswrapper[4826]: I0319 20:12:54.726568 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c355bc9-260d-4bdb-ac08-a31641c1efb0-catalog-content\") pod \"redhat-marketplace-rrk78\" (UID: \"1c355bc9-260d-4bdb-ac08-a31641c1efb0\") " pod="openshift-marketplace/redhat-marketplace-rrk78" Mar 19 20:12:54 crc kubenswrapper[4826]: I0319 20:12:54.726720 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdmm9\" (UniqueName: \"kubernetes.io/projected/1c355bc9-260d-4bdb-ac08-a31641c1efb0-kube-api-access-zdmm9\") pod \"redhat-marketplace-rrk78\" (UID: \"1c355bc9-260d-4bdb-ac08-a31641c1efb0\") " pod="openshift-marketplace/redhat-marketplace-rrk78" Mar 19 20:12:54 crc kubenswrapper[4826]: I0319 20:12:54.727260 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c355bc9-260d-4bdb-ac08-a31641c1efb0-utilities\") pod \"redhat-marketplace-rrk78\" (UID: \"1c355bc9-260d-4bdb-ac08-a31641c1efb0\") " pod="openshift-marketplace/redhat-marketplace-rrk78" Mar 19 20:12:54 crc kubenswrapper[4826]: I0319 20:12:54.727364 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c355bc9-260d-4bdb-ac08-a31641c1efb0-catalog-content\") pod \"redhat-marketplace-rrk78\" (UID: \"1c355bc9-260d-4bdb-ac08-a31641c1efb0\") " pod="openshift-marketplace/redhat-marketplace-rrk78" Mar 19 20:12:54 crc kubenswrapper[4826]: I0319 20:12:54.768472 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdmm9\" (UniqueName: \"kubernetes.io/projected/1c355bc9-260d-4bdb-ac08-a31641c1efb0-kube-api-access-zdmm9\") pod \"redhat-marketplace-rrk78\" (UID: \"1c355bc9-260d-4bdb-ac08-a31641c1efb0\") " pod="openshift-marketplace/redhat-marketplace-rrk78" Mar 19 20:12:54 crc kubenswrapper[4826]: I0319 20:12:54.929970 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rrk78" Mar 19 20:12:58 crc kubenswrapper[4826]: I0319 20:12:58.511425 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rrk78"] Mar 19 20:12:58 crc kubenswrapper[4826]: I0319 20:12:58.670261 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rrk78" event={"ID":"1c355bc9-260d-4bdb-ac08-a31641c1efb0","Type":"ContainerStarted","Data":"345ebb1574b1080dab2e06685ad60d093e1f0acd3b169074d8319b04305780da"} Mar 19 20:12:58 crc kubenswrapper[4826]: I0319 20:12:58.672789 4826 generic.go:334] "Generic (PLEG): container finished" podID="3a20f6a8-01f3-4492-856d-e5f494672fa3" containerID="0b9d9011dec4866cb987c3f95bf4dd70774094102d96f822987e3db728f7f599" exitCode=0 Mar 19 20:12:58 crc kubenswrapper[4826]: I0319 20:12:58.672854 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p9rmq" event={"ID":"3a20f6a8-01f3-4492-856d-e5f494672fa3","Type":"ContainerDied","Data":"0b9d9011dec4866cb987c3f95bf4dd70774094102d96f822987e3db728f7f599"} Mar 19 20:12:59 crc kubenswrapper[4826]: I0319 20:12:59.693161 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p9rmq" event={"ID":"3a20f6a8-01f3-4492-856d-e5f494672fa3","Type":"ContainerStarted","Data":"9ba1abb6e9c4b14c4a37526dcfb4f281941152770b5eee3214e8be9e4b81582a"} Mar 19 20:12:59 crc kubenswrapper[4826]: I0319 20:12:59.696428 4826 generic.go:334] "Generic (PLEG): container finished" podID="1c355bc9-260d-4bdb-ac08-a31641c1efb0" containerID="066d919609a203fbf6ee5c1354f5bb1b4e2be3e282328de3b4df56a56ed46b69" exitCode=0 Mar 19 20:12:59 crc kubenswrapper[4826]: I0319 20:12:59.696489 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rrk78" event={"ID":"1c355bc9-260d-4bdb-ac08-a31641c1efb0","Type":"ContainerDied","Data":"066d919609a203fbf6ee5c1354f5bb1b4e2be3e282328de3b4df56a56ed46b69"} Mar 19 20:12:59 crc kubenswrapper[4826]: I0319 20:12:59.733065 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-p9rmq" podStartSLOduration=4.2036327270000005 podStartE2EDuration="11.733036326s" podCreationTimestamp="2026-03-19 20:12:48 +0000 UTC" firstStartedPulling="2026-03-19 20:12:51.592736178 +0000 UTC m=+4596.346804491" lastFinishedPulling="2026-03-19 20:12:59.122139747 +0000 UTC m=+4603.876208090" observedRunningTime="2026-03-19 20:12:59.719282793 +0000 UTC m=+4604.473351116" watchObservedRunningTime="2026-03-19 20:12:59.733036326 +0000 UTC m=+4604.487104659" Mar 19 20:13:00 crc kubenswrapper[4826]: I0319 20:13:00.710299 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rrk78" event={"ID":"1c355bc9-260d-4bdb-ac08-a31641c1efb0","Type":"ContainerStarted","Data":"6e9b50033ff0540fd15c4aa6a4c10430ce7208c816f58ba71b78773c8abd4b58"} Mar 19 20:13:02 crc kubenswrapper[4826]: I0319 20:13:02.773682 4826 generic.go:334] "Generic (PLEG): container finished" podID="1c355bc9-260d-4bdb-ac08-a31641c1efb0" containerID="6e9b50033ff0540fd15c4aa6a4c10430ce7208c816f58ba71b78773c8abd4b58" exitCode=0 Mar 19 20:13:02 crc kubenswrapper[4826]: I0319 20:13:02.774087 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rrk78" event={"ID":"1c355bc9-260d-4bdb-ac08-a31641c1efb0","Type":"ContainerDied","Data":"6e9b50033ff0540fd15c4aa6a4c10430ce7208c816f58ba71b78773c8abd4b58"} Mar 19 20:13:03 crc kubenswrapper[4826]: I0319 20:13:03.789260 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rrk78" event={"ID":"1c355bc9-260d-4bdb-ac08-a31641c1efb0","Type":"ContainerStarted","Data":"1e23d27deaba0bfc72edb18ffe7f681743d08f73feb215f05cdd5e29a9995a5d"} Mar 19 20:13:03 crc kubenswrapper[4826]: I0319 20:13:03.820589 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rrk78" podStartSLOduration=6.00569427 podStartE2EDuration="9.820568401s" podCreationTimestamp="2026-03-19 20:12:54 +0000 UTC" firstStartedPulling="2026-03-19 20:12:59.698291304 +0000 UTC m=+4604.452359637" lastFinishedPulling="2026-03-19 20:13:03.513165455 +0000 UTC m=+4608.267233768" observedRunningTime="2026-03-19 20:13:03.80693197 +0000 UTC m=+4608.561000273" watchObservedRunningTime="2026-03-19 20:13:03.820568401 +0000 UTC m=+4608.574636714" Mar 19 20:13:03 crc kubenswrapper[4826]: I0319 20:13:03.976134 4826 scope.go:117] "RemoveContainer" containerID="02b3c6e47f7a47f581a85215aee0d596eac58c76c3ea4fd0865e77774a333c9d" Mar 19 20:13:04 crc kubenswrapper[4826]: I0319 20:13:04.930734 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rrk78" Mar 19 20:13:04 crc kubenswrapper[4826]: I0319 20:13:04.931321 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rrk78" Mar 19 20:13:05 crc kubenswrapper[4826]: I0319 20:13:05.812625 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" event={"ID":"b456fa3f-c7a7-45ca-b560-e7a9b21be05a","Type":"ContainerStarted","Data":"b791712166d20c7da73a215deab7b11094ad95f1cb3bea1ca9cc7f96f4e37482"} Mar 19 20:13:05 crc kubenswrapper[4826]: I0319 20:13:05.990033 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-rrk78" podUID="1c355bc9-260d-4bdb-ac08-a31641c1efb0" containerName="registry-server" probeResult="failure" output=< Mar 19 20:13:05 crc kubenswrapper[4826]: timeout: failed to connect service ":50051" within 1s Mar 19 20:13:05 crc kubenswrapper[4826]: > Mar 19 20:13:09 crc kubenswrapper[4826]: I0319 20:13:09.090185 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-p9rmq" Mar 19 20:13:09 crc kubenswrapper[4826]: I0319 20:13:09.090706 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-p9rmq" Mar 19 20:13:10 crc kubenswrapper[4826]: I0319 20:13:10.164276 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-p9rmq" podUID="3a20f6a8-01f3-4492-856d-e5f494672fa3" containerName="registry-server" probeResult="failure" output=< Mar 19 20:13:10 crc kubenswrapper[4826]: timeout: failed to connect service ":50051" within 1s Mar 19 20:13:10 crc kubenswrapper[4826]: > Mar 19 20:13:16 crc kubenswrapper[4826]: I0319 20:13:16.016393 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-rrk78" podUID="1c355bc9-260d-4bdb-ac08-a31641c1efb0" containerName="registry-server" probeResult="failure" output=< Mar 19 20:13:16 crc kubenswrapper[4826]: timeout: failed to connect service ":50051" within 1s Mar 19 20:13:16 crc kubenswrapper[4826]: > Mar 19 20:13:19 crc kubenswrapper[4826]: I0319 20:13:19.158191 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-p9rmq" Mar 19 20:13:19 crc kubenswrapper[4826]: I0319 20:13:19.369309 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-p9rmq" Mar 19 20:13:19 crc kubenswrapper[4826]: I0319 20:13:19.797415 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p9rmq"] Mar 19 20:13:19 crc kubenswrapper[4826]: I0319 20:13:19.969213 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bp86p"] Mar 19 20:13:19 crc kubenswrapper[4826]: I0319 20:13:19.969481 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bp86p" podUID="fdc7f0e3-fe71-4226-a701-40b602cf6444" containerName="registry-server" containerID="cri-o://700f0b148af7c319786f5b3905922158394330437ce052b0c7378ee132ffba47" gracePeriod=2 Mar 19 20:13:20 crc kubenswrapper[4826]: I0319 20:13:20.795876 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bp86p" Mar 19 20:13:20 crc kubenswrapper[4826]: I0319 20:13:20.859290 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmwxj\" (UniqueName: \"kubernetes.io/projected/fdc7f0e3-fe71-4226-a701-40b602cf6444-kube-api-access-kmwxj\") pod \"fdc7f0e3-fe71-4226-a701-40b602cf6444\" (UID: \"fdc7f0e3-fe71-4226-a701-40b602cf6444\") " Mar 19 20:13:20 crc kubenswrapper[4826]: I0319 20:13:20.859451 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdc7f0e3-fe71-4226-a701-40b602cf6444-utilities\") pod \"fdc7f0e3-fe71-4226-a701-40b602cf6444\" (UID: \"fdc7f0e3-fe71-4226-a701-40b602cf6444\") " Mar 19 20:13:20 crc kubenswrapper[4826]: I0319 20:13:20.859486 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdc7f0e3-fe71-4226-a701-40b602cf6444-catalog-content\") pod \"fdc7f0e3-fe71-4226-a701-40b602cf6444\" (UID: \"fdc7f0e3-fe71-4226-a701-40b602cf6444\") " Mar 19 20:13:20 crc kubenswrapper[4826]: I0319 20:13:20.864846 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdc7f0e3-fe71-4226-a701-40b602cf6444-utilities" (OuterVolumeSpecName: "utilities") pod "fdc7f0e3-fe71-4226-a701-40b602cf6444" (UID: "fdc7f0e3-fe71-4226-a701-40b602cf6444"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:13:20 crc kubenswrapper[4826]: I0319 20:13:20.877900 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdc7f0e3-fe71-4226-a701-40b602cf6444-kube-api-access-kmwxj" (OuterVolumeSpecName: "kube-api-access-kmwxj") pod "fdc7f0e3-fe71-4226-a701-40b602cf6444" (UID: "fdc7f0e3-fe71-4226-a701-40b602cf6444"). InnerVolumeSpecName "kube-api-access-kmwxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:13:20 crc kubenswrapper[4826]: I0319 20:13:20.963772 4826 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdc7f0e3-fe71-4226-a701-40b602cf6444-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 20:13:20 crc kubenswrapper[4826]: I0319 20:13:20.963808 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmwxj\" (UniqueName: \"kubernetes.io/projected/fdc7f0e3-fe71-4226-a701-40b602cf6444-kube-api-access-kmwxj\") on node \"crc\" DevicePath \"\"" Mar 19 20:13:21 crc kubenswrapper[4826]: I0319 20:13:20.999981 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdc7f0e3-fe71-4226-a701-40b602cf6444-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fdc7f0e3-fe71-4226-a701-40b602cf6444" (UID: "fdc7f0e3-fe71-4226-a701-40b602cf6444"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:13:21 crc kubenswrapper[4826]: I0319 20:13:21.028297 4826 generic.go:334] "Generic (PLEG): container finished" podID="fdc7f0e3-fe71-4226-a701-40b602cf6444" containerID="700f0b148af7c319786f5b3905922158394330437ce052b0c7378ee132ffba47" exitCode=0 Mar 19 20:13:21 crc kubenswrapper[4826]: I0319 20:13:21.028549 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bp86p" Mar 19 20:13:21 crc kubenswrapper[4826]: I0319 20:13:21.028611 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bp86p" event={"ID":"fdc7f0e3-fe71-4226-a701-40b602cf6444","Type":"ContainerDied","Data":"700f0b148af7c319786f5b3905922158394330437ce052b0c7378ee132ffba47"} Mar 19 20:13:21 crc kubenswrapper[4826]: I0319 20:13:21.028649 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bp86p" event={"ID":"fdc7f0e3-fe71-4226-a701-40b602cf6444","Type":"ContainerDied","Data":"00bac2129e9629f7f6ed0424368567c43c29cbb9988d4f3306f00bc9124fdc8b"} Mar 19 20:13:21 crc kubenswrapper[4826]: I0319 20:13:21.028685 4826 scope.go:117] "RemoveContainer" containerID="700f0b148af7c319786f5b3905922158394330437ce052b0c7378ee132ffba47" Mar 19 20:13:21 crc kubenswrapper[4826]: I0319 20:13:21.066110 4826 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdc7f0e3-fe71-4226-a701-40b602cf6444-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 20:13:21 crc kubenswrapper[4826]: I0319 20:13:21.076092 4826 scope.go:117] "RemoveContainer" containerID="9482e4dda722a69decbcb0dee79dc8ff864ddede5698d6ec77ae7e70de51c2ce" Mar 19 20:13:21 crc kubenswrapper[4826]: I0319 20:13:21.076962 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bp86p"] Mar 19 20:13:21 crc kubenswrapper[4826]: I0319 20:13:21.088333 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bp86p"] Mar 19 20:13:21 crc kubenswrapper[4826]: I0319 20:13:21.106947 4826 scope.go:117] "RemoveContainer" containerID="724c6af6dc90b0175b4102f72a5587f2c5c4c14f2fd35d4bf18031456a6063bd" Mar 19 20:13:21 crc kubenswrapper[4826]: I0319 20:13:21.157695 4826 scope.go:117] "RemoveContainer" containerID="700f0b148af7c319786f5b3905922158394330437ce052b0c7378ee132ffba47" Mar 19 20:13:21 crc kubenswrapper[4826]: E0319 20:13:21.158201 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"700f0b148af7c319786f5b3905922158394330437ce052b0c7378ee132ffba47\": container with ID starting with 700f0b148af7c319786f5b3905922158394330437ce052b0c7378ee132ffba47 not found: ID does not exist" containerID="700f0b148af7c319786f5b3905922158394330437ce052b0c7378ee132ffba47" Mar 19 20:13:21 crc kubenswrapper[4826]: I0319 20:13:21.158248 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"700f0b148af7c319786f5b3905922158394330437ce052b0c7378ee132ffba47"} err="failed to get container status \"700f0b148af7c319786f5b3905922158394330437ce052b0c7378ee132ffba47\": rpc error: code = NotFound desc = could not find container \"700f0b148af7c319786f5b3905922158394330437ce052b0c7378ee132ffba47\": container with ID starting with 700f0b148af7c319786f5b3905922158394330437ce052b0c7378ee132ffba47 not found: ID does not exist" Mar 19 20:13:21 crc kubenswrapper[4826]: I0319 20:13:21.158279 4826 scope.go:117] "RemoveContainer" containerID="9482e4dda722a69decbcb0dee79dc8ff864ddede5698d6ec77ae7e70de51c2ce" Mar 19 20:13:21 crc kubenswrapper[4826]: E0319 20:13:21.159093 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9482e4dda722a69decbcb0dee79dc8ff864ddede5698d6ec77ae7e70de51c2ce\": container with ID starting with 9482e4dda722a69decbcb0dee79dc8ff864ddede5698d6ec77ae7e70de51c2ce not found: ID does not exist" containerID="9482e4dda722a69decbcb0dee79dc8ff864ddede5698d6ec77ae7e70de51c2ce" Mar 19 20:13:21 crc kubenswrapper[4826]: I0319 20:13:21.159116 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9482e4dda722a69decbcb0dee79dc8ff864ddede5698d6ec77ae7e70de51c2ce"} err="failed to get container status \"9482e4dda722a69decbcb0dee79dc8ff864ddede5698d6ec77ae7e70de51c2ce\": rpc error: code = NotFound desc = could not find container \"9482e4dda722a69decbcb0dee79dc8ff864ddede5698d6ec77ae7e70de51c2ce\": container with ID starting with 9482e4dda722a69decbcb0dee79dc8ff864ddede5698d6ec77ae7e70de51c2ce not found: ID does not exist" Mar 19 20:13:21 crc kubenswrapper[4826]: I0319 20:13:21.159132 4826 scope.go:117] "RemoveContainer" containerID="724c6af6dc90b0175b4102f72a5587f2c5c4c14f2fd35d4bf18031456a6063bd" Mar 19 20:13:21 crc kubenswrapper[4826]: E0319 20:13:21.160434 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"724c6af6dc90b0175b4102f72a5587f2c5c4c14f2fd35d4bf18031456a6063bd\": container with ID starting with 724c6af6dc90b0175b4102f72a5587f2c5c4c14f2fd35d4bf18031456a6063bd not found: ID does not exist" containerID="724c6af6dc90b0175b4102f72a5587f2c5c4c14f2fd35d4bf18031456a6063bd" Mar 19 20:13:21 crc kubenswrapper[4826]: I0319 20:13:21.160463 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"724c6af6dc90b0175b4102f72a5587f2c5c4c14f2fd35d4bf18031456a6063bd"} err="failed to get container status \"724c6af6dc90b0175b4102f72a5587f2c5c4c14f2fd35d4bf18031456a6063bd\": rpc error: code = NotFound desc = could not find container \"724c6af6dc90b0175b4102f72a5587f2c5c4c14f2fd35d4bf18031456a6063bd\": container with ID starting with 724c6af6dc90b0175b4102f72a5587f2c5c4c14f2fd35d4bf18031456a6063bd not found: ID does not exist" Mar 19 20:13:22 crc kubenswrapper[4826]: I0319 20:13:22.002238 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdc7f0e3-fe71-4226-a701-40b602cf6444" path="/var/lib/kubelet/pods/fdc7f0e3-fe71-4226-a701-40b602cf6444/volumes" Mar 19 20:13:25 crc kubenswrapper[4826]: I0319 20:13:25.016241 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rrk78" Mar 19 20:13:25 crc kubenswrapper[4826]: I0319 20:13:25.071715 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rrk78" Mar 19 20:13:26 crc kubenswrapper[4826]: I0319 20:13:26.164400 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rrk78"] Mar 19 20:13:26 crc kubenswrapper[4826]: I0319 20:13:26.165264 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rrk78" podUID="1c355bc9-260d-4bdb-ac08-a31641c1efb0" containerName="registry-server" containerID="cri-o://1e23d27deaba0bfc72edb18ffe7f681743d08f73feb215f05cdd5e29a9995a5d" gracePeriod=2 Mar 19 20:13:27 crc kubenswrapper[4826]: I0319 20:13:27.141957 4826 generic.go:334] "Generic (PLEG): container finished" podID="1c355bc9-260d-4bdb-ac08-a31641c1efb0" containerID="1e23d27deaba0bfc72edb18ffe7f681743d08f73feb215f05cdd5e29a9995a5d" exitCode=0 Mar 19 20:13:27 crc kubenswrapper[4826]: I0319 20:13:27.142082 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rrk78" event={"ID":"1c355bc9-260d-4bdb-ac08-a31641c1efb0","Type":"ContainerDied","Data":"1e23d27deaba0bfc72edb18ffe7f681743d08f73feb215f05cdd5e29a9995a5d"} Mar 19 20:13:27 crc kubenswrapper[4826]: I0319 20:13:27.706874 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rrk78" Mar 19 20:13:27 crc kubenswrapper[4826]: I0319 20:13:27.743811 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c355bc9-260d-4bdb-ac08-a31641c1efb0-utilities\") pod \"1c355bc9-260d-4bdb-ac08-a31641c1efb0\" (UID: \"1c355bc9-260d-4bdb-ac08-a31641c1efb0\") " Mar 19 20:13:27 crc kubenswrapper[4826]: I0319 20:13:27.743990 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c355bc9-260d-4bdb-ac08-a31641c1efb0-catalog-content\") pod \"1c355bc9-260d-4bdb-ac08-a31641c1efb0\" (UID: \"1c355bc9-260d-4bdb-ac08-a31641c1efb0\") " Mar 19 20:13:27 crc kubenswrapper[4826]: I0319 20:13:27.744047 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdmm9\" (UniqueName: \"kubernetes.io/projected/1c355bc9-260d-4bdb-ac08-a31641c1efb0-kube-api-access-zdmm9\") pod \"1c355bc9-260d-4bdb-ac08-a31641c1efb0\" (UID: \"1c355bc9-260d-4bdb-ac08-a31641c1efb0\") " Mar 19 20:13:27 crc kubenswrapper[4826]: I0319 20:13:27.745059 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c355bc9-260d-4bdb-ac08-a31641c1efb0-utilities" (OuterVolumeSpecName: "utilities") pod "1c355bc9-260d-4bdb-ac08-a31641c1efb0" (UID: "1c355bc9-260d-4bdb-ac08-a31641c1efb0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:13:27 crc kubenswrapper[4826]: I0319 20:13:27.761047 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c355bc9-260d-4bdb-ac08-a31641c1efb0-kube-api-access-zdmm9" (OuterVolumeSpecName: "kube-api-access-zdmm9") pod "1c355bc9-260d-4bdb-ac08-a31641c1efb0" (UID: "1c355bc9-260d-4bdb-ac08-a31641c1efb0"). InnerVolumeSpecName "kube-api-access-zdmm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:13:27 crc kubenswrapper[4826]: I0319 20:13:27.782814 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c355bc9-260d-4bdb-ac08-a31641c1efb0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1c355bc9-260d-4bdb-ac08-a31641c1efb0" (UID: "1c355bc9-260d-4bdb-ac08-a31641c1efb0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:13:27 crc kubenswrapper[4826]: I0319 20:13:27.846787 4826 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c355bc9-260d-4bdb-ac08-a31641c1efb0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 20:13:27 crc kubenswrapper[4826]: I0319 20:13:27.846824 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdmm9\" (UniqueName: \"kubernetes.io/projected/1c355bc9-260d-4bdb-ac08-a31641c1efb0-kube-api-access-zdmm9\") on node \"crc\" DevicePath \"\"" Mar 19 20:13:27 crc kubenswrapper[4826]: I0319 20:13:27.846833 4826 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c355bc9-260d-4bdb-ac08-a31641c1efb0-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 20:13:28 crc kubenswrapper[4826]: I0319 20:13:28.156793 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rrk78" event={"ID":"1c355bc9-260d-4bdb-ac08-a31641c1efb0","Type":"ContainerDied","Data":"345ebb1574b1080dab2e06685ad60d093e1f0acd3b169074d8319b04305780da"} Mar 19 20:13:28 crc kubenswrapper[4826]: I0319 20:13:28.156846 4826 scope.go:117] "RemoveContainer" containerID="1e23d27deaba0bfc72edb18ffe7f681743d08f73feb215f05cdd5e29a9995a5d" Mar 19 20:13:28 crc kubenswrapper[4826]: I0319 20:13:28.156884 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rrk78" Mar 19 20:13:28 crc kubenswrapper[4826]: I0319 20:13:28.183406 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rrk78"] Mar 19 20:13:28 crc kubenswrapper[4826]: I0319 20:13:28.190057 4826 scope.go:117] "RemoveContainer" containerID="6e9b50033ff0540fd15c4aa6a4c10430ce7208c816f58ba71b78773c8abd4b58" Mar 19 20:13:28 crc kubenswrapper[4826]: I0319 20:13:28.195606 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rrk78"] Mar 19 20:13:28 crc kubenswrapper[4826]: I0319 20:13:28.219942 4826 scope.go:117] "RemoveContainer" containerID="066d919609a203fbf6ee5c1354f5bb1b4e2be3e282328de3b4df56a56ed46b69" Mar 19 20:13:30 crc kubenswrapper[4826]: I0319 20:13:30.023478 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c355bc9-260d-4bdb-ac08-a31641c1efb0" path="/var/lib/kubelet/pods/1c355bc9-260d-4bdb-ac08-a31641c1efb0/volumes" Mar 19 20:13:32 crc kubenswrapper[4826]: I0319 20:13:32.017908 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Mar 19 20:13:32 crc kubenswrapper[4826]: E0319 20:13:32.018917 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c355bc9-260d-4bdb-ac08-a31641c1efb0" containerName="registry-server" Mar 19 20:13:32 crc kubenswrapper[4826]: I0319 20:13:32.018938 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c355bc9-260d-4bdb-ac08-a31641c1efb0" containerName="registry-server" Mar 19 20:13:32 crc kubenswrapper[4826]: E0319 20:13:32.018954 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c355bc9-260d-4bdb-ac08-a31641c1efb0" containerName="extract-content" Mar 19 20:13:32 crc kubenswrapper[4826]: I0319 20:13:32.018963 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c355bc9-260d-4bdb-ac08-a31641c1efb0" containerName="extract-content" Mar 19 20:13:32 crc kubenswrapper[4826]: E0319 20:13:32.018982 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdc7f0e3-fe71-4226-a701-40b602cf6444" containerName="extract-content" Mar 19 20:13:32 crc kubenswrapper[4826]: I0319 20:13:32.018992 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdc7f0e3-fe71-4226-a701-40b602cf6444" containerName="extract-content" Mar 19 20:13:32 crc kubenswrapper[4826]: E0319 20:13:32.019017 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c355bc9-260d-4bdb-ac08-a31641c1efb0" containerName="extract-utilities" Mar 19 20:13:32 crc kubenswrapper[4826]: I0319 20:13:32.019025 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c355bc9-260d-4bdb-ac08-a31641c1efb0" containerName="extract-utilities" Mar 19 20:13:32 crc kubenswrapper[4826]: E0319 20:13:32.019080 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdc7f0e3-fe71-4226-a701-40b602cf6444" containerName="registry-server" Mar 19 20:13:32 crc kubenswrapper[4826]: I0319 20:13:32.019088 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdc7f0e3-fe71-4226-a701-40b602cf6444" containerName="registry-server" Mar 19 20:13:32 crc kubenswrapper[4826]: E0319 20:13:32.019111 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdc7f0e3-fe71-4226-a701-40b602cf6444" containerName="extract-utilities" Mar 19 20:13:32 crc kubenswrapper[4826]: I0319 20:13:32.019119 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdc7f0e3-fe71-4226-a701-40b602cf6444" containerName="extract-utilities" Mar 19 20:13:32 crc kubenswrapper[4826]: I0319 20:13:32.019450 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdc7f0e3-fe71-4226-a701-40b602cf6444" containerName="registry-server" Mar 19 20:13:32 crc kubenswrapper[4826]: I0319 20:13:32.019485 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c355bc9-260d-4bdb-ac08-a31641c1efb0" containerName="registry-server" Mar 19 20:13:32 crc kubenswrapper[4826]: I0319 20:13:32.020593 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 19 20:13:32 crc kubenswrapper[4826]: I0319 20:13:32.022791 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 19 20:13:32 crc kubenswrapper[4826]: I0319 20:13:32.023073 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Mar 19 20:13:32 crc kubenswrapper[4826]: I0319 20:13:32.023127 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Mar 19 20:13:32 crc kubenswrapper[4826]: I0319 20:13:32.023188 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-vxjvc" Mar 19 20:13:32 crc kubenswrapper[4826]: I0319 20:13:32.029053 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 19 20:13:32 crc kubenswrapper[4826]: I0319 20:13:32.063272 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/36ba563e-3246-4dbb-ba34-e15fd6646fad-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"36ba563e-3246-4dbb-ba34-e15fd6646fad\") " pod="openstack/tempest-tests-tempest" Mar 19 20:13:32 crc kubenswrapper[4826]: I0319 20:13:32.063603 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/36ba563e-3246-4dbb-ba34-e15fd6646fad-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"36ba563e-3246-4dbb-ba34-e15fd6646fad\") " pod="openstack/tempest-tests-tempest" Mar 19 20:13:32 crc kubenswrapper[4826]: I0319 20:13:32.063983 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/36ba563e-3246-4dbb-ba34-e15fd6646fad-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"36ba563e-3246-4dbb-ba34-e15fd6646fad\") " pod="openstack/tempest-tests-tempest" Mar 19 20:13:32 crc kubenswrapper[4826]: I0319 20:13:32.064118 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/36ba563e-3246-4dbb-ba34-e15fd6646fad-config-data\") pod \"tempest-tests-tempest\" (UID: \"36ba563e-3246-4dbb-ba34-e15fd6646fad\") " pod="openstack/tempest-tests-tempest" Mar 19 20:13:32 crc kubenswrapper[4826]: I0319 20:13:32.064200 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/36ba563e-3246-4dbb-ba34-e15fd6646fad-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"36ba563e-3246-4dbb-ba34-e15fd6646fad\") " pod="openstack/tempest-tests-tempest" Mar 19 20:13:32 crc kubenswrapper[4826]: I0319 20:13:32.064460 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/36ba563e-3246-4dbb-ba34-e15fd6646fad-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"36ba563e-3246-4dbb-ba34-e15fd6646fad\") " pod="openstack/tempest-tests-tempest" Mar 19 20:13:32 crc kubenswrapper[4826]: I0319 20:13:32.064521 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/36ba563e-3246-4dbb-ba34-e15fd6646fad-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"36ba563e-3246-4dbb-ba34-e15fd6646fad\") " pod="openstack/tempest-tests-tempest" Mar 19 20:13:32 crc kubenswrapper[4826]: I0319 20:13:32.064629 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlt8n\" (UniqueName: \"kubernetes.io/projected/36ba563e-3246-4dbb-ba34-e15fd6646fad-kube-api-access-tlt8n\") pod \"tempest-tests-tempest\" (UID: \"36ba563e-3246-4dbb-ba34-e15fd6646fad\") " pod="openstack/tempest-tests-tempest" Mar 19 20:13:32 crc kubenswrapper[4826]: I0319 20:13:32.064706 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"tempest-tests-tempest\" (UID: \"36ba563e-3246-4dbb-ba34-e15fd6646fad\") " pod="openstack/tempest-tests-tempest" Mar 19 20:13:32 crc kubenswrapper[4826]: I0319 20:13:32.166568 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/36ba563e-3246-4dbb-ba34-e15fd6646fad-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"36ba563e-3246-4dbb-ba34-e15fd6646fad\") " pod="openstack/tempest-tests-tempest" Mar 19 20:13:32 crc kubenswrapper[4826]: I0319 20:13:32.167302 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/36ba563e-3246-4dbb-ba34-e15fd6646fad-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"36ba563e-3246-4dbb-ba34-e15fd6646fad\") " pod="openstack/tempest-tests-tempest" Mar 19 20:13:32 crc kubenswrapper[4826]: I0319 20:13:32.167365 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/36ba563e-3246-4dbb-ba34-e15fd6646fad-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"36ba563e-3246-4dbb-ba34-e15fd6646fad\") " pod="openstack/tempest-tests-tempest" Mar 19 20:13:32 crc kubenswrapper[4826]: I0319 20:13:32.167393 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/36ba563e-3246-4dbb-ba34-e15fd6646fad-config-data\") pod \"tempest-tests-tempest\" (UID: \"36ba563e-3246-4dbb-ba34-e15fd6646fad\") " pod="openstack/tempest-tests-tempest" Mar 19 20:13:32 crc kubenswrapper[4826]: I0319 20:13:32.167506 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/36ba563e-3246-4dbb-ba34-e15fd6646fad-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"36ba563e-3246-4dbb-ba34-e15fd6646fad\") " pod="openstack/tempest-tests-tempest" Mar 19 20:13:32 crc kubenswrapper[4826]: I0319 20:13:32.167750 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/36ba563e-3246-4dbb-ba34-e15fd6646fad-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"36ba563e-3246-4dbb-ba34-e15fd6646fad\") " pod="openstack/tempest-tests-tempest" Mar 19 20:13:32 crc kubenswrapper[4826]: I0319 20:13:32.167815 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/36ba563e-3246-4dbb-ba34-e15fd6646fad-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"36ba563e-3246-4dbb-ba34-e15fd6646fad\") " pod="openstack/tempest-tests-tempest" Mar 19 20:13:32 crc kubenswrapper[4826]: I0319 20:13:32.167918 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlt8n\" (UniqueName: \"kubernetes.io/projected/36ba563e-3246-4dbb-ba34-e15fd6646fad-kube-api-access-tlt8n\") pod \"tempest-tests-tempest\" (UID: \"36ba563e-3246-4dbb-ba34-e15fd6646fad\") " pod="openstack/tempest-tests-tempest" Mar 19 20:13:32 crc kubenswrapper[4826]: I0319 20:13:32.167968 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"tempest-tests-tempest\" (UID: \"36ba563e-3246-4dbb-ba34-e15fd6646fad\") " pod="openstack/tempest-tests-tempest" Mar 19 20:13:32 crc kubenswrapper[4826]: I0319 20:13:32.168043 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/36ba563e-3246-4dbb-ba34-e15fd6646fad-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"36ba563e-3246-4dbb-ba34-e15fd6646fad\") " pod="openstack/tempest-tests-tempest" Mar 19 20:13:32 crc kubenswrapper[4826]: I0319 20:13:32.168602 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/36ba563e-3246-4dbb-ba34-e15fd6646fad-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"36ba563e-3246-4dbb-ba34-e15fd6646fad\") " pod="openstack/tempest-tests-tempest" Mar 19 20:13:32 crc kubenswrapper[4826]: I0319 20:13:32.169781 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/36ba563e-3246-4dbb-ba34-e15fd6646fad-config-data\") pod \"tempest-tests-tempest\" (UID: \"36ba563e-3246-4dbb-ba34-e15fd6646fad\") " pod="openstack/tempest-tests-tempest" Mar 19 20:13:32 crc kubenswrapper[4826]: I0319 20:13:32.169838 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/36ba563e-3246-4dbb-ba34-e15fd6646fad-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"36ba563e-3246-4dbb-ba34-e15fd6646fad\") " pod="openstack/tempest-tests-tempest" Mar 19 20:13:32 crc kubenswrapper[4826]: I0319 20:13:32.173821 4826 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"tempest-tests-tempest\" (UID: \"36ba563e-3246-4dbb-ba34-e15fd6646fad\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/tempest-tests-tempest" Mar 19 20:13:32 crc kubenswrapper[4826]: I0319 20:13:32.185078 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/36ba563e-3246-4dbb-ba34-e15fd6646fad-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"36ba563e-3246-4dbb-ba34-e15fd6646fad\") " pod="openstack/tempest-tests-tempest" Mar 19 20:13:32 crc kubenswrapper[4826]: I0319 20:13:32.185226 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/36ba563e-3246-4dbb-ba34-e15fd6646fad-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"36ba563e-3246-4dbb-ba34-e15fd6646fad\") " pod="openstack/tempest-tests-tempest" Mar 19 20:13:32 crc kubenswrapper[4826]: I0319 20:13:32.187519 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/36ba563e-3246-4dbb-ba34-e15fd6646fad-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"36ba563e-3246-4dbb-ba34-e15fd6646fad\") " pod="openstack/tempest-tests-tempest" Mar 19 20:13:32 crc kubenswrapper[4826]: I0319 20:13:32.188767 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlt8n\" (UniqueName: \"kubernetes.io/projected/36ba563e-3246-4dbb-ba34-e15fd6646fad-kube-api-access-tlt8n\") pod \"tempest-tests-tempest\" (UID: \"36ba563e-3246-4dbb-ba34-e15fd6646fad\") " pod="openstack/tempest-tests-tempest" Mar 19 20:13:32 crc kubenswrapper[4826]: I0319 20:13:32.224528 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"tempest-tests-tempest\" (UID: \"36ba563e-3246-4dbb-ba34-e15fd6646fad\") " pod="openstack/tempest-tests-tempest" Mar 19 20:13:32 crc kubenswrapper[4826]: I0319 20:13:32.345747 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 19 20:13:32 crc kubenswrapper[4826]: I0319 20:13:32.937110 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 19 20:13:33 crc kubenswrapper[4826]: I0319 20:13:33.217776 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"36ba563e-3246-4dbb-ba34-e15fd6646fad","Type":"ContainerStarted","Data":"4db1569639cd08c33f37c6f66fd4ae88f333f99fed18a87cd7c4b02404abb45c"} Mar 19 20:14:00 crc kubenswrapper[4826]: I0319 20:14:00.174432 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565854-m7sgq"] Mar 19 20:14:00 crc kubenswrapper[4826]: I0319 20:14:00.176302 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565854-m7sgq" Mar 19 20:14:00 crc kubenswrapper[4826]: I0319 20:14:00.187376 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b27wl" Mar 19 20:14:00 crc kubenswrapper[4826]: I0319 20:14:00.187545 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 20:14:00 crc kubenswrapper[4826]: I0319 20:14:00.187688 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 20:14:00 crc kubenswrapper[4826]: I0319 20:14:00.188727 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565854-m7sgq"] Mar 19 20:14:00 crc kubenswrapper[4826]: I0319 20:14:00.272114 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkrkn\" (UniqueName: \"kubernetes.io/projected/89d4e4ee-f487-404d-9bdd-a945c36d7cbc-kube-api-access-bkrkn\") pod \"auto-csr-approver-29565854-m7sgq\" (UID: \"89d4e4ee-f487-404d-9bdd-a945c36d7cbc\") " pod="openshift-infra/auto-csr-approver-29565854-m7sgq" Mar 19 20:14:00 crc kubenswrapper[4826]: I0319 20:14:00.375492 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkrkn\" (UniqueName: \"kubernetes.io/projected/89d4e4ee-f487-404d-9bdd-a945c36d7cbc-kube-api-access-bkrkn\") pod \"auto-csr-approver-29565854-m7sgq\" (UID: \"89d4e4ee-f487-404d-9bdd-a945c36d7cbc\") " pod="openshift-infra/auto-csr-approver-29565854-m7sgq" Mar 19 20:14:00 crc kubenswrapper[4826]: I0319 20:14:00.396169 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkrkn\" (UniqueName: \"kubernetes.io/projected/89d4e4ee-f487-404d-9bdd-a945c36d7cbc-kube-api-access-bkrkn\") pod \"auto-csr-approver-29565854-m7sgq\" (UID: \"89d4e4ee-f487-404d-9bdd-a945c36d7cbc\") " pod="openshift-infra/auto-csr-approver-29565854-m7sgq" Mar 19 20:14:00 crc kubenswrapper[4826]: I0319 20:14:00.503960 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565854-m7sgq" Mar 19 20:14:06 crc kubenswrapper[4826]: E0319 20:14:06.603645 4826 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Mar 19 20:14:06 crc kubenswrapper[4826]: E0319 20:14:06.609709 4826 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tlt8n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(36ba563e-3246-4dbb-ba34-e15fd6646fad): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 20:14:06 crc kubenswrapper[4826]: E0319 20:14:06.610824 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="36ba563e-3246-4dbb-ba34-e15fd6646fad" Mar 19 20:14:06 crc kubenswrapper[4826]: E0319 20:14:06.648124 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="36ba563e-3246-4dbb-ba34-e15fd6646fad" Mar 19 20:14:07 crc kubenswrapper[4826]: I0319 20:14:07.038969 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565854-m7sgq"] Mar 19 20:14:07 crc kubenswrapper[4826]: I0319 20:14:07.664053 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565854-m7sgq" event={"ID":"89d4e4ee-f487-404d-9bdd-a945c36d7cbc","Type":"ContainerStarted","Data":"b33aa2eea8839ebbe0d1bba247d0e585292f6e4aad012775260fe35cf31b41ee"} Mar 19 20:14:09 crc kubenswrapper[4826]: I0319 20:14:09.697227 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565854-m7sgq" event={"ID":"89d4e4ee-f487-404d-9bdd-a945c36d7cbc","Type":"ContainerStarted","Data":"945b8e02372f2bc787044d5007674cf89e3f5889d0014bbaf6ca51c6a1bcb018"} Mar 19 20:14:09 crc kubenswrapper[4826]: I0319 20:14:09.732062 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565854-m7sgq" podStartSLOduration=8.802298252 podStartE2EDuration="9.732030273s" podCreationTimestamp="2026-03-19 20:14:00 +0000 UTC" firstStartedPulling="2026-03-19 20:14:07.047390142 +0000 UTC m=+4671.801458465" lastFinishedPulling="2026-03-19 20:14:07.977122123 +0000 UTC m=+4672.731190486" observedRunningTime="2026-03-19 20:14:09.717754337 +0000 UTC m=+4674.471822680" watchObservedRunningTime="2026-03-19 20:14:09.732030273 +0000 UTC m=+4674.486098636" Mar 19 20:14:10 crc kubenswrapper[4826]: I0319 20:14:10.708968 4826 generic.go:334] "Generic (PLEG): container finished" podID="89d4e4ee-f487-404d-9bdd-a945c36d7cbc" containerID="945b8e02372f2bc787044d5007674cf89e3f5889d0014bbaf6ca51c6a1bcb018" exitCode=0 Mar 19 20:14:10 crc kubenswrapper[4826]: I0319 20:14:10.709017 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565854-m7sgq" event={"ID":"89d4e4ee-f487-404d-9bdd-a945c36d7cbc","Type":"ContainerDied","Data":"945b8e02372f2bc787044d5007674cf89e3f5889d0014bbaf6ca51c6a1bcb018"} Mar 19 20:14:12 crc kubenswrapper[4826]: I0319 20:14:12.169549 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565854-m7sgq" Mar 19 20:14:12 crc kubenswrapper[4826]: I0319 20:14:12.294302 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkrkn\" (UniqueName: \"kubernetes.io/projected/89d4e4ee-f487-404d-9bdd-a945c36d7cbc-kube-api-access-bkrkn\") pod \"89d4e4ee-f487-404d-9bdd-a945c36d7cbc\" (UID: \"89d4e4ee-f487-404d-9bdd-a945c36d7cbc\") " Mar 19 20:14:12 crc kubenswrapper[4826]: I0319 20:14:12.303576 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89d4e4ee-f487-404d-9bdd-a945c36d7cbc-kube-api-access-bkrkn" (OuterVolumeSpecName: "kube-api-access-bkrkn") pod "89d4e4ee-f487-404d-9bdd-a945c36d7cbc" (UID: "89d4e4ee-f487-404d-9bdd-a945c36d7cbc"). InnerVolumeSpecName "kube-api-access-bkrkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:14:12 crc kubenswrapper[4826]: I0319 20:14:12.397269 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkrkn\" (UniqueName: \"kubernetes.io/projected/89d4e4ee-f487-404d-9bdd-a945c36d7cbc-kube-api-access-bkrkn\") on node \"crc\" DevicePath \"\"" Mar 19 20:14:12 crc kubenswrapper[4826]: I0319 20:14:12.734923 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565854-m7sgq" event={"ID":"89d4e4ee-f487-404d-9bdd-a945c36d7cbc","Type":"ContainerDied","Data":"b33aa2eea8839ebbe0d1bba247d0e585292f6e4aad012775260fe35cf31b41ee"} Mar 19 20:14:12 crc kubenswrapper[4826]: I0319 20:14:12.734963 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b33aa2eea8839ebbe0d1bba247d0e585292f6e4aad012775260fe35cf31b41ee" Mar 19 20:14:12 crc kubenswrapper[4826]: I0319 20:14:12.735027 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565854-m7sgq" Mar 19 20:14:12 crc kubenswrapper[4826]: I0319 20:14:12.810074 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565848-9hxcr"] Mar 19 20:14:12 crc kubenswrapper[4826]: I0319 20:14:12.821346 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565848-9hxcr"] Mar 19 20:14:13 crc kubenswrapper[4826]: I0319 20:14:13.989878 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93d364b9-495c-49a7-9c05-b6f0194baff2" path="/var/lib/kubelet/pods/93d364b9-495c-49a7-9c05-b6f0194baff2/volumes" Mar 19 20:14:20 crc kubenswrapper[4826]: I0319 20:14:20.451829 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 19 20:14:23 crc kubenswrapper[4826]: I0319 20:14:23.889515 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"36ba563e-3246-4dbb-ba34-e15fd6646fad","Type":"ContainerStarted","Data":"194aba72010cf27497ca6cea02ad74054529a252896975a16f9a305f2c121f86"} Mar 19 20:14:23 crc kubenswrapper[4826]: I0319 20:14:23.927518 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=6.426735783 podStartE2EDuration="53.92749564s" podCreationTimestamp="2026-03-19 20:13:30 +0000 UTC" firstStartedPulling="2026-03-19 20:13:32.947830551 +0000 UTC m=+4637.701898864" lastFinishedPulling="2026-03-19 20:14:20.448590398 +0000 UTC m=+4685.202658721" observedRunningTime="2026-03-19 20:14:23.91718544 +0000 UTC m=+4688.671253783" watchObservedRunningTime="2026-03-19 20:14:23.92749564 +0000 UTC m=+4688.681563963" Mar 19 20:14:40 crc kubenswrapper[4826]: I0319 20:14:40.006906 4826 scope.go:117] "RemoveContainer" containerID="2c3a27dcd08892facd14d19c4f81989418a74d404528ba37ee3545669c729224" Mar 19 20:15:00 crc kubenswrapper[4826]: I0319 20:15:00.197439 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565855-fss9g"] Mar 19 20:15:00 crc kubenswrapper[4826]: E0319 20:15:00.198578 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89d4e4ee-f487-404d-9bdd-a945c36d7cbc" containerName="oc" Mar 19 20:15:00 crc kubenswrapper[4826]: I0319 20:15:00.198600 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="89d4e4ee-f487-404d-9bdd-a945c36d7cbc" containerName="oc" Mar 19 20:15:00 crc kubenswrapper[4826]: I0319 20:15:00.198852 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="89d4e4ee-f487-404d-9bdd-a945c36d7cbc" containerName="oc" Mar 19 20:15:00 crc kubenswrapper[4826]: I0319 20:15:00.199648 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565855-fss9g" Mar 19 20:15:00 crc kubenswrapper[4826]: I0319 20:15:00.208092 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 19 20:15:00 crc kubenswrapper[4826]: I0319 20:15:00.208336 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 19 20:15:00 crc kubenswrapper[4826]: I0319 20:15:00.212025 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565855-fss9g"] Mar 19 20:15:00 crc kubenswrapper[4826]: I0319 20:15:00.312244 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b4bq\" (UniqueName: \"kubernetes.io/projected/28b3fd6c-b14a-4790-83e3-36ee9da9c840-kube-api-access-7b4bq\") pod \"collect-profiles-29565855-fss9g\" (UID: \"28b3fd6c-b14a-4790-83e3-36ee9da9c840\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565855-fss9g" Mar 19 20:15:00 crc kubenswrapper[4826]: I0319 20:15:00.312399 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/28b3fd6c-b14a-4790-83e3-36ee9da9c840-config-volume\") pod \"collect-profiles-29565855-fss9g\" (UID: \"28b3fd6c-b14a-4790-83e3-36ee9da9c840\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565855-fss9g" Mar 19 20:15:00 crc kubenswrapper[4826]: I0319 20:15:00.312542 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/28b3fd6c-b14a-4790-83e3-36ee9da9c840-secret-volume\") pod \"collect-profiles-29565855-fss9g\" (UID: \"28b3fd6c-b14a-4790-83e3-36ee9da9c840\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565855-fss9g" Mar 19 20:15:00 crc kubenswrapper[4826]: I0319 20:15:00.415171 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7b4bq\" (UniqueName: \"kubernetes.io/projected/28b3fd6c-b14a-4790-83e3-36ee9da9c840-kube-api-access-7b4bq\") pod \"collect-profiles-29565855-fss9g\" (UID: \"28b3fd6c-b14a-4790-83e3-36ee9da9c840\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565855-fss9g" Mar 19 20:15:00 crc kubenswrapper[4826]: I0319 20:15:00.415299 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/28b3fd6c-b14a-4790-83e3-36ee9da9c840-config-volume\") pod \"collect-profiles-29565855-fss9g\" (UID: \"28b3fd6c-b14a-4790-83e3-36ee9da9c840\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565855-fss9g" Mar 19 20:15:00 crc kubenswrapper[4826]: I0319 20:15:00.415427 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/28b3fd6c-b14a-4790-83e3-36ee9da9c840-secret-volume\") pod \"collect-profiles-29565855-fss9g\" (UID: \"28b3fd6c-b14a-4790-83e3-36ee9da9c840\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565855-fss9g" Mar 19 20:15:00 crc kubenswrapper[4826]: I0319 20:15:00.416291 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/28b3fd6c-b14a-4790-83e3-36ee9da9c840-config-volume\") pod \"collect-profiles-29565855-fss9g\" (UID: \"28b3fd6c-b14a-4790-83e3-36ee9da9c840\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565855-fss9g" Mar 19 20:15:00 crc kubenswrapper[4826]: I0319 20:15:00.424460 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/28b3fd6c-b14a-4790-83e3-36ee9da9c840-secret-volume\") pod \"collect-profiles-29565855-fss9g\" (UID: \"28b3fd6c-b14a-4790-83e3-36ee9da9c840\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565855-fss9g" Mar 19 20:15:00 crc kubenswrapper[4826]: I0319 20:15:00.434774 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b4bq\" (UniqueName: \"kubernetes.io/projected/28b3fd6c-b14a-4790-83e3-36ee9da9c840-kube-api-access-7b4bq\") pod \"collect-profiles-29565855-fss9g\" (UID: \"28b3fd6c-b14a-4790-83e3-36ee9da9c840\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565855-fss9g" Mar 19 20:15:00 crc kubenswrapper[4826]: I0319 20:15:00.529344 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565855-fss9g" Mar 19 20:15:01 crc kubenswrapper[4826]: I0319 20:15:01.319693 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565855-fss9g"] Mar 19 20:15:02 crc kubenswrapper[4826]: I0319 20:15:02.425288 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565855-fss9g" event={"ID":"28b3fd6c-b14a-4790-83e3-36ee9da9c840","Type":"ContainerStarted","Data":"ab277d98cce1f4e350cb6c4868374451b0521483515a75371ec903efd8d39ff7"} Mar 19 20:15:03 crc kubenswrapper[4826]: I0319 20:15:03.442197 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565855-fss9g" event={"ID":"28b3fd6c-b14a-4790-83e3-36ee9da9c840","Type":"ContainerStarted","Data":"a7a2391f6898999b4fb6136b7fb397c794db7e94c2aff72298ef2c0da056cfb5"} Mar 19 20:15:03 crc kubenswrapper[4826]: I0319 20:15:03.466583 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29565855-fss9g" podStartSLOduration=3.466562402 podStartE2EDuration="3.466562402s" podCreationTimestamp="2026-03-19 20:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:15:03.456126149 +0000 UTC m=+4728.210194462" watchObservedRunningTime="2026-03-19 20:15:03.466562402 +0000 UTC m=+4728.220630715" Mar 19 20:15:04 crc kubenswrapper[4826]: I0319 20:15:04.457212 4826 generic.go:334] "Generic (PLEG): container finished" podID="28b3fd6c-b14a-4790-83e3-36ee9da9c840" containerID="a7a2391f6898999b4fb6136b7fb397c794db7e94c2aff72298ef2c0da056cfb5" exitCode=0 Mar 19 20:15:04 crc kubenswrapper[4826]: I0319 20:15:04.457328 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565855-fss9g" event={"ID":"28b3fd6c-b14a-4790-83e3-36ee9da9c840","Type":"ContainerDied","Data":"a7a2391f6898999b4fb6136b7fb397c794db7e94c2aff72298ef2c0da056cfb5"} Mar 19 20:15:06 crc kubenswrapper[4826]: I0319 20:15:06.046692 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565855-fss9g" Mar 19 20:15:06 crc kubenswrapper[4826]: I0319 20:15:06.178752 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/28b3fd6c-b14a-4790-83e3-36ee9da9c840-secret-volume\") pod \"28b3fd6c-b14a-4790-83e3-36ee9da9c840\" (UID: \"28b3fd6c-b14a-4790-83e3-36ee9da9c840\") " Mar 19 20:15:06 crc kubenswrapper[4826]: I0319 20:15:06.179388 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7b4bq\" (UniqueName: \"kubernetes.io/projected/28b3fd6c-b14a-4790-83e3-36ee9da9c840-kube-api-access-7b4bq\") pod \"28b3fd6c-b14a-4790-83e3-36ee9da9c840\" (UID: \"28b3fd6c-b14a-4790-83e3-36ee9da9c840\") " Mar 19 20:15:06 crc kubenswrapper[4826]: I0319 20:15:06.179680 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/28b3fd6c-b14a-4790-83e3-36ee9da9c840-config-volume\") pod \"28b3fd6c-b14a-4790-83e3-36ee9da9c840\" (UID: \"28b3fd6c-b14a-4790-83e3-36ee9da9c840\") " Mar 19 20:15:06 crc kubenswrapper[4826]: I0319 20:15:06.186575 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28b3fd6c-b14a-4790-83e3-36ee9da9c840-config-volume" (OuterVolumeSpecName: "config-volume") pod "28b3fd6c-b14a-4790-83e3-36ee9da9c840" (UID: "28b3fd6c-b14a-4790-83e3-36ee9da9c840"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:15:06 crc kubenswrapper[4826]: I0319 20:15:06.201222 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28b3fd6c-b14a-4790-83e3-36ee9da9c840-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "28b3fd6c-b14a-4790-83e3-36ee9da9c840" (UID: "28b3fd6c-b14a-4790-83e3-36ee9da9c840"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:15:06 crc kubenswrapper[4826]: I0319 20:15:06.217780 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28b3fd6c-b14a-4790-83e3-36ee9da9c840-kube-api-access-7b4bq" (OuterVolumeSpecName: "kube-api-access-7b4bq") pod "28b3fd6c-b14a-4790-83e3-36ee9da9c840" (UID: "28b3fd6c-b14a-4790-83e3-36ee9da9c840"). InnerVolumeSpecName "kube-api-access-7b4bq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:15:06 crc kubenswrapper[4826]: I0319 20:15:06.283109 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7b4bq\" (UniqueName: \"kubernetes.io/projected/28b3fd6c-b14a-4790-83e3-36ee9da9c840-kube-api-access-7b4bq\") on node \"crc\" DevicePath \"\"" Mar 19 20:15:06 crc kubenswrapper[4826]: I0319 20:15:06.283147 4826 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/28b3fd6c-b14a-4790-83e3-36ee9da9c840-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 20:15:06 crc kubenswrapper[4826]: I0319 20:15:06.283157 4826 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/28b3fd6c-b14a-4790-83e3-36ee9da9c840-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 19 20:15:06 crc kubenswrapper[4826]: I0319 20:15:06.480673 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565855-fss9g" event={"ID":"28b3fd6c-b14a-4790-83e3-36ee9da9c840","Type":"ContainerDied","Data":"ab277d98cce1f4e350cb6c4868374451b0521483515a75371ec903efd8d39ff7"} Mar 19 20:15:06 crc kubenswrapper[4826]: I0319 20:15:06.480711 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab277d98cce1f4e350cb6c4868374451b0521483515a75371ec903efd8d39ff7" Mar 19 20:15:06 crc kubenswrapper[4826]: I0319 20:15:06.480770 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565855-fss9g" Mar 19 20:15:06 crc kubenswrapper[4826]: I0319 20:15:06.556098 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565810-w7dq2"] Mar 19 20:15:06 crc kubenswrapper[4826]: I0319 20:15:06.566316 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565810-w7dq2"] Mar 19 20:15:07 crc kubenswrapper[4826]: I0319 20:15:07.995422 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5cc3244-041a-411c-9459-64b8b57ed1ae" path="/var/lib/kubelet/pods/f5cc3244-041a-411c-9459-64b8b57ed1ae/volumes" Mar 19 20:15:25 crc kubenswrapper[4826]: I0319 20:15:25.400875 4826 patch_prober.go:28] interesting pod/machine-config-daemon-zz87p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:15:25 crc kubenswrapper[4826]: I0319 20:15:25.401591 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:15:40 crc kubenswrapper[4826]: I0319 20:15:40.173521 4826 scope.go:117] "RemoveContainer" containerID="47fc440aa4c3e5bf057fcf82ecb76a2a0de666fa617ada59ab4bdfa77112bb5d" Mar 19 20:15:55 crc kubenswrapper[4826]: I0319 20:15:55.404333 4826 patch_prober.go:28] interesting pod/machine-config-daemon-zz87p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:15:55 crc kubenswrapper[4826]: I0319 20:15:55.407812 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:15:57 crc kubenswrapper[4826]: I0319 20:15:57.363782 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vt4z5"] Mar 19 20:15:57 crc kubenswrapper[4826]: E0319 20:15:57.370276 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28b3fd6c-b14a-4790-83e3-36ee9da9c840" containerName="collect-profiles" Mar 19 20:15:57 crc kubenswrapper[4826]: I0319 20:15:57.370815 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="28b3fd6c-b14a-4790-83e3-36ee9da9c840" containerName="collect-profiles" Mar 19 20:15:57 crc kubenswrapper[4826]: I0319 20:15:57.375443 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="28b3fd6c-b14a-4790-83e3-36ee9da9c840" containerName="collect-profiles" Mar 19 20:15:57 crc kubenswrapper[4826]: I0319 20:15:57.384265 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vt4z5" Mar 19 20:15:57 crc kubenswrapper[4826]: I0319 20:15:57.503500 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddzmc\" (UniqueName: \"kubernetes.io/projected/4c98a23b-914e-4545-92fd-056eeb4af1ee-kube-api-access-ddzmc\") pod \"redhat-operators-vt4z5\" (UID: \"4c98a23b-914e-4545-92fd-056eeb4af1ee\") " pod="openshift-marketplace/redhat-operators-vt4z5" Mar 19 20:15:57 crc kubenswrapper[4826]: I0319 20:15:57.503884 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c98a23b-914e-4545-92fd-056eeb4af1ee-utilities\") pod \"redhat-operators-vt4z5\" (UID: \"4c98a23b-914e-4545-92fd-056eeb4af1ee\") " pod="openshift-marketplace/redhat-operators-vt4z5" Mar 19 20:15:57 crc kubenswrapper[4826]: I0319 20:15:57.503964 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c98a23b-914e-4545-92fd-056eeb4af1ee-catalog-content\") pod \"redhat-operators-vt4z5\" (UID: \"4c98a23b-914e-4545-92fd-056eeb4af1ee\") " pod="openshift-marketplace/redhat-operators-vt4z5" Mar 19 20:15:57 crc kubenswrapper[4826]: I0319 20:15:57.607314 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c98a23b-914e-4545-92fd-056eeb4af1ee-utilities\") pod \"redhat-operators-vt4z5\" (UID: \"4c98a23b-914e-4545-92fd-056eeb4af1ee\") " pod="openshift-marketplace/redhat-operators-vt4z5" Mar 19 20:15:57 crc kubenswrapper[4826]: I0319 20:15:57.608417 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c98a23b-914e-4545-92fd-056eeb4af1ee-catalog-content\") pod \"redhat-operators-vt4z5\" (UID: \"4c98a23b-914e-4545-92fd-056eeb4af1ee\") " pod="openshift-marketplace/redhat-operators-vt4z5" Mar 19 20:15:57 crc kubenswrapper[4826]: I0319 20:15:57.609393 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddzmc\" (UniqueName: \"kubernetes.io/projected/4c98a23b-914e-4545-92fd-056eeb4af1ee-kube-api-access-ddzmc\") pod \"redhat-operators-vt4z5\" (UID: \"4c98a23b-914e-4545-92fd-056eeb4af1ee\") " pod="openshift-marketplace/redhat-operators-vt4z5" Mar 19 20:15:57 crc kubenswrapper[4826]: I0319 20:15:57.620306 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c98a23b-914e-4545-92fd-056eeb4af1ee-utilities\") pod \"redhat-operators-vt4z5\" (UID: \"4c98a23b-914e-4545-92fd-056eeb4af1ee\") " pod="openshift-marketplace/redhat-operators-vt4z5" Mar 19 20:15:57 crc kubenswrapper[4826]: I0319 20:15:57.621978 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c98a23b-914e-4545-92fd-056eeb4af1ee-catalog-content\") pod \"redhat-operators-vt4z5\" (UID: \"4c98a23b-914e-4545-92fd-056eeb4af1ee\") " pod="openshift-marketplace/redhat-operators-vt4z5" Mar 19 20:15:57 crc kubenswrapper[4826]: I0319 20:15:57.687232 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddzmc\" (UniqueName: \"kubernetes.io/projected/4c98a23b-914e-4545-92fd-056eeb4af1ee-kube-api-access-ddzmc\") pod \"redhat-operators-vt4z5\" (UID: \"4c98a23b-914e-4545-92fd-056eeb4af1ee\") " pod="openshift-marketplace/redhat-operators-vt4z5" Mar 19 20:15:57 crc kubenswrapper[4826]: I0319 20:15:57.715476 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vt4z5"] Mar 19 20:15:57 crc kubenswrapper[4826]: I0319 20:15:57.768887 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vt4z5" Mar 19 20:15:59 crc kubenswrapper[4826]: I0319 20:15:59.580131 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vt4z5"] Mar 19 20:16:00 crc kubenswrapper[4826]: I0319 20:16:00.137146 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vt4z5" event={"ID":"4c98a23b-914e-4545-92fd-056eeb4af1ee","Type":"ContainerDied","Data":"da45a447d2129cee85036f1261d0bf2cdf4e5d868b2a5d5583cb4492b3ea8744"} Mar 19 20:16:00 crc kubenswrapper[4826]: I0319 20:16:00.138395 4826 generic.go:334] "Generic (PLEG): container finished" podID="4c98a23b-914e-4545-92fd-056eeb4af1ee" containerID="da45a447d2129cee85036f1261d0bf2cdf4e5d868b2a5d5583cb4492b3ea8744" exitCode=0 Mar 19 20:16:00 crc kubenswrapper[4826]: I0319 20:16:00.138778 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vt4z5" event={"ID":"4c98a23b-914e-4545-92fd-056eeb4af1ee","Type":"ContainerStarted","Data":"a47ea601a88e3f920e02b6f94fc8d996719e98bd0696cedcf56e122064374f3e"} Mar 19 20:16:00 crc kubenswrapper[4826]: I0319 20:16:00.156975 4826 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 20:16:00 crc kubenswrapper[4826]: I0319 20:16:00.809322 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565856-nwv2c"] Mar 19 20:16:00 crc kubenswrapper[4826]: I0319 20:16:00.814603 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565856-nwv2c" Mar 19 20:16:00 crc kubenswrapper[4826]: I0319 20:16:00.845237 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565856-nwv2c"] Mar 19 20:16:00 crc kubenswrapper[4826]: I0319 20:16:00.894679 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwnj7\" (UniqueName: \"kubernetes.io/projected/773a141c-3fd8-4f47-b8e1-ad98015ea7d4-kube-api-access-wwnj7\") pod \"auto-csr-approver-29565856-nwv2c\" (UID: \"773a141c-3fd8-4f47-b8e1-ad98015ea7d4\") " pod="openshift-infra/auto-csr-approver-29565856-nwv2c" Mar 19 20:16:00 crc kubenswrapper[4826]: I0319 20:16:00.911631 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 20:16:00 crc kubenswrapper[4826]: I0319 20:16:00.911641 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 20:16:00 crc kubenswrapper[4826]: I0319 20:16:00.911634 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b27wl" Mar 19 20:16:00 crc kubenswrapper[4826]: I0319 20:16:00.997116 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwnj7\" (UniqueName: \"kubernetes.io/projected/773a141c-3fd8-4f47-b8e1-ad98015ea7d4-kube-api-access-wwnj7\") pod \"auto-csr-approver-29565856-nwv2c\" (UID: \"773a141c-3fd8-4f47-b8e1-ad98015ea7d4\") " pod="openshift-infra/auto-csr-approver-29565856-nwv2c" Mar 19 20:16:01 crc kubenswrapper[4826]: I0319 20:16:01.326458 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwnj7\" (UniqueName: \"kubernetes.io/projected/773a141c-3fd8-4f47-b8e1-ad98015ea7d4-kube-api-access-wwnj7\") pod \"auto-csr-approver-29565856-nwv2c\" (UID: \"773a141c-3fd8-4f47-b8e1-ad98015ea7d4\") " pod="openshift-infra/auto-csr-approver-29565856-nwv2c" Mar 19 20:16:01 crc kubenswrapper[4826]: I0319 20:16:01.329162 4826 patch_prober.go:28] interesting pod/logging-loki-gateway-68b4bcd8f5-mhqzk container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:01 crc kubenswrapper[4826]: I0319 20:16:01.329225 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-mhqzk" podUID="1e1484c9-801f-4999-9754-456df604d7ca" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.57:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:01 crc kubenswrapper[4826]: I0319 20:16:01.327908 4826 patch_prober.go:28] interesting pod/logging-loki-gateway-68b4bcd8f5-zvtrc container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8083/ready\": context deadline exceeded" start-of-body= Mar 19 20:16:01 crc kubenswrapper[4826]: I0319 20:16:01.330765 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-zvtrc" podUID="64ca34d8-5f9f-448d-9ab2-414c5b4757e9" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.56:8083/ready\": context deadline exceeded" Mar 19 20:16:01 crc kubenswrapper[4826]: I0319 20:16:01.354240 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565856-nwv2c" Mar 19 20:16:01 crc kubenswrapper[4826]: I0319 20:16:01.354694 4826 patch_prober.go:28] interesting pod/logging-loki-gateway-68b4bcd8f5-mhqzk container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:8083/ready\": context deadline exceeded" start-of-body= Mar 19 20:16:01 crc kubenswrapper[4826]: I0319 20:16:01.354731 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-mhqzk" podUID="1e1484c9-801f-4999-9754-456df604d7ca" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.57:8083/ready\": context deadline exceeded" Mar 19 20:16:02 crc kubenswrapper[4826]: I0319 20:16:02.336893 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565856-nwv2c"] Mar 19 20:16:02 crc kubenswrapper[4826]: I0319 20:16:02.384708 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565856-nwv2c" event={"ID":"773a141c-3fd8-4f47-b8e1-ad98015ea7d4","Type":"ContainerStarted","Data":"75deefc8d688b23aecc15d14ccb268112b9361a9766883b60fe876272b9131cc"} Mar 19 20:16:02 crc kubenswrapper[4826]: I0319 20:16:02.768824 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="763c5ded-be94-49ad-9eea-447e444f24f3" containerName="galera" probeResult="failure" output="command timed out" Mar 19 20:16:02 crc kubenswrapper[4826]: I0319 20:16:02.769253 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="763c5ded-be94-49ad-9eea-447e444f24f3" containerName="galera" probeResult="failure" output="command timed out" Mar 19 20:16:03 crc kubenswrapper[4826]: I0319 20:16:03.409604 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vt4z5" event={"ID":"4c98a23b-914e-4545-92fd-056eeb4af1ee","Type":"ContainerStarted","Data":"3afda7e361b537c4d54b166e46699b3580326e89b48ef9f26da22c6b1123e603"} Mar 19 20:16:05 crc kubenswrapper[4826]: I0319 20:16:05.437354 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565856-nwv2c" event={"ID":"773a141c-3fd8-4f47-b8e1-ad98015ea7d4","Type":"ContainerStarted","Data":"a27243d89376028b26d66dfc28c4f4d0217cd4075ac5c42e4aa3f89b931bc73f"} Mar 19 20:16:05 crc kubenswrapper[4826]: I0319 20:16:05.467317 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565856-nwv2c" podStartSLOduration=4.453623265 podStartE2EDuration="5.465018335s" podCreationTimestamp="2026-03-19 20:16:00 +0000 UTC" firstStartedPulling="2026-03-19 20:16:02.365186976 +0000 UTC m=+4787.119255279" lastFinishedPulling="2026-03-19 20:16:03.376582036 +0000 UTC m=+4788.130650349" observedRunningTime="2026-03-19 20:16:05.461193812 +0000 UTC m=+4790.215262135" watchObservedRunningTime="2026-03-19 20:16:05.465018335 +0000 UTC m=+4790.219086658" Mar 19 20:16:06 crc kubenswrapper[4826]: I0319 20:16:06.575678 4826 patch_prober.go:28] interesting pod/route-controller-manager-bb4bb89f7-bhb8x container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.73:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:06 crc kubenswrapper[4826]: I0319 20:16:06.575734 4826 patch_prober.go:28] interesting pod/route-controller-manager-bb4bb89f7-bhb8x container/route-controller-manager namespace/openshift-route-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.73:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:06 crc kubenswrapper[4826]: I0319 20:16:06.578699 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-bb4bb89f7-bhb8x" podUID="5f25fb62-ec83-409e-88fb-0073d07869b9" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.73:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:06 crc kubenswrapper[4826]: I0319 20:16:06.578738 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-route-controller-manager/route-controller-manager-bb4bb89f7-bhb8x" podUID="5f25fb62-ec83-409e-88fb-0073d07869b9" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.73:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:07 crc kubenswrapper[4826]: I0319 20:16:07.225823 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/manila-operator-controller-manager-55f864c847-zrczt" podUID="6a5ffd48-ea97-46a0-b9ed-f7c38d5d8a90" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.112:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:07 crc kubenswrapper[4826]: I0319 20:16:07.267833 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/manila-operator-controller-manager-55f864c847-zrczt" podUID="6a5ffd48-ea97-46a0-b9ed-f7c38d5d8a90" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.112:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:08 crc kubenswrapper[4826]: I0319 20:16:08.187847 4826 patch_prober.go:28] interesting pod/console-849c6d8fdf-t6vlp container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.142:8443/health\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:08 crc kubenswrapper[4826]: I0319 20:16:08.188111 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-849c6d8fdf-t6vlp" podUID="d068f929-58c2-481e-99bd-e7808a74f36e" containerName="console" probeResult="failure" output="Get \"https://10.217.0.142:8443/health\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:09 crc kubenswrapper[4826]: I0319 20:16:09.668310 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="cert-manager/cert-manager-webhook-687f57d79b-fhw8v" podUID="fe2ad622-0df2-4cb2-8c00-45f4d9a8a1c3" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.40:6080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:10 crc kubenswrapper[4826]: I0319 20:16:10.511885 4826 patch_prober.go:28] interesting pod/oauth-openshift-55bb4f975f-zpl6z container/oauth-openshift namespace/openshift-authentication: Liveness probe status=failure output="Get \"https://10.217.0.74:6443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:10 crc kubenswrapper[4826]: I0319 20:16:10.517208 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" podUID="46e578cd-3724-4abe-805c-554b384ed050" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.74:6443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:10 crc kubenswrapper[4826]: I0319 20:16:10.724864 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/metallb-operator-webhook-server-8645ff956b-rx86q" podUID="b57da585-9fca-48a5-a872-e5019db1e36e" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.96:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:10 crc kubenswrapper[4826]: I0319 20:16:10.724940 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-webhook-server-8645ff956b-rx86q" podUID="b57da585-9fca-48a5-a872-e5019db1e36e" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.96:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:10 crc kubenswrapper[4826]: I0319 20:16:10.893784 4826 patch_prober.go:28] interesting pod/authentication-operator-69f744f599-zl2jh container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:10 crc kubenswrapper[4826]: I0319 20:16:10.893857 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-69f744f599-zl2jh" podUID="352eae31-d0e1-452b-8319-ab53b8095b5a" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:11 crc kubenswrapper[4826]: I0319 20:16:11.425351 4826 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-v6d7k container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:11 crc kubenswrapper[4826]: I0319 20:16:11.425416 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-v6d7k" podUID="9bc83b3f-72da-4527-b7a8-5f09d3f5f39f" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:11 crc kubenswrapper[4826]: I0319 20:16:11.425358 4826 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-v6d7k container/catalog-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:11 crc kubenswrapper[4826]: I0319 20:16:11.425513 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-v6d7k" podUID="9bc83b3f-72da-4527-b7a8-5f09d3f5f39f" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:11 crc kubenswrapper[4826]: I0319 20:16:11.481431 4826 patch_prober.go:28] interesting pod/console-operator-58897d9998-zc8ht container/console-operator namespace/openshift-console-operator: Liveness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:11 crc kubenswrapper[4826]: I0319 20:16:11.481491 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console-operator/console-operator-58897d9998-zc8ht" podUID="f61cc107-39c3-4add-b9a1-45c5d744ea4b" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:11 crc kubenswrapper[4826]: I0319 20:16:11.481440 4826 patch_prober.go:28] interesting pod/console-operator-58897d9998-zc8ht container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:11 crc kubenswrapper[4826]: I0319 20:16:11.481593 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-zc8ht" podUID="f61cc107-39c3-4add-b9a1-45c5d744ea4b" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:11 crc kubenswrapper[4826]: I0319 20:16:11.526835 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-index-wrwzn" podUID="f9f3d33c-0421-473c-94e6-a7860932d772" containerName="registry-server" probeResult="failure" output=< Mar 19 20:16:11 crc kubenswrapper[4826]: timeout: failed to connect service ":50051" within 1s Mar 19 20:16:11 crc kubenswrapper[4826]: > Mar 19 20:16:11 crc kubenswrapper[4826]: I0319 20:16:11.526980 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-index-wrwzn" podUID="f9f3d33c-0421-473c-94e6-a7860932d772" containerName="registry-server" probeResult="failure" output=< Mar 19 20:16:11 crc kubenswrapper[4826]: timeout: failed to connect service ":50051" within 1s Mar 19 20:16:11 crc kubenswrapper[4826]: > Mar 19 20:16:11 crc kubenswrapper[4826]: I0319 20:16:11.622407 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-prxxj" podUID="b724e39c-45b5-4701-b4f0-a19969224d90" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:11 crc kubenswrapper[4826]: I0319 20:16:11.662848 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-prxxj" podUID="b724e39c-45b5-4701-b4f0-a19969224d90" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:11 crc kubenswrapper[4826]: I0319 20:16:11.662931 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-prxxj" podUID="b724e39c-45b5-4701-b4f0-a19969224d90" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:11 crc kubenswrapper[4826]: I0319 20:16:11.925821 4826 patch_prober.go:28] interesting pod/perses-operator-6648f6899-wbmts container/perses-operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.14:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:11 crc kubenswrapper[4826]: I0319 20:16:11.926122 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/perses-operator-6648f6899-wbmts" podUID="8eb71543-680b-4018-94e4-572cfcc12660" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.14:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:12 crc kubenswrapper[4826]: I0319 20:16:12.115330 4826 patch_prober.go:28] interesting pod/package-server-manager-789f6589d5-nlft6 container/package-server-manager namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:12 crc kubenswrapper[4826]: I0319 20:16:12.115402 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nlft6" podUID="4858c7f7-6a71-40dc-8222-082f6d97504c" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.217.0.36:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:12 crc kubenswrapper[4826]: I0319 20:16:12.115464 4826 patch_prober.go:28] interesting pod/package-server-manager-789f6589d5-nlft6 container/package-server-manager namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"http://10.217.0.36:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:12 crc kubenswrapper[4826]: I0319 20:16:12.115477 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nlft6" podUID="4858c7f7-6a71-40dc-8222-082f6d97504c" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.217.0.36:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:12 crc kubenswrapper[4826]: I0319 20:16:12.129337 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/community-operators-p9rmq" podUID="3a20f6a8-01f3-4492-856d-e5f494672fa3" containerName="registry-server" probeResult="failure" output=< Mar 19 20:16:12 crc kubenswrapper[4826]: timeout: health rpc did not complete within 1s Mar 19 20:16:12 crc kubenswrapper[4826]: > Mar 19 20:16:12 crc kubenswrapper[4826]: I0319 20:16:12.129343 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/community-operators-p9rmq" podUID="3a20f6a8-01f3-4492-856d-e5f494672fa3" containerName="registry-server" probeResult="failure" output=< Mar 19 20:16:12 crc kubenswrapper[4826]: timeout: health rpc did not complete within 1s Mar 19 20:16:12 crc kubenswrapper[4826]: > Mar 19 20:16:12 crc kubenswrapper[4826]: I0319 20:16:12.346085 4826 patch_prober.go:28] interesting pod/router-default-5444994796-drbf6 container/router namespace/openshift-ingress: Liveness probe status=failure output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:12 crc kubenswrapper[4826]: I0319 20:16:12.346130 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-ingress/router-default-5444994796-drbf6" podUID="ee11e1f6-25be-40f4-b19b-a2d8e439d8c6" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:12 crc kubenswrapper[4826]: I0319 20:16:12.346199 4826 patch_prober.go:28] interesting pod/router-default-5444994796-drbf6 container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:12 crc kubenswrapper[4826]: I0319 20:16:12.346282 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-drbf6" podUID="ee11e1f6-25be-40f4-b19b-a2d8e439d8c6" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:12 crc kubenswrapper[4826]: I0319 20:16:12.354229 4826 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-dnc22 container/olm-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:12 crc kubenswrapper[4826]: I0319 20:16:12.354286 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dnc22" podUID="fdb49b25-5e81-4f9d-9a17-34bade2cec18" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:12 crc kubenswrapper[4826]: I0319 20:16:12.354350 4826 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-dnc22 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:12 crc kubenswrapper[4826]: I0319 20:16:12.354376 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dnc22" podUID="fdb49b25-5e81-4f9d-9a17-34bade2cec18" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:12 crc kubenswrapper[4826]: I0319 20:16:12.380198 4826 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-fcnzx container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.44:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:12 crc kubenswrapper[4826]: I0319 20:16:12.380253 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fcnzx" podUID="781f0741-f222-4ccc-aa80-6dde59e9648d" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.44:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:12 crc kubenswrapper[4826]: I0319 20:16:12.393581 4826 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-fcnzx container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.44:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:12 crc kubenswrapper[4826]: I0319 20:16:12.393621 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fcnzx" podUID="781f0741-f222-4ccc-aa80-6dde59e9648d" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.44:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:12 crc kubenswrapper[4826]: I0319 20:16:12.644727 4826 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Liveness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:12 crc kubenswrapper[4826]: I0319 20:16:12.644785 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:12 crc kubenswrapper[4826]: I0319 20:16:12.661106 4826 patch_prober.go:28] interesting pod/metrics-server-657c5b447-gjh5h container/metrics-server namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.89:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:12 crc kubenswrapper[4826]: I0319 20:16:12.661171 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/metrics-server-657c5b447-gjh5h" podUID="ab7c3046-ac34-417e-a7c6-63e500286063" containerName="metrics-server" probeResult="failure" output="Get \"https://10.217.0.89:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:12 crc kubenswrapper[4826]: I0319 20:16:12.661172 4826 patch_prober.go:28] interesting pod/metrics-server-657c5b447-gjh5h container/metrics-server namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.89:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:12 crc kubenswrapper[4826]: I0319 20:16:12.661237 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/metrics-server-657c5b447-gjh5h" podUID="ab7c3046-ac34-417e-a7c6-63e500286063" containerName="metrics-server" probeResult="failure" output="Get \"https://10.217.0.89:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:12 crc kubenswrapper[4826]: I0319 20:16:12.769936 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="763c5ded-be94-49ad-9eea-447e444f24f3" containerName="galera" probeResult="failure" output="command timed out" Mar 19 20:16:12 crc kubenswrapper[4826]: I0319 20:16:12.770114 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="763c5ded-be94-49ad-9eea-447e444f24f3" containerName="galera" probeResult="failure" output="command timed out" Mar 19 20:16:12 crc kubenswrapper[4826]: I0319 20:16:12.995790 4826 patch_prober.go:28] interesting pod/monitoring-plugin-747c5d4c44-ltxl4 container/monitoring-plugin namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.90:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:12 crc kubenswrapper[4826]: I0319 20:16:12.995829 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-zjkbj" podUID="a960df53-d712-424a-85a7-64b0e50c911f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.110:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:12 crc kubenswrapper[4826]: I0319 20:16:12.995842 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/monitoring-plugin-747c5d4c44-ltxl4" podUID="2b19eec2-98e8-47bd-b68f-55b033eb788c" containerName="monitoring-plugin" probeResult="failure" output="Get \"https://10.217.0.90:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:13 crc kubenswrapper[4826]: I0319 20:16:13.275821 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/speaker-w2f68" podUID="b812f1db-b2c8-467c-977a-a8661540546e" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:13 crc kubenswrapper[4826]: I0319 20:16:13.275886 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/speaker-w2f68" podUID="b812f1db-b2c8-467c-977a-a8661540546e" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:13 crc kubenswrapper[4826]: I0319 20:16:13.333024 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-b76w9" podUID="4f382869-5ee2-4a46-8188-d4ddd0bee2fa" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.116:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:13 crc kubenswrapper[4826]: I0319 20:16:13.422304 4826 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:13 crc kubenswrapper[4826]: I0319 20:16:13.422370 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:14 crc kubenswrapper[4826]: I0319 20:16:14.551852 4826 patch_prober.go:28] interesting pod/thanos-querier-788cb6bfb6-558hf container/kube-rbac-proxy-web namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.87:9091/-/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:14 crc kubenswrapper[4826]: I0319 20:16:14.552226 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/thanos-querier-788cb6bfb6-558hf" podUID="c1f0645f-c1e5-40ed-8cc7-d3b7e15175b8" containerName="kube-rbac-proxy-web" probeResult="failure" output="Get \"https://10.217.0.87:9091/-/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:14 crc kubenswrapper[4826]: I0319 20:16:14.561441 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="38814433-1737-49df-966a-ac3511ed48dd" containerName="galera" probeResult="failure" output="command timed out" Mar 19 20:16:14 crc kubenswrapper[4826]: I0319 20:16:14.561810 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="38814433-1737-49df-966a-ac3511ed48dd" containerName="galera" probeResult="failure" output="command timed out" Mar 19 20:16:14 crc kubenswrapper[4826]: I0319 20:16:14.587107 4826 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-pfrcn container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:14 crc kubenswrapper[4826]: I0319 20:16:14.587176 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-pfrcn" podUID="72f0a310-1676-49a4-826a-d83406d28e93" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:14 crc kubenswrapper[4826]: I0319 20:16:14.587395 4826 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-pfrcn container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:14 crc kubenswrapper[4826]: I0319 20:16:14.587426 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-pfrcn" podUID="72f0a310-1676-49a4-826a-d83406d28e93" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:15 crc kubenswrapper[4826]: I0319 20:16:15.090110 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="bf194957-ec68-4ea7-b094-3e0912bc3bc5" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.171:9090/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:15 crc kubenswrapper[4826]: I0319 20:16:15.090125 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/prometheus-metric-storage-0" podUID="bf194957-ec68-4ea7-b094-3e0912bc3bc5" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.171:9090/-/healthy\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:15 crc kubenswrapper[4826]: I0319 20:16:15.583912 4826 patch_prober.go:28] interesting pod/loki-operator-controller-manager-d88f59dd5-fqs6s container/manager namespace/openshift-operators-redhat: Readiness probe status=failure output="Get \"http://10.217.0.49:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:15 crc kubenswrapper[4826]: I0319 20:16:15.583940 4826 patch_prober.go:28] interesting pod/nmstate-webhook-5f558f5558-d4pjw container/nmstate-webhook namespace/openshift-nmstate: Readiness probe status=failure output="Get \"https://10.217.0.71:9443/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:15 crc kubenswrapper[4826]: I0319 20:16:15.583969 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators-redhat/loki-operator-controller-manager-d88f59dd5-fqs6s" podUID="84bba80c-841e-4df3-87e0-901afbc23bf3" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.49:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:15 crc kubenswrapper[4826]: I0319 20:16:15.583913 4826 patch_prober.go:28] interesting pod/logging-loki-distributor-9c6b6d984-qrlfg container/loki-distributor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:15 crc kubenswrapper[4826]: I0319 20:16:15.584048 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-distributor-9c6b6d984-qrlfg" podUID="e1f51b15-5d82-43d5-b391-5f4b10434957" containerName="loki-distributor" probeResult="failure" output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:15 crc kubenswrapper[4826]: I0319 20:16:15.583986 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-webhook-5f558f5558-d4pjw" podUID="afb786fa-7916-4f36-9978-5bd829c9dbf8" containerName="nmstate-webhook" probeResult="failure" output="Get \"https://10.217.0.71:9443/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:15 crc kubenswrapper[4826]: I0319 20:16:15.771151 4826 patch_prober.go:28] interesting pod/logging-loki-query-frontend-ff66c4dc9-l2p46 container/loki-query-frontend namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:15 crc kubenswrapper[4826]: I0319 20:16:15.771211 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-l2p46" podUID="0fc08676-ae6f-4018-8f85-259585de45fe" containerName="loki-query-frontend" probeResult="failure" output="Get \"https://10.217.0.55:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:15 crc kubenswrapper[4826]: I0319 20:16:15.906991 4826 patch_prober.go:28] interesting pod/logging-loki-querier-6dcbdf8bb8-qltmk container/loki-querier namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.54:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:15 crc kubenswrapper[4826]: I0319 20:16:15.907072 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qltmk" podUID="081e84d7-1c7e-4c6f-935e-ee01eaf393e2" containerName="loki-querier" probeResult="failure" output="Get \"https://10.217.0.54:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:16 crc kubenswrapper[4826]: I0319 20:16:16.141305 4826 patch_prober.go:28] interesting pod/logging-loki-gateway-68b4bcd8f5-zvtrc container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:16 crc kubenswrapper[4826]: I0319 20:16:16.141654 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-zvtrc" podUID="64ca34d8-5f9f-448d-9ab2-414c5b4757e9" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:16 crc kubenswrapper[4826]: I0319 20:16:16.240162 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-controller-init-6c6f68556d-k5tlt" podUID="c045bb2f-b87b-4a14-92b5-0b98cdc7a0d1" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.103:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:16 crc kubenswrapper[4826]: I0319 20:16:16.240654 4826 patch_prober.go:28] interesting pod/logging-loki-gateway-68b4bcd8f5-mhqzk container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:16 crc kubenswrapper[4826]: I0319 20:16:16.240701 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-mhqzk" podUID="1e1484c9-801f-4999-9754-456df604d7ca" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.57:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:16 crc kubenswrapper[4826]: I0319 20:16:16.240740 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-init-6c6f68556d-k5tlt" podUID="c045bb2f-b87b-4a14-92b5-0b98cdc7a0d1" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.103:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:16 crc kubenswrapper[4826]: I0319 20:16:16.485446 4826 patch_prober.go:28] interesting pod/controller-manager-567cb464d6-bm4t6 container/controller-manager namespace/openshift-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.70:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:16 crc kubenswrapper[4826]: I0319 20:16:16.485736 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-controller-manager/controller-manager-567cb464d6-bm4t6" podUID="e5996d80-d5eb-423c-8965-1f5704c3dd69" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.70:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:16 crc kubenswrapper[4826]: I0319 20:16:16.485566 4826 patch_prober.go:28] interesting pod/controller-manager-567cb464d6-bm4t6 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.70:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:16 crc kubenswrapper[4826]: I0319 20:16:16.485841 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-567cb464d6-bm4t6" podUID="e5996d80-d5eb-423c-8965-1f5704c3dd69" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.70:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:16 crc kubenswrapper[4826]: I0319 20:16:16.489384 4826 patch_prober.go:28] interesting pod/route-controller-manager-bb4bb89f7-bhb8x container/route-controller-manager namespace/openshift-route-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.73:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:16 crc kubenswrapper[4826]: I0319 20:16:16.489454 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-route-controller-manager/route-controller-manager-bb4bb89f7-bhb8x" podUID="5f25fb62-ec83-409e-88fb-0073d07869b9" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.73:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:16 crc kubenswrapper[4826]: I0319 20:16:16.489404 4826 patch_prober.go:28] interesting pod/route-controller-manager-bb4bb89f7-bhb8x container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.73:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:16 crc kubenswrapper[4826]: I0319 20:16:16.490288 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-bb4bb89f7-bhb8x" podUID="5f25fb62-ec83-409e-88fb-0073d07869b9" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.73:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:16 crc kubenswrapper[4826]: I0319 20:16:16.607250 4826 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.58:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:16 crc kubenswrapper[4826]: I0319 20:16:16.607686 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="93990ea7-96ba-4c12-b92c-17a7c38aece4" containerName="loki-ingester" probeResult="failure" output="Get \"https://10.217.0.58:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:16 crc kubenswrapper[4826]: I0319 20:16:16.717570 4826 patch_prober.go:28] interesting pod/logging-loki-compactor-0 container/loki-compactor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.60:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:16 crc kubenswrapper[4826]: I0319 20:16:16.717646 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-compactor-0" podUID="377fff75-1f59-4c28-a3ed-2bd89e803b73" containerName="loki-compactor" probeResult="failure" output="Get \"https://10.217.0.60:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:16 crc kubenswrapper[4826]: I0319 20:16:16.768098 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-k8s-0" podUID="cb23233f-a975-4476-8bff-5e7b4b9c8646" containerName="prometheus" probeResult="failure" output="command timed out" Mar 19 20:16:16 crc kubenswrapper[4826]: I0319 20:16:16.768314 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="cb23233f-a975-4476-8bff-5e7b4b9c8646" containerName="prometheus" probeResult="failure" output="command timed out" Mar 19 20:16:16 crc kubenswrapper[4826]: I0319 20:16:16.774908 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="ab298593-ac97-4031-8bfc-b0e5be9b341a" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Mar 19 20:16:16 crc kubenswrapper[4826]: I0319 20:16:16.970952 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-jjqrs" podUID="5f60643c-c919-436b-bd23-9e39698d9c9b" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.104:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:17 crc kubenswrapper[4826]: I0319 20:16:17.033858 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-zm4ps" podUID="38267b94-39ea-4067-9b6e-3d863ff60494" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.105:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:17 crc kubenswrapper[4826]: I0319 20:16:17.091821 4826 patch_prober.go:28] interesting pod/logging-loki-index-gateway-0 container/loki-index-gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.61:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:17 crc kubenswrapper[4826]: I0319 20:16:17.091891 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-index-gateway-0" podUID="05505420-3d58-4de7-9da6-2f27e54c32f5" containerName="loki-index-gateway" probeResult="failure" output="Get \"https://10.217.0.61:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:17 crc kubenswrapper[4826]: I0319 20:16:17.116893 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-hf8n5" podUID="080fa697-4720-424e-b75e-6564061cd68f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.106:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:17 crc kubenswrapper[4826]: I0319 20:16:17.157944 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-8265b" podUID="0f77f094-1b90-43a6-85be-27e8b1fda71f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.107:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:17 crc kubenswrapper[4826]: I0319 20:16:17.199057 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-rsrjx" podUID="d2375678-e630-4376-9dfd-28efbc77aed4" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.108:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:17 crc kubenswrapper[4826]: I0319 20:16:17.199053 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-ngb9j" podUID="ee5c97c9-5dc0-4292-9a34-08ca45f5387a" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.109:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:17 crc kubenswrapper[4826]: I0319 20:16:17.239957 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-725xd" podUID="f073a654-efe9-4fd0-9c08-23d9fdb0d492" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.111:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:17 crc kubenswrapper[4826]: I0319 20:16:17.280988 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/manila-operator-controller-manager-55f864c847-zrczt" podUID="6a5ffd48-ea97-46a0-b9ed-f7c38d5d8a90" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.112:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:17 crc kubenswrapper[4826]: I0319 20:16:17.388870 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-xpq6x" podUID="271f8c86-929d-46a4-8852-f5ec8e701bcb" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.113:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:17 crc kubenswrapper[4826]: I0319 20:16:17.388944 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-4bkbn" podUID="49f5fbe6-ba93-4ff2-b575-aa08dceb2622" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.114:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:17 crc kubenswrapper[4826]: I0319 20:16:17.483957 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ovn-operator-controller-manager-884679f54-zs74n" podUID="6243b523-966a-4f1d-b663-2f1ed4614fdb" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.117:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:17 crc kubenswrapper[4826]: I0319 20:16:17.544881 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-tjcmb" podUID="44055ef9-1bc5-4b25-a40d-553a1546fc15" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.119:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:17 crc kubenswrapper[4826]: I0319 20:16:17.580824 4826 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-pfrcn container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": context deadline exceeded" start-of-body= Mar 19 20:16:17 crc kubenswrapper[4826]: I0319 20:16:17.580876 4826 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-pfrcn container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:17 crc kubenswrapper[4826]: I0319 20:16:17.580888 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-pfrcn" podUID="72f0a310-1676-49a4-826a-d83406d28e93" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": context deadline exceeded" Mar 19 20:16:17 crc kubenswrapper[4826]: I0319 20:16:17.580909 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-pfrcn" podUID="72f0a310-1676-49a4-826a-d83406d28e93" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:17 crc kubenswrapper[4826]: I0319 20:16:17.638811 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/placement-operator-controller-manager-5784578c99-kkmzl" podUID="79a89fcd-3226-4314-951d-d94af2ac242c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.120:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:17 crc kubenswrapper[4826]: I0319 20:16:17.718811 4826 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-sbhr9 container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.82:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:17 crc kubenswrapper[4826]: I0319 20:16:17.718882 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/neutron-operator-controller-manager-767865f676-sfs65" podUID="918ac815-fe60-44b9-b6c0-c99ee8dc80b8" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.115:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:17 crc kubenswrapper[4826]: I0319 20:16:17.719160 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-sbhr9" podUID="67f96c65-0583-4f62-a063-98c7e6bbfb87" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.82:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:17 crc kubenswrapper[4826]: I0319 20:16:17.718942 4826 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-sbhr9 container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.82:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:17 crc kubenswrapper[4826]: I0319 20:16:17.719250 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-sbhr9" podUID="67f96c65-0583-4f62-a063-98c7e6bbfb87" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.82:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:17 crc kubenswrapper[4826]: I0319 20:16:17.828904 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-j4p25" podUID="7137162e-cccf-4ce6-9dc4-7380db33a85a" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.118:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:18 crc kubenswrapper[4826]: I0319 20:16:18.078808 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/swift-operator-controller-manager-c674c5965-skdcp" podUID="aff2d31f-3465-4c0c-8bbf-b04dfdb92db0" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.121:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:18 crc kubenswrapper[4826]: I0319 20:16:18.078890 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-6vmk6" podUID="e36e6f7a-53ec-4262-b9e5-798353e5bf15" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.123:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:18 crc kubenswrapper[4826]: I0319 20:16:18.119933 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/telemetry-operator-controller-manager-6c5c766d94-258q2" podUID="5d8869b3-7d43-4db2-b79d-f05c13d0d6f2" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.122:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:18 crc kubenswrapper[4826]: I0319 20:16:18.119955 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-7l4t6" podUID="dc64459f-49c1-41f5-b946-88ab7bc8e1d8" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.124:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:18 crc kubenswrapper[4826]: I0319 20:16:18.186996 4826 patch_prober.go:28] interesting pod/console-849c6d8fdf-t6vlp container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.142:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:18 crc kubenswrapper[4826]: I0319 20:16:18.187057 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-849c6d8fdf-t6vlp" podUID="d068f929-58c2-481e-99bd-e7808a74f36e" containerName="console" probeResult="failure" output="Get \"https://10.217.0.142:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:19 crc kubenswrapper[4826]: I0319 20:16:19.235828 4826 patch_prober.go:28] interesting pod/thanos-querier-788cb6bfb6-558hf container/kube-rbac-proxy-web namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.87:9091/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:19 crc kubenswrapper[4826]: I0319 20:16:19.236224 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/thanos-querier-788cb6bfb6-558hf" podUID="c1f0645f-c1e5-40ed-8cc7-d3b7e15175b8" containerName="kube-rbac-proxy-web" probeResult="failure" output="Get \"https://10.217.0.87:9091/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:19 crc kubenswrapper[4826]: I0319 20:16:19.667847 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="cert-manager/cert-manager-webhook-687f57d79b-fhw8v" podUID="fe2ad622-0df2-4cb2-8c00-45f4d9a8a1c3" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.40:6080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:19 crc kubenswrapper[4826]: I0319 20:16:19.821930 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-8t2bm" podUID="50980b03-91b0-4e4d-9923-e2a531458fd4" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.125:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:19 crc kubenswrapper[4826]: I0319 20:16:19.822224 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-8t2bm" podUID="50980b03-91b0-4e4d-9923-e2a531458fd4" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.125:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:20 crc kubenswrapper[4826]: I0319 20:16:20.088809 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="bf194957-ec68-4ea7-b094-3e0912bc3bc5" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.171:9090/-/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:20 crc kubenswrapper[4826]: I0319 20:16:20.089063 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/prometheus-metric-storage-0" podUID="bf194957-ec68-4ea7-b094-3e0912bc3bc5" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.171:9090/-/healthy\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:20 crc kubenswrapper[4826]: I0319 20:16:20.416952 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-controller-manager-84d47777df-4x998" podUID="010ce31f-d333-43a9-b1e0-cd85cc0f6fd6" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.95:8080/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:20 crc kubenswrapper[4826]: I0319 20:16:20.481857 4826 patch_prober.go:28] interesting pod/oauth-openshift-55bb4f975f-zpl6z container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.74:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:20 crc kubenswrapper[4826]: I0319 20:16:20.481911 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" podUID="46e578cd-3724-4abe-805c-554b384ed050" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.74:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:20 crc kubenswrapper[4826]: I0319 20:16:20.481858 4826 patch_prober.go:28] interesting pod/oauth-openshift-55bb4f975f-zpl6z container/oauth-openshift namespace/openshift-authentication: Liveness probe status=failure output="Get \"https://10.217.0.74:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:20 crc kubenswrapper[4826]: I0319 20:16:20.481964 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" podUID="46e578cd-3724-4abe-805c-554b384ed050" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.74:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:20 crc kubenswrapper[4826]: I0319 20:16:20.662986 4826 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-pfrcn container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:20 crc kubenswrapper[4826]: I0319 20:16:20.663447 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-pfrcn" podUID="72f0a310-1676-49a4-826a-d83406d28e93" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:20 crc kubenswrapper[4826]: I0319 20:16:20.662933 4826 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-pfrcn container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:20 crc kubenswrapper[4826]: I0319 20:16:20.663570 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-pfrcn" podUID="72f0a310-1676-49a4-826a-d83406d28e93" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:20 crc kubenswrapper[4826]: I0319 20:16:20.666457 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-config-operator/openshift-config-operator-7777fb866f-pfrcn" Mar 19 20:16:20 crc kubenswrapper[4826]: I0319 20:16:20.666646 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-pfrcn" Mar 19 20:16:20 crc kubenswrapper[4826]: I0319 20:16:20.674069 4826 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="openshift-config-operator" containerStatusID={"Type":"cri-o","ID":"76591b9cfad14c68b1ef112f6ed7cca58927da7f28bfa6fafae17389b99d7728"} pod="openshift-config-operator/openshift-config-operator-7777fb866f-pfrcn" containerMessage="Container openshift-config-operator failed liveness probe, will be restarted" Mar 19 20:16:20 crc kubenswrapper[4826]: I0319 20:16:20.676157 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-config-operator/openshift-config-operator-7777fb866f-pfrcn" podUID="72f0a310-1676-49a4-826a-d83406d28e93" containerName="openshift-config-operator" containerID="cri-o://76591b9cfad14c68b1ef112f6ed7cca58927da7f28bfa6fafae17389b99d7728" gracePeriod=30 Mar 19 20:16:20 crc kubenswrapper[4826]: I0319 20:16:20.746894 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/metallb-operator-webhook-server-8645ff956b-rx86q" podUID="b57da585-9fca-48a5-a872-e5019db1e36e" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.96:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:20 crc kubenswrapper[4826]: I0319 20:16:20.770472 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-handler-zqv4g" podUID="9f450458-8845-4ec5-9971-6df9dd448312" containerName="nmstate-handler" probeResult="failure" output="command timed out" Mar 19 20:16:20 crc kubenswrapper[4826]: I0319 20:16:20.828865 4826 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-66v8z container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.75:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:20 crc kubenswrapper[4826]: I0319 20:16:20.828932 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-66v8z" podUID="f182fb72-66c7-4d5d-bccd-29a47b27f4c6" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.75:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:20 crc kubenswrapper[4826]: I0319 20:16:20.829141 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-webhook-server-8645ff956b-rx86q" podUID="b57da585-9fca-48a5-a872-e5019db1e36e" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.96:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:20 crc kubenswrapper[4826]: I0319 20:16:20.829416 4826 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-66v8z container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.217.0.75:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:20 crc kubenswrapper[4826]: I0319 20:16:20.829923 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-79b997595-66v8z" podUID="f182fb72-66c7-4d5d-bccd-29a47b27f4c6" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.75:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:20 crc kubenswrapper[4826]: I0319 20:16:20.851755 4826 patch_prober.go:28] interesting pod/authentication-operator-69f744f599-zl2jh container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:20 crc kubenswrapper[4826]: I0319 20:16:20.851813 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-69f744f599-zl2jh" podUID="352eae31-d0e1-452b-8319-ab53b8095b5a" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:21 crc kubenswrapper[4826]: I0319 20:16:21.142626 4826 patch_prober.go:28] interesting pod/logging-loki-gateway-68b4bcd8f5-zvtrc container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:21 crc kubenswrapper[4826]: I0319 20:16:21.142733 4826 patch_prober.go:28] interesting pod/logging-loki-gateway-68b4bcd8f5-zvtrc container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:21 crc kubenswrapper[4826]: I0319 20:16:21.142735 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-zvtrc" podUID="64ca34d8-5f9f-448d-9ab2-414c5b4757e9" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.56:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:21 crc kubenswrapper[4826]: I0319 20:16:21.142792 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-zvtrc" podUID="64ca34d8-5f9f-448d-9ab2-414c5b4757e9" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:21 crc kubenswrapper[4826]: I0319 20:16:21.161219 4826 patch_prober.go:28] interesting pod/logging-loki-gateway-68b4bcd8f5-mhqzk container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:21 crc kubenswrapper[4826]: I0319 20:16:21.161286 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-mhqzk" podUID="1e1484c9-801f-4999-9754-456df604d7ca" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.57:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:21 crc kubenswrapper[4826]: I0319 20:16:21.161225 4826 patch_prober.go:28] interesting pod/logging-loki-gateway-68b4bcd8f5-mhqzk container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:21 crc kubenswrapper[4826]: I0319 20:16:21.161453 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-mhqzk" podUID="1e1484c9-801f-4999-9754-456df604d7ca" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.57:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:21 crc kubenswrapper[4826]: I0319 20:16:21.457862 4826 patch_prober.go:28] interesting pod/observability-operator-6dd7dd855f-tcdmb container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.13:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:21 crc kubenswrapper[4826]: I0319 20:16:21.457925 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-6dd7dd855f-tcdmb" podUID="217c809e-0af8-4b11-a5ce-932d698ed444" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.13:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:21 crc kubenswrapper[4826]: I0319 20:16:21.457992 4826 patch_prober.go:28] interesting pod/observability-operator-6dd7dd855f-tcdmb container/operator namespace/openshift-operators: Liveness probe status=failure output="Get \"http://10.217.0.13:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:21 crc kubenswrapper[4826]: I0319 20:16:21.458008 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators/observability-operator-6dd7dd855f-tcdmb" podUID="217c809e-0af8-4b11-a5ce-932d698ed444" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.13:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:21 crc kubenswrapper[4826]: I0319 20:16:21.458157 4826 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-v6d7k container/catalog-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:21 crc kubenswrapper[4826]: I0319 20:16:21.458213 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-v6d7k" podUID="9bc83b3f-72da-4527-b7a8-5f09d3f5f39f" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:21 crc kubenswrapper[4826]: I0319 20:16:21.458248 4826 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-v6d7k container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:21 crc kubenswrapper[4826]: I0319 20:16:21.458298 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-v6d7k" podUID="9bc83b3f-72da-4527-b7a8-5f09d3f5f39f" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:21 crc kubenswrapper[4826]: I0319 20:16:21.481615 4826 patch_prober.go:28] interesting pod/console-operator-58897d9998-zc8ht container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:21 crc kubenswrapper[4826]: I0319 20:16:21.481672 4826 patch_prober.go:28] interesting pod/console-operator-58897d9998-zc8ht container/console-operator namespace/openshift-console-operator: Liveness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:21 crc kubenswrapper[4826]: I0319 20:16:21.481760 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-zc8ht" podUID="f61cc107-39c3-4add-b9a1-45c5d744ea4b" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:21 crc kubenswrapper[4826]: I0319 20:16:21.481799 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console-operator/console-operator-58897d9998-zc8ht" podUID="f61cc107-39c3-4add-b9a1-45c5d744ea4b" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:21 crc kubenswrapper[4826]: I0319 20:16:21.657904 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-prxxj" podUID="b724e39c-45b5-4701-b4f0-a19969224d90" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:21 crc kubenswrapper[4826]: I0319 20:16:21.739889 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-prxxj" podUID="b724e39c-45b5-4701-b4f0-a19969224d90" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:21 crc kubenswrapper[4826]: I0319 20:16:21.739905 4826 patch_prober.go:28] interesting pod/downloads-7954f5f757-cbmtf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:21 crc kubenswrapper[4826]: I0319 20:16:21.739981 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-cbmtf" podUID="0a13bc75-83b6-4952-8e8e-cd93809a87b5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:21 crc kubenswrapper[4826]: I0319 20:16:21.739929 4826 patch_prober.go:28] interesting pod/downloads-7954f5f757-cbmtf container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.17:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:21 crc kubenswrapper[4826]: I0319 20:16:21.740012 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-cbmtf" podUID="0a13bc75-83b6-4952-8e8e-cd93809a87b5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:21 crc kubenswrapper[4826]: I0319 20:16:21.771127 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="cb23233f-a975-4476-8bff-5e7b4b9c8646" containerName="prometheus" probeResult="failure" output="command timed out" Mar 19 20:16:21 crc kubenswrapper[4826]: I0319 20:16:21.771157 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-k8s-0" podUID="cb23233f-a975-4476-8bff-5e7b4b9c8646" containerName="prometheus" probeResult="failure" output="command timed out" Mar 19 20:16:21 crc kubenswrapper[4826]: I0319 20:16:21.985965 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-prxxj" podUID="b724e39c-45b5-4701-b4f0-a19969224d90" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:21 crc kubenswrapper[4826]: I0319 20:16:21.986042 4826 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-pfrcn container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:21 crc kubenswrapper[4826]: I0319 20:16:21.986102 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-pfrcn" podUID="72f0a310-1676-49a4-826a-d83406d28e93" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:22 crc kubenswrapper[4826]: I0319 20:16:22.026850 4826 patch_prober.go:28] interesting pod/perses-operator-6648f6899-wbmts container/perses-operator namespace/openshift-operators: Liveness probe status=failure output="Get \"http://10.217.0.14:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:22 crc kubenswrapper[4826]: I0319 20:16:22.026909 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators/perses-operator-6648f6899-wbmts" podUID="8eb71543-680b-4018-94e4-572cfcc12660" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.14:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:22 crc kubenswrapper[4826]: I0319 20:16:22.068881 4826 patch_prober.go:28] interesting pod/perses-operator-6648f6899-wbmts container/perses-operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.14:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:22 crc kubenswrapper[4826]: I0319 20:16:22.068983 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/perses-operator-6648f6899-wbmts" podUID="8eb71543-680b-4018-94e4-572cfcc12660" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.14:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:22 crc kubenswrapper[4826]: I0319 20:16:22.152388 4826 patch_prober.go:28] interesting pod/package-server-manager-789f6589d5-nlft6 container/package-server-manager namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"http://10.217.0.36:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:22 crc kubenswrapper[4826]: I0319 20:16:22.152458 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nlft6" podUID="4858c7f7-6a71-40dc-8222-082f6d97504c" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.217.0.36:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:22 crc kubenswrapper[4826]: I0319 20:16:22.152691 4826 patch_prober.go:28] interesting pod/package-server-manager-789f6589d5-nlft6 container/package-server-manager namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:22 crc kubenswrapper[4826]: I0319 20:16:22.152755 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nlft6" podUID="4858c7f7-6a71-40dc-8222-082f6d97504c" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.217.0.36:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:22 crc kubenswrapper[4826]: I0319 20:16:22.238865 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-6btqx" podUID="81cad5dc-6bd8-4081-adc1-28f65b056636" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.97:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:22 crc kubenswrapper[4826]: I0319 20:16:22.239317 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-6btqx" podUID="81cad5dc-6bd8-4081-adc1-28f65b056636" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.97:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:22 crc kubenswrapper[4826]: I0319 20:16:22.322920 4826 patch_prober.go:28] interesting pod/router-default-5444994796-drbf6 container/router namespace/openshift-ingress: Liveness probe status=failure output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:22 crc kubenswrapper[4826]: I0319 20:16:22.322985 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-ingress/router-default-5444994796-drbf6" podUID="ee11e1f6-25be-40f4-b19b-a2d8e439d8c6" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:22 crc kubenswrapper[4826]: I0319 20:16:22.405951 4826 patch_prober.go:28] interesting pod/router-default-5444994796-drbf6 container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:22 crc kubenswrapper[4826]: I0319 20:16:22.405991 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/controller-7bb4cc7c98-cnfr9" podUID="13651756-55fe-46f1-b849-fbdc5dc20887" containerName="controller" probeResult="failure" output="Get \"http://10.217.0.98:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:22 crc kubenswrapper[4826]: I0319 20:16:22.406021 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-drbf6" podUID="ee11e1f6-25be-40f4-b19b-a2d8e439d8c6" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:22 crc kubenswrapper[4826]: I0319 20:16:22.406088 4826 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-dnc22 container/olm-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:22 crc kubenswrapper[4826]: I0319 20:16:22.406107 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dnc22" podUID="fdb49b25-5e81-4f9d-9a17-34bade2cec18" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:22 crc kubenswrapper[4826]: I0319 20:16:22.406149 4826 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-dnc22 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:22 crc kubenswrapper[4826]: I0319 20:16:22.406210 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/controller-7bb4cc7c98-cnfr9" podUID="13651756-55fe-46f1-b849-fbdc5dc20887" containerName="controller" probeResult="failure" output="Get \"http://10.217.0.98:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:22 crc kubenswrapper[4826]: I0319 20:16:22.406202 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dnc22" podUID="fdb49b25-5e81-4f9d-9a17-34bade2cec18" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:22 crc kubenswrapper[4826]: I0319 20:16:22.406244 4826 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-fcnzx container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.44:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:22 crc kubenswrapper[4826]: I0319 20:16:22.406357 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fcnzx" podUID="781f0741-f222-4ccc-aa80-6dde59e9648d" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.44:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:22 crc kubenswrapper[4826]: I0319 20:16:22.406389 4826 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-fcnzx container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.44:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:22 crc kubenswrapper[4826]: I0319 20:16:22.406423 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fcnzx" podUID="781f0741-f222-4ccc-aa80-6dde59e9648d" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.44:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:22 crc kubenswrapper[4826]: I0319 20:16:22.644455 4826 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Liveness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:22 crc kubenswrapper[4826]: I0319 20:16:22.644578 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:22 crc kubenswrapper[4826]: I0319 20:16:22.661291 4826 patch_prober.go:28] interesting pod/metrics-server-657c5b447-gjh5h container/metrics-server namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.89:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:22 crc kubenswrapper[4826]: I0319 20:16:22.661370 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/metrics-server-657c5b447-gjh5h" podUID="ab7c3046-ac34-417e-a7c6-63e500286063" containerName="metrics-server" probeResult="failure" output="Get \"https://10.217.0.89:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:22 crc kubenswrapper[4826]: I0319 20:16:22.767758 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="763c5ded-be94-49ad-9eea-447e444f24f3" containerName="galera" probeResult="failure" output="command timed out" Mar 19 20:16:22 crc kubenswrapper[4826]: I0319 20:16:22.769133 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="763c5ded-be94-49ad-9eea-447e444f24f3" containerName="galera" probeResult="failure" output="command timed out" Mar 19 20:16:22 crc kubenswrapper[4826]: I0319 20:16:22.771308 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/openstack-galera-0" Mar 19 20:16:22 crc kubenswrapper[4826]: I0319 20:16:22.773554 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="ab298593-ac97-4031-8bfc-b0e5be9b341a" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Mar 19 20:16:22 crc kubenswrapper[4826]: I0319 20:16:22.774165 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 19 20:16:22 crc kubenswrapper[4826]: I0319 20:16:22.786855 4826 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="galera" containerStatusID={"Type":"cri-o","ID":"48ef0c18927acad3cab5327c9df4d256a3f01325b10cf9e6772558514a35dec9"} pod="openstack/openstack-galera-0" containerMessage="Container galera failed liveness probe, will be restarted" Mar 19 20:16:23 crc kubenswrapper[4826]: I0319 20:16:23.036851 4826 patch_prober.go:28] interesting pod/monitoring-plugin-747c5d4c44-ltxl4 container/monitoring-plugin namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.90:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:23 crc kubenswrapper[4826]: I0319 20:16:23.037124 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/monitoring-plugin-747c5d4c44-ltxl4" podUID="2b19eec2-98e8-47bd-b68f-55b033eb788c" containerName="monitoring-plugin" probeResult="failure" output="Get \"https://10.217.0.90:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:23 crc kubenswrapper[4826]: I0319 20:16:23.036937 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-zjkbj" podUID="a960df53-d712-424a-85a7-64b0e50c911f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.110:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:23 crc kubenswrapper[4826]: I0319 20:16:23.036859 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-zjkbj" podUID="a960df53-d712-424a-85a7-64b0e50c911f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.110:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:23 crc kubenswrapper[4826]: I0319 20:16:23.139079 4826 trace.go:236] Trace[534867564]: "Calculate volume metrics of storage for pod openshift-logging/logging-loki-index-gateway-0" (19-Mar-2026 20:16:15.277) (total time: 7857ms): Mar 19 20:16:23 crc kubenswrapper[4826]: Trace[534867564]: [7.857900577s] [7.857900577s] END Mar 19 20:16:23 crc kubenswrapper[4826]: I0319 20:16:23.275935 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/speaker-w2f68" podUID="b812f1db-b2c8-467c-977a-a8661540546e" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:23 crc kubenswrapper[4826]: I0319 20:16:23.275961 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/speaker-w2f68" podUID="b812f1db-b2c8-467c-977a-a8661540546e" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:23 crc kubenswrapper[4826]: I0319 20:16:23.375826 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-b76w9" podUID="4f382869-5ee2-4a46-8188-d4ddd0bee2fa" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.116:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:23 crc kubenswrapper[4826]: I0319 20:16:23.375823 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-b76w9" podUID="4f382869-5ee2-4a46-8188-d4ddd0bee2fa" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.116:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:23 crc kubenswrapper[4826]: I0319 20:16:23.422286 4826 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:23 crc kubenswrapper[4826]: I0319 20:16:23.422338 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:23 crc kubenswrapper[4826]: I0319 20:16:23.620866 4826 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-pfrcn container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:23 crc kubenswrapper[4826]: I0319 20:16:23.620925 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-pfrcn" podUID="72f0a310-1676-49a4-826a-d83406d28e93" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:23 crc kubenswrapper[4826]: I0319 20:16:23.768941 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="38814433-1737-49df-966a-ac3511ed48dd" containerName="galera" probeResult="failure" output="command timed out" Mar 19 20:16:23 crc kubenswrapper[4826]: I0319 20:16:23.770078 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="38814433-1737-49df-966a-ac3511ed48dd" containerName="galera" probeResult="failure" output="command timed out" Mar 19 20:16:23 crc kubenswrapper[4826]: I0319 20:16:23.770140 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="763c5ded-be94-49ad-9eea-447e444f24f3" containerName="galera" probeResult="failure" output="command timed out" Mar 19 20:16:23 crc kubenswrapper[4826]: I0319 20:16:23.770642 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-index-wrwzn" podUID="f9f3d33c-0421-473c-94e6-a7860932d772" containerName="registry-server" probeResult="failure" output="command timed out" Mar 19 20:16:23 crc kubenswrapper[4826]: I0319 20:16:23.772870 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-index-wrwzn" podUID="f9f3d33c-0421-473c-94e6-a7860932d772" containerName="registry-server" probeResult="failure" output="command timed out" Mar 19 20:16:24 crc kubenswrapper[4826]: I0319 20:16:24.236416 4826 patch_prober.go:28] interesting pod/thanos-querier-788cb6bfb6-558hf container/kube-rbac-proxy-web namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.87:9091/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:24 crc kubenswrapper[4826]: I0319 20:16:24.236745 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/thanos-querier-788cb6bfb6-558hf" podUID="c1f0645f-c1e5-40ed-8cc7-d3b7e15175b8" containerName="kube-rbac-proxy-web" probeResult="failure" output="Get \"https://10.217.0.87:9091/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:24 crc kubenswrapper[4826]: I0319 20:16:24.741058 4826 patch_prober.go:28] interesting pod/apiserver-76f77b778f-h7mf4 container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/readyz?exclude=etcd&exclude=etcd-readiness\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:24 crc kubenswrapper[4826]: I0319 20:16:24.741124 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-76f77b778f-h7mf4" podUID="86f5311b-39ed-455f-a9bc-b83044d63db8" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.5:8443/readyz?exclude=etcd&exclude=etcd-readiness\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:24 crc kubenswrapper[4826]: I0319 20:16:24.750146 4826 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-tw9k9 container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:24 crc kubenswrapper[4826]: I0319 20:16:24.750216 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tw9k9" podUID="4e673de9-6eb1-430b-8123-1254957f125f" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.7:8443/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:24 crc kubenswrapper[4826]: I0319 20:16:24.773554 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/community-operators-p9rmq" podUID="3a20f6a8-01f3-4492-856d-e5f494672fa3" containerName="registry-server" probeResult="failure" output="command timed out" Mar 19 20:16:24 crc kubenswrapper[4826]: I0319 20:16:24.773758 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/community-operators-p9rmq" podUID="3a20f6a8-01f3-4492-856d-e5f494672fa3" containerName="registry-server" probeResult="failure" output="command timed out" Mar 19 20:16:25 crc kubenswrapper[4826]: I0319 20:16:25.089643 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="bf194957-ec68-4ea7-b094-3e0912bc3bc5" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.171:9090/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:25 crc kubenswrapper[4826]: I0319 20:16:25.089796 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 19 20:16:25 crc kubenswrapper[4826]: I0319 20:16:25.089631 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/prometheus-metric-storage-0" podUID="bf194957-ec68-4ea7-b094-3e0912bc3bc5" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.171:9090/-/healthy\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:25 crc kubenswrapper[4826]: I0319 20:16:25.095832 4826 patch_prober.go:28] interesting pod/loki-operator-controller-manager-d88f59dd5-fqs6s container/manager namespace/openshift-operators-redhat: Liveness probe status=failure output="Get \"http://10.217.0.49:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:25 crc kubenswrapper[4826]: I0319 20:16:25.095875 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators-redhat/loki-operator-controller-manager-d88f59dd5-fqs6s" podUID="84bba80c-841e-4df3-87e0-901afbc23bf3" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.49:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:25 crc kubenswrapper[4826]: I0319 20:16:25.095930 4826 patch_prober.go:28] interesting pod/loki-operator-controller-manager-d88f59dd5-fqs6s container/manager namespace/openshift-operators-redhat: Readiness probe status=failure output="Get \"http://10.217.0.49:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:25 crc kubenswrapper[4826]: I0319 20:16:25.095947 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators-redhat/loki-operator-controller-manager-d88f59dd5-fqs6s" podUID="84bba80c-841e-4df3-87e0-901afbc23bf3" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.49:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:25 crc kubenswrapper[4826]: I0319 20:16:25.108359 4826 patch_prober.go:28] interesting pod/image-registry-66df7c8f76-77cc2 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="Get \"https://10.217.0.72:5000/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:25 crc kubenswrapper[4826]: I0319 20:16:25.108421 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-66df7c8f76-77cc2" podUID="85a0e24a-07c2-4184-8f80-90479e82f839" containerName="registry" probeResult="failure" output="Get \"https://10.217.0.72:5000/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:25 crc kubenswrapper[4826]: I0319 20:16:25.120256 4826 patch_prober.go:28] interesting pod/image-registry-66df7c8f76-77cc2 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="Get \"https://10.217.0.72:5000/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:25 crc kubenswrapper[4826]: I0319 20:16:25.120324 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-66df7c8f76-77cc2" podUID="85a0e24a-07c2-4184-8f80-90479e82f839" containerName="registry" probeResult="failure" output="Get \"https://10.217.0.72:5000/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:25 crc kubenswrapper[4826]: I0319 20:16:25.251169 4826 patch_prober.go:28] interesting pod/nmstate-webhook-5f558f5558-d4pjw container/nmstate-webhook namespace/openshift-nmstate: Readiness probe status=failure output="Get \"https://10.217.0.71:9443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:25 crc kubenswrapper[4826]: I0319 20:16:25.251516 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-webhook-5f558f5558-d4pjw" podUID="afb786fa-7916-4f36-9978-5bd829c9dbf8" containerName="nmstate-webhook" probeResult="failure" output="Get \"https://10.217.0.71:9443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:25 crc kubenswrapper[4826]: I0319 20:16:25.427975 4826 patch_prober.go:28] interesting pod/machine-config-daemon-zz87p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:16:25 crc kubenswrapper[4826]: I0319 20:16:25.428084 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:16:25 crc kubenswrapper[4826]: I0319 20:16:25.428143 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" Mar 19 20:16:25 crc kubenswrapper[4826]: I0319 20:16:25.430718 4826 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b791712166d20c7da73a215deab7b11094ad95f1cb3bea1ca9cc7f96f4e37482"} pod="openshift-machine-config-operator/machine-config-daemon-zz87p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 20:16:25 crc kubenswrapper[4826]: I0319 20:16:25.432148 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" containerID="cri-o://b791712166d20c7da73a215deab7b11094ad95f1cb3bea1ca9cc7f96f4e37482" gracePeriod=600 Mar 19 20:16:25 crc kubenswrapper[4826]: I0319 20:16:25.463202 4826 patch_prober.go:28] interesting pod/logging-loki-distributor-9c6b6d984-qrlfg container/loki-distributor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:25 crc kubenswrapper[4826]: I0319 20:16:25.463298 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-distributor-9c6b6d984-qrlfg" podUID="e1f51b15-5d82-43d5-b391-5f4b10434957" containerName="loki-distributor" probeResult="failure" output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:25 crc kubenswrapper[4826]: I0319 20:16:25.580477 4826 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-pfrcn container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Mar 19 20:16:25 crc kubenswrapper[4826]: I0319 20:16:25.580558 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-pfrcn" podUID="72f0a310-1676-49a4-826a-d83406d28e93" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" Mar 19 20:16:25 crc kubenswrapper[4826]: I0319 20:16:25.769151 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-handler-zqv4g" podUID="9f450458-8845-4ec5-9971-6df9dd448312" containerName="nmstate-handler" probeResult="failure" output="command timed out" Mar 19 20:16:25 crc kubenswrapper[4826]: I0319 20:16:25.771370 4826 patch_prober.go:28] interesting pod/logging-loki-query-frontend-ff66c4dc9-l2p46 container/loki-query-frontend namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:25 crc kubenswrapper[4826]: I0319 20:16:25.771451 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-l2p46" podUID="0fc08676-ae6f-4018-8f85-259585de45fe" containerName="loki-query-frontend" probeResult="failure" output="Get \"https://10.217.0.55:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:25 crc kubenswrapper[4826]: I0319 20:16:25.908367 4826 patch_prober.go:28] interesting pod/logging-loki-querier-6dcbdf8bb8-qltmk container/loki-querier namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.54:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:25 crc kubenswrapper[4826]: I0319 20:16:25.908418 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qltmk" podUID="081e84d7-1c7e-4c6f-935e-ee01eaf393e2" containerName="loki-querier" probeResult="failure" output="Get \"https://10.217.0.54:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:26 crc kubenswrapper[4826]: I0319 20:16:26.142035 4826 patch_prober.go:28] interesting pod/logging-loki-gateway-68b4bcd8f5-zvtrc container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8081/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:26 crc kubenswrapper[4826]: I0319 20:16:26.142093 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-zvtrc" podUID="64ca34d8-5f9f-448d-9ab2-414c5b4757e9" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.56:8081/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:26 crc kubenswrapper[4826]: I0319 20:16:26.143314 4826 patch_prober.go:28] interesting pod/logging-loki-gateway-68b4bcd8f5-zvtrc container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:26 crc kubenswrapper[4826]: I0319 20:16:26.143367 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-zvtrc" podUID="64ca34d8-5f9f-448d-9ab2-414c5b4757e9" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:26 crc kubenswrapper[4826]: I0319 20:16:26.195888 4826 patch_prober.go:28] interesting pod/logging-loki-gateway-68b4bcd8f5-mhqzk container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:26 crc kubenswrapper[4826]: I0319 20:16:26.195928 4826 patch_prober.go:28] interesting pod/logging-loki-gateway-68b4bcd8f5-mhqzk container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:8081/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:26 crc kubenswrapper[4826]: I0319 20:16:26.195946 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-mhqzk" podUID="1e1484c9-801f-4999-9754-456df604d7ca" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.57:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:26 crc kubenswrapper[4826]: I0319 20:16:26.195961 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-mhqzk" podUID="1e1484c9-801f-4999-9754-456df604d7ca" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.57:8081/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:26 crc kubenswrapper[4826]: I0319 20:16:26.195881 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-init-6c6f68556d-k5tlt" podUID="c045bb2f-b87b-4a14-92b5-0b98cdc7a0d1" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.103:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:26 crc kubenswrapper[4826]: I0319 20:16:26.485977 4826 patch_prober.go:28] interesting pod/controller-manager-567cb464d6-bm4t6 container/controller-manager namespace/openshift-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.70:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:26 crc kubenswrapper[4826]: I0319 20:16:26.486072 4826 patch_prober.go:28] interesting pod/controller-manager-567cb464d6-bm4t6 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.70:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:26 crc kubenswrapper[4826]: I0319 20:16:26.486419 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-567cb464d6-bm4t6" podUID="e5996d80-d5eb-423c-8965-1f5704c3dd69" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.70:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:26 crc kubenswrapper[4826]: I0319 20:16:26.486346 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-controller-manager/controller-manager-567cb464d6-bm4t6" podUID="e5996d80-d5eb-423c-8965-1f5704c3dd69" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.70:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:26 crc kubenswrapper[4826]: I0319 20:16:26.489398 4826 patch_prober.go:28] interesting pod/route-controller-manager-bb4bb89f7-bhb8x container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.73:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:26 crc kubenswrapper[4826]: I0319 20:16:26.489478 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-bb4bb89f7-bhb8x" podUID="5f25fb62-ec83-409e-88fb-0073d07869b9" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.73:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:26 crc kubenswrapper[4826]: I0319 20:16:26.492282 4826 patch_prober.go:28] interesting pod/route-controller-manager-bb4bb89f7-bhb8x container/route-controller-manager namespace/openshift-route-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.73:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:26 crc kubenswrapper[4826]: I0319 20:16:26.492349 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-route-controller-manager/route-controller-manager-bb4bb89f7-bhb8x" podUID="5f25fb62-ec83-409e-88fb-0073d07869b9" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.73:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:26 crc kubenswrapper[4826]: I0319 20:16:26.492404 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-route-controller-manager/route-controller-manager-bb4bb89f7-bhb8x" Mar 19 20:16:26 crc kubenswrapper[4826]: I0319 20:16:26.493950 4826 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="route-controller-manager" containerStatusID={"Type":"cri-o","ID":"124b4e552bd3b6885f319faa228718a478ff32940ded948a03982e9e648fe0bb"} pod="openshift-route-controller-manager/route-controller-manager-bb4bb89f7-bhb8x" containerMessage="Container route-controller-manager failed liveness probe, will be restarted" Mar 19 20:16:26 crc kubenswrapper[4826]: I0319 20:16:26.494012 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-bb4bb89f7-bhb8x" podUID="5f25fb62-ec83-409e-88fb-0073d07869b9" containerName="route-controller-manager" containerID="cri-o://124b4e552bd3b6885f319faa228718a478ff32940ded948a03982e9e648fe0bb" gracePeriod=30 Mar 19 20:16:26 crc kubenswrapper[4826]: I0319 20:16:26.608706 4826 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.58:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:26 crc kubenswrapper[4826]: I0319 20:16:26.608764 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="93990ea7-96ba-4c12-b92c-17a7c38aece4" containerName="loki-ingester" probeResult="failure" output="Get \"https://10.217.0.58:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:26 crc kubenswrapper[4826]: I0319 20:16:26.717343 4826 patch_prober.go:28] interesting pod/logging-loki-compactor-0 container/loki-compactor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.60:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:26 crc kubenswrapper[4826]: I0319 20:16:26.717412 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-compactor-0" podUID="377fff75-1f59-4c28-a3ed-2bd89e803b73" containerName="loki-compactor" probeResult="failure" output="Get \"https://10.217.0.60:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:26 crc kubenswrapper[4826]: I0319 20:16:26.771214 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-k8s-0" podUID="cb23233f-a975-4476-8bff-5e7b4b9c8646" containerName="prometheus" probeResult="failure" output="command timed out" Mar 19 20:16:26 crc kubenswrapper[4826]: I0319 20:16:26.771314 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="cb23233f-a975-4476-8bff-5e7b4b9c8646" containerName="prometheus" probeResult="failure" output="command timed out" Mar 19 20:16:26 crc kubenswrapper[4826]: I0319 20:16:26.771417 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Mar 19 20:16:26 crc kubenswrapper[4826]: I0319 20:16:26.937844 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" event={"ID":"b456fa3f-c7a7-45ca-b560-e7a9b21be05a","Type":"ContainerDied","Data":"b791712166d20c7da73a215deab7b11094ad95f1cb3bea1ca9cc7f96f4e37482"} Mar 19 20:16:26 crc kubenswrapper[4826]: I0319 20:16:26.939448 4826 generic.go:334] "Generic (PLEG): container finished" podID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerID="b791712166d20c7da73a215deab7b11094ad95f1cb3bea1ca9cc7f96f4e37482" exitCode=0 Mar 19 20:16:26 crc kubenswrapper[4826]: I0319 20:16:26.941951 4826 scope.go:117] "RemoveContainer" containerID="02b3c6e47f7a47f581a85215aee0d596eac58c76c3ea4fd0865e77774a333c9d" Mar 19 20:16:27 crc kubenswrapper[4826]: I0319 20:16:27.010019 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-jjqrs" podUID="5f60643c-c919-436b-bd23-9e39698d9c9b" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.104:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:27 crc kubenswrapper[4826]: I0319 20:16:27.010381 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-jjqrs" podUID="5f60643c-c919-436b-bd23-9e39698d9c9b" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.104:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:27 crc kubenswrapper[4826]: I0319 20:16:27.052168 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-zm4ps" podUID="38267b94-39ea-4067-9b6e-3d863ff60494" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.105:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:27 crc kubenswrapper[4826]: I0319 20:16:27.257922 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-hf8n5" podUID="080fa697-4720-424e-b75e-6564061cd68f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.106:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:27 crc kubenswrapper[4826]: I0319 20:16:27.340851 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-8265b" podUID="0f77f094-1b90-43a6-85be-27e8b1fda71f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.107:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:27 crc kubenswrapper[4826]: I0319 20:16:27.423876 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-ngb9j" podUID="ee5c97c9-5dc0-4292-9a34-08ca45f5387a" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.109:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:27 crc kubenswrapper[4826]: I0319 20:16:27.423953 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-zm4ps" podUID="38267b94-39ea-4067-9b6e-3d863ff60494" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.105:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:27 crc kubenswrapper[4826]: I0319 20:16:27.424002 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-rsrjx" podUID="d2375678-e630-4376-9dfd-28efbc77aed4" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.108:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:27 crc kubenswrapper[4826]: I0319 20:16:27.519859 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-rsrjx" podUID="d2375678-e630-4376-9dfd-28efbc77aed4" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.108:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:27 crc kubenswrapper[4826]: I0319 20:16:27.519951 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-725xd" podUID="f073a654-efe9-4fd0-9c08-23d9fdb0d492" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.111:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:27 crc kubenswrapper[4826]: I0319 20:16:27.520591 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-hf8n5" podUID="080fa697-4720-424e-b75e-6564061cd68f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.106:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:27 crc kubenswrapper[4826]: I0319 20:16:27.598837 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/manila-operator-controller-manager-55f864c847-zrczt" podUID="6a5ffd48-ea97-46a0-b9ed-f7c38d5d8a90" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.112:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:27 crc kubenswrapper[4826]: I0319 20:16:27.599022 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-8265b" podUID="0f77f094-1b90-43a6-85be-27e8b1fda71f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.107:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:27 crc kubenswrapper[4826]: I0319 20:16:27.682930 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-4bkbn" podUID="49f5fbe6-ba93-4ff2-b575-aa08dceb2622" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.114:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:27 crc kubenswrapper[4826]: I0319 20:16:27.764850 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-ngb9j" podUID="ee5c97c9-5dc0-4292-9a34-08ca45f5387a" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.109:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:27 crc kubenswrapper[4826]: I0319 20:16:27.764857 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-xpq6x" podUID="271f8c86-929d-46a4-8852-f5ec8e701bcb" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.113:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:27 crc kubenswrapper[4826]: I0319 20:16:27.770630 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-marketplace-plvms" podUID="623d1506-574a-4eee-9f1c-cc0ee85e9083" containerName="registry-server" probeResult="failure" output="command timed out" Mar 19 20:16:27 crc kubenswrapper[4826]: I0319 20:16:27.778451 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="ab298593-ac97-4031-8bfc-b0e5be9b341a" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Mar 19 20:16:27 crc kubenswrapper[4826]: I0319 20:16:27.778537 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ceilometer-0" Mar 19 20:16:27 crc kubenswrapper[4826]: I0319 20:16:27.779247 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-plvms" podUID="623d1506-574a-4eee-9f1c-cc0ee85e9083" containerName="registry-server" probeResult="failure" output="command timed out" Mar 19 20:16:27 crc kubenswrapper[4826]: I0319 20:16:27.780205 4826 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="ceilometer-central-agent" containerStatusID={"Type":"cri-o","ID":"6bce5b1cd3e4e908191a9ef12dfd6f5c8e6ba3c2ec093ded1e9938b5a4c85dc8"} pod="openstack/ceilometer-0" containerMessage="Container ceilometer-central-agent failed liveness probe, will be restarted" Mar 19 20:16:27 crc kubenswrapper[4826]: I0319 20:16:27.780282 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ab298593-ac97-4031-8bfc-b0e5be9b341a" containerName="ceilometer-central-agent" containerID="cri-o://6bce5b1cd3e4e908191a9ef12dfd6f5c8e6ba3c2ec093ded1e9938b5a4c85dc8" gracePeriod=30 Mar 19 20:16:27 crc kubenswrapper[4826]: I0319 20:16:27.846874 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-725xd" podUID="f073a654-efe9-4fd0-9c08-23d9fdb0d492" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.111:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:27 crc kubenswrapper[4826]: I0319 20:16:27.846959 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ovn-operator-controller-manager-884679f54-zs74n" podUID="6243b523-966a-4f1d-b663-2f1ed4614fdb" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.117:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:27 crc kubenswrapper[4826]: I0319 20:16:27.928998 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-tjcmb" podUID="44055ef9-1bc5-4b25-a40d-553a1546fc15" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.119:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:27 crc kubenswrapper[4826]: I0319 20:16:27.929540 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/manila-operator-controller-manager-55f864c847-zrczt" podUID="6a5ffd48-ea97-46a0-b9ed-f7c38d5d8a90" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.112:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:27 crc kubenswrapper[4826]: I0319 20:16:27.929715 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-55f864c847-zrczt" Mar 19 20:16:27 crc kubenswrapper[4826]: I0319 20:16:27.929841 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-4bkbn" podUID="49f5fbe6-ba93-4ff2-b575-aa08dceb2622" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.114:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:27 crc kubenswrapper[4826]: I0319 20:16:27.970963 4826 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-sbhr9 container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.82:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:27 crc kubenswrapper[4826]: I0319 20:16:27.971024 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-sbhr9" podUID="67f96c65-0583-4f62-a063-98c7e6bbfb87" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.82:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:27 crc kubenswrapper[4826]: I0319 20:16:27.971080 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/neutron-operator-controller-manager-767865f676-sfs65" podUID="918ac815-fe60-44b9-b6c0-c99ee8dc80b8" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.115:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:27 crc kubenswrapper[4826]: I0319 20:16:27.971115 4826 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-sbhr9 container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.82:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:27 crc kubenswrapper[4826]: I0319 20:16:27.971129 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-sbhr9" podUID="67f96c65-0583-4f62-a063-98c7e6bbfb87" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.82:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:27 crc kubenswrapper[4826]: I0319 20:16:27.971161 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-xpq6x" podUID="271f8c86-929d-46a4-8852-f5ec8e701bcb" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.113:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:28 crc kubenswrapper[4826]: I0319 20:16:28.052884 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-j4p25" podUID="7137162e-cccf-4ce6-9dc4-7380db33a85a" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.118:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:28 crc kubenswrapper[4826]: I0319 20:16:28.052996 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-tjcmb" podUID="44055ef9-1bc5-4b25-a40d-553a1546fc15" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.119:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:28 crc kubenswrapper[4826]: I0319 20:16:28.053014 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/kube-state-metrics-0" podUID="06606200-618f-46b2-afb9-e5e2738fe2dd" containerName="kube-state-metrics" probeResult="failure" output="Get \"https://10.217.1.17:8080/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:28 crc kubenswrapper[4826]: I0319 20:16:28.053031 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/kube-state-metrics-0" podUID="06606200-618f-46b2-afb9-e5e2738fe2dd" containerName="kube-state-metrics" probeResult="failure" output="Get \"https://10.217.1.17:8081/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:28 crc kubenswrapper[4826]: I0319 20:16:28.052861 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/ovn-operator-controller-manager-884679f54-zs74n" podUID="6243b523-966a-4f1d-b663-2f1ed4614fdb" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.117:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:28 crc kubenswrapper[4826]: I0319 20:16:28.090358 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="bf194957-ec68-4ea7-b094-3e0912bc3bc5" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.171:9090/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:28 crc kubenswrapper[4826]: I0319 20:16:28.134814 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/neutron-operator-controller-manager-767865f676-sfs65" podUID="918ac815-fe60-44b9-b6c0-c99ee8dc80b8" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.115:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:28 crc kubenswrapper[4826]: I0319 20:16:28.134848 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/swift-operator-controller-manager-c674c5965-skdcp" podUID="aff2d31f-3465-4c0c-8bbf-b04dfdb92db0" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.121:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:28 crc kubenswrapper[4826]: I0319 20:16:28.300854 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-6vmk6" podUID="e36e6f7a-53ec-4262-b9e5-798353e5bf15" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.123:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:28 crc kubenswrapper[4826]: I0319 20:16:28.301042 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-j4p25" podUID="7137162e-cccf-4ce6-9dc4-7380db33a85a" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.118:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:28 crc kubenswrapper[4826]: I0319 20:16:28.383008 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-7l4t6" podUID="dc64459f-49c1-41f5-b946-88ab7bc8e1d8" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.124:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:28 crc kubenswrapper[4826]: I0319 20:16:28.383258 4826 patch_prober.go:28] interesting pod/console-849c6d8fdf-t6vlp container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.142:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:28 crc kubenswrapper[4826]: I0319 20:16:28.383397 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-849c6d8fdf-t6vlp" podUID="d068f929-58c2-481e-99bd-e7808a74f36e" containerName="console" probeResult="failure" output="Get \"https://10.217.0.142:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:28 crc kubenswrapper[4826]: I0319 20:16:28.383193 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/telemetry-operator-controller-manager-6c5c766d94-258q2" podUID="5d8869b3-7d43-4db2-b79d-f05c13d0d6f2" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.122:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:28 crc kubenswrapper[4826]: I0319 20:16:28.383296 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/telemetry-operator-controller-manager-6c5c766d94-258q2" podUID="5d8869b3-7d43-4db2-b79d-f05c13d0d6f2" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.122:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:28 crc kubenswrapper[4826]: I0319 20:16:28.383335 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-7l4t6" podUID="dc64459f-49c1-41f5-b946-88ab7bc8e1d8" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.124:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:28 crc kubenswrapper[4826]: I0319 20:16:28.383477 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-849c6d8fdf-t6vlp" Mar 19 20:16:28 crc kubenswrapper[4826]: I0319 20:16:28.383160 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/swift-operator-controller-manager-c674c5965-skdcp" podUID="aff2d31f-3465-4c0c-8bbf-b04dfdb92db0" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.121:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:28 crc kubenswrapper[4826]: I0319 20:16:28.383335 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-6vmk6" podUID="e36e6f7a-53ec-4262-b9e5-798353e5bf15" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.123:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:28 crc kubenswrapper[4826]: I0319 20:16:28.580331 4826 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-pfrcn container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Mar 19 20:16:28 crc kubenswrapper[4826]: I0319 20:16:28.580768 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-pfrcn" podUID="72f0a310-1676-49a4-826a-d83406d28e93" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" Mar 19 20:16:28 crc kubenswrapper[4826]: I0319 20:16:28.773125 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-operators-xnzmj" podUID="326a5687-dfe7-4a01-b8b9-c6bedd76684a" containerName="registry-server" probeResult="failure" output="command timed out" Mar 19 20:16:28 crc kubenswrapper[4826]: I0319 20:16:28.774094 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-operators-xnzmj" podUID="326a5687-dfe7-4a01-b8b9-c6bedd76684a" containerName="registry-server" probeResult="failure" output="command timed out" Mar 19 20:16:28 crc kubenswrapper[4826]: I0319 20:16:28.963703 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" event={"ID":"b456fa3f-c7a7-45ca-b560-e7a9b21be05a","Type":"ContainerStarted","Data":"63ecc124824e01c5ccfd2a32cf4bb3e2efc0746dfbe17c96f0b271731ffe1823"} Mar 19 20:16:28 crc kubenswrapper[4826]: I0319 20:16:28.971862 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/manila-operator-controller-manager-55f864c847-zrczt" podUID="6a5ffd48-ea97-46a0-b9ed-f7c38d5d8a90" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.112:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:29 crc kubenswrapper[4826]: I0319 20:16:29.009754 4826 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:6443/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:29 crc kubenswrapper[4826]: I0319 20:16:29.009862 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:29 crc kubenswrapper[4826]: I0319 20:16:29.235989 4826 patch_prober.go:28] interesting pod/thanos-querier-788cb6bfb6-558hf container/kube-rbac-proxy-web namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.87:9091/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:29 crc kubenswrapper[4826]: I0319 20:16:29.236061 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/thanos-querier-788cb6bfb6-558hf" podUID="c1f0645f-c1e5-40ed-8cc7-d3b7e15175b8" containerName="kube-rbac-proxy-web" probeResult="failure" output="Get \"https://10.217.0.87:9091/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:29 crc kubenswrapper[4826]: I0319 20:16:29.384496 4826 patch_prober.go:28] interesting pod/console-849c6d8fdf-t6vlp container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.142:8443/health\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:29 crc kubenswrapper[4826]: I0319 20:16:29.384569 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-849c6d8fdf-t6vlp" podUID="d068f929-58c2-481e-99bd-e7808a74f36e" containerName="console" probeResult="failure" output="Get \"https://10.217.0.142:8443/health\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:29 crc kubenswrapper[4826]: I0319 20:16:29.669846 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="cert-manager/cert-manager-webhook-687f57d79b-fhw8v" podUID="fe2ad622-0df2-4cb2-8c00-45f4d9a8a1c3" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.40:6080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:29 crc kubenswrapper[4826]: I0319 20:16:29.751079 4826 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-tw9k9 container/oauth-apiserver namespace/openshift-oauth-apiserver: Liveness probe status=failure output="Get \"https://10.217.0.7:8443/livez?exclude=etcd\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:29 crc kubenswrapper[4826]: I0319 20:16:29.751432 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tw9k9" podUID="4e673de9-6eb1-430b-8123-1254957f125f" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.7:8443/livez?exclude=etcd\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:29 crc kubenswrapper[4826]: I0319 20:16:29.780820 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-8t2bm" podUID="50980b03-91b0-4e4d-9923-e2a531458fd4" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.125:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:30 crc kubenswrapper[4826]: I0319 20:16:30.088332 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/prometheus-metric-storage-0" podUID="bf194957-ec68-4ea7-b094-3e0912bc3bc5" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.171:9090/-/healthy\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:30 crc kubenswrapper[4826]: E0319 20:16:30.193304 4826 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:16:20Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:16:20Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:16:20Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T20:16:20Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:30 crc kubenswrapper[4826]: I0319 20:16:30.417871 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-controller-manager-84d47777df-4x998" podUID="010ce31f-d333-43a9-b1e0-cd85cc0f6fd6" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.95:8080/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:30 crc kubenswrapper[4826]: I0319 20:16:30.474916 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="hostpath-provisioner/csi-hostpathplugin-4rf57" podUID="eeb43c2f-961b-4ed4-9aa0-cda4dea289cb" containerName="hostpath-provisioner" probeResult="failure" output="Get \"http://10.217.0.35:9898/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:30 crc kubenswrapper[4826]: I0319 20:16:30.482446 4826 patch_prober.go:28] interesting pod/oauth-openshift-55bb4f975f-zpl6z container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.74:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:30 crc kubenswrapper[4826]: I0319 20:16:30.482497 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" podUID="46e578cd-3724-4abe-805c-554b384ed050" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.74:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:30 crc kubenswrapper[4826]: I0319 20:16:30.482770 4826 patch_prober.go:28] interesting pod/oauth-openshift-55bb4f975f-zpl6z container/oauth-openshift namespace/openshift-authentication: Liveness probe status=failure output="Get \"https://10.217.0.74:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:30 crc kubenswrapper[4826]: I0319 20:16:30.482826 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" podUID="46e578cd-3724-4abe-805c-554b384ed050" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.74:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:30 crc kubenswrapper[4826]: I0319 20:16:30.482879 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" Mar 19 20:16:30 crc kubenswrapper[4826]: I0319 20:16:30.485509 4826 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="oauth-openshift" containerStatusID={"Type":"cri-o","ID":"9bc32b51b5566b427ffa287240d9eb0613e8145bd9253dd2736092863a4a7221"} pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" containerMessage="Container oauth-openshift failed liveness probe, will be restarted" Mar 19 20:16:30 crc kubenswrapper[4826]: I0319 20:16:30.625972 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/certified-operators-mw64r" podUID="a3821b4d-7122-428f-be08-2c5f72a29b1d" containerName="registry-server" probeResult="failure" output=< Mar 19 20:16:30 crc kubenswrapper[4826]: timeout: failed to connect service ":50051" within 1s Mar 19 20:16:30 crc kubenswrapper[4826]: > Mar 19 20:16:30 crc kubenswrapper[4826]: I0319 20:16:30.768878 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/certified-operators-mw64r" podUID="a3821b4d-7122-428f-be08-2c5f72a29b1d" containerName="registry-server" probeResult="failure" output="command timed out" Mar 19 20:16:30 crc kubenswrapper[4826]: I0319 20:16:30.769257 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-handler-zqv4g" podUID="9f450458-8845-4ec5-9971-6df9dd448312" containerName="nmstate-handler" probeResult="failure" output="command timed out" Mar 19 20:16:30 crc kubenswrapper[4826]: I0319 20:16:30.769345 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-zqv4g" Mar 19 20:16:30 crc kubenswrapper[4826]: I0319 20:16:30.770831 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-k8s-0" podUID="cb23233f-a975-4476-8bff-5e7b4b9c8646" containerName="prometheus" probeResult="failure" output="command timed out" Mar 19 20:16:30 crc kubenswrapper[4826]: I0319 20:16:30.772230 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="cb23233f-a975-4476-8bff-5e7b4b9c8646" containerName="prometheus" probeResult="failure" output="command timed out" Mar 19 20:16:30 crc kubenswrapper[4826]: I0319 20:16:30.807958 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-webhook-server-8645ff956b-rx86q" podUID="b57da585-9fca-48a5-a872-e5019db1e36e" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.96:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:30 crc kubenswrapper[4826]: I0319 20:16:30.807994 4826 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-66v8z container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.217.0.75:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:30 crc kubenswrapper[4826]: I0319 20:16:30.808102 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-79b997595-66v8z" podUID="f182fb72-66c7-4d5d-bccd-29a47b27f4c6" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.75:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:30 crc kubenswrapper[4826]: I0319 20:16:30.808140 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-8645ff956b-rx86q" Mar 19 20:16:30 crc kubenswrapper[4826]: I0319 20:16:30.808166 4826 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-66v8z container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.75:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:30 crc kubenswrapper[4826]: I0319 20:16:30.808187 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-66v8z" podUID="f182fb72-66c7-4d5d-bccd-29a47b27f4c6" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.75:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:30 crc kubenswrapper[4826]: I0319 20:16:30.808207 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/metallb-operator-webhook-server-8645ff956b-rx86q" podUID="b57da585-9fca-48a5-a872-e5019db1e36e" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.96:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:30 crc kubenswrapper[4826]: I0319 20:16:30.808336 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="metallb-system/metallb-operator-webhook-server-8645ff956b-rx86q" Mar 19 20:16:30 crc kubenswrapper[4826]: I0319 20:16:30.820193 4826 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="webhook-server" containerStatusID={"Type":"cri-o","ID":"5cfbb83518ed54ebf9988facce22068ceae695e9cf4f38cc1f2065a778e798be"} pod="metallb-system/metallb-operator-webhook-server-8645ff956b-rx86q" containerMessage="Container webhook-server failed liveness probe, will be restarted" Mar 19 20:16:30 crc kubenswrapper[4826]: I0319 20:16:30.821607 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="metallb-system/metallb-operator-webhook-server-8645ff956b-rx86q" podUID="b57da585-9fca-48a5-a872-e5019db1e36e" containerName="webhook-server" containerID="cri-o://5cfbb83518ed54ebf9988facce22068ceae695e9cf4f38cc1f2065a778e798be" gracePeriod=2 Mar 19 20:16:30 crc kubenswrapper[4826]: I0319 20:16:30.851765 4826 patch_prober.go:28] interesting pod/authentication-operator-69f744f599-zl2jh container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:30 crc kubenswrapper[4826]: I0319 20:16:30.851828 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-69f744f599-zl2jh" podUID="352eae31-d0e1-452b-8319-ab53b8095b5a" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:30 crc kubenswrapper[4826]: I0319 20:16:30.851872 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication-operator/authentication-operator-69f744f599-zl2jh" Mar 19 20:16:30 crc kubenswrapper[4826]: I0319 20:16:30.853084 4826 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="authentication-operator" containerStatusID={"Type":"cri-o","ID":"d2e3ec0be6cedec0fc7839de9ce4bc718a61bf1dc4687ae1ec01c9bb5e46e584"} pod="openshift-authentication-operator/authentication-operator-69f744f599-zl2jh" containerMessage="Container authentication-operator failed liveness probe, will be restarted" Mar 19 20:16:30 crc kubenswrapper[4826]: I0319 20:16:30.853124 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication-operator/authentication-operator-69f744f599-zl2jh" podUID="352eae31-d0e1-452b-8319-ab53b8095b5a" containerName="authentication-operator" containerID="cri-o://d2e3ec0be6cedec0fc7839de9ce4bc718a61bf1dc4687ae1ec01c9bb5e46e584" gracePeriod=30 Mar 19 20:16:31 crc kubenswrapper[4826]: I0319 20:16:31.091359 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="bf194957-ec68-4ea7-b094-3e0912bc3bc5" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.171:9090/-/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:31 crc kubenswrapper[4826]: I0319 20:16:31.141853 4826 patch_prober.go:28] interesting pod/logging-loki-gateway-68b4bcd8f5-zvtrc container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8081/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:31 crc kubenswrapper[4826]: I0319 20:16:31.141892 4826 patch_prober.go:28] interesting pod/logging-loki-gateway-68b4bcd8f5-zvtrc container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:31 crc kubenswrapper[4826]: I0319 20:16:31.141912 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-zvtrc" podUID="64ca34d8-5f9f-448d-9ab2-414c5b4757e9" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.56:8081/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:31 crc kubenswrapper[4826]: I0319 20:16:31.141944 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-zvtrc" podUID="64ca34d8-5f9f-448d-9ab2-414c5b4757e9" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:31 crc kubenswrapper[4826]: I0319 20:16:31.161362 4826 patch_prober.go:28] interesting pod/logging-loki-gateway-68b4bcd8f5-mhqzk container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:8081/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:31 crc kubenswrapper[4826]: I0319 20:16:31.161421 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-mhqzk" podUID="1e1484c9-801f-4999-9754-456df604d7ca" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.57:8081/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:31 crc kubenswrapper[4826]: I0319 20:16:31.161506 4826 patch_prober.go:28] interesting pod/logging-loki-gateway-68b4bcd8f5-mhqzk container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:31 crc kubenswrapper[4826]: I0319 20:16:31.161524 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-mhqzk" podUID="1e1484c9-801f-4999-9754-456df604d7ca" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.57:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:31 crc kubenswrapper[4826]: I0319 20:16:31.459852 4826 patch_prober.go:28] interesting pod/observability-operator-6dd7dd855f-tcdmb container/operator namespace/openshift-operators: Liveness probe status=failure output="Get \"http://10.217.0.13:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:31 crc kubenswrapper[4826]: I0319 20:16:31.459911 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators/observability-operator-6dd7dd855f-tcdmb" podUID="217c809e-0af8-4b11-a5ce-932d698ed444" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.13:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:31 crc kubenswrapper[4826]: I0319 20:16:31.459957 4826 patch_prober.go:28] interesting pod/observability-operator-6dd7dd855f-tcdmb container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.13:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:31 crc kubenswrapper[4826]: I0319 20:16:31.460025 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-6dd7dd855f-tcdmb" podUID="217c809e-0af8-4b11-a5ce-932d698ed444" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.13:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:31 crc kubenswrapper[4826]: I0319 20:16:31.460092 4826 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-v6d7k container/catalog-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:31 crc kubenswrapper[4826]: I0319 20:16:31.460113 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-v6d7k" podUID="9bc83b3f-72da-4527-b7a8-5f09d3f5f39f" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:31 crc kubenswrapper[4826]: I0319 20:16:31.460144 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-v6d7k" Mar 19 20:16:31 crc kubenswrapper[4826]: I0319 20:16:31.460174 4826 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-v6d7k container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:31 crc kubenswrapper[4826]: I0319 20:16:31.460241 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-v6d7k" podUID="9bc83b3f-72da-4527-b7a8-5f09d3f5f39f" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:31 crc kubenswrapper[4826]: I0319 20:16:31.460492 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-v6d7k" Mar 19 20:16:31 crc kubenswrapper[4826]: I0319 20:16:31.461871 4826 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="catalog-operator" containerStatusID={"Type":"cri-o","ID":"e82fdfcddbdef8d498bd3bdde22d4b6daa10cd720da0d57c6f23bff2745a3227"} pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-v6d7k" containerMessage="Container catalog-operator failed liveness probe, will be restarted" Mar 19 20:16:31 crc kubenswrapper[4826]: I0319 20:16:31.461910 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-v6d7k" podUID="9bc83b3f-72da-4527-b7a8-5f09d3f5f39f" containerName="catalog-operator" containerID="cri-o://e82fdfcddbdef8d498bd3bdde22d4b6daa10cd720da0d57c6f23bff2745a3227" gracePeriod=30 Mar 19 20:16:31 crc kubenswrapper[4826]: I0319 20:16:31.481423 4826 patch_prober.go:28] interesting pod/console-operator-58897d9998-zc8ht container/console-operator namespace/openshift-console-operator: Liveness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:31 crc kubenswrapper[4826]: I0319 20:16:31.481495 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console-operator/console-operator-58897d9998-zc8ht" podUID="f61cc107-39c3-4add-b9a1-45c5d744ea4b" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:31 crc kubenswrapper[4826]: I0319 20:16:31.481561 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-58897d9998-zc8ht" Mar 19 20:16:31 crc kubenswrapper[4826]: I0319 20:16:31.481822 4826 patch_prober.go:28] interesting pod/console-operator-58897d9998-zc8ht container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:31 crc kubenswrapper[4826]: I0319 20:16:31.481882 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-zc8ht" podUID="f61cc107-39c3-4add-b9a1-45c5d744ea4b" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:31 crc kubenswrapper[4826]: I0319 20:16:31.481967 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-zc8ht" Mar 19 20:16:31 crc kubenswrapper[4826]: I0319 20:16:31.482921 4826 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="console-operator" containerStatusID={"Type":"cri-o","ID":"a456ee0ad60c9f375ff1bffa7a4c02e145d0984db73abfcd1d8cb0a4007c2682"} pod="openshift-console-operator/console-operator-58897d9998-zc8ht" containerMessage="Container console-operator failed liveness probe, will be restarted" Mar 19 20:16:31 crc kubenswrapper[4826]: I0319 20:16:31.482970 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console-operator/console-operator-58897d9998-zc8ht" podUID="f61cc107-39c3-4add-b9a1-45c5d744ea4b" containerName="console-operator" containerID="cri-o://a456ee0ad60c9f375ff1bffa7a4c02e145d0984db73abfcd1d8cb0a4007c2682" gracePeriod=30 Mar 19 20:16:31 crc kubenswrapper[4826]: I0319 20:16:31.619957 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-prxxj" podUID="b724e39c-45b5-4701-b4f0-a19969224d90" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:31 crc kubenswrapper[4826]: I0319 20:16:31.620066 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="metallb-system/frr-k8s-prxxj" Mar 19 20:16:31 crc kubenswrapper[4826]: I0319 20:16:31.621883 4826 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="frr" containerStatusID={"Type":"cri-o","ID":"d6501efe4d21439968204859b1ed1ca17f8790cc86545382812dea2434ac9b1b"} pod="metallb-system/frr-k8s-prxxj" containerMessage="Container frr failed liveness probe, will be restarted" Mar 19 20:16:31 crc kubenswrapper[4826]: I0319 20:16:31.622050 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="metallb-system/frr-k8s-prxxj" podUID="b724e39c-45b5-4701-b4f0-a19969224d90" containerName="frr" containerID="cri-o://d6501efe4d21439968204859b1ed1ca17f8790cc86545382812dea2434ac9b1b" gracePeriod=2 Mar 19 20:16:31 crc kubenswrapper[4826]: I0319 20:16:31.661807 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-prxxj" podUID="b724e39c-45b5-4701-b4f0-a19969224d90" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:31 crc kubenswrapper[4826]: I0319 20:16:31.661883 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="metallb-system/frr-k8s-prxxj" Mar 19 20:16:31 crc kubenswrapper[4826]: I0319 20:16:31.661825 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-prxxj" podUID="b724e39c-45b5-4701-b4f0-a19969224d90" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:31 crc kubenswrapper[4826]: I0319 20:16:31.661971 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-prxxj" Mar 19 20:16:31 crc kubenswrapper[4826]: I0319 20:16:31.662946 4826 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-pfrcn container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Mar 19 20:16:31 crc kubenswrapper[4826]: I0319 20:16:31.662979 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-pfrcn" podUID="72f0a310-1676-49a4-826a-d83406d28e93" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" Mar 19 20:16:31 crc kubenswrapper[4826]: I0319 20:16:31.850108 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-webhook-server-8645ff956b-rx86q" podUID="b57da585-9fca-48a5-a872-e5019db1e36e" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.96:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:31 crc kubenswrapper[4826]: I0319 20:16:31.857951 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/community-operators-p9rmq" podUID="3a20f6a8-01f3-4492-856d-e5f494672fa3" containerName="registry-server" probeResult="failure" output=< Mar 19 20:16:31 crc kubenswrapper[4826]: timeout: failed to connect service ":50051" within 1s Mar 19 20:16:31 crc kubenswrapper[4826]: > Mar 19 20:16:31 crc kubenswrapper[4826]: I0319 20:16:31.857955 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/community-operators-p9rmq" podUID="3a20f6a8-01f3-4492-856d-e5f494672fa3" containerName="registry-server" probeResult="failure" output=< Mar 19 20:16:31 crc kubenswrapper[4826]: timeout: failed to connect service ":50051" within 1s Mar 19 20:16:31 crc kubenswrapper[4826]: > Mar 19 20:16:31 crc kubenswrapper[4826]: I0319 20:16:31.858065 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-p9rmq" Mar 19 20:16:31 crc kubenswrapper[4826]: I0319 20:16:31.858087 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-marketplace/community-operators-p9rmq" Mar 19 20:16:31 crc kubenswrapper[4826]: I0319 20:16:31.860374 4826 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="registry-server" containerStatusID={"Type":"cri-o","ID":"9ba1abb6e9c4b14c4a37526dcfb4f281941152770b5eee3214e8be9e4b81582a"} pod="openshift-marketplace/community-operators-p9rmq" containerMessage="Container registry-server failed liveness probe, will be restarted" Mar 19 20:16:31 crc kubenswrapper[4826]: I0319 20:16:31.860428 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-p9rmq" podUID="3a20f6a8-01f3-4492-856d-e5f494672fa3" containerName="registry-server" containerID="cri-o://9ba1abb6e9c4b14c4a37526dcfb4f281941152770b5eee3214e8be9e4b81582a" gracePeriod=30 Mar 19 20:16:31 crc kubenswrapper[4826]: I0319 20:16:31.924856 4826 patch_prober.go:28] interesting pod/perses-operator-6648f6899-wbmts container/perses-operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.14:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:31 crc kubenswrapper[4826]: I0319 20:16:31.924926 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/perses-operator-6648f6899-wbmts" podUID="8eb71543-680b-4018-94e4-572cfcc12660" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.14:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:31 crc kubenswrapper[4826]: I0319 20:16:31.925010 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-6648f6899-wbmts" Mar 19 20:16:32 crc kubenswrapper[4826]: I0319 20:16:32.239013 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-6btqx" podUID="81cad5dc-6bd8-4081-adc1-28f65b056636" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.97:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:32 crc kubenswrapper[4826]: I0319 20:16:32.239074 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-6btqx" podUID="81cad5dc-6bd8-4081-adc1-28f65b056636" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.97:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:32 crc kubenswrapper[4826]: I0319 20:16:32.321858 4826 patch_prober.go:28] interesting pod/router-default-5444994796-drbf6 container/router namespace/openshift-ingress: Liveness probe status=failure output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:32 crc kubenswrapper[4826]: I0319 20:16:32.321920 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-ingress/router-default-5444994796-drbf6" podUID="ee11e1f6-25be-40f4-b19b-a2d8e439d8c6" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:32 crc kubenswrapper[4826]: I0319 20:16:32.321974 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-ingress/router-default-5444994796-drbf6" Mar 19 20:16:32 crc kubenswrapper[4826]: I0319 20:16:32.321996 4826 patch_prober.go:28] interesting pod/router-default-5444994796-drbf6 container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:32 crc kubenswrapper[4826]: I0319 20:16:32.322017 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-drbf6" podUID="ee11e1f6-25be-40f4-b19b-a2d8e439d8c6" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:32 crc kubenswrapper[4826]: I0319 20:16:32.322140 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-drbf6" Mar 19 20:16:32 crc kubenswrapper[4826]: I0319 20:16:32.323548 4826 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="router" containerStatusID={"Type":"cri-o","ID":"e5ad2e2cb4a689f1fcd62421183d45d8058fdf428b04c4a6dd5bcdf89ea60bd9"} pod="openshift-ingress/router-default-5444994796-drbf6" containerMessage="Container router failed liveness probe, will be restarted" Mar 19 20:16:32 crc kubenswrapper[4826]: I0319 20:16:32.323586 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ingress/router-default-5444994796-drbf6" podUID="ee11e1f6-25be-40f4-b19b-a2d8e439d8c6" containerName="router" containerID="cri-o://e5ad2e2cb4a689f1fcd62421183d45d8058fdf428b04c4a6dd5bcdf89ea60bd9" gracePeriod=10 Mar 19 20:16:32 crc kubenswrapper[4826]: I0319 20:16:32.407742 4826 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-dnc22 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:32 crc kubenswrapper[4826]: I0319 20:16:32.407813 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dnc22" podUID="fdb49b25-5e81-4f9d-9a17-34bade2cec18" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:32 crc kubenswrapper[4826]: I0319 20:16:32.407898 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dnc22" Mar 19 20:16:32 crc kubenswrapper[4826]: I0319 20:16:32.407930 4826 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-dnc22 container/olm-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:32 crc kubenswrapper[4826]: I0319 20:16:32.407987 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dnc22" podUID="fdb49b25-5e81-4f9d-9a17-34bade2cec18" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:32 crc kubenswrapper[4826]: I0319 20:16:32.408056 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dnc22" Mar 19 20:16:32 crc kubenswrapper[4826]: I0319 20:16:32.408082 4826 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-fcnzx container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.44:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:32 crc kubenswrapper[4826]: I0319 20:16:32.408019 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/controller-7bb4cc7c98-cnfr9" podUID="13651756-55fe-46f1-b849-fbdc5dc20887" containerName="controller" probeResult="failure" output="Get \"http://10.217.0.98:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:32 crc kubenswrapper[4826]: I0319 20:16:32.408126 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fcnzx" podUID="781f0741-f222-4ccc-aa80-6dde59e9648d" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.44:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:32 crc kubenswrapper[4826]: I0319 20:16:32.408221 4826 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-fcnzx container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.44:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:32 crc kubenswrapper[4826]: I0319 20:16:32.408331 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fcnzx" Mar 19 20:16:32 crc kubenswrapper[4826]: I0319 20:16:32.408332 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fcnzx" podUID="781f0741-f222-4ccc-aa80-6dde59e9648d" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.44:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:32 crc kubenswrapper[4826]: I0319 20:16:32.408409 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fcnzx" Mar 19 20:16:32 crc kubenswrapper[4826]: I0319 20:16:32.408475 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/controller-7bb4cc7c98-cnfr9" podUID="13651756-55fe-46f1-b849-fbdc5dc20887" containerName="controller" probeResult="failure" output="Get \"http://10.217.0.98:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:32 crc kubenswrapper[4826]: I0319 20:16:32.420503 4826 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="olm-operator" containerStatusID={"Type":"cri-o","ID":"d3a209fa43a7430fe81c94d1187f4b67a6e187072f60a4066fd2e2e620507871"} pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dnc22" containerMessage="Container olm-operator failed liveness probe, will be restarted" Mar 19 20:16:32 crc kubenswrapper[4826]: I0319 20:16:32.420581 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dnc22" podUID="fdb49b25-5e81-4f9d-9a17-34bade2cec18" containerName="olm-operator" containerID="cri-o://d3a209fa43a7430fe81c94d1187f4b67a6e187072f60a4066fd2e2e620507871" gracePeriod=30 Mar 19 20:16:32 crc kubenswrapper[4826]: I0319 20:16:32.420694 4826 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="packageserver" containerStatusID={"Type":"cri-o","ID":"9f3ea13753127e75061a238605c9b00b1148fb1f5ff5ec51558df743b2cf27d5"} pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fcnzx" containerMessage="Container packageserver failed liveness probe, will be restarted" Mar 19 20:16:32 crc kubenswrapper[4826]: I0319 20:16:32.420780 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fcnzx" podUID="781f0741-f222-4ccc-aa80-6dde59e9648d" containerName="packageserver" containerID="cri-o://9f3ea13753127e75061a238605c9b00b1148fb1f5ff5ec51558df743b2cf27d5" gracePeriod=30 Mar 19 20:16:32 crc kubenswrapper[4826]: I0319 20:16:32.532197 4826 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Readiness probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:32 crc kubenswrapper[4826]: I0319 20:16:32.532278 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:32 crc kubenswrapper[4826]: I0319 20:16:32.643875 4826 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Liveness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:32 crc kubenswrapper[4826]: I0319 20:16:32.643933 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:32 crc kubenswrapper[4826]: I0319 20:16:32.643983 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 20:16:32 crc kubenswrapper[4826]: I0319 20:16:32.651492 4826 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-scheduler" containerStatusID={"Type":"cri-o","ID":"20939d58b80c88ee271214997c2a628fedf2297700bc34d570f5ef7a0aba7429"} pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" containerMessage="Container kube-scheduler failed liveness probe, will be restarted" Mar 19 20:16:32 crc kubenswrapper[4826]: I0319 20:16:32.651578 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" containerID="cri-o://20939d58b80c88ee271214997c2a628fedf2297700bc34d570f5ef7a0aba7429" gracePeriod=30 Mar 19 20:16:32 crc kubenswrapper[4826]: I0319 20:16:32.661869 4826 patch_prober.go:28] interesting pod/metrics-server-657c5b447-gjh5h container/metrics-server namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.89:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:32 crc kubenswrapper[4826]: I0319 20:16:32.661864 4826 patch_prober.go:28] interesting pod/metrics-server-657c5b447-gjh5h container/metrics-server namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.89:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:32 crc kubenswrapper[4826]: I0319 20:16:32.661925 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/metrics-server-657c5b447-gjh5h" podUID="ab7c3046-ac34-417e-a7c6-63e500286063" containerName="metrics-server" probeResult="failure" output="Get \"https://10.217.0.89:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:32 crc kubenswrapper[4826]: I0319 20:16:32.661982 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/metrics-server-657c5b447-gjh5h" podUID="ab7c3046-ac34-417e-a7c6-63e500286063" containerName="metrics-server" probeResult="failure" output="Get \"https://10.217.0.89:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:32 crc kubenswrapper[4826]: I0319 20:16:32.662039 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-monitoring/metrics-server-657c5b447-gjh5h" Mar 19 20:16:32 crc kubenswrapper[4826]: I0319 20:16:32.664913 4826 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="metrics-server" containerStatusID={"Type":"cri-o","ID":"1d31d0d8db0c2bb8340ffd0cc50bd990421104e190bb05e42ac92b09f1760326"} pod="openshift-monitoring/metrics-server-657c5b447-gjh5h" containerMessage="Container metrics-server failed liveness probe, will be restarted" Mar 19 20:16:32 crc kubenswrapper[4826]: I0319 20:16:32.664972 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/metrics-server-657c5b447-gjh5h" podUID="ab7c3046-ac34-417e-a7c6-63e500286063" containerName="metrics-server" containerID="cri-o://1d31d0d8db0c2bb8340ffd0cc50bd990421104e190bb05e42ac92b09f1760326" gracePeriod=170 Mar 19 20:16:32 crc kubenswrapper[4826]: I0319 20:16:32.717419 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-zqv4g" Mar 19 20:16:32 crc kubenswrapper[4826]: I0319 20:16:32.772907 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/heat-engine-84bf69fdcb-b6hq4" podUID="a8d87bc1-29fa-4219-8c55-968d58f697e8" containerName="heat-engine" probeResult="failure" output="command timed out" Mar 19 20:16:32 crc kubenswrapper[4826]: I0319 20:16:32.773679 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-engine-84bf69fdcb-b6hq4" podUID="a8d87bc1-29fa-4219-8c55-968d58f697e8" containerName="heat-engine" probeResult="failure" output="command timed out" Mar 19 20:16:33 crc kubenswrapper[4826]: I0319 20:16:33.007959 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-zjkbj" podUID="a960df53-d712-424a-85a7-64b0e50c911f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.110:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:33 crc kubenswrapper[4826]: I0319 20:16:33.008466 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-zjkbj" Mar 19 20:16:33 crc kubenswrapper[4826]: I0319 20:16:33.008482 4826 patch_prober.go:28] interesting pod/perses-operator-6648f6899-wbmts container/perses-operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.14:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:33 crc kubenswrapper[4826]: I0319 20:16:33.008503 4826 patch_prober.go:28] interesting pod/monitoring-plugin-747c5d4c44-ltxl4 container/monitoring-plugin namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.90:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:33 crc kubenswrapper[4826]: I0319 20:16:33.008529 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/perses-operator-6648f6899-wbmts" podUID="8eb71543-680b-4018-94e4-572cfcc12660" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.14:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:33 crc kubenswrapper[4826]: I0319 20:16:33.008537 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/monitoring-plugin-747c5d4c44-ltxl4" podUID="2b19eec2-98e8-47bd-b68f-55b033eb788c" containerName="monitoring-plugin" probeResult="failure" output="Get \"https://10.217.0.90:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:33 crc kubenswrapper[4826]: I0319 20:16:33.008587 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/monitoring-plugin-747c5d4c44-ltxl4" Mar 19 20:16:33 crc kubenswrapper[4826]: I0319 20:16:33.275982 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/speaker-w2f68" podUID="b812f1db-b2c8-467c-977a-a8661540546e" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:33 crc kubenswrapper[4826]: I0319 20:16:33.276129 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-w2f68" Mar 19 20:16:33 crc kubenswrapper[4826]: I0319 20:16:33.276019 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/speaker-w2f68" podUID="b812f1db-b2c8-467c-977a-a8661540546e" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:33 crc kubenswrapper[4826]: I0319 20:16:33.276489 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="metallb-system/speaker-w2f68" Mar 19 20:16:33 crc kubenswrapper[4826]: I0319 20:16:33.283101 4826 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="speaker" containerStatusID={"Type":"cri-o","ID":"cecf9f716377394271759e79568a9bf3e3c6352f09e209ea04029a11db3f6fc7"} pod="metallb-system/speaker-w2f68" containerMessage="Container speaker failed liveness probe, will be restarted" Mar 19 20:16:33 crc kubenswrapper[4826]: I0319 20:16:33.283186 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="metallb-system/speaker-w2f68" podUID="b812f1db-b2c8-467c-977a-a8661540546e" containerName="speaker" containerID="cri-o://cecf9f716377394271759e79568a9bf3e3c6352f09e209ea04029a11db3f6fc7" gracePeriod=2 Mar 19 20:16:33 crc kubenswrapper[4826]: I0319 20:16:33.375906 4826 patch_prober.go:28] interesting pod/router-default-5444994796-drbf6 container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:33 crc kubenswrapper[4826]: I0319 20:16:33.375977 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-drbf6" podUID="ee11e1f6-25be-40f4-b19b-a2d8e439d8c6" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:33 crc kubenswrapper[4826]: I0319 20:16:33.376060 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-b76w9" podUID="4f382869-5ee2-4a46-8188-d4ddd0bee2fa" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.116:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:33 crc kubenswrapper[4826]: I0319 20:16:33.376138 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-b76w9" Mar 19 20:16:33 crc kubenswrapper[4826]: I0319 20:16:33.408899 4826 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-fcnzx container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.44:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:33 crc kubenswrapper[4826]: I0319 20:16:33.408975 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fcnzx" podUID="781f0741-f222-4ccc-aa80-6dde59e9648d" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.44:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:33 crc kubenswrapper[4826]: I0319 20:16:33.409111 4826 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-dnc22 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:33 crc kubenswrapper[4826]: I0319 20:16:33.409170 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dnc22" podUID="fdb49b25-5e81-4f9d-9a17-34bade2cec18" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:33 crc kubenswrapper[4826]: I0319 20:16:33.423498 4826 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:33 crc kubenswrapper[4826]: I0319 20:16:33.423552 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:33 crc kubenswrapper[4826]: I0319 20:16:33.423645 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 20:16:33 crc kubenswrapper[4826]: I0319 20:16:33.500478 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="hostpath-provisioner/csi-hostpathplugin-4rf57" podUID="eeb43c2f-961b-4ed4-9aa0-cda4dea289cb" containerName="hostpath-provisioner" probeResult="failure" output="Get \"http://10.217.0.35:9898/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:33 crc kubenswrapper[4826]: I0319 20:16:33.769503 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="38814433-1737-49df-966a-ac3511ed48dd" containerName="galera" probeResult="failure" output="command timed out" Mar 19 20:16:33 crc kubenswrapper[4826]: I0319 20:16:33.769512 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="38814433-1737-49df-966a-ac3511ed48dd" containerName="galera" probeResult="failure" output="command timed out" Mar 19 20:16:33 crc kubenswrapper[4826]: I0319 20:16:33.769827 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="763c5ded-be94-49ad-9eea-447e444f24f3" containerName="galera" probeResult="failure" output="command timed out" Mar 19 20:16:33 crc kubenswrapper[4826]: I0319 20:16:33.770113 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 19 20:16:33 crc kubenswrapper[4826]: I0319 20:16:33.770142 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 19 20:16:33 crc kubenswrapper[4826]: I0319 20:16:33.770234 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-index-wrwzn" podUID="f9f3d33c-0421-473c-94e6-a7860932d772" containerName="registry-server" probeResult="failure" output="command timed out" Mar 19 20:16:33 crc kubenswrapper[4826]: I0319 20:16:33.770300 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/openstack-operator-index-wrwzn" Mar 19 20:16:33 crc kubenswrapper[4826]: I0319 20:16:33.771729 4826 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="galera" containerStatusID={"Type":"cri-o","ID":"3c6b4dafb4bb937c4481ee36080942d492dddee83e2f324b34dcb098d03b3ea9"} pod="openstack/openstack-cell1-galera-0" containerMessage="Container galera failed liveness probe, will be restarted" Mar 19 20:16:33 crc kubenswrapper[4826]: I0319 20:16:33.772135 4826 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="registry-server" containerStatusID={"Type":"cri-o","ID":"9fce9490185a1bb948a973838565e9f7f627139a2f115ccc5704f0289fc76b3f"} pod="openstack-operators/openstack-operator-index-wrwzn" containerMessage="Container registry-server failed liveness probe, will be restarted" Mar 19 20:16:33 crc kubenswrapper[4826]: I0319 20:16:33.772190 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-wrwzn" podUID="f9f3d33c-0421-473c-94e6-a7860932d772" containerName="registry-server" containerID="cri-o://9fce9490185a1bb948a973838565e9f7f627139a2f115ccc5704f0289fc76b3f" gracePeriod=30 Mar 19 20:16:33 crc kubenswrapper[4826]: I0319 20:16:33.774455 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-index-wrwzn" podUID="f9f3d33c-0421-473c-94e6-a7860932d772" containerName="registry-server" probeResult="failure" output="command timed out" Mar 19 20:16:33 crc kubenswrapper[4826]: I0319 20:16:33.774516 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-wrwzn" Mar 19 20:16:34 crc kubenswrapper[4826]: I0319 20:16:34.012966 4826 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:6443/livez?exclude=etcd\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:34 crc kubenswrapper[4826]: I0319 20:16:34.013036 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/livez?exclude=etcd\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:34 crc kubenswrapper[4826]: I0319 20:16:34.048914 4826 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:34 crc kubenswrapper[4826]: I0319 20:16:34.048992 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:34 crc kubenswrapper[4826]: I0319 20:16:34.059159 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-8645ff956b-rx86q" event={"ID":"b57da585-9fca-48a5-a872-e5019db1e36e","Type":"ContainerDied","Data":"5cfbb83518ed54ebf9988facce22068ceae695e9cf4f38cc1f2065a778e798be"} Mar 19 20:16:34 crc kubenswrapper[4826]: I0319 20:16:34.060483 4826 generic.go:334] "Generic (PLEG): container finished" podID="b57da585-9fca-48a5-a872-e5019db1e36e" containerID="5cfbb83518ed54ebf9988facce22068ceae695e9cf4f38cc1f2065a778e798be" exitCode=137 Mar 19 20:16:34 crc kubenswrapper[4826]: I0319 20:16:34.067242 4826 generic.go:334] "Generic (PLEG): container finished" podID="b724e39c-45b5-4701-b4f0-a19969224d90" containerID="d6501efe4d21439968204859b1ed1ca17f8790cc86545382812dea2434ac9b1b" exitCode=143 Mar 19 20:16:34 crc kubenswrapper[4826]: I0319 20:16:34.067290 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-prxxj" event={"ID":"b724e39c-45b5-4701-b4f0-a19969224d90","Type":"ContainerDied","Data":"d6501efe4d21439968204859b1ed1ca17f8790cc86545382812dea2434ac9b1b"} Mar 19 20:16:34 crc kubenswrapper[4826]: I0319 20:16:34.132875 4826 patch_prober.go:28] interesting pod/monitoring-plugin-747c5d4c44-ltxl4 container/monitoring-plugin namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.90:9443/health\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:34 crc kubenswrapper[4826]: I0319 20:16:34.132920 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-zjkbj" podUID="a960df53-d712-424a-85a7-64b0e50c911f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.110:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:34 crc kubenswrapper[4826]: I0319 20:16:34.132943 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/monitoring-plugin-747c5d4c44-ltxl4" podUID="2b19eec2-98e8-47bd-b68f-55b033eb788c" containerName="monitoring-plugin" probeResult="failure" output="Get \"https://10.217.0.90:9443/health\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:34 crc kubenswrapper[4826]: I0319 20:16:34.133228 4826 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:34 crc kubenswrapper[4826]: I0319 20:16:34.133257 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:34 crc kubenswrapper[4826]: I0319 20:16:34.235966 4826 patch_prober.go:28] interesting pod/thanos-querier-788cb6bfb6-558hf container/kube-rbac-proxy-web namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.87:9091/-/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:34 crc kubenswrapper[4826]: I0319 20:16:34.236347 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/thanos-querier-788cb6bfb6-558hf" podUID="c1f0645f-c1e5-40ed-8cc7-d3b7e15175b8" containerName="kube-rbac-proxy-web" probeResult="failure" output="Get \"https://10.217.0.87:9091/-/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:34 crc kubenswrapper[4826]: I0319 20:16:34.317917 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/speaker-w2f68" podUID="b812f1db-b2c8-467c-977a-a8661540546e" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:34 crc kubenswrapper[4826]: I0319 20:16:34.421888 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-b76w9" podUID="4f382869-5ee2-4a46-8188-d4ddd0bee2fa" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.116:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:34 crc kubenswrapper[4826]: I0319 20:16:34.580061 4826 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-pfrcn container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Mar 19 20:16:34 crc kubenswrapper[4826]: I0319 20:16:34.580121 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-pfrcn" podUID="72f0a310-1676-49a4-826a-d83406d28e93" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" Mar 19 20:16:34 crc kubenswrapper[4826]: I0319 20:16:34.703304 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="hostpath-provisioner/csi-hostpathplugin-4rf57" podUID="eeb43c2f-961b-4ed4-9aa0-cda4dea289cb" containerName="hostpath-provisioner" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 20:16:34 crc kubenswrapper[4826]: I0319 20:16:34.750359 4826 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-tw9k9 container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:34 crc kubenswrapper[4826]: I0319 20:16:34.750470 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tw9k9" podUID="4e673de9-6eb1-430b-8123-1254957f125f" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.7:8443/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:34 crc kubenswrapper[4826]: I0319 20:16:34.769345 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="38814433-1737-49df-966a-ac3511ed48dd" containerName="galera" probeResult="failure" output="command timed out" Mar 19 20:16:34 crc kubenswrapper[4826]: I0319 20:16:34.770822 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="cb23233f-a975-4476-8bff-5e7b4b9c8646" containerName="prometheus" probeResult="failure" output="command timed out" Mar 19 20:16:34 crc kubenswrapper[4826]: I0319 20:16:34.957870 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-api-0" podUID="e18e7f7e-f1f1-4349-a076-79e1f781315d" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.224:8776/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:34 crc kubenswrapper[4826]: E0319 20:16:34.993711 4826 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" Mar 19 20:16:35 crc kubenswrapper[4826]: I0319 20:16:35.056864 4826 patch_prober.go:28] interesting pod/loki-operator-controller-manager-d88f59dd5-fqs6s container/manager namespace/openshift-operators-redhat: Readiness probe status=failure output="Get \"http://10.217.0.49:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:35 crc kubenswrapper[4826]: I0319 20:16:35.057232 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators-redhat/loki-operator-controller-manager-d88f59dd5-fqs6s" podUID="84bba80c-841e-4df3-87e0-901afbc23bf3" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.49:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:35 crc kubenswrapper[4826]: I0319 20:16:35.057326 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-d88f59dd5-fqs6s" Mar 19 20:16:35 crc kubenswrapper[4826]: I0319 20:16:35.088702 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/prometheus-metric-storage-0" podUID="bf194957-ec68-4ea7-b094-3e0912bc3bc5" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.171:9090/-/healthy\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:35 crc kubenswrapper[4826]: I0319 20:16:35.088820 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="bf194957-ec68-4ea7-b094-3e0912bc3bc5" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.171:9090/-/ready\": context deadline exceeded" Mar 19 20:16:35 crc kubenswrapper[4826]: I0319 20:16:35.107721 4826 patch_prober.go:28] interesting pod/image-registry-66df7c8f76-77cc2 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="Get \"https://10.217.0.72:5000/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:35 crc kubenswrapper[4826]: I0319 20:16:35.107801 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-66df7c8f76-77cc2" podUID="85a0e24a-07c2-4184-8f80-90479e82f839" containerName="registry" probeResult="failure" output="Get \"https://10.217.0.72:5000/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:35 crc kubenswrapper[4826]: I0319 20:16:35.111034 4826 patch_prober.go:28] interesting pod/image-registry-66df7c8f76-77cc2 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="Get \"https://10.217.0.72:5000/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:35 crc kubenswrapper[4826]: I0319 20:16:35.116959 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-66df7c8f76-77cc2" podUID="85a0e24a-07c2-4184-8f80-90479e82f839" containerName="registry" probeResult="failure" output="Get \"https://10.217.0.72:5000/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:35 crc kubenswrapper[4826]: I0319 20:16:35.250755 4826 patch_prober.go:28] interesting pod/nmstate-webhook-5f558f5558-d4pjw container/nmstate-webhook namespace/openshift-nmstate: Readiness probe status=failure output="Get \"https://10.217.0.71:9443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:35 crc kubenswrapper[4826]: I0319 20:16:35.250844 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-webhook-5f558f5558-d4pjw" podUID="afb786fa-7916-4f36-9978-5bd829c9dbf8" containerName="nmstate-webhook" probeResult="failure" output="Get \"https://10.217.0.71:9443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:35 crc kubenswrapper[4826]: I0319 20:16:35.250938 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-d4pjw" Mar 19 20:16:35 crc kubenswrapper[4826]: I0319 20:16:35.462965 4826 patch_prober.go:28] interesting pod/logging-loki-distributor-9c6b6d984-qrlfg container/loki-distributor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:35 crc kubenswrapper[4826]: I0319 20:16:35.463012 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-distributor-9c6b6d984-qrlfg" podUID="e1f51b15-5d82-43d5-b391-5f4b10434957" containerName="loki-distributor" probeResult="failure" output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:35 crc kubenswrapper[4826]: I0319 20:16:35.463074 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-distributor-9c6b6d984-qrlfg" Mar 19 20:16:35 crc kubenswrapper[4826]: I0319 20:16:35.489727 4826 patch_prober.go:28] interesting pod/route-controller-manager-bb4bb89f7-bhb8x container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.73:8443/healthz\": dial tcp 10.217.0.73:8443: connect: connection refused" start-of-body= Mar 19 20:16:35 crc kubenswrapper[4826]: I0319 20:16:35.489784 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-bb4bb89f7-bhb8x" podUID="5f25fb62-ec83-409e-88fb-0073d07869b9" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.73:8443/healthz\": dial tcp 10.217.0.73:8443: connect: connection refused" Mar 19 20:16:35 crc kubenswrapper[4826]: I0319 20:16:35.771080 4826 patch_prober.go:28] interesting pod/logging-loki-query-frontend-ff66c4dc9-l2p46 container/loki-query-frontend namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:3101/loki/api/v1/status/buildinfo\": context deadline exceeded" start-of-body= Mar 19 20:16:35 crc kubenswrapper[4826]: I0319 20:16:35.771364 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-l2p46" podUID="0fc08676-ae6f-4018-8f85-259585de45fe" containerName="loki-query-frontend" probeResult="failure" output="Get \"https://10.217.0.55:3101/loki/api/v1/status/buildinfo\": context deadline exceeded" Mar 19 20:16:35 crc kubenswrapper[4826]: I0319 20:16:35.771442 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-l2p46" Mar 19 20:16:35 crc kubenswrapper[4826]: I0319 20:16:35.906964 4826 patch_prober.go:28] interesting pod/logging-loki-querier-6dcbdf8bb8-qltmk container/loki-querier namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.54:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:35 crc kubenswrapper[4826]: I0319 20:16:35.907046 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qltmk" podUID="081e84d7-1c7e-4c6f-935e-ee01eaf393e2" containerName="loki-querier" probeResult="failure" output="Get \"https://10.217.0.54:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:35 crc kubenswrapper[4826]: I0319 20:16:35.907121 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qltmk" Mar 19 20:16:36 crc kubenswrapper[4826]: I0319 20:16:36.094232 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-prxxj" event={"ID":"b724e39c-45b5-4701-b4f0-a19969224d90","Type":"ContainerStarted","Data":"256d42755872dc9290eef729e8b1935356cb74f5669ce6a5b0a7ad7abeffe11d"} Mar 19 20:16:36 crc kubenswrapper[4826]: I0319 20:16:36.095622 4826 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="controller" containerStatusID={"Type":"cri-o","ID":"04a2f678f0acbe9cdead3d329e1d3aa6d7f40e03a8448d5eb57c7065bb9d062e"} pod="metallb-system/frr-k8s-prxxj" containerMessage="Container controller failed liveness probe, will be restarted" Mar 19 20:16:36 crc kubenswrapper[4826]: I0319 20:16:36.095850 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="metallb-system/frr-k8s-prxxj" podUID="b724e39c-45b5-4701-b4f0-a19969224d90" containerName="controller" containerID="cri-o://04a2f678f0acbe9cdead3d329e1d3aa6d7f40e03a8448d5eb57c7065bb9d062e" gracePeriod=2 Mar 19 20:16:36 crc kubenswrapper[4826]: I0319 20:16:36.098878 4826 generic.go:334] "Generic (PLEG): container finished" podID="9bc83b3f-72da-4527-b7a8-5f09d3f5f39f" containerID="e82fdfcddbdef8d498bd3bdde22d4b6daa10cd720da0d57c6f23bff2745a3227" exitCode=0 Mar 19 20:16:36 crc kubenswrapper[4826]: I0319 20:16:36.098951 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-v6d7k" event={"ID":"9bc83b3f-72da-4527-b7a8-5f09d3f5f39f","Type":"ContainerDied","Data":"e82fdfcddbdef8d498bd3bdde22d4b6daa10cd720da0d57c6f23bff2745a3227"} Mar 19 20:16:36 crc kubenswrapper[4826]: I0319 20:16:36.099015 4826 patch_prober.go:28] interesting pod/loki-operator-controller-manager-d88f59dd5-fqs6s container/manager namespace/openshift-operators-redhat: Readiness probe status=failure output="Get \"http://10.217.0.49:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:36 crc kubenswrapper[4826]: I0319 20:16:36.099047 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators-redhat/loki-operator-controller-manager-d88f59dd5-fqs6s" podUID="84bba80c-841e-4df3-87e0-901afbc23bf3" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.49:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:36 crc kubenswrapper[4826]: I0319 20:16:36.140067 4826 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Liveness probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:36 crc kubenswrapper[4826]: I0319 20:16:36.140136 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:36 crc kubenswrapper[4826]: I0319 20:16:36.141840 4826 patch_prober.go:28] interesting pod/logging-loki-gateway-68b4bcd8f5-zvtrc container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:36 crc kubenswrapper[4826]: I0319 20:16:36.141886 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-zvtrc" podUID="64ca34d8-5f9f-448d-9ab2-414c5b4757e9" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:36 crc kubenswrapper[4826]: I0319 20:16:36.142066 4826 patch_prober.go:28] interesting pod/logging-loki-gateway-68b4bcd8f5-zvtrc container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8081/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:36 crc kubenswrapper[4826]: I0319 20:16:36.142103 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-zvtrc" podUID="64ca34d8-5f9f-448d-9ab2-414c5b4757e9" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.56:8081/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:36 crc kubenswrapper[4826]: I0319 20:16:36.145820 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-58897d9998-zc8ht_f61cc107-39c3-4add-b9a1-45c5d744ea4b/console-operator/0.log" Mar 19 20:16:36 crc kubenswrapper[4826]: I0319 20:16:36.145893 4826 generic.go:334] "Generic (PLEG): container finished" podID="f61cc107-39c3-4add-b9a1-45c5d744ea4b" containerID="a456ee0ad60c9f375ff1bffa7a4c02e145d0984db73abfcd1d8cb0a4007c2682" exitCode=1 Mar 19 20:16:36 crc kubenswrapper[4826]: I0319 20:16:36.145934 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-zc8ht" event={"ID":"f61cc107-39c3-4add-b9a1-45c5d744ea4b","Type":"ContainerDied","Data":"a456ee0ad60c9f375ff1bffa7a4c02e145d0984db73abfcd1d8cb0a4007c2682"} Mar 19 20:16:36 crc kubenswrapper[4826]: I0319 20:16:36.149257 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-8645ff956b-rx86q" event={"ID":"b57da585-9fca-48a5-a872-e5019db1e36e","Type":"ContainerStarted","Data":"d0cc397df977944405345153a37aedf8495905d32297586ea0028f2c888bde47"} Mar 19 20:16:36 crc kubenswrapper[4826]: I0319 20:16:36.149501 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-8645ff956b-rx86q" Mar 19 20:16:36 crc kubenswrapper[4826]: I0319 20:16:36.236857 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-init-6c6f68556d-k5tlt" podUID="c045bb2f-b87b-4a14-92b5-0b98cdc7a0d1" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.103:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:36 crc kubenswrapper[4826]: I0319 20:16:36.236881 4826 patch_prober.go:28] interesting pod/logging-loki-gateway-68b4bcd8f5-mhqzk container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:8081/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:36 crc kubenswrapper[4826]: I0319 20:16:36.236937 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-mhqzk" podUID="1e1484c9-801f-4999-9754-456df604d7ca" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.57:8081/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:36 crc kubenswrapper[4826]: I0319 20:16:36.237009 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-6c6f68556d-k5tlt" Mar 19 20:16:36 crc kubenswrapper[4826]: I0319 20:16:36.237025 4826 patch_prober.go:28] interesting pod/logging-loki-gateway-68b4bcd8f5-mhqzk container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:36 crc kubenswrapper[4826]: I0319 20:16:36.237056 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-mhqzk" podUID="1e1484c9-801f-4999-9754-456df604d7ca" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.57:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:36 crc kubenswrapper[4826]: I0319 20:16:36.237089 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-controller-init-6c6f68556d-k5tlt" podUID="c045bb2f-b87b-4a14-92b5-0b98cdc7a0d1" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.103:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:36 crc kubenswrapper[4826]: I0319 20:16:36.251995 4826 patch_prober.go:28] interesting pod/nmstate-webhook-5f558f5558-d4pjw container/nmstate-webhook namespace/openshift-nmstate: Readiness probe status=failure output="Get \"https://10.217.0.71:9443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:36 crc kubenswrapper[4826]: I0319 20:16:36.252046 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-webhook-5f558f5558-d4pjw" podUID="afb786fa-7916-4f36-9978-5bd829c9dbf8" containerName="nmstate-webhook" probeResult="failure" output="Get \"https://10.217.0.71:9443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:36 crc kubenswrapper[4826]: I0319 20:16:36.463333 4826 patch_prober.go:28] interesting pod/logging-loki-distributor-9c6b6d984-qrlfg container/loki-distributor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:36 crc kubenswrapper[4826]: I0319 20:16:36.463361 4826 patch_prober.go:28] interesting pod/logging-loki-distributor-9c6b6d984-qrlfg container/loki-distributor namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.53:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:36 crc kubenswrapper[4826]: I0319 20:16:36.485076 4826 patch_prober.go:28] interesting pod/controller-manager-567cb464d6-bm4t6 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.70:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:36 crc kubenswrapper[4826]: I0319 20:16:36.485079 4826 patch_prober.go:28] interesting pod/controller-manager-567cb464d6-bm4t6 container/controller-manager namespace/openshift-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.70:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:36 crc kubenswrapper[4826]: I0319 20:16:36.517207 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-567cb464d6-bm4t6" podUID="e5996d80-d5eb-423c-8965-1f5704c3dd69" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.70:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:36 crc kubenswrapper[4826]: I0319 20:16:36.517262 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-distributor-9c6b6d984-qrlfg" podUID="e1f51b15-5d82-43d5-b391-5f4b10434957" containerName="loki-distributor" probeResult="failure" output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:36 crc kubenswrapper[4826]: I0319 20:16:36.517207 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-controller-manager/controller-manager-567cb464d6-bm4t6" podUID="e5996d80-d5eb-423c-8965-1f5704c3dd69" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.70:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:36 crc kubenswrapper[4826]: I0319 20:16:36.517514 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-controller-manager/controller-manager-567cb464d6-bm4t6" Mar 19 20:16:36 crc kubenswrapper[4826]: I0319 20:16:36.517207 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-distributor-9c6b6d984-qrlfg" podUID="e1f51b15-5d82-43d5-b391-5f4b10434957" containerName="loki-distributor" probeResult="failure" output="Get \"https://10.217.0.53:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:36 crc kubenswrapper[4826]: I0319 20:16:36.552825 4826 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="controller-manager" containerStatusID={"Type":"cri-o","ID":"79bd64d7d2c89268df3c32958214a5b3ac6e94666db0ac84e1fea7d21c755d03"} pod="openshift-controller-manager/controller-manager-567cb464d6-bm4t6" containerMessage="Container controller-manager failed liveness probe, will be restarted" Mar 19 20:16:36 crc kubenswrapper[4826]: I0319 20:16:36.553178 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-567cb464d6-bm4t6" podUID="e5996d80-d5eb-423c-8965-1f5704c3dd69" containerName="controller-manager" containerID="cri-o://79bd64d7d2c89268df3c32958214a5b3ac6e94666db0ac84e1fea7d21c755d03" gracePeriod=30 Mar 19 20:16:36 crc kubenswrapper[4826]: I0319 20:16:36.607623 4826 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.58:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:36 crc kubenswrapper[4826]: I0319 20:16:36.607702 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="93990ea7-96ba-4c12-b92c-17a7c38aece4" containerName="loki-ingester" probeResult="failure" output="Get \"https://10.217.0.58:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:36 crc kubenswrapper[4826]: I0319 20:16:36.607806 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-ingester-0" Mar 19 20:16:36 crc kubenswrapper[4826]: I0319 20:16:36.726080 4826 patch_prober.go:28] interesting pod/logging-loki-compactor-0 container/loki-compactor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.60:3101/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:36 crc kubenswrapper[4826]: I0319 20:16:36.726180 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-compactor-0" podUID="377fff75-1f59-4c28-a3ed-2bd89e803b73" containerName="loki-compactor" probeResult="failure" output="Get \"https://10.217.0.60:3101/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:36 crc kubenswrapper[4826]: I0319 20:16:36.726297 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-compactor-0" Mar 19 20:16:36 crc kubenswrapper[4826]: I0319 20:16:36.770435 4826 patch_prober.go:28] interesting pod/logging-loki-query-frontend-ff66c4dc9-l2p46 container/loki-query-frontend namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.55:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:36 crc kubenswrapper[4826]: I0319 20:16:36.770494 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-l2p46" podUID="0fc08676-ae6f-4018-8f85-259585de45fe" containerName="loki-query-frontend" probeResult="failure" output="Get \"https://10.217.0.55:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:36 crc kubenswrapper[4826]: I0319 20:16:36.772732 4826 patch_prober.go:28] interesting pod/logging-loki-query-frontend-ff66c4dc9-l2p46 container/loki-query-frontend namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:36 crc kubenswrapper[4826]: I0319 20:16:36.772767 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-l2p46" podUID="0fc08676-ae6f-4018-8f85-259585de45fe" containerName="loki-query-frontend" probeResult="failure" output="Get \"https://10.217.0.55:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:36 crc kubenswrapper[4826]: I0319 20:16:36.773868 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-k8s-0" podUID="cb23233f-a975-4476-8bff-5e7b4b9c8646" containerName="prometheus" probeResult="failure" output="command timed out" Mar 19 20:16:36 crc kubenswrapper[4826]: I0319 20:16:36.906494 4826 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Liveness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": context deadline exceeded" start-of-body= Mar 19 20:16:36 crc kubenswrapper[4826]: I0319 20:16:36.906565 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": context deadline exceeded" Mar 19 20:16:36 crc kubenswrapper[4826]: I0319 20:16:36.906962 4826 patch_prober.go:28] interesting pod/logging-loki-querier-6dcbdf8bb8-qltmk container/loki-querier namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.54:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:36 crc kubenswrapper[4826]: I0319 20:16:36.908383 4826 patch_prober.go:28] interesting pod/logging-loki-querier-6dcbdf8bb8-qltmk container/loki-querier namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.54:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:36 crc kubenswrapper[4826]: I0319 20:16:36.908419 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qltmk" podUID="081e84d7-1c7e-4c6f-935e-ee01eaf393e2" containerName="loki-querier" probeResult="failure" output="Get \"https://10.217.0.54:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:36 crc kubenswrapper[4826]: I0319 20:16:36.920171 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qltmk" podUID="081e84d7-1c7e-4c6f-935e-ee01eaf393e2" containerName="loki-querier" probeResult="failure" output="Get \"https://10.217.0.54:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:36 crc kubenswrapper[4826]: I0319 20:16:36.967861 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-jjqrs" podUID="5f60643c-c919-436b-bd23-9e39698d9c9b" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.104:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:36 crc kubenswrapper[4826]: I0319 20:16:36.968001 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-jjqrs" Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.099966 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-hf8n5" podUID="080fa697-4720-424e-b75e-6564061cd68f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.106:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.099983 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-zm4ps" podUID="38267b94-39ea-4067-9b6e-3d863ff60494" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.105:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.100068 4826 patch_prober.go:28] interesting pod/logging-loki-index-gateway-0 container/loki-index-gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.61:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.100441 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-hf8n5" Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.100484 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-zm4ps" Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.100439 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-index-gateway-0" podUID="05505420-3d58-4de7-9da6-2f27e54c32f5" containerName="loki-index-gateway" probeResult="failure" output="Get \"https://10.217.0.61:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.141905 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-8265b" podUID="0f77f094-1b90-43a6-85be-27e8b1fda71f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.107:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.142038 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-8265b" Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.162582 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-v6d7k" event={"ID":"9bc83b3f-72da-4527-b7a8-5f09d3f5f39f","Type":"ContainerStarted","Data":"12cb99b82a8de0bb19e30fc326b582acd2f011f9b4de6aa8fc4040252b7ca9b5"} Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.162821 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-v6d7k" Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.164556 4826 generic.go:334] "Generic (PLEG): container finished" podID="b812f1db-b2c8-467c-977a-a8661540546e" containerID="cecf9f716377394271759e79568a9bf3e3c6352f09e209ea04029a11db3f6fc7" exitCode=137 Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.164618 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-w2f68" event={"ID":"b812f1db-b2c8-467c-977a-a8661540546e","Type":"ContainerDied","Data":"cecf9f716377394271759e79568a9bf3e3c6352f09e209ea04029a11db3f6fc7"} Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.183937 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-ngb9j" podUID="ee5c97c9-5dc0-4292-9a34-08ca45f5387a" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.109:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.183978 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-rsrjx" podUID="d2375678-e630-4376-9dfd-28efbc77aed4" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.108:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.184171 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-rsrjx" Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.184220 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-ngb9j" Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.224896 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-prxxj" podUID="b724e39c-45b5-4701-b4f0-a19969224d90" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.225623 4826 patch_prober.go:28] interesting pod/logging-loki-gateway-68b4bcd8f5-zvtrc container/opa namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.56:8083/live\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.225703 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-zvtrc" podUID="64ca34d8-5f9f-448d-9ab2-414c5b4757e9" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.56:8083/live\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.266986 4826 patch_prober.go:28] interesting pod/logging-loki-gateway-68b4bcd8f5-zvtrc container/gateway namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.56:8081/live\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.267113 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-zvtrc" podUID="64ca34d8-5f9f-448d-9ab2-414c5b4757e9" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.56:8081/live\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.314146 4826 patch_prober.go:28] interesting pod/logging-loki-gateway-68b4bcd8f5-mhqzk container/gateway namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.57:8081/live\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.314238 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-mhqzk" podUID="1e1484c9-801f-4999-9754-456df604d7ca" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.57:8081/live\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.314357 4826 patch_prober.go:28] interesting pod/logging-loki-gateway-68b4bcd8f5-mhqzk container/opa namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.57:8083/live\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.314383 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-mhqzk" podUID="1e1484c9-801f-4999-9754-456df604d7ca" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.57:8083/live\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.360524 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/manila-operator-controller-manager-55f864c847-zrczt" podUID="6a5ffd48-ea97-46a0-b9ed-f7c38d5d8a90" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.112:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.360607 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-725xd" podUID="f073a654-efe9-4fd0-9c08-23d9fdb0d492" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.111:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.360681 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-725xd" Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.361041 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-zm4ps" Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.361139 4826 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-v6d7k container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.361195 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-v6d7k" podUID="9bc83b3f-72da-4527-b7a8-5f09d3f5f39f" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.400816 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-xpq6x" podUID="271f8c86-929d-46a4-8852-f5ec8e701bcb" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.113:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.400845 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-init-6c6f68556d-k5tlt" podUID="c045bb2f-b87b-4a14-92b5-0b98cdc7a0d1" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.103:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.400896 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-xpq6x" Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.482870 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ovn-operator-controller-manager-884679f54-zs74n" podUID="6243b523-966a-4f1d-b663-2f1ed4614fdb" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.117:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.483030 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-884679f54-zs74n" Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.608126 4826 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.58:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.608208 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-ingester-0" podUID="93990ea7-96ba-4c12-b92c-17a7c38aece4" containerName="loki-ingester" probeResult="failure" output="Get \"https://10.217.0.58:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.608473 4826 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.58:3101/ready\": context deadline exceeded" start-of-body= Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.608529 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="93990ea7-96ba-4c12-b92c-17a7c38aece4" containerName="loki-ingester" probeResult="failure" output="Get \"https://10.217.0.58:3101/ready\": context deadline exceeded" Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.623943 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-tjcmb" podUID="44055ef9-1bc5-4b25-a40d-553a1546fc15" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.119:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.624348 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-tjcmb" Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.624079 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-5b5447b648-5hq9h" podUID="45e3dc79-4f5e-4bec-a579-41b93f1d6150" containerName="heat-cfnapi" probeResult="failure" output="Get \"https://10.217.1.29:8000/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.706866 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/heat-api-fb594b459-7sf97" podUID="77ff195c-0819-4764-b09f-fd10a1aea177" containerName="heat-api" probeResult="failure" output="Get \"https://10.217.1.30:8004/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.707040 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/heat-cfnapi-5b5447b648-5hq9h" podUID="45e3dc79-4f5e-4bec-a579-41b93f1d6150" containerName="heat-cfnapi" probeResult="failure" output="Get \"https://10.217.1.29:8000/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.717825 4826 patch_prober.go:28] interesting pod/logging-loki-compactor-0 container/loki-compactor namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.60:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.717905 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-compactor-0" podUID="377fff75-1f59-4c28-a3ed-2bd89e803b73" containerName="loki-compactor" probeResult="failure" output="Get \"https://10.217.0.60:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.727453 4826 patch_prober.go:28] interesting pod/logging-loki-compactor-0 container/loki-compactor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.60:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.727491 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-compactor-0" podUID="377fff75-1f59-4c28-a3ed-2bd89e803b73" containerName="loki-compactor" probeResult="failure" output="Get \"https://10.217.0.60:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.747825 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/placement-operator-controller-manager-5784578c99-kkmzl" podUID="79a89fcd-3226-4314-951d-d94af2ac242c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.120:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.747992 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-fb594b459-7sf97" podUID="77ff195c-0819-4764-b09f-fd10a1aea177" containerName="heat-api" probeResult="failure" output="Get \"https://10.217.1.30:8004/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.771708 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="cb23233f-a975-4476-8bff-5e7b4b9c8646" containerName="prometheus" probeResult="failure" output="command timed out" Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.772476 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-marketplace-plvms" podUID="623d1506-574a-4eee-9f1c-cc0ee85e9083" containerName="registry-server" probeResult="failure" output="command timed out" Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.773537 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-plvms" podUID="623d1506-574a-4eee-9f1c-cc0ee85e9083" containerName="registry-server" probeResult="failure" output="command timed out" Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.773582 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/community-operators-p9rmq" podUID="3a20f6a8-01f3-4492-856d-e5f494672fa3" containerName="registry-server" probeResult="failure" output="command timed out" Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.788791 4826 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-sbhr9 container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.82:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.789084 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-sbhr9" podUID="67f96c65-0583-4f62-a063-98c7e6bbfb87" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.82:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.789163 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-sbhr9" Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.789264 4826 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-pfrcn container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.789314 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-pfrcn" podUID="72f0a310-1676-49a4-826a-d83406d28e93" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.788897 4826 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-sbhr9 container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.82:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.788833 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/neutron-operator-controller-manager-767865f676-sfs65" podUID="918ac815-fe60-44b9-b6c0-c99ee8dc80b8" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.115:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.789483 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-sbhr9" podUID="67f96c65-0583-4f62-a063-98c7e6bbfb87" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.82:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.792383 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-767865f676-sfs65" Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.792444 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-sbhr9" Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.793214 4826 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="prometheus-operator-admission-webhook" containerStatusID={"Type":"cri-o","ID":"d633e3e09132bed99d00b1fbe22f27863f03fa82f637c17e9bce1e725b46e4df"} pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-sbhr9" containerMessage="Container prometheus-operator-admission-webhook failed liveness probe, will be restarted" Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.793272 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-sbhr9" podUID="67f96c65-0583-4f62-a063-98c7e6bbfb87" containerName="prometheus-operator-admission-webhook" containerID="cri-o://d633e3e09132bed99d00b1fbe22f27863f03fa82f637c17e9bce1e725b46e4df" gracePeriod=30 Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.830921 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-j4p25" podUID="7137162e-cccf-4ce6-9dc4-7380db33a85a" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.118:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.831020 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-j4p25" Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.894379 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/kube-state-metrics-0" podUID="06606200-618f-46b2-afb9-e5e2738fe2dd" containerName="kube-state-metrics" probeResult="failure" output="Get \"https://10.217.1.17:8080/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:37 crc kubenswrapper[4826]: I0319 20:16:37.894403 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/kube-state-metrics-0" podUID="06606200-618f-46b2-afb9-e5e2738fe2dd" containerName="kube-state-metrics" probeResult="failure" output="Get \"https://10.217.1.17:8081/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:38 crc kubenswrapper[4826]: I0319 20:16:38.089866 4826 patch_prober.go:28] interesting pod/logging-loki-index-gateway-0 container/loki-index-gateway namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.61:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:38 crc kubenswrapper[4826]: I0319 20:16:38.089927 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-index-gateway-0" podUID="05505420-3d58-4de7-9da6-2f27e54c32f5" containerName="loki-index-gateway" probeResult="failure" output="Get \"https://10.217.0.61:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:38 crc kubenswrapper[4826]: I0319 20:16:38.122875 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-6vmk6" podUID="e36e6f7a-53ec-4262-b9e5-798353e5bf15" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.123:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:38 crc kubenswrapper[4826]: I0319 20:16:38.122884 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/swift-operator-controller-manager-c674c5965-skdcp" podUID="aff2d31f-3465-4c0c-8bbf-b04dfdb92db0" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.121:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:38 crc kubenswrapper[4826]: I0319 20:16:38.122978 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-6vmk6" Mar 19 20:16:38 crc kubenswrapper[4826]: I0319 20:16:38.123014 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-c674c5965-skdcp" Mar 19 20:16:38 crc kubenswrapper[4826]: I0319 20:16:38.163844 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/telemetry-operator-controller-manager-6c5c766d94-258q2" podUID="5d8869b3-7d43-4db2-b79d-f05c13d0d6f2" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.122:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:38 crc kubenswrapper[4826]: I0319 20:16:38.163844 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-jjqrs" podUID="5f60643c-c919-436b-bd23-9e39698d9c9b" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.104:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:38 crc kubenswrapper[4826]: I0319 20:16:38.163948 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-6c5c766d94-258q2" Mar 19 20:16:38 crc kubenswrapper[4826]: I0319 20:16:38.163833 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-7l4t6" podUID="dc64459f-49c1-41f5-b946-88ab7bc8e1d8" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.124:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:38 crc kubenswrapper[4826]: I0319 20:16:38.164102 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-7l4t6" Mar 19 20:16:38 crc kubenswrapper[4826]: I0319 20:16:38.208260 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-hf8n5" podUID="080fa697-4720-424e-b75e-6564061cd68f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.106:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:38 crc kubenswrapper[4826]: I0319 20:16:38.248884 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-8265b" podUID="0f77f094-1b90-43a6-85be-27e8b1fda71f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.107:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:38 crc kubenswrapper[4826]: I0319 20:16:38.330829 4826 patch_prober.go:28] interesting pod/console-849c6d8fdf-t6vlp container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.142:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:38 crc kubenswrapper[4826]: I0319 20:16:38.330884 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-849c6d8fdf-t6vlp" podUID="d068f929-58c2-481e-99bd-e7808a74f36e" containerName="console" probeResult="failure" output="Get \"https://10.217.0.142:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:38 crc kubenswrapper[4826]: I0319 20:16:38.330835 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-rsrjx" podUID="d2375678-e630-4376-9dfd-28efbc77aed4" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.108:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:38 crc kubenswrapper[4826]: I0319 20:16:38.331080 4826 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-v6d7k container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Mar 19 20:16:38 crc kubenswrapper[4826]: I0319 20:16:38.331097 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-v6d7k" podUID="9bc83b3f-72da-4527-b7a8-5f09d3f5f39f" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" Mar 19 20:16:38 crc kubenswrapper[4826]: I0319 20:16:38.331140 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-ngb9j" podUID="ee5c97c9-5dc0-4292-9a34-08ca45f5387a" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.109:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:38 crc kubenswrapper[4826]: I0319 20:16:38.444870 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-725xd" podUID="f073a654-efe9-4fd0-9c08-23d9fdb0d492" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.111:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:38 crc kubenswrapper[4826]: I0319 20:16:38.444888 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-xpq6x" podUID="271f8c86-929d-46a4-8852-f5ec8e701bcb" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.113:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:38 crc kubenswrapper[4826]: I0319 20:16:38.524952 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ovn-operator-controller-manager-884679f54-zs74n" podUID="6243b523-966a-4f1d-b663-2f1ed4614fdb" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.117:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:38 crc kubenswrapper[4826]: I0319 20:16:38.665935 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-tjcmb" podUID="44055ef9-1bc5-4b25-a40d-553a1546fc15" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.119:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:38 crc kubenswrapper[4826]: I0319 20:16:38.792671 4826 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-sbhr9 container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.82:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:38 crc kubenswrapper[4826]: I0319 20:16:38.792759 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-sbhr9" podUID="67f96c65-0583-4f62-a063-98c7e6bbfb87" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.82:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:38 crc kubenswrapper[4826]: E0319 20:16:38.867190 4826 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9fce9490185a1bb948a973838565e9f7f627139a2f115ccc5704f0289fc76b3f" cmd=["grpc_health_probe","-addr=:50051"] Mar 19 20:16:38 crc kubenswrapper[4826]: E0319 20:16:38.870094 4826 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9fce9490185a1bb948a973838565e9f7f627139a2f115ccc5704f0289fc76b3f" cmd=["grpc_health_probe","-addr=:50051"] Mar 19 20:16:38 crc kubenswrapper[4826]: E0319 20:16:38.874263 4826 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9fce9490185a1bb948a973838565e9f7f627139a2f115ccc5704f0289fc76b3f" cmd=["grpc_health_probe","-addr=:50051"] Mar 19 20:16:38 crc kubenswrapper[4826]: E0319 20:16:38.874349 4826 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-operators/openstack-operator-index-wrwzn" podUID="f9f3d33c-0421-473c-94e6-a7860932d772" containerName="registry-server" Mar 19 20:16:38 crc kubenswrapper[4826]: I0319 20:16:38.874909 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/neutron-operator-controller-manager-767865f676-sfs65" podUID="918ac815-fe60-44b9-b6c0-c99ee8dc80b8" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.115:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:38 crc kubenswrapper[4826]: I0319 20:16:38.874941 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-j4p25" podUID="7137162e-cccf-4ce6-9dc4-7380db33a85a" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.118:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:39 crc kubenswrapper[4826]: I0319 20:16:39.010301 4826 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:6443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:39 crc kubenswrapper[4826]: I0319 20:16:39.010382 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:39 crc kubenswrapper[4826]: E0319 20:16:39.094296 4826 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9ba1abb6e9c4b14c4a37526dcfb4f281941152770b5eee3214e8be9e4b81582a" cmd=["grpc_health_probe","-addr=:50051"] Mar 19 20:16:39 crc kubenswrapper[4826]: E0319 20:16:39.096742 4826 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9ba1abb6e9c4b14c4a37526dcfb4f281941152770b5eee3214e8be9e4b81582a" cmd=["grpc_health_probe","-addr=:50051"] Mar 19 20:16:39 crc kubenswrapper[4826]: E0319 20:16:39.099407 4826 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9ba1abb6e9c4b14c4a37526dcfb4f281941152770b5eee3214e8be9e4b81582a" cmd=["grpc_health_probe","-addr=:50051"] Mar 19 20:16:39 crc kubenswrapper[4826]: E0319 20:16:39.099479 4826 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-marketplace/community-operators-p9rmq" podUID="3a20f6a8-01f3-4492-856d-e5f494672fa3" containerName="registry-server" Mar 19 20:16:39 crc kubenswrapper[4826]: I0319 20:16:39.236745 4826 patch_prober.go:28] interesting pod/thanos-querier-788cb6bfb6-558hf container/kube-rbac-proxy-web namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.87:9091/-/healthy\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:39 crc kubenswrapper[4826]: I0319 20:16:39.237036 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/thanos-querier-788cb6bfb6-558hf" podUID="c1f0645f-c1e5-40ed-8cc7-d3b7e15175b8" containerName="kube-rbac-proxy-web" probeResult="failure" output="Get \"https://10.217.0.87:9091/-/healthy\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:39 crc kubenswrapper[4826]: I0319 20:16:39.237546 4826 patch_prober.go:28] interesting pod/thanos-querier-788cb6bfb6-558hf container/kube-rbac-proxy-web namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.87:9091/-/ready\": context deadline exceeded" start-of-body= Mar 19 20:16:39 crc kubenswrapper[4826]: I0319 20:16:39.237612 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/thanos-querier-788cb6bfb6-558hf" podUID="c1f0645f-c1e5-40ed-8cc7-d3b7e15175b8" containerName="kube-rbac-proxy-web" probeResult="failure" output="Get \"https://10.217.0.87:9091/-/ready\": context deadline exceeded" Mar 19 20:16:39 crc kubenswrapper[4826]: I0319 20:16:39.258242 4826 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Readiness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:39 crc kubenswrapper[4826]: I0319 20:16:39.258296 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:39 crc kubenswrapper[4826]: I0319 20:16:39.295780 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-6vmk6" podUID="e36e6f7a-53ec-4262-b9e5-798353e5bf15" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.123:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:39 crc kubenswrapper[4826]: I0319 20:16:39.295866 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-7l4t6" podUID="dc64459f-49c1-41f5-b946-88ab7bc8e1d8" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.124:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:39 crc kubenswrapper[4826]: I0319 20:16:39.295895 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/swift-operator-controller-manager-c674c5965-skdcp" podUID="aff2d31f-3465-4c0c-8bbf-b04dfdb92db0" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.121:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:39 crc kubenswrapper[4826]: I0319 20:16:39.296173 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/telemetry-operator-controller-manager-6c5c766d94-258q2" podUID="5d8869b3-7d43-4db2-b79d-f05c13d0d6f2" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.122:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:39 crc kubenswrapper[4826]: I0319 20:16:39.605854 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="hostpath-provisioner/csi-hostpathplugin-4rf57" podUID="eeb43c2f-961b-4ed4-9aa0-cda4dea289cb" containerName="hostpath-provisioner" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 20:16:39 crc kubenswrapper[4826]: I0319 20:16:39.668874 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="cert-manager/cert-manager-webhook-687f57d79b-fhw8v" podUID="fe2ad622-0df2-4cb2-8c00-45f4d9a8a1c3" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.40:6080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:39 crc kubenswrapper[4826]: I0319 20:16:39.740689 4826 patch_prober.go:28] interesting pod/apiserver-76f77b778f-h7mf4 container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/readyz?exclude=etcd&exclude=etcd-readiness\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:39 crc kubenswrapper[4826]: I0319 20:16:39.740744 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-76f77b778f-h7mf4" podUID="86f5311b-39ed-455f-a9bc-b83044d63db8" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.5:8443/readyz?exclude=etcd&exclude=etcd-readiness\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:39 crc kubenswrapper[4826]: I0319 20:16:39.740765 4826 patch_prober.go:28] interesting pod/apiserver-76f77b778f-h7mf4 container/openshift-apiserver namespace/openshift-apiserver: Liveness probe status=failure output="Get \"https://10.217.0.5:8443/livez?exclude=etcd\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:39 crc kubenswrapper[4826]: I0319 20:16:39.740829 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-apiserver/apiserver-76f77b778f-h7mf4" podUID="86f5311b-39ed-455f-a9bc-b83044d63db8" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.5:8443/livez?exclude=etcd\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:39 crc kubenswrapper[4826]: I0319 20:16:39.753098 4826 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-tw9k9 container/oauth-apiserver namespace/openshift-oauth-apiserver: Liveness probe status=failure output="Get \"https://10.217.0.7:8443/livez?exclude=etcd\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:39 crc kubenswrapper[4826]: I0319 20:16:39.753169 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tw9k9" podUID="4e673de9-6eb1-430b-8123-1254957f125f" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.7:8443/livez?exclude=etcd\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:39 crc kubenswrapper[4826]: I0319 20:16:39.773287 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-operators-xnzmj" podUID="326a5687-dfe7-4a01-b8b9-c6bedd76684a" containerName="registry-server" probeResult="failure" output="command timed out" Mar 19 20:16:39 crc kubenswrapper[4826]: I0319 20:16:39.773471 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-operators-xnzmj" podUID="326a5687-dfe7-4a01-b8b9-c6bedd76684a" containerName="registry-server" probeResult="failure" output="command timed out" Mar 19 20:16:40 crc kubenswrapper[4826]: I0319 20:16:40.088582 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/prometheus-metric-storage-0" podUID="bf194957-ec68-4ea7-b094-3e0912bc3bc5" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.171:9090/-/healthy\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:40 crc kubenswrapper[4826]: I0319 20:16:40.088601 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="bf194957-ec68-4ea7-b094-3e0912bc3bc5" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.171:9090/-/ready\": context deadline exceeded" Mar 19 20:16:40 crc kubenswrapper[4826]: I0319 20:16:40.088984 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Mar 19 20:16:40 crc kubenswrapper[4826]: I0319 20:16:40.093148 4826 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="prometheus" containerStatusID={"Type":"cri-o","ID":"a0a6be61771c3bd1837bd426d4b58d5149185e19bd788e28443b62cb3821baf3"} pod="openstack/prometheus-metric-storage-0" containerMessage="Container prometheus failed liveness probe, will be restarted" Mar 19 20:16:40 crc kubenswrapper[4826]: I0319 20:16:40.093582 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="bf194957-ec68-4ea7-b094-3e0912bc3bc5" containerName="prometheus" containerID="cri-o://a0a6be61771c3bd1837bd426d4b58d5149185e19bd788e28443b62cb3821baf3" gracePeriod=600 Mar 19 20:16:40 crc kubenswrapper[4826]: I0319 20:16:40.228449 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-58897d9998-zc8ht_f61cc107-39c3-4add-b9a1-45c5d744ea4b/console-operator/0.log" Mar 19 20:16:40 crc kubenswrapper[4826]: I0319 20:16:40.228801 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-zc8ht" event={"ID":"f61cc107-39c3-4add-b9a1-45c5d744ea4b","Type":"ContainerStarted","Data":"3cf0bf9159090624f6eca9a9d1e19d4a84856bfd91d1aea72b2b4add97004404"} Mar 19 20:16:40 crc kubenswrapper[4826]: I0319 20:16:40.229812 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-zc8ht" Mar 19 20:16:40 crc kubenswrapper[4826]: I0319 20:16:40.230081 4826 patch_prober.go:28] interesting pod/console-operator-58897d9998-zc8ht container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 19 20:16:40 crc kubenswrapper[4826]: I0319 20:16:40.230130 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-zc8ht" podUID="f61cc107-39c3-4add-b9a1-45c5d744ea4b" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 19 20:16:40 crc kubenswrapper[4826]: E0319 20:16:40.230960 4826 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": context deadline exceeded" Mar 19 20:16:40 crc kubenswrapper[4826]: I0319 20:16:40.236220 4826 generic.go:334] "Generic (PLEG): container finished" podID="b724e39c-45b5-4701-b4f0-a19969224d90" containerID="04a2f678f0acbe9cdead3d329e1d3aa6d7f40e03a8448d5eb57c7065bb9d062e" exitCode=137 Mar 19 20:16:40 crc kubenswrapper[4826]: I0319 20:16:40.236371 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-prxxj" event={"ID":"b724e39c-45b5-4701-b4f0-a19969224d90","Type":"ContainerDied","Data":"04a2f678f0acbe9cdead3d329e1d3aa6d7f40e03a8448d5eb57c7065bb9d062e"} Mar 19 20:16:40 crc kubenswrapper[4826]: I0319 20:16:40.240600 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-w2f68" event={"ID":"b812f1db-b2c8-467c-977a-a8661540546e","Type":"ContainerStarted","Data":"0ca45ce8702f3d635eb1ff8de46bee6e20c18dbf05b2efa9a8d043c8a34f9a1d"} Mar 19 20:16:40 crc kubenswrapper[4826]: I0319 20:16:40.241202 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-w2f68" Mar 19 20:16:40 crc kubenswrapper[4826]: I0319 20:16:40.418242 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-controller-manager-84d47777df-4x998" podUID="010ce31f-d333-43a9-b1e0-cd85cc0f6fd6" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.95:8080/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:40 crc kubenswrapper[4826]: I0319 20:16:40.418584 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-84d47777df-4x998" Mar 19 20:16:40 crc kubenswrapper[4826]: I0319 20:16:40.424457 4826 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-v6d7k container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Mar 19 20:16:40 crc kubenswrapper[4826]: I0319 20:16:40.424501 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-v6d7k" podUID="9bc83b3f-72da-4527-b7a8-5f09d3f5f39f" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" Mar 19 20:16:40 crc kubenswrapper[4826]: I0319 20:16:40.424732 4826 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-v6d7k container/catalog-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Mar 19 20:16:40 crc kubenswrapper[4826]: I0319 20:16:40.424819 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-v6d7k" podUID="9bc83b3f-72da-4527-b7a8-5f09d3f5f39f" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" Mar 19 20:16:40 crc kubenswrapper[4826]: I0319 20:16:40.481425 4826 patch_prober.go:28] interesting pod/console-operator-58897d9998-zc8ht container/console-operator namespace/openshift-console-operator: Liveness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 19 20:16:40 crc kubenswrapper[4826]: I0319 20:16:40.481435 4826 patch_prober.go:28] interesting pod/console-operator-58897d9998-zc8ht container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 19 20:16:40 crc kubenswrapper[4826]: I0319 20:16:40.481487 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console-operator/console-operator-58897d9998-zc8ht" podUID="f61cc107-39c3-4add-b9a1-45c5d744ea4b" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 19 20:16:40 crc kubenswrapper[4826]: I0319 20:16:40.481547 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-zc8ht" podUID="f61cc107-39c3-4add-b9a1-45c5d744ea4b" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 19 20:16:40 crc kubenswrapper[4826]: I0319 20:16:40.481620 4826 patch_prober.go:28] interesting pod/oauth-openshift-55bb4f975f-zpl6z container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.74:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:40 crc kubenswrapper[4826]: I0319 20:16:40.481643 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" podUID="46e578cd-3724-4abe-805c-554b384ed050" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.74:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:40 crc kubenswrapper[4826]: I0319 20:16:40.481746 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" Mar 19 20:16:40 crc kubenswrapper[4826]: I0319 20:16:40.511217 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-controller-manager-84d47777df-4x998" podUID="010ce31f-d333-43a9-b1e0-cd85cc0f6fd6" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.95:8080/readyz\": read tcp 10.217.0.2:39436->10.217.0.95:8080: read: connection reset by peer" Mar 19 20:16:40 crc kubenswrapper[4826]: I0319 20:16:40.536388 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-prxxj" Mar 19 20:16:40 crc kubenswrapper[4826]: I0319 20:16:40.536721 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-prxxj" podUID="b724e39c-45b5-4701-b4f0-a19969224d90" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": dial tcp 127.0.0.1:7572: connect: connection refused" Mar 19 20:16:40 crc kubenswrapper[4826]: I0319 20:16:40.579536 4826 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-pfrcn container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Mar 19 20:16:40 crc kubenswrapper[4826]: I0319 20:16:40.579601 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-pfrcn" podUID="72f0a310-1676-49a4-826a-d83406d28e93" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" Mar 19 20:16:40 crc kubenswrapper[4826]: I0319 20:16:40.726818 4826 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-66v8z container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.217.0.75:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:40 crc kubenswrapper[4826]: I0319 20:16:40.726885 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-79b997595-66v8z" podUID="f182fb72-66c7-4d5d-bccd-29a47b27f4c6" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.75:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:40 crc kubenswrapper[4826]: I0319 20:16:40.726939 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-marketplace/marketplace-operator-79b997595-66v8z" Mar 19 20:16:40 crc kubenswrapper[4826]: I0319 20:16:40.728254 4826 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-66v8z container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.75:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:40 crc kubenswrapper[4826]: I0319 20:16:40.728287 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-66v8z" podUID="f182fb72-66c7-4d5d-bccd-29a47b27f4c6" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.75:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:40 crc kubenswrapper[4826]: I0319 20:16:40.728336 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-66v8z" Mar 19 20:16:40 crc kubenswrapper[4826]: I0319 20:16:40.728362 4826 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="marketplace-operator" containerStatusID={"Type":"cri-o","ID":"68307e5a08a95cfd72fde6061fc39632cfa52ca8644e891be3785baefa2852fd"} pod="openshift-marketplace/marketplace-operator-79b997595-66v8z" containerMessage="Container marketplace-operator failed liveness probe, will be restarted" Mar 19 20:16:40 crc kubenswrapper[4826]: I0319 20:16:40.728399 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-66v8z" podUID="f182fb72-66c7-4d5d-bccd-29a47b27f4c6" containerName="marketplace-operator" containerID="cri-o://68307e5a08a95cfd72fde6061fc39632cfa52ca8644e891be3785baefa2852fd" gracePeriod=30 Mar 19 20:16:40 crc kubenswrapper[4826]: I0319 20:16:40.768078 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-handler-zqv4g" podUID="9f450458-8845-4ec5-9971-6df9dd448312" containerName="nmstate-handler" probeResult="failure" output="command timed out" Mar 19 20:16:40 crc kubenswrapper[4826]: I0319 20:16:40.773070 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/certified-operators-mw64r" podUID="a3821b4d-7122-428f-be08-2c5f72a29b1d" containerName="registry-server" probeResult="failure" output="command timed out" Mar 19 20:16:40 crc kubenswrapper[4826]: I0319 20:16:40.773435 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/certified-operators-mw64r" podUID="a3821b4d-7122-428f-be08-2c5f72a29b1d" containerName="registry-server" probeResult="failure" output="command timed out" Mar 19 20:16:41 crc kubenswrapper[4826]: I0319 20:16:41.141107 4826 patch_prober.go:28] interesting pod/logging-loki-gateway-68b4bcd8f5-zvtrc container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8081/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:41 crc kubenswrapper[4826]: I0319 20:16:41.141403 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-zvtrc" podUID="64ca34d8-5f9f-448d-9ab2-414c5b4757e9" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.56:8081/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:41 crc kubenswrapper[4826]: I0319 20:16:41.141146 4826 patch_prober.go:28] interesting pod/logging-loki-gateway-68b4bcd8f5-zvtrc container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:41 crc kubenswrapper[4826]: I0319 20:16:41.141455 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-zvtrc" podUID="64ca34d8-5f9f-448d-9ab2-414c5b4757e9" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:41 crc kubenswrapper[4826]: I0319 20:16:41.161679 4826 patch_prober.go:28] interesting pod/logging-loki-gateway-68b4bcd8f5-mhqzk container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:41 crc kubenswrapper[4826]: I0319 20:16:41.161748 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-mhqzk" podUID="1e1484c9-801f-4999-9754-456df604d7ca" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.57:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:41 crc kubenswrapper[4826]: I0319 20:16:41.161754 4826 patch_prober.go:28] interesting pod/logging-loki-gateway-68b4bcd8f5-mhqzk container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:8081/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:41 crc kubenswrapper[4826]: I0319 20:16:41.161820 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-68b4bcd8f5-mhqzk" podUID="1e1484c9-801f-4999-9754-456df604d7ca" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.57:8081/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:41 crc kubenswrapper[4826]: I0319 20:16:41.252078 4826 generic.go:334] "Generic (PLEG): container finished" podID="e36e6f7a-53ec-4262-b9e5-798353e5bf15" containerID="61d79f08aef814872ceb84e75c780ec6ec93ec84ca6dc2fa776e7a8ab8d65433" exitCode=1 Mar 19 20:16:41 crc kubenswrapper[4826]: I0319 20:16:41.252165 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-6vmk6" event={"ID":"e36e6f7a-53ec-4262-b9e5-798353e5bf15","Type":"ContainerDied","Data":"61d79f08aef814872ceb84e75c780ec6ec93ec84ca6dc2fa776e7a8ab8d65433"} Mar 19 20:16:41 crc kubenswrapper[4826]: I0319 20:16:41.254541 4826 generic.go:334] "Generic (PLEG): container finished" podID="f9f3d33c-0421-473c-94e6-a7860932d772" containerID="9fce9490185a1bb948a973838565e9f7f627139a2f115ccc5704f0289fc76b3f" exitCode=0 Mar 19 20:16:41 crc kubenswrapper[4826]: I0319 20:16:41.254608 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wrwzn" event={"ID":"f9f3d33c-0421-473c-94e6-a7860932d772","Type":"ContainerDied","Data":"9fce9490185a1bb948a973838565e9f7f627139a2f115ccc5704f0289fc76b3f"} Mar 19 20:16:41 crc kubenswrapper[4826]: I0319 20:16:41.255290 4826 patch_prober.go:28] interesting pod/console-operator-58897d9998-zc8ht container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 19 20:16:41 crc kubenswrapper[4826]: I0319 20:16:41.255330 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-zc8ht" podUID="f61cc107-39c3-4add-b9a1-45c5d744ea4b" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 19 20:16:41 crc kubenswrapper[4826]: I0319 20:16:41.256734 4826 scope.go:117] "RemoveContainer" containerID="61d79f08aef814872ceb84e75c780ec6ec93ec84ca6dc2fa776e7a8ab8d65433" Mar 19 20:16:41 crc kubenswrapper[4826]: I0319 20:16:41.458941 4826 patch_prober.go:28] interesting pod/observability-operator-6dd7dd855f-tcdmb container/operator namespace/openshift-operators: Liveness probe status=failure output="Get \"http://10.217.0.13:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:41 crc kubenswrapper[4826]: I0319 20:16:41.458987 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators/observability-operator-6dd7dd855f-tcdmb" podUID="217c809e-0af8-4b11-a5ce-932d698ed444" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.13:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:41 crc kubenswrapper[4826]: I0319 20:16:41.458998 4826 patch_prober.go:28] interesting pod/observability-operator-6dd7dd855f-tcdmb container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.13:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:41 crc kubenswrapper[4826]: I0319 20:16:41.459072 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-6dd7dd855f-tcdmb" podUID="217c809e-0af8-4b11-a5ce-932d698ed444" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.13:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:41 crc kubenswrapper[4826]: I0319 20:16:41.459027 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-operators/observability-operator-6dd7dd855f-tcdmb" Mar 19 20:16:41 crc kubenswrapper[4826]: I0319 20:16:41.459217 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-6dd7dd855f-tcdmb" Mar 19 20:16:41 crc kubenswrapper[4826]: I0319 20:16:41.459759 4826 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-fcnzx container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.44:5443/healthz\": dial tcp 10.217.0.44:5443: connect: connection refused" start-of-body= Mar 19 20:16:41 crc kubenswrapper[4826]: I0319 20:16:41.459782 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fcnzx" podUID="781f0741-f222-4ccc-aa80-6dde59e9648d" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.44:5443/healthz\": dial tcp 10.217.0.44:5443: connect: connection refused" Mar 19 20:16:41 crc kubenswrapper[4826]: I0319 20:16:41.461454 4826 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="operator" containerStatusID={"Type":"cri-o","ID":"4e64c0a4a98e62e740173a5f8f4c9dc44425c09198509fa5ce76cbdc06dfbea7"} pod="openshift-operators/observability-operator-6dd7dd855f-tcdmb" containerMessage="Container operator failed liveness probe, will be restarted" Mar 19 20:16:41 crc kubenswrapper[4826]: I0319 20:16:41.461518 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-operators/observability-operator-6dd7dd855f-tcdmb" podUID="217c809e-0af8-4b11-a5ce-932d698ed444" containerName="operator" containerID="cri-o://4e64c0a4a98e62e740173a5f8f4c9dc44425c09198509fa5ce76cbdc06dfbea7" gracePeriod=30 Mar 19 20:16:41 crc kubenswrapper[4826]: I0319 20:16:41.654876 4826 patch_prober.go:28] interesting pod/downloads-7954f5f757-cbmtf container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.17:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:41 crc kubenswrapper[4826]: I0319 20:16:41.654939 4826 prober.go:107] "Probe failed" probeType="Startup" pod="metallb-system/frr-k8s-prxxj" podUID="b724e39c-45b5-4701-b4f0-a19969224d90" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:41 crc kubenswrapper[4826]: I0319 20:16:41.655417 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-cbmtf" podUID="0a13bc75-83b6-4952-8e8e-cd93809a87b5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:41 crc kubenswrapper[4826]: I0319 20:16:41.654890 4826 patch_prober.go:28] interesting pod/downloads-7954f5f757-cbmtf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:41 crc kubenswrapper[4826]: I0319 20:16:41.655526 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-cbmtf" podUID="0a13bc75-83b6-4952-8e8e-cd93809a87b5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:41 crc kubenswrapper[4826]: I0319 20:16:41.771533 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-k8s-0" podUID="cb23233f-a975-4476-8bff-5e7b4b9c8646" containerName="prometheus" probeResult="failure" output="command timed out" Mar 19 20:16:41 crc kubenswrapper[4826]: I0319 20:16:41.772890 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Mar 19 20:16:41 crc kubenswrapper[4826]: I0319 20:16:41.772347 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="ab298593-ac97-4031-8bfc-b0e5be9b341a" containerName="ceilometer-notification-agent" probeResult="failure" output="command timed out" Mar 19 20:16:41 crc kubenswrapper[4826]: I0319 20:16:41.772078 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="cb23233f-a975-4476-8bff-5e7b4b9c8646" containerName="prometheus" probeResult="failure" output="command timed out" Mar 19 20:16:41 crc kubenswrapper[4826]: I0319 20:16:41.775058 4826 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="prometheus" containerStatusID={"Type":"cri-o","ID":"f9625e2a72fc3f907926b5792559f166baea15fc5366e9be6d1426295fd0720f"} pod="openshift-monitoring/prometheus-k8s-0" containerMessage="Container prometheus failed liveness probe, will be restarted" Mar 19 20:16:41 crc kubenswrapper[4826]: I0319 20:16:41.775311 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="cb23233f-a975-4476-8bff-5e7b4b9c8646" containerName="prometheus" containerID="cri-o://f9625e2a72fc3f907926b5792559f166baea15fc5366e9be6d1426295fd0720f" gracePeriod=600 Mar 19 20:16:41 crc kubenswrapper[4826]: I0319 20:16:41.829505 4826 trace.go:236] Trace[1943972310]: "Calculate volume metrics of mysql-db for pod openstack/openstack-cell1-galera-0" (19-Mar-2026 20:16:39.923) (total time: 1904ms): Mar 19 20:16:41 crc kubenswrapper[4826]: Trace[1943972310]: [1.904166056s] [1.904166056s] END Mar 19 20:16:41 crc kubenswrapper[4826]: I0319 20:16:41.829512 4826 trace.go:236] Trace[1702962397]: "Calculate volume metrics of registry-storage for pod openshift-image-registry/image-registry-66df7c8f76-77cc2" (19-Mar-2026 20:16:35.423) (total time: 6403ms): Mar 19 20:16:41 crc kubenswrapper[4826]: Trace[1702962397]: [6.403983948s] [6.403983948s] END Mar 19 20:16:41 crc kubenswrapper[4826]: I0319 20:16:41.829521 4826 trace.go:236] Trace[2035344256]: "Calculate volume metrics of prometheus-metric-storage-db for pod openstack/prometheus-metric-storage-0" (19-Mar-2026 20:16:28.935) (total time: 12892ms): Mar 19 20:16:41 crc kubenswrapper[4826]: Trace[2035344256]: [12.892239268s] [12.892239268s] END Mar 19 20:16:41 crc kubenswrapper[4826]: I0319 20:16:41.829522 4826 trace.go:236] Trace[1127287295]: "Calculate volume metrics of storage for pod openshift-logging/logging-loki-compactor-0" (19-Mar-2026 20:16:32.760) (total time: 9066ms): Mar 19 20:16:41 crc kubenswrapper[4826]: Trace[1127287295]: [9.066635408s] [9.066635408s] END Mar 19 20:16:41 crc kubenswrapper[4826]: I0319 20:16:41.922573 4826 trace.go:236] Trace[1201991096]: "Calculate volume metrics of storage for pod minio-dev/minio" (19-Mar-2026 20:16:28.880) (total time: 13042ms): Mar 19 20:16:41 crc kubenswrapper[4826]: Trace[1201991096]: [13.042205391s] [13.042205391s] END Mar 19 20:16:41 crc kubenswrapper[4826]: I0319 20:16:41.966760 4826 patch_prober.go:28] interesting pod/perses-operator-6648f6899-wbmts container/perses-operator namespace/openshift-operators: Liveness probe status=failure output="Get \"http://10.217.0.14:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:41 crc kubenswrapper[4826]: I0319 20:16:41.966810 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators/perses-operator-6648f6899-wbmts" podUID="8eb71543-680b-4018-94e4-572cfcc12660" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.14:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:41 crc kubenswrapper[4826]: I0319 20:16:41.966843 4826 patch_prober.go:28] interesting pod/perses-operator-6648f6899-wbmts container/perses-operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.14:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:41 crc kubenswrapper[4826]: I0319 20:16:41.966924 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/perses-operator-6648f6899-wbmts" podUID="8eb71543-680b-4018-94e4-572cfcc12660" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.14:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:41 crc kubenswrapper[4826]: I0319 20:16:41.967106 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-zjkbj" podUID="a960df53-d712-424a-85a7-64b0e50c911f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.110:8081/readyz\": dial tcp 10.217.0.110:8081: connect: connection refused" Mar 19 20:16:41 crc kubenswrapper[4826]: I0319 20:16:41.967119 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-zjkbj" podUID="a960df53-d712-424a-85a7-64b0e50c911f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.110:8081/healthz\": dial tcp 10.217.0.110:8081: connect: connection refused" Mar 19 20:16:42 crc kubenswrapper[4826]: I0319 20:16:42.131864 4826 patch_prober.go:28] interesting pod/package-server-manager-789f6589d5-nlft6 container/package-server-manager namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:42 crc kubenswrapper[4826]: I0319 20:16:42.131927 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nlft6" podUID="4858c7f7-6a71-40dc-8222-082f6d97504c" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.217.0.36:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:42 crc kubenswrapper[4826]: I0319 20:16:42.131873 4826 patch_prober.go:28] interesting pod/package-server-manager-789f6589d5-nlft6 container/package-server-manager namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"http://10.217.0.36:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:42 crc kubenswrapper[4826]: I0319 20:16:42.131982 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nlft6" podUID="4858c7f7-6a71-40dc-8222-082f6d97504c" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.217.0.36:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:42 crc kubenswrapper[4826]: I0319 20:16:42.241210 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-6btqx" podUID="81cad5dc-6bd8-4081-adc1-28f65b056636" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.97:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:42 crc kubenswrapper[4826]: I0319 20:16:42.241304 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-6btqx" Mar 19 20:16:42 crc kubenswrapper[4826]: I0319 20:16:42.269043 4826 generic.go:334] "Generic (PLEG): container finished" podID="a960df53-d712-424a-85a7-64b0e50c911f" containerID="c85539ce9517f0b13ccc4eb01b1c3dbeae6d6c7e1282718adce636b8cd3ae051" exitCode=1 Mar 19 20:16:42 crc kubenswrapper[4826]: I0319 20:16:42.269142 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-zjkbj" event={"ID":"a960df53-d712-424a-85a7-64b0e50c911f","Type":"ContainerDied","Data":"c85539ce9517f0b13ccc4eb01b1c3dbeae6d6c7e1282718adce636b8cd3ae051"} Mar 19 20:16:42 crc kubenswrapper[4826]: I0319 20:16:42.271058 4826 scope.go:117] "RemoveContainer" containerID="c85539ce9517f0b13ccc4eb01b1c3dbeae6d6c7e1282718adce636b8cd3ae051" Mar 19 20:16:42 crc kubenswrapper[4826]: I0319 20:16:42.273882 4826 generic.go:334] "Generic (PLEG): container finished" podID="918ac815-fe60-44b9-b6c0-c99ee8dc80b8" containerID="72ab8a50302e29cc88abed996900e9d0b141a6a645cbea56cf1e66e7ad06f07c" exitCode=1 Mar 19 20:16:42 crc kubenswrapper[4826]: I0319 20:16:42.273972 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-sfs65" event={"ID":"918ac815-fe60-44b9-b6c0-c99ee8dc80b8","Type":"ContainerDied","Data":"72ab8a50302e29cc88abed996900e9d0b141a6a645cbea56cf1e66e7ad06f07c"} Mar 19 20:16:42 crc kubenswrapper[4826]: I0319 20:16:42.275298 4826 scope.go:117] "RemoveContainer" containerID="72ab8a50302e29cc88abed996900e9d0b141a6a645cbea56cf1e66e7ad06f07c" Mar 19 20:16:42 crc kubenswrapper[4826]: I0319 20:16:42.282147 4826 patch_prober.go:28] interesting pod/router-default-5444994796-drbf6 container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:42 crc kubenswrapper[4826]: I0319 20:16:42.282194 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-drbf6" podUID="ee11e1f6-25be-40f4-b19b-a2d8e439d8c6" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:42 crc kubenswrapper[4826]: I0319 20:16:42.282131 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-6btqx" podUID="81cad5dc-6bd8-4081-adc1-28f65b056636" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.97:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:42 crc kubenswrapper[4826]: I0319 20:16:42.282253 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-6btqx" Mar 19 20:16:42 crc kubenswrapper[4826]: I0319 20:16:42.290344 4826 generic.go:334] "Generic (PLEG): container finished" podID="010ce31f-d333-43a9-b1e0-cd85cc0f6fd6" containerID="b5b421b006d9403cd764de38fd564830fc823289b50ceab2e322440e22638665" exitCode=1 Mar 19 20:16:42 crc kubenswrapper[4826]: I0319 20:16:42.290510 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-84d47777df-4x998" event={"ID":"010ce31f-d333-43a9-b1e0-cd85cc0f6fd6","Type":"ContainerDied","Data":"b5b421b006d9403cd764de38fd564830fc823289b50ceab2e322440e22638665"} Mar 19 20:16:42 crc kubenswrapper[4826]: I0319 20:16:42.291991 4826 scope.go:117] "RemoveContainer" containerID="b5b421b006d9403cd764de38fd564830fc823289b50ceab2e322440e22638665" Mar 19 20:16:42 crc kubenswrapper[4826]: I0319 20:16:42.292403 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-b76w9" podUID="4f382869-5ee2-4a46-8188-d4ddd0bee2fa" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.116:8081/healthz\": dial tcp 10.217.0.116:8081: connect: connection refused" Mar 19 20:16:42 crc kubenswrapper[4826]: I0319 20:16:42.292699 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-b76w9" podUID="4f382869-5ee2-4a46-8188-d4ddd0bee2fa" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.116:8081/readyz\": dial tcp 10.217.0.116:8081: connect: connection refused" Mar 19 20:16:42 crc kubenswrapper[4826]: I0319 20:16:42.306755 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-prxxj" event={"ID":"b724e39c-45b5-4701-b4f0-a19969224d90","Type":"ContainerStarted","Data":"392ebab065f2ad4f6f7699a1fc748c362340caaab0dc1cb72feb5f3b0ad02071"} Mar 19 20:16:42 crc kubenswrapper[4826]: I0319 20:16:42.307397 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-prxxj" Mar 19 20:16:42 crc kubenswrapper[4826]: I0319 20:16:42.315761 4826 generic.go:334] "Generic (PLEG): container finished" podID="4f382869-5ee2-4a46-8188-d4ddd0bee2fa" containerID="b857d8cdc6d9f653d3c6edbb6a6cdeff43f9dad5bd645072af281d18fefa3ae7" exitCode=1 Mar 19 20:16:42 crc kubenswrapper[4826]: I0319 20:16:42.315821 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-b76w9" event={"ID":"4f382869-5ee2-4a46-8188-d4ddd0bee2fa","Type":"ContainerDied","Data":"b857d8cdc6d9f653d3c6edbb6a6cdeff43f9dad5bd645072af281d18fefa3ae7"} Mar 19 20:16:42 crc kubenswrapper[4826]: I0319 20:16:42.316390 4826 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="frr-k8s-webhook-server" containerStatusID={"Type":"cri-o","ID":"41fa4ca7bfde9c58a52e5d57a2358483ce84ae4dac66f1e5cadf2397e21f9fbf"} pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-6btqx" containerMessage="Container frr-k8s-webhook-server failed liveness probe, will be restarted" Mar 19 20:16:42 crc kubenswrapper[4826]: I0319 20:16:42.316451 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-6btqx" podUID="81cad5dc-6bd8-4081-adc1-28f65b056636" containerName="frr-k8s-webhook-server" containerID="cri-o://41fa4ca7bfde9c58a52e5d57a2358483ce84ae4dac66f1e5cadf2397e21f9fbf" gracePeriod=10 Mar 19 20:16:42 crc kubenswrapper[4826]: I0319 20:16:42.317347 4826 scope.go:117] "RemoveContainer" containerID="b857d8cdc6d9f653d3c6edbb6a6cdeff43f9dad5bd645072af281d18fefa3ae7" Mar 19 20:16:42 crc kubenswrapper[4826]: I0319 20:16:42.353835 4826 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-dnc22 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:42 crc kubenswrapper[4826]: I0319 20:16:42.353898 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dnc22" podUID="fdb49b25-5e81-4f9d-9a17-34bade2cec18" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:42 crc kubenswrapper[4826]: I0319 20:16:42.422787 4826 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": dial tcp 192.168.126.11:10259: connect: connection refused" start-of-body= Mar 19 20:16:42 crc kubenswrapper[4826]: I0319 20:16:42.422839 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": dial tcp 192.168.126.11:10259: connect: connection refused" Mar 19 20:16:42 crc kubenswrapper[4826]: I0319 20:16:42.500850 4826 patch_prober.go:28] interesting pod/observability-operator-6dd7dd855f-tcdmb container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.13:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:42 crc kubenswrapper[4826]: I0319 20:16:42.500904 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-6dd7dd855f-tcdmb" podUID="217c809e-0af8-4b11-a5ce-932d698ed444" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.13:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:42 crc kubenswrapper[4826]: I0319 20:16:42.769872 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="763c5ded-be94-49ad-9eea-447e444f24f3" containerName="galera" probeResult="failure" output="command timed out" Mar 19 20:16:42 crc kubenswrapper[4826]: I0319 20:16:42.808245 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-554d47978d-xzcgd" podUID="a4ae16c3-6030-48fc-b4d2-057a45770fe1" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.213:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:42 crc kubenswrapper[4826]: I0319 20:16:42.808337 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-554d47978d-xzcgd" podUID="a4ae16c3-6030-48fc-b4d2-057a45770fe1" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.213:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:42 crc kubenswrapper[4826]: I0319 20:16:42.989204 4826 patch_prober.go:28] interesting pod/monitoring-plugin-747c5d4c44-ltxl4 container/monitoring-plugin namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.90:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:42 crc kubenswrapper[4826]: I0319 20:16:42.989254 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/monitoring-plugin-747c5d4c44-ltxl4" podUID="2b19eec2-98e8-47bd-b68f-55b033eb788c" containerName="monitoring-plugin" probeResult="failure" output="Get \"https://10.217.0.90:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:43 crc kubenswrapper[4826]: I0319 20:16:43.089853 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="bf194957-ec68-4ea7-b094-3e0912bc3bc5" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.171:9090/-/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:43 crc kubenswrapper[4826]: I0319 20:16:43.283932 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-6btqx" podUID="81cad5dc-6bd8-4081-adc1-28f65b056636" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.97:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:43 crc kubenswrapper[4826]: I0319 20:16:43.331643 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-6vmk6" event={"ID":"e36e6f7a-53ec-4262-b9e5-798353e5bf15","Type":"ContainerStarted","Data":"c0b8081bb46a37bc5133d3312b05fd08fe9f88e835867a0dd34070d133a76a70"} Mar 19 20:16:43 crc kubenswrapper[4826]: I0319 20:16:43.332056 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-6vmk6" Mar 19 20:16:43 crc kubenswrapper[4826]: I0319 20:16:43.335346 4826 generic.go:334] "Generic (PLEG): container finished" podID="781f0741-f222-4ccc-aa80-6dde59e9648d" containerID="9f3ea13753127e75061a238605c9b00b1148fb1f5ff5ec51558df743b2cf27d5" exitCode=0 Mar 19 20:16:43 crc kubenswrapper[4826]: I0319 20:16:43.335461 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fcnzx" event={"ID":"781f0741-f222-4ccc-aa80-6dde59e9648d","Type":"ContainerDied","Data":"9f3ea13753127e75061a238605c9b00b1148fb1f5ff5ec51558df743b2cf27d5"} Mar 19 20:16:43 crc kubenswrapper[4826]: I0319 20:16:43.338760 4826 generic.go:334] "Generic (PLEG): container finished" podID="84bba80c-841e-4df3-87e0-901afbc23bf3" containerID="848443327f956e513fb70499b5d5c8874d8078b35219c51423462c8665815814" exitCode=1 Mar 19 20:16:43 crc kubenswrapper[4826]: I0319 20:16:43.338819 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-d88f59dd5-fqs6s" event={"ID":"84bba80c-841e-4df3-87e0-901afbc23bf3","Type":"ContainerDied","Data":"848443327f956e513fb70499b5d5c8874d8078b35219c51423462c8665815814"} Mar 19 20:16:43 crc kubenswrapper[4826]: I0319 20:16:43.340234 4826 scope.go:117] "RemoveContainer" containerID="848443327f956e513fb70499b5d5c8874d8078b35219c51423462c8665815814" Mar 19 20:16:43 crc kubenswrapper[4826]: I0319 20:16:43.343455 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5444994796-drbf6_ee11e1f6-25be-40f4-b19b-a2d8e439d8c6/router/0.log" Mar 19 20:16:43 crc kubenswrapper[4826]: I0319 20:16:43.343570 4826 generic.go:334] "Generic (PLEG): container finished" podID="ee11e1f6-25be-40f4-b19b-a2d8e439d8c6" containerID="e5ad2e2cb4a689f1fcd62421183d45d8058fdf428b04c4a6dd5bcdf89ea60bd9" exitCode=137 Mar 19 20:16:43 crc kubenswrapper[4826]: I0319 20:16:43.343668 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-drbf6" event={"ID":"ee11e1f6-25be-40f4-b19b-a2d8e439d8c6","Type":"ContainerDied","Data":"e5ad2e2cb4a689f1fcd62421183d45d8058fdf428b04c4a6dd5bcdf89ea60bd9"} Mar 19 20:16:43 crc kubenswrapper[4826]: I0319 20:16:43.580462 4826 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-pfrcn container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Mar 19 20:16:43 crc kubenswrapper[4826]: I0319 20:16:43.580534 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-pfrcn" podUID="72f0a310-1676-49a4-826a-d83406d28e93" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" Mar 19 20:16:43 crc kubenswrapper[4826]: I0319 20:16:43.619356 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="bf194957-ec68-4ea7-b094-3e0912bc3bc5" containerName="prometheus" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 19 20:16:43 crc kubenswrapper[4826]: I0319 20:16:43.767937 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="38814433-1737-49df-966a-ac3511ed48dd" containerName="galera" probeResult="failure" output="command timed out" Mar 19 20:16:43 crc kubenswrapper[4826]: I0319 20:16:43.773014 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/heat-engine-84bf69fdcb-b6hq4" podUID="a8d87bc1-29fa-4219-8c55-968d58f697e8" containerName="heat-engine" probeResult="failure" output="command timed out" Mar 19 20:16:43 crc kubenswrapper[4826]: I0319 20:16:43.776094 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-engine-84bf69fdcb-b6hq4" podUID="a8d87bc1-29fa-4219-8c55-968d58f697e8" containerName="heat-engine" probeResult="failure" output="command timed out" Mar 19 20:16:43 crc kubenswrapper[4826]: I0319 20:16:43.826084 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="cb23233f-a975-4476-8bff-5e7b4b9c8646" containerName="prometheus" probeResult="failure" output=< Mar 19 20:16:43 crc kubenswrapper[4826]: % Total % Received % Xferd Average Speed Time Time Time Current Mar 19 20:16:43 crc kubenswrapper[4826]: Dload Upload Total Spent Left Speed Mar 19 20:16:43 crc kubenswrapper[4826]: [166B blob data] Mar 19 20:16:43 crc kubenswrapper[4826]: curl: (22) The requested URL returned error: 503 Mar 19 20:16:43 crc kubenswrapper[4826]: > Mar 19 20:16:43 crc kubenswrapper[4826]: E0319 20:16:43.829351 4826 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f9625e2a72fc3f907926b5792559f166baea15fc5366e9be6d1426295fd0720f" cmd=["sh","-c","if [ -x \"$(command -v curl)\" ]; then exec curl --fail http://localhost:9090/-/ready; elif [ -x \"$(command -v wget)\" ]; then exec wget -q -O /dev/null http://localhost:9090/-/ready; else exit 1; fi"] Mar 19 20:16:43 crc kubenswrapper[4826]: E0319 20:16:43.831026 4826 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f9625e2a72fc3f907926b5792559f166baea15fc5366e9be6d1426295fd0720f" cmd=["sh","-c","if [ -x \"$(command -v curl)\" ]; then exec curl --fail http://localhost:9090/-/ready; elif [ -x \"$(command -v wget)\" ]; then exec wget -q -O /dev/null http://localhost:9090/-/ready; else exit 1; fi"] Mar 19 20:16:43 crc kubenswrapper[4826]: E0319 20:16:43.833196 4826 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f9625e2a72fc3f907926b5792559f166baea15fc5366e9be6d1426295fd0720f" cmd=["sh","-c","if [ -x \"$(command -v curl)\" ]; then exec curl --fail http://localhost:9090/-/ready; elif [ -x \"$(command -v wget)\" ]; then exec wget -q -O /dev/null http://localhost:9090/-/ready; else exit 1; fi"] Mar 19 20:16:43 crc kubenswrapper[4826]: E0319 20:16:43.833229 4826 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="cb23233f-a975-4476-8bff-5e7b4b9c8646" containerName="prometheus" Mar 19 20:16:43 crc kubenswrapper[4826]: I0319 20:16:43.916127 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-6btqx" Mar 19 20:16:44 crc kubenswrapper[4826]: I0319 20:16:44.011688 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-operators-redhat/loki-operator-controller-manager-d88f59dd5-fqs6s" Mar 19 20:16:44 crc kubenswrapper[4826]: I0319 20:16:44.013707 4826 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:6443/livez?exclude=etcd\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 20:16:44 crc kubenswrapper[4826]: I0319 20:16:44.013748 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/livez?exclude=etcd\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 20:16:44 crc kubenswrapper[4826]: I0319 20:16:44.354912 4826 generic.go:334] "Generic (PLEG): container finished" podID="38267b94-39ea-4067-9b6e-3d863ff60494" containerID="d0aa998d1f9489b9c4d36b5764a678bbc5c98ff707f120f886b97f7655249ec1" exitCode=1 Mar 19 20:16:44 crc kubenswrapper[4826]: I0319 20:16:44.354999 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-zm4ps" event={"ID":"38267b94-39ea-4067-9b6e-3d863ff60494","Type":"ContainerDied","Data":"d0aa998d1f9489b9c4d36b5764a678bbc5c98ff707f120f886b97f7655249ec1"} Mar 19 20:16:44 crc kubenswrapper[4826]: I0319 20:16:44.356225 4826 scope.go:117] "RemoveContainer" containerID="d0aa998d1f9489b9c4d36b5764a678bbc5c98ff707f120f886b97f7655249ec1" Mar 19 20:16:44 crc kubenswrapper[4826]: I0319 20:16:44.357243 4826 generic.go:334] "Generic (PLEG): container finished" podID="79a89fcd-3226-4314-951d-d94af2ac242c" containerID="aa78967d3e7a9430579fcede00919a69d6078be67f4a5f03bb285f24e92fcd4a" exitCode=1 Mar 19 20:16:44 crc kubenswrapper[4826]: I0319 20:16:44.357298 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-kkmzl" event={"ID":"79a89fcd-3226-4314-951d-d94af2ac242c","Type":"ContainerDied","Data":"aa78967d3e7a9430579fcede00919a69d6078be67f4a5f03bb285f24e92fcd4a"} Mar 19 20:16:44 crc kubenswrapper[4826]: I0319 20:16:44.358154 4826 scope.go:117] "RemoveContainer" containerID="aa78967d3e7a9430579fcede00919a69d6078be67f4a5f03bb285f24e92fcd4a" Mar 19 20:16:44 crc kubenswrapper[4826]: I0319 20:16:44.359737 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-sfs65" event={"ID":"918ac815-fe60-44b9-b6c0-c99ee8dc80b8","Type":"ContainerStarted","Data":"55bded078e0e315fcf7676f4772b45b0c1f82dc3fd393d51958245018d54bf3c"} Mar 19 20:16:44 crc kubenswrapper[4826]: I0319 20:16:44.359964 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-767865f676-sfs65" Mar 19 20:16:44 crc kubenswrapper[4826]: I0319 20:16:44.362557 4826 generic.go:334] "Generic (PLEG): container finished" podID="bf194957-ec68-4ea7-b094-3e0912bc3bc5" containerID="a0a6be61771c3bd1837bd426d4b58d5149185e19bd788e28443b62cb3821baf3" exitCode=0 Mar 19 20:16:44 crc kubenswrapper[4826]: I0319 20:16:44.362615 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"bf194957-ec68-4ea7-b094-3e0912bc3bc5","Type":"ContainerDied","Data":"a0a6be61771c3bd1837bd426d4b58d5149185e19bd788e28443b62cb3821baf3"} Mar 19 20:16:44 crc kubenswrapper[4826]: I0319 20:16:44.364787 4826 generic.go:334] "Generic (PLEG): container finished" podID="0f77f094-1b90-43a6-85be-27e8b1fda71f" containerID="308e7798925ff5c58289d7bde2eacf17176005865caf2c61701339d14b9e2601" exitCode=1 Mar 19 20:16:44 crc kubenswrapper[4826]: I0319 20:16:44.364803 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-8265b" event={"ID":"0f77f094-1b90-43a6-85be-27e8b1fda71f","Type":"ContainerDied","Data":"308e7798925ff5c58289d7bde2eacf17176005865caf2c61701339d14b9e2601"} Mar 19 20:16:44 crc kubenswrapper[4826]: I0319 20:16:44.365821 4826 scope.go:117] "RemoveContainer" containerID="308e7798925ff5c58289d7bde2eacf17176005865caf2c61701339d14b9e2601" Mar 19 20:16:44 crc kubenswrapper[4826]: I0319 20:16:44.366688 4826 generic.go:334] "Generic (PLEG): container finished" podID="6243b523-966a-4f1d-b663-2f1ed4614fdb" containerID="b404056dacf1668e9a6ee2842e12e9fe4d3565f74446b93e187a9a86c94ff70e" exitCode=1 Mar 19 20:16:44 crc kubenswrapper[4826]: I0319 20:16:44.366705 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-zs74n" event={"ID":"6243b523-966a-4f1d-b663-2f1ed4614fdb","Type":"ContainerDied","Data":"b404056dacf1668e9a6ee2842e12e9fe4d3565f74446b93e187a9a86c94ff70e"} Mar 19 20:16:44 crc kubenswrapper[4826]: I0319 20:16:44.367745 4826 scope.go:117] "RemoveContainer" containerID="b404056dacf1668e9a6ee2842e12e9fe4d3565f74446b93e187a9a86c94ff70e" Mar 19 20:16:44 crc kubenswrapper[4826]: I0319 20:16:44.368585 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-zjkbj" event={"ID":"a960df53-d712-424a-85a7-64b0e50c911f","Type":"ContainerStarted","Data":"436bcd8d417d7532ad497e96aeb0b69b47b89a988c50dc9a5fa7023f623b0e36"} Mar 19 20:16:44 crc kubenswrapper[4826]: I0319 20:16:44.368830 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-zjkbj" Mar 19 20:16:44 crc kubenswrapper[4826]: I0319 20:16:44.370820 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wrwzn" event={"ID":"f9f3d33c-0421-473c-94e6-a7860932d772","Type":"ContainerStarted","Data":"1cc4a9ec2a4cd0bd5c2739bee338b621e2afee3939a63043a5230e06b1839761"} Mar 19 20:16:44 crc kubenswrapper[4826]: I0319 20:16:44.372670 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-6btqx" event={"ID":"81cad5dc-6bd8-4081-adc1-28f65b056636","Type":"ContainerDied","Data":"41fa4ca7bfde9c58a52e5d57a2358483ce84ae4dac66f1e5cadf2397e21f9fbf"} Mar 19 20:16:44 crc kubenswrapper[4826]: I0319 20:16:44.372646 4826 generic.go:334] "Generic (PLEG): container finished" podID="81cad5dc-6bd8-4081-adc1-28f65b056636" containerID="41fa4ca7bfde9c58a52e5d57a2358483ce84ae4dac66f1e5cadf2397e21f9fbf" exitCode=0 Mar 19 20:16:44 crc kubenswrapper[4826]: I0319 20:16:44.374423 4826 generic.go:334] "Generic (PLEG): container finished" podID="b00ec043-3d8c-41dd-bbef-fc99f7ad0bb6" containerID="d7fea80738dae6f7fc1cd87625eb88d0f943a85c0a65baac45a7c3ceda09127e" exitCode=1 Mar 19 20:16:44 crc kubenswrapper[4826]: I0319 20:16:44.374462 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5jvh4" event={"ID":"b00ec043-3d8c-41dd-bbef-fc99f7ad0bb6","Type":"ContainerDied","Data":"d7fea80738dae6f7fc1cd87625eb88d0f943a85c0a65baac45a7c3ceda09127e"} Mar 19 20:16:44 crc kubenswrapper[4826]: I0319 20:16:44.375203 4826 scope.go:117] "RemoveContainer" containerID="d7fea80738dae6f7fc1cd87625eb88d0f943a85c0a65baac45a7c3ceda09127e" Mar 19 20:16:44 crc kubenswrapper[4826]: I0319 20:16:44.378004 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 19 20:16:44 crc kubenswrapper[4826]: I0319 20:16:44.380826 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 19 20:16:44 crc kubenswrapper[4826]: I0319 20:16:44.384162 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 19 20:16:44 crc kubenswrapper[4826]: I0319 20:16:44.384192 4826 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="f29e73691785ec65fa525c1e8b3f6ee226df96250aa3138bf89302e5e9c8b33e" exitCode=1 Mar 19 20:16:44 crc kubenswrapper[4826]: I0319 20:16:44.384236 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"f29e73691785ec65fa525c1e8b3f6ee226df96250aa3138bf89302e5e9c8b33e"} Mar 19 20:16:44 crc kubenswrapper[4826]: I0319 20:16:44.384257 4826 scope.go:117] "RemoveContainer" containerID="a22f8de90a48c727556cc628544f3262bb1f7f32592a6672b8895a9e395d28af" Mar 19 20:16:44 crc kubenswrapper[4826]: I0319 20:16:44.385210 4826 scope.go:117] "RemoveContainer" containerID="f29e73691785ec65fa525c1e8b3f6ee226df96250aa3138bf89302e5e9c8b33e" Mar 19 20:16:44 crc kubenswrapper[4826]: I0319 20:16:44.388570 4826 generic.go:334] "Generic (PLEG): container finished" podID="7137162e-cccf-4ce6-9dc4-7380db33a85a" containerID="0b5c426ab0a77b4f05b5408a3685d9a3d53c05a20614569d4f59f0e560fa47e3" exitCode=1 Mar 19 20:16:44 crc kubenswrapper[4826]: I0319 20:16:44.388695 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-j4p25" event={"ID":"7137162e-cccf-4ce6-9dc4-7380db33a85a","Type":"ContainerDied","Data":"0b5c426ab0a77b4f05b5408a3685d9a3d53c05a20614569d4f59f0e560fa47e3"} Mar 19 20:16:44 crc kubenswrapper[4826]: I0319 20:16:44.390214 4826 scope.go:117] "RemoveContainer" containerID="0b5c426ab0a77b4f05b5408a3685d9a3d53c05a20614569d4f59f0e560fa47e3" Mar 19 20:16:44 crc kubenswrapper[4826]: I0319 20:16:44.396696 4826 generic.go:334] "Generic (PLEG): container finished" podID="67f96c65-0583-4f62-a063-98c7e6bbfb87" containerID="d633e3e09132bed99d00b1fbe22f27863f03fa82f637c17e9bce1e725b46e4df" exitCode=0 Mar 19 20:16:44 crc kubenswrapper[4826]: I0319 20:16:44.396777 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-sbhr9" event={"ID":"67f96c65-0583-4f62-a063-98c7e6bbfb87","Type":"ContainerDied","Data":"d633e3e09132bed99d00b1fbe22f27863f03fa82f637c17e9bce1e725b46e4df"} Mar 19 20:16:44 crc kubenswrapper[4826]: I0319 20:16:44.419898 4826 generic.go:334] "Generic (PLEG): container finished" podID="3a20f6a8-01f3-4492-856d-e5f494672fa3" containerID="9ba1abb6e9c4b14c4a37526dcfb4f281941152770b5eee3214e8be9e4b81582a" exitCode=0 Mar 19 20:16:44 crc kubenswrapper[4826]: I0319 20:16:44.420043 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p9rmq" event={"ID":"3a20f6a8-01f3-4492-856d-e5f494672fa3","Type":"ContainerDied","Data":"9ba1abb6e9c4b14c4a37526dcfb4f281941152770b5eee3214e8be9e4b81582a"} Mar 19 20:16:44 crc kubenswrapper[4826]: I0319 20:16:44.432570 4826 generic.go:334] "Generic (PLEG): container finished" podID="ee5c97c9-5dc0-4292-9a34-08ca45f5387a" containerID="9aab00592e3103b510ac88ec4710424ef42bc2192e995f53db7890e497783d40" exitCode=1 Mar 19 20:16:44 crc kubenswrapper[4826]: I0319 20:16:44.433553 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-ngb9j" event={"ID":"ee5c97c9-5dc0-4292-9a34-08ca45f5387a","Type":"ContainerDied","Data":"9aab00592e3103b510ac88ec4710424ef42bc2192e995f53db7890e497783d40"} Mar 19 20:16:44 crc kubenswrapper[4826]: I0319 20:16:44.435148 4826 scope.go:117] "RemoveContainer" containerID="9aab00592e3103b510ac88ec4710424ef42bc2192e995f53db7890e497783d40" Mar 19 20:16:44 crc kubenswrapper[4826]: I0319 20:16:44.721667 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-d4pjw" Mar 19 20:16:45 crc kubenswrapper[4826]: I0319 20:16:45.115712 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-distributor-9c6b6d984-qrlfg" Mar 19 20:16:45 crc kubenswrapper[4826]: I0319 20:16:45.118753 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qltmk" Mar 19 20:16:45 crc kubenswrapper[4826]: I0319 20:16:45.204668 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-l2p46" Mar 19 20:16:45 crc kubenswrapper[4826]: I0319 20:16:45.217071 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-6c6f68556d-k5tlt" Mar 19 20:16:45 crc kubenswrapper[4826]: I0319 20:16:45.321000 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="38814433-1737-49df-966a-ac3511ed48dd" containerName="galera" containerID="cri-o://3c6b4dafb4bb937c4481ee36080942d492dddee83e2f324b34dcb098d03b3ea9" gracePeriod=19 Mar 19 20:16:45 crc kubenswrapper[4826]: I0319 20:16:45.398688 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="763c5ded-be94-49ad-9eea-447e444f24f3" containerName="galera" containerID="cri-o://48ef0c18927acad3cab5327c9df4d256a3f01325b10cf9e6772558514a35dec9" gracePeriod=8 Mar 19 20:16:45 crc kubenswrapper[4826]: I0319 20:16:45.477161 4826 generic.go:334] "Generic (PLEG): container finished" podID="f182fb72-66c7-4d5d-bccd-29a47b27f4c6" containerID="68307e5a08a95cfd72fde6061fc39632cfa52ca8644e891be3785baefa2852fd" exitCode=0 Mar 19 20:16:45 crc kubenswrapper[4826]: I0319 20:16:45.477279 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-66v8z" event={"ID":"f182fb72-66c7-4d5d-bccd-29a47b27f4c6","Type":"ContainerDied","Data":"68307e5a08a95cfd72fde6061fc39632cfa52ca8644e891be3785baefa2852fd"} Mar 19 20:16:45 crc kubenswrapper[4826]: I0319 20:16:45.487820 4826 patch_prober.go:28] interesting pod/controller-manager-567cb464d6-bm4t6 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.70:8443/healthz\": dial tcp 10.217.0.70:8443: connect: connection refused" start-of-body= Mar 19 20:16:45 crc kubenswrapper[4826]: I0319 20:16:45.487881 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-567cb464d6-bm4t6" podUID="e5996d80-d5eb-423c-8965-1f5704c3dd69" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.70:8443/healthz\": dial tcp 10.217.0.70:8443: connect: connection refused" Mar 19 20:16:45 crc kubenswrapper[4826]: I0319 20:16:45.489323 4826 patch_prober.go:28] interesting pod/route-controller-manager-bb4bb89f7-bhb8x container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.73:8443/healthz\": dial tcp 10.217.0.73:8443: connect: connection refused" start-of-body= Mar 19 20:16:45 crc kubenswrapper[4826]: I0319 20:16:45.489363 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-bb4bb89f7-bhb8x" podUID="5f25fb62-ec83-409e-88fb-0073d07869b9" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.73:8443/healthz\": dial tcp 10.217.0.73:8443: connect: connection refused" Mar 19 20:16:45 crc kubenswrapper[4826]: I0319 20:16:45.506901 4826 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="20939d58b80c88ee271214997c2a628fedf2297700bc34d570f5ef7a0aba7429" exitCode=0 Mar 19 20:16:45 crc kubenswrapper[4826]: I0319 20:16:45.507029 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"20939d58b80c88ee271214997c2a628fedf2297700bc34d570f5ef7a0aba7429"} Mar 19 20:16:45 crc kubenswrapper[4826]: I0319 20:16:45.521930 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 19 20:16:45 crc kubenswrapper[4826]: I0319 20:16:45.533422 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 19 20:16:45 crc kubenswrapper[4826]: I0319 20:16:45.536975 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 20:16:45 crc kubenswrapper[4826]: I0319 20:16:45.554178 4826 generic.go:334] "Generic (PLEG): container finished" podID="5f25fb62-ec83-409e-88fb-0073d07869b9" containerID="124b4e552bd3b6885f319faa228718a478ff32940ded948a03982e9e648fe0bb" exitCode=0 Mar 19 20:16:45 crc kubenswrapper[4826]: I0319 20:16:45.554261 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-bb4bb89f7-bhb8x" event={"ID":"5f25fb62-ec83-409e-88fb-0073d07869b9","Type":"ContainerDied","Data":"124b4e552bd3b6885f319faa228718a478ff32940ded948a03982e9e648fe0bb"} Mar 19 20:16:45 crc kubenswrapper[4826]: I0319 20:16:45.559798 4826 generic.go:334] "Generic (PLEG): container finished" podID="080fa697-4720-424e-b75e-6564061cd68f" containerID="91ee439cd3099250e4f963c3380bde90f7c6552487df9e0cc33fb009ec6db276" exitCode=1 Mar 19 20:16:45 crc kubenswrapper[4826]: I0319 20:16:45.559973 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-hf8n5" event={"ID":"080fa697-4720-424e-b75e-6564061cd68f","Type":"ContainerDied","Data":"91ee439cd3099250e4f963c3380bde90f7c6552487df9e0cc33fb009ec6db276"} Mar 19 20:16:45 crc kubenswrapper[4826]: I0319 20:16:45.560786 4826 scope.go:117] "RemoveContainer" containerID="91ee439cd3099250e4f963c3380bde90f7c6552487df9e0cc33fb009ec6db276" Mar 19 20:16:45 crc kubenswrapper[4826]: I0319 20:16:45.581641 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-prxxj" Mar 19 20:16:45 crc kubenswrapper[4826]: I0319 20:16:45.600093 4826 generic.go:334] "Generic (PLEG): container finished" podID="f073a654-efe9-4fd0-9c08-23d9fdb0d492" containerID="551ea97bff248de44602944c01ec7662e22f9f72421824709fd6bb957972ba0a" exitCode=1 Mar 19 20:16:45 crc kubenswrapper[4826]: I0319 20:16:45.600175 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-725xd" event={"ID":"f073a654-efe9-4fd0-9c08-23d9fdb0d492","Type":"ContainerDied","Data":"551ea97bff248de44602944c01ec7662e22f9f72421824709fd6bb957972ba0a"} Mar 19 20:16:45 crc kubenswrapper[4826]: I0319 20:16:45.602010 4826 scope.go:117] "RemoveContainer" containerID="551ea97bff248de44602944c01ec7662e22f9f72421824709fd6bb957972ba0a" Mar 19 20:16:45 crc kubenswrapper[4826]: I0319 20:16:45.616615 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-ingester-0" Mar 19 20:16:45 crc kubenswrapper[4826]: I0319 20:16:45.627311 4826 generic.go:334] "Generic (PLEG): container finished" podID="352eae31-d0e1-452b-8319-ab53b8095b5a" containerID="d2e3ec0be6cedec0fc7839de9ce4bc718a61bf1dc4687ae1ec01c9bb5e46e584" exitCode=0 Mar 19 20:16:45 crc kubenswrapper[4826]: I0319 20:16:45.627368 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-zl2jh" event={"ID":"352eae31-d0e1-452b-8319-ab53b8095b5a","Type":"ContainerDied","Data":"d2e3ec0be6cedec0fc7839de9ce4bc718a61bf1dc4687ae1ec01c9bb5e46e584"} Mar 19 20:16:45 crc kubenswrapper[4826]: I0319 20:16:45.639292 4826 generic.go:334] "Generic (PLEG): container finished" podID="50980b03-91b0-4e4d-9923-e2a531458fd4" containerID="0c2f34ab591639d111c4934a0d9282a82981c96de35c94a53b74fbcfb47cae74" exitCode=1 Mar 19 20:16:45 crc kubenswrapper[4826]: I0319 20:16:45.639351 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-8t2bm" event={"ID":"50980b03-91b0-4e4d-9923-e2a531458fd4","Type":"ContainerDied","Data":"0c2f34ab591639d111c4934a0d9282a82981c96de35c94a53b74fbcfb47cae74"} Mar 19 20:16:45 crc kubenswrapper[4826]: I0319 20:16:45.640218 4826 scope.go:117] "RemoveContainer" containerID="0c2f34ab591639d111c4934a0d9282a82981c96de35c94a53b74fbcfb47cae74" Mar 19 20:16:45 crc kubenswrapper[4826]: I0319 20:16:45.642122 4826 generic.go:334] "Generic (PLEG): container finished" podID="fdb49b25-5e81-4f9d-9a17-34bade2cec18" containerID="d3a209fa43a7430fe81c94d1187f4b67a6e187072f60a4066fd2e2e620507871" exitCode=0 Mar 19 20:16:45 crc kubenswrapper[4826]: I0319 20:16:45.642252 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dnc22" event={"ID":"fdb49b25-5e81-4f9d-9a17-34bade2cec18","Type":"ContainerDied","Data":"d3a209fa43a7430fe81c94d1187f4b67a6e187072f60a4066fd2e2e620507871"} Mar 19 20:16:45 crc kubenswrapper[4826]: I0319 20:16:45.644924 4826 generic.go:334] "Generic (PLEG): container finished" podID="e5996d80-d5eb-423c-8965-1f5704c3dd69" containerID="79bd64d7d2c89268df3c32958214a5b3ac6e94666db0ac84e1fea7d21c755d03" exitCode=0 Mar 19 20:16:45 crc kubenswrapper[4826]: I0319 20:16:45.644995 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-567cb464d6-bm4t6" event={"ID":"e5996d80-d5eb-423c-8965-1f5704c3dd69","Type":"ContainerDied","Data":"79bd64d7d2c89268df3c32958214a5b3ac6e94666db0ac84e1fea7d21c755d03"} Mar 19 20:16:45 crc kubenswrapper[4826]: E0319 20:16:45.649167 4826 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f25fb62_ec83_409e_88fb_0073d07869b9.slice/crio-conmon-124b4e552bd3b6885f319faa228718a478ff32940ded948a03982e9e648fe0bb.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod352eae31_d0e1_452b_8319_ab53b8095b5a.slice/crio-conmon-d2e3ec0be6cedec0fc7839de9ce4bc718a61bf1dc4687ae1ec01c9bb5e46e584.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc045bb2f_b87b_4a14_92b5_0b98cdc7a0d1.slice/crio-8b365c4de811f05a97f6948925e513561fcd22296731daa07c22f7bbd7cfd7cc.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-conmon-20939d58b80c88ee271214997c2a628fedf2297700bc34d570f5ef7a0aba7429.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-20939d58b80c88ee271214997c2a628fedf2297700bc34d570f5ef7a0aba7429.scope\": RecentStats: unable to find data in memory cache]" Mar 19 20:16:45 crc kubenswrapper[4826]: I0319 20:16:45.737816 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-compactor-0" Mar 19 20:16:45 crc kubenswrapper[4826]: I0319 20:16:45.929866 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-jjqrs" Mar 19 20:16:46 crc kubenswrapper[4826]: I0319 20:16:46.004140 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-rsrjx" podUID="d2375678-e630-4376-9dfd-28efbc77aed4" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.108:8081/readyz\": dial tcp 10.217.0.108:8081: connect: connection refused" Mar 19 20:16:46 crc kubenswrapper[4826]: I0319 20:16:46.011144 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-rsrjx" podUID="d2375678-e630-4376-9dfd-28efbc77aed4" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.108:8081/healthz\": dial tcp 10.217.0.108:8081: connect: connection refused" Mar 19 20:16:46 crc kubenswrapper[4826]: I0319 20:16:46.060601 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-zm4ps" Mar 19 20:16:46 crc kubenswrapper[4826]: I0319 20:16:46.060864 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-zm4ps" Mar 19 20:16:46 crc kubenswrapper[4826]: I0319 20:16:46.060881 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-hf8n5" Mar 19 20:16:46 crc kubenswrapper[4826]: I0319 20:16:46.060890 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-8265b" Mar 19 20:16:46 crc kubenswrapper[4826]: I0319 20:16:46.060899 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-ngb9j" Mar 19 20:16:46 crc kubenswrapper[4826]: I0319 20:16:46.157281 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-725xd" Mar 19 20:16:46 crc kubenswrapper[4826]: I0319 20:16:46.231360 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-55f864c847-zrczt" Mar 19 20:16:46 crc kubenswrapper[4826]: I0319 20:16:46.307515 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-4bkbn" podUID="49f5fbe6-ba93-4ff2-b575-aa08dceb2622" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.114:8081/healthz\": dial tcp 10.217.0.114:8081: connect: connection refused" Mar 19 20:16:46 crc kubenswrapper[4826]: I0319 20:16:46.309639 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-4bkbn" podUID="49f5fbe6-ba93-4ff2-b575-aa08dceb2622" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.114:8081/readyz\": dial tcp 10.217.0.114:8081: connect: connection refused" Mar 19 20:16:46 crc kubenswrapper[4826]: I0319 20:16:46.332897 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-xpq6x" Mar 19 20:16:46 crc kubenswrapper[4826]: I0319 20:16:46.442055 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/ovn-operator-controller-manager-884679f54-zs74n" Mar 19 20:16:46 crc kubenswrapper[4826]: I0319 20:16:46.507841 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-tjcmb" podUID="44055ef9-1bc5-4b25-a40d-553a1546fc15" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.119:8081/readyz\": dial tcp 10.217.0.119:8081: connect: connection refused" Mar 19 20:16:46 crc kubenswrapper[4826]: I0319 20:16:46.508398 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-tjcmb" podUID="44055ef9-1bc5-4b25-a40d-553a1546fc15" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.119:8081/healthz\": dial tcp 10.217.0.119:8081: connect: connection refused" Mar 19 20:16:46 crc kubenswrapper[4826]: I0319 20:16:46.582155 4826 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-pfrcn container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Mar 19 20:16:46 crc kubenswrapper[4826]: I0319 20:16:46.582203 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-pfrcn" podUID="72f0a310-1676-49a4-826a-d83406d28e93" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" Mar 19 20:16:46 crc kubenswrapper[4826]: I0319 20:16:46.596267 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/placement-operator-controller-manager-5784578c99-kkmzl" Mar 19 20:16:46 crc kubenswrapper[4826]: I0319 20:16:46.596309 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5784578c99-kkmzl" Mar 19 20:16:46 crc kubenswrapper[4826]: I0319 20:16:46.660315 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-b76w9" event={"ID":"4f382869-5ee2-4a46-8188-d4ddd0bee2fa","Type":"ContainerStarted","Data":"4bdc6ba817ffe3123f8cb852d761854786edc2513ea165f7e1bce3cdb6a1cdc3"} Mar 19 20:16:46 crc kubenswrapper[4826]: I0319 20:16:46.661120 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-b76w9" Mar 19 20:16:46 crc kubenswrapper[4826]: I0319 20:16:46.662589 4826 generic.go:334] "Generic (PLEG): container finished" podID="49f5fbe6-ba93-4ff2-b575-aa08dceb2622" containerID="fbae391a355fca143beec49ad5c733d24d2d87928b4d4ffc5d5d192fd32d6246" exitCode=1 Mar 19 20:16:46 crc kubenswrapper[4826]: I0319 20:16:46.662620 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-4bkbn" event={"ID":"49f5fbe6-ba93-4ff2-b575-aa08dceb2622","Type":"ContainerDied","Data":"fbae391a355fca143beec49ad5c733d24d2d87928b4d4ffc5d5d192fd32d6246"} Mar 19 20:16:46 crc kubenswrapper[4826]: I0319 20:16:46.662946 4826 scope.go:117] "RemoveContainer" containerID="fbae391a355fca143beec49ad5c733d24d2d87928b4d4ffc5d5d192fd32d6246" Mar 19 20:16:46 crc kubenswrapper[4826]: I0319 20:16:46.665794 4826 generic.go:334] "Generic (PLEG): container finished" podID="c045bb2f-b87b-4a14-92b5-0b98cdc7a0d1" containerID="8b365c4de811f05a97f6948925e513561fcd22296731daa07c22f7bbd7cfd7cc" exitCode=1 Mar 19 20:16:46 crc kubenswrapper[4826]: I0319 20:16:46.665832 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6c6f68556d-k5tlt" event={"ID":"c045bb2f-b87b-4a14-92b5-0b98cdc7a0d1","Type":"ContainerDied","Data":"8b365c4de811f05a97f6948925e513561fcd22296731daa07c22f7bbd7cfd7cc"} Mar 19 20:16:46 crc kubenswrapper[4826]: I0319 20:16:46.666129 4826 scope.go:117] "RemoveContainer" containerID="8b365c4de811f05a97f6948925e513561fcd22296731daa07c22f7bbd7cfd7cc" Mar 19 20:16:46 crc kubenswrapper[4826]: I0319 20:16:46.685786 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-84d47777df-4x998" event={"ID":"010ce31f-d333-43a9-b1e0-cd85cc0f6fd6","Type":"ContainerStarted","Data":"b1f1e1a2a35632026d4f35b7d98fe73b0d04e9b19bdd602ca0c630d468fd117f"} Mar 19 20:16:46 crc kubenswrapper[4826]: I0319 20:16:46.687087 4826 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-sbhr9 container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.82:8443/healthz\": dial tcp 10.217.0.82:8443: connect: connection refused" start-of-body= Mar 19 20:16:46 crc kubenswrapper[4826]: I0319 20:16:46.687281 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-sbhr9" podUID="67f96c65-0583-4f62-a063-98c7e6bbfb87" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.82:8443/healthz\": dial tcp 10.217.0.82:8443: connect: connection refused" Mar 19 20:16:46 crc kubenswrapper[4826]: I0319 20:16:46.687237 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-84d47777df-4x998" Mar 19 20:16:46 crc kubenswrapper[4826]: I0319 20:16:46.697883 4826 generic.go:334] "Generic (PLEG): container finished" podID="ab298593-ac97-4031-8bfc-b0e5be9b341a" containerID="6bce5b1cd3e4e908191a9ef12dfd6f5c8e6ba3c2ec093ded1e9938b5a4c85dc8" exitCode=0 Mar 19 20:16:46 crc kubenswrapper[4826]: I0319 20:16:46.697941 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab298593-ac97-4031-8bfc-b0e5be9b341a","Type":"ContainerDied","Data":"6bce5b1cd3e4e908191a9ef12dfd6f5c8e6ba3c2ec093ded1e9938b5a4c85dc8"} Mar 19 20:16:46 crc kubenswrapper[4826]: I0319 20:16:46.701243 4826 generic.go:334] "Generic (PLEG): container finished" podID="d2375678-e630-4376-9dfd-28efbc77aed4" containerID="497d04c6c02cd33a4535ca06ac6841664f0d2fb4d066c68a30101f9f2463f892" exitCode=1 Mar 19 20:16:46 crc kubenswrapper[4826]: I0319 20:16:46.701290 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-rsrjx" event={"ID":"d2375678-e630-4376-9dfd-28efbc77aed4","Type":"ContainerDied","Data":"497d04c6c02cd33a4535ca06ac6841664f0d2fb4d066c68a30101f9f2463f892"} Mar 19 20:16:46 crc kubenswrapper[4826]: I0319 20:16:46.702003 4826 scope.go:117] "RemoveContainer" containerID="497d04c6c02cd33a4535ca06ac6841664f0d2fb4d066c68a30101f9f2463f892" Mar 19 20:16:46 crc kubenswrapper[4826]: I0319 20:16:46.708213 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fcnzx" event={"ID":"781f0741-f222-4ccc-aa80-6dde59e9648d","Type":"ContainerStarted","Data":"704defe2abac9609996764bf71d71603e892b6b09adcd9fe54ca385fb2176a66"} Mar 19 20:16:46 crc kubenswrapper[4826]: I0319 20:16:46.709283 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fcnzx" Mar 19 20:16:46 crc kubenswrapper[4826]: I0319 20:16:46.709360 4826 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-fcnzx container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.44:5443/healthz\": dial tcp 10.217.0.44:5443: connect: connection refused" start-of-body= Mar 19 20:16:46 crc kubenswrapper[4826]: I0319 20:16:46.709386 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fcnzx" podUID="781f0741-f222-4ccc-aa80-6dde59e9648d" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.44:5443/healthz\": dial tcp 10.217.0.44:5443: connect: connection refused" Mar 19 20:16:46 crc kubenswrapper[4826]: I0319 20:16:46.716956 4826 generic.go:334] "Generic (PLEG): container finished" podID="217c809e-0af8-4b11-a5ce-932d698ed444" containerID="4e64c0a4a98e62e740173a5f8f4c9dc44425c09198509fa5ce76cbdc06dfbea7" exitCode=0 Mar 19 20:16:46 crc kubenswrapper[4826]: I0319 20:16:46.717119 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-6dd7dd855f-tcdmb" event={"ID":"217c809e-0af8-4b11-a5ce-932d698ed444","Type":"ContainerDied","Data":"4e64c0a4a98e62e740173a5f8f4c9dc44425c09198509fa5ce76cbdc06dfbea7"} Mar 19 20:16:46 crc kubenswrapper[4826]: I0319 20:16:46.720561 4826 generic.go:334] "Generic (PLEG): container finished" podID="44055ef9-1bc5-4b25-a40d-553a1546fc15" containerID="c0aea16d54113eea374338a2a5ad761f722d63d272579619d2e5a1377a28915e" exitCode=1 Mar 19 20:16:46 crc kubenswrapper[4826]: I0319 20:16:46.720602 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-tjcmb" event={"ID":"44055ef9-1bc5-4b25-a40d-553a1546fc15","Type":"ContainerDied","Data":"c0aea16d54113eea374338a2a5ad761f722d63d272579619d2e5a1377a28915e"} Mar 19 20:16:46 crc kubenswrapper[4826]: I0319 20:16:46.721534 4826 scope.go:117] "RemoveContainer" containerID="c0aea16d54113eea374338a2a5ad761f722d63d272579619d2e5a1377a28915e" Mar 19 20:16:46 crc kubenswrapper[4826]: I0319 20:16:46.787767 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-j4p25" Mar 19 20:16:46 crc kubenswrapper[4826]: I0319 20:16:46.905927 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 20:16:46 crc kubenswrapper[4826]: I0319 20:16:46.956480 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-c674c5965-skdcp" Mar 19 20:16:46 crc kubenswrapper[4826]: I0319 20:16:46.978065 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-6c5c766d94-258q2" Mar 19 20:16:47 crc kubenswrapper[4826]: I0319 20:16:47.019286 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-7l4t6" Mar 19 20:16:47 crc kubenswrapper[4826]: I0319 20:16:47.028707 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-scheduler-0" podUID="9c0ba365-345a-4a2b-b919-b5e9de88b680" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 20:16:47 crc kubenswrapper[4826]: I0319 20:16:47.089116 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="bf194957-ec68-4ea7-b094-3e0912bc3bc5" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.171:9090/-/ready\": dial tcp 10.217.0.171:9090: connect: connection refused" Mar 19 20:16:47 crc kubenswrapper[4826]: I0319 20:16:47.311177 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-849c6d8fdf-t6vlp" Mar 19 20:16:47 crc kubenswrapper[4826]: E0319 20:16:47.592031 4826 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f9625e2a72fc3f907926b5792559f166baea15fc5366e9be6d1426295fd0720f is running failed: container process not found" containerID="f9625e2a72fc3f907926b5792559f166baea15fc5366e9be6d1426295fd0720f" cmd=["sh","-c","if [ -x \"$(command -v curl)\" ]; then exec curl --fail http://localhost:9090/-/ready; elif [ -x \"$(command -v wget)\" ]; then exec wget -q -O /dev/null http://localhost:9090/-/ready; else exit 1; fi"] Mar 19 20:16:47 crc kubenswrapper[4826]: E0319 20:16:47.592523 4826 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f9625e2a72fc3f907926b5792559f166baea15fc5366e9be6d1426295fd0720f is running failed: container process not found" containerID="f9625e2a72fc3f907926b5792559f166baea15fc5366e9be6d1426295fd0720f" cmd=["sh","-c","if [ -x \"$(command -v curl)\" ]; then exec curl --fail http://localhost:9090/-/ready; elif [ -x \"$(command -v wget)\" ]; then exec wget -q -O /dev/null http://localhost:9090/-/ready; else exit 1; fi"] Mar 19 20:16:47 crc kubenswrapper[4826]: E0319 20:16:47.592912 4826 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f9625e2a72fc3f907926b5792559f166baea15fc5366e9be6d1426295fd0720f is running failed: container process not found" containerID="f9625e2a72fc3f907926b5792559f166baea15fc5366e9be6d1426295fd0720f" cmd=["sh","-c","if [ -x \"$(command -v curl)\" ]; then exec curl --fail http://localhost:9090/-/ready; elif [ -x \"$(command -v wget)\" ]; then exec wget -q -O /dev/null http://localhost:9090/-/ready; else exit 1; fi"] Mar 19 20:16:47 crc kubenswrapper[4826]: E0319 20:16:47.592944 4826 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f9625e2a72fc3f907926b5792559f166baea15fc5366e9be6d1426295fd0720f is running failed: container process not found" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="cb23233f-a975-4476-8bff-5e7b4b9c8646" containerName="prometheus" Mar 19 20:16:47 crc kubenswrapper[4826]: I0319 20:16:47.737819 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-8265b" event={"ID":"0f77f094-1b90-43a6-85be-27e8b1fda71f","Type":"ContainerStarted","Data":"ce8342e738f339e677404130d20d533157ac2c8a3c2c6ac64ca5f1f402487a3a"} Mar 19 20:16:47 crc kubenswrapper[4826]: I0319 20:16:47.738096 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-8265b" Mar 19 20:16:47 crc kubenswrapper[4826]: I0319 20:16:47.741640 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-zs74n" event={"ID":"6243b523-966a-4f1d-b663-2f1ed4614fdb","Type":"ContainerStarted","Data":"bb98760a94103b6936ad0f1d2487285e4fedea31df9aa180081de42ea4986402"} Mar 19 20:16:47 crc kubenswrapper[4826]: I0319 20:16:47.741863 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-884679f54-zs74n" Mar 19 20:16:47 crc kubenswrapper[4826]: I0319 20:16:47.745333 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-j4p25" event={"ID":"7137162e-cccf-4ce6-9dc4-7380db33a85a","Type":"ContainerStarted","Data":"977ed7975ae4e64873fe1fc51a2e7040961a4101381d68655ae1a8e518ea6594"} Mar 19 20:16:47 crc kubenswrapper[4826]: I0319 20:16:47.745388 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-j4p25" Mar 19 20:16:47 crc kubenswrapper[4826]: I0319 20:16:47.749140 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dnc22" event={"ID":"fdb49b25-5e81-4f9d-9a17-34bade2cec18","Type":"ContainerStarted","Data":"c91c4e73bced64dd61dce685d43e67abcaed958a20649f4ae807a0f50ed1131a"} Mar 19 20:16:47 crc kubenswrapper[4826]: I0319 20:16:47.749637 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dnc22" Mar 19 20:16:47 crc kubenswrapper[4826]: I0319 20:16:47.749705 4826 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-dnc22 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Mar 19 20:16:47 crc kubenswrapper[4826]: I0319 20:16:47.749739 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dnc22" podUID="fdb49b25-5e81-4f9d-9a17-34bade2cec18" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" Mar 19 20:16:47 crc kubenswrapper[4826]: I0319 20:16:47.765425 4826 generic.go:334] "Generic (PLEG): container finished" podID="cb23233f-a975-4476-8bff-5e7b4b9c8646" containerID="f9625e2a72fc3f907926b5792559f166baea15fc5366e9be6d1426295fd0720f" exitCode=0 Mar 19 20:16:47 crc kubenswrapper[4826]: I0319 20:16:47.766037 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cb23233f-a975-4476-8bff-5e7b4b9c8646","Type":"ContainerDied","Data":"f9625e2a72fc3f907926b5792559f166baea15fc5366e9be6d1426295fd0720f"} Mar 19 20:16:47 crc kubenswrapper[4826]: I0319 20:16:47.799118 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5jvh4" event={"ID":"b00ec043-3d8c-41dd-bbef-fc99f7ad0bb6","Type":"ContainerStarted","Data":"3d03f6f0da31fb004a5621b2e0ddfa4dbab2771485a03ff9c29de5a8c215a31e"} Mar 19 20:16:47 crc kubenswrapper[4826]: I0319 20:16:47.816132 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-sbhr9" event={"ID":"67f96c65-0583-4f62-a063-98c7e6bbfb87","Type":"ContainerStarted","Data":"c4e9d4287bf5558fcf4a34182bf04147f2cee7865a61cf2ae3e242f323644ad0"} Mar 19 20:16:47 crc kubenswrapper[4826]: I0319 20:16:47.816875 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-sbhr9" Mar 19 20:16:47 crc kubenswrapper[4826]: I0319 20:16:47.817270 4826 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-sbhr9 container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.82:8443/healthz\": dial tcp 10.217.0.82:8443: connect: connection refused" start-of-body= Mar 19 20:16:47 crc kubenswrapper[4826]: I0319 20:16:47.817321 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-sbhr9" podUID="67f96c65-0583-4f62-a063-98c7e6bbfb87" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.82:8443/healthz\": dial tcp 10.217.0.82:8443: connect: connection refused" Mar 19 20:16:47 crc kubenswrapper[4826]: I0319 20:16:47.840488 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"bf194957-ec68-4ea7-b094-3e0912bc3bc5","Type":"ContainerStarted","Data":"9401e1a248af1b51b1cbcbb94731e7e009661fa62213fe23d79cf52ea28343c3"} Mar 19 20:16:47 crc kubenswrapper[4826]: I0319 20:16:47.851917 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-hf8n5" event={"ID":"080fa697-4720-424e-b75e-6564061cd68f","Type":"ContainerStarted","Data":"91fc15b713f3decde4110c6846241fbda1fe53f1a6a1d30751a15a640ffe23a0"} Mar 19 20:16:47 crc kubenswrapper[4826]: I0319 20:16:47.852197 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-hf8n5" Mar 19 20:16:47 crc kubenswrapper[4826]: I0319 20:16:47.859771 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-567cb464d6-bm4t6" event={"ID":"e5996d80-d5eb-423c-8965-1f5704c3dd69","Type":"ContainerStarted","Data":"0ef97a7b01a7d5ddd46940a29f48805c159577b5af429f7c54aec506924f3fbf"} Mar 19 20:16:47 crc kubenswrapper[4826]: I0319 20:16:47.859836 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-567cb464d6-bm4t6" Mar 19 20:16:47 crc kubenswrapper[4826]: I0319 20:16:47.860084 4826 patch_prober.go:28] interesting pod/controller-manager-567cb464d6-bm4t6 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.70:8443/healthz\": dial tcp 10.217.0.70:8443: connect: connection refused" start-of-body= Mar 19 20:16:47 crc kubenswrapper[4826]: I0319 20:16:47.860127 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-567cb464d6-bm4t6" podUID="e5996d80-d5eb-423c-8965-1f5704c3dd69" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.70:8443/healthz\": dial tcp 10.217.0.70:8443: connect: connection refused" Mar 19 20:16:47 crc kubenswrapper[4826]: I0319 20:16:47.862606 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-725xd" event={"ID":"f073a654-efe9-4fd0-9c08-23d9fdb0d492","Type":"ContainerStarted","Data":"a418afbfaf04131498e1150987f17791bd991f8b99772620ec5f35dfd0cd3f53"} Mar 19 20:16:47 crc kubenswrapper[4826]: I0319 20:16:47.863722 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-725xd" Mar 19 20:16:47 crc kubenswrapper[4826]: I0319 20:16:47.866993 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p9rmq" event={"ID":"3a20f6a8-01f3-4492-856d-e5f494672fa3","Type":"ContainerStarted","Data":"2d5c261cfd82adca71381f67e79fb61b30b42f7ce29c7c7c4876b15a535a26a6"} Mar 19 20:16:47 crc kubenswrapper[4826]: I0319 20:16:47.880737 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab298593-ac97-4031-8bfc-b0e5be9b341a","Type":"ContainerStarted","Data":"77f4662c2806af827e1689d25552d32a3d904d36a1f2250581af259883d234b2"} Mar 19 20:16:47 crc kubenswrapper[4826]: I0319 20:16:47.890432 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-ngb9j" event={"ID":"ee5c97c9-5dc0-4292-9a34-08ca45f5387a","Type":"ContainerStarted","Data":"14167ffa3b45ebf1a491593aed65279a05a14f597787acef22e4f6f2490409ec"} Mar 19 20:16:47 crc kubenswrapper[4826]: I0319 20:16:47.891057 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-ngb9j" Mar 19 20:16:47 crc kubenswrapper[4826]: I0319 20:16:47.893173 4826 generic.go:334] "Generic (PLEG): container finished" podID="72f0a310-1676-49a4-826a-d83406d28e93" containerID="76591b9cfad14c68b1ef112f6ed7cca58927da7f28bfa6fafae17389b99d7728" exitCode=0 Mar 19 20:16:47 crc kubenswrapper[4826]: I0319 20:16:47.893230 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-pfrcn" event={"ID":"72f0a310-1676-49a4-826a-d83406d28e93","Type":"ContainerDied","Data":"76591b9cfad14c68b1ef112f6ed7cca58927da7f28bfa6fafae17389b99d7728"} Mar 19 20:16:47 crc kubenswrapper[4826]: I0319 20:16:47.905700 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-bb4bb89f7-bhb8x" event={"ID":"5f25fb62-ec83-409e-88fb-0073d07869b9","Type":"ContainerStarted","Data":"8260030d80cac63c5bfb90ef66196b9d8d6d8bb61898400cc5f9073ff0e7e1b7"} Mar 19 20:16:47 crc kubenswrapper[4826]: I0319 20:16:47.906013 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-bb4bb89f7-bhb8x" Mar 19 20:16:47 crc kubenswrapper[4826]: I0319 20:16:47.906213 4826 patch_prober.go:28] interesting pod/route-controller-manager-bb4bb89f7-bhb8x container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.73:8443/healthz\": dial tcp 10.217.0.73:8443: connect: connection refused" start-of-body= Mar 19 20:16:47 crc kubenswrapper[4826]: I0319 20:16:47.906248 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-bb4bb89f7-bhb8x" podUID="5f25fb62-ec83-409e-88fb-0073d07869b9" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.73:8443/healthz\": dial tcp 10.217.0.73:8443: connect: connection refused" Mar 19 20:16:47 crc kubenswrapper[4826]: I0319 20:16:47.913609 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-66v8z" event={"ID":"f182fb72-66c7-4d5d-bccd-29a47b27f4c6","Type":"ContainerStarted","Data":"fe5975c18f5012d817e0d622fa7d9fed67cce71a696229b5746a848198f090b1"} Mar 19 20:16:47 crc kubenswrapper[4826]: I0319 20:16:47.914564 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-66v8z" Mar 19 20:16:47 crc kubenswrapper[4826]: I0319 20:16:47.914949 4826 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-66v8z container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.75:8080/healthz\": dial tcp 10.217.0.75:8080: connect: connection refused" start-of-body= Mar 19 20:16:47 crc kubenswrapper[4826]: I0319 20:16:47.914980 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-66v8z" podUID="f182fb72-66c7-4d5d-bccd-29a47b27f4c6" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.75:8080/healthz\": dial tcp 10.217.0.75:8080: connect: connection refused" Mar 19 20:16:47 crc kubenswrapper[4826]: I0319 20:16:47.917000 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5444994796-drbf6_ee11e1f6-25be-40f4-b19b-a2d8e439d8c6/router/0.log" Mar 19 20:16:47 crc kubenswrapper[4826]: I0319 20:16:47.918074 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-drbf6" event={"ID":"ee11e1f6-25be-40f4-b19b-a2d8e439d8c6","Type":"ContainerStarted","Data":"dbbf941db66380e1a48849f1493f214715937eaac309e14145f0be505ff2259e"} Mar 19 20:16:47 crc kubenswrapper[4826]: I0319 20:16:47.932808 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 19 20:16:47 crc kubenswrapper[4826]: I0319 20:16:47.938368 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 19 20:16:47 crc kubenswrapper[4826]: I0319 20:16:47.938514 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b5bb8b4ddf0e9e06d8db504174710452c60b6340f6dfba912299024ce30d862c"} Mar 19 20:16:47 crc kubenswrapper[4826]: I0319 20:16:47.947097 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-8t2bm" event={"ID":"50980b03-91b0-4e4d-9923-e2a531458fd4","Type":"ContainerStarted","Data":"5e1fa38e4e6873b3c226fbf6bc9d1761d0ec246573a5977646ecbd5acba02965"} Mar 19 20:16:47 crc kubenswrapper[4826]: I0319 20:16:47.947774 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-8t2bm" Mar 19 20:16:47 crc kubenswrapper[4826]: I0319 20:16:47.951152 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-zl2jh" event={"ID":"352eae31-d0e1-452b-8319-ab53b8095b5a","Type":"ContainerStarted","Data":"424e2933cb925676ed2e186e426777debd7ada4ce7e4417475b1ed225c2ea404"} Mar 19 20:16:47 crc kubenswrapper[4826]: I0319 20:16:47.955604 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-kkmzl" event={"ID":"79a89fcd-3226-4314-951d-d94af2ac242c","Type":"ContainerStarted","Data":"3e35daa40019e07cac814765f3dcf2b292c145f61bcb6149cb9b220c6da76cd0"} Mar 19 20:16:47 crc kubenswrapper[4826]: I0319 20:16:47.956731 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5784578c99-kkmzl" Mar 19 20:16:47 crc kubenswrapper[4826]: I0319 20:16:47.959849 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-6btqx" event={"ID":"81cad5dc-6bd8-4081-adc1-28f65b056636","Type":"ContainerStarted","Data":"36f6d929a99a886ebdebc2bd2f03571d97237f7bddacae04644527c54dbef4a2"} Mar 19 20:16:47 crc kubenswrapper[4826]: I0319 20:16:47.959900 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-6btqx" Mar 19 20:16:47 crc kubenswrapper[4826]: I0319 20:16:47.973203 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-d88f59dd5-fqs6s" event={"ID":"84bba80c-841e-4df3-87e0-901afbc23bf3","Type":"ContainerStarted","Data":"7c50021b4219273587d961a11e95b6d48adfef16979ff0e88917e4f446fdd044"} Mar 19 20:16:47 crc kubenswrapper[4826]: I0319 20:16:47.974356 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-d88f59dd5-fqs6s" Mar 19 20:16:48 crc kubenswrapper[4826]: I0319 20:16:48.019358 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-zm4ps" Mar 19 20:16:48 crc kubenswrapper[4826]: I0319 20:16:48.019402 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-zm4ps" event={"ID":"38267b94-39ea-4067-9b6e-3d863ff60494","Type":"ContainerStarted","Data":"19a833312c824854ca04d923cc3a957b0ee756676b55f756b845f64f9a7c7b66"} Mar 19 20:16:48 crc kubenswrapper[4826]: I0319 20:16:48.025745 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"b9ed5a6e252c5df9726e5644cbd2f8e45260fcb2c0be19e0d95dac9681b1e483"} Mar 19 20:16:48 crc kubenswrapper[4826]: I0319 20:16:48.025786 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 20:16:48 crc kubenswrapper[4826]: I0319 20:16:48.026366 4826 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-fcnzx container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.44:5443/healthz\": dial tcp 10.217.0.44:5443: connect: connection refused" start-of-body= Mar 19 20:16:48 crc kubenswrapper[4826]: I0319 20:16:48.026421 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fcnzx" podUID="781f0741-f222-4ccc-aa80-6dde59e9648d" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.44:5443/healthz\": dial tcp 10.217.0.44:5443: connect: connection refused" Mar 19 20:16:48 crc kubenswrapper[4826]: I0319 20:16:48.240283 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-drbf6" Mar 19 20:16:48 crc kubenswrapper[4826]: I0319 20:16:48.242012 4826 patch_prober.go:28] interesting pod/router-default-5444994796-drbf6 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 19 20:16:48 crc kubenswrapper[4826]: I0319 20:16:48.242056 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-drbf6" podUID="ee11e1f6-25be-40f4-b19b-a2d8e439d8c6" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 19 20:16:48 crc kubenswrapper[4826]: I0319 20:16:48.859549 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-wrwzn" Mar 19 20:16:48 crc kubenswrapper[4826]: I0319 20:16:48.860900 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-wrwzn" Mar 19 20:16:49 crc kubenswrapper[4826]: I0319 20:16:49.038718 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-rsrjx" event={"ID":"d2375678-e630-4376-9dfd-28efbc77aed4","Type":"ContainerStarted","Data":"c1f5f246eb441e17116a5c7db443cdbcf1f6d54958c104f266a0465d714d4f1b"} Mar 19 20:16:49 crc kubenswrapper[4826]: I0319 20:16:49.040489 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-rsrjx" Mar 19 20:16:49 crc kubenswrapper[4826]: I0319 20:16:49.045892 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-4bkbn" event={"ID":"49f5fbe6-ba93-4ff2-b575-aa08dceb2622","Type":"ContainerStarted","Data":"2f58865e322c6b24cb5906136013cb43257c2410ec31568d18de10a025181c3b"} Mar 19 20:16:49 crc kubenswrapper[4826]: I0319 20:16:49.046114 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-4bkbn" Mar 19 20:16:49 crc kubenswrapper[4826]: I0319 20:16:49.048586 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6c6f68556d-k5tlt" event={"ID":"c045bb2f-b87b-4a14-92b5-0b98cdc7a0d1","Type":"ContainerStarted","Data":"57cb133ea9663b746b4fbe4faa5ceaa1924436283b042cc4940027ff65cf43c3"} Mar 19 20:16:49 crc kubenswrapper[4826]: I0319 20:16:49.048781 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-6c6f68556d-k5tlt" Mar 19 20:16:49 crc kubenswrapper[4826]: I0319 20:16:49.052879 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cb23233f-a975-4476-8bff-5e7b4b9c8646","Type":"ContainerStarted","Data":"3c3ece291b01cae3e4f832dff7d29b0a56eca4b352ce9c0c3a87692f764db05d"} Mar 19 20:16:49 crc kubenswrapper[4826]: I0319 20:16:49.066005 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-6dd7dd855f-tcdmb" event={"ID":"217c809e-0af8-4b11-a5ce-932d698ed444","Type":"ContainerStarted","Data":"477da22ce2b367a42998901e5588670404435e666be4a66bf5c4c7f7546fd428"} Mar 19 20:16:49 crc kubenswrapper[4826]: I0319 20:16:49.067279 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-6dd7dd855f-tcdmb" Mar 19 20:16:49 crc kubenswrapper[4826]: I0319 20:16:49.067615 4826 patch_prober.go:28] interesting pod/observability-operator-6dd7dd855f-tcdmb container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.13:8081/healthz\": dial tcp 10.217.0.13:8081: connect: connection refused" start-of-body= Mar 19 20:16:49 crc kubenswrapper[4826]: I0319 20:16:49.067787 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-6dd7dd855f-tcdmb" podUID="217c809e-0af8-4b11-a5ce-932d698ed444" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.13:8081/healthz\": dial tcp 10.217.0.13:8081: connect: connection refused" Mar 19 20:16:49 crc kubenswrapper[4826]: I0319 20:16:49.070520 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-tjcmb" event={"ID":"44055ef9-1bc5-4b25-a40d-553a1546fc15","Type":"ContainerStarted","Data":"d642b315be69af2cae044d232c2fa7d2c333648d415492d03cf1b1379b9dbbf9"} Mar 19 20:16:49 crc kubenswrapper[4826]: I0319 20:16:49.071494 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-tjcmb" Mar 19 20:16:49 crc kubenswrapper[4826]: I0319 20:16:49.083751 4826 generic.go:334] "Generic (PLEG): container finished" podID="38814433-1737-49df-966a-ac3511ed48dd" containerID="3c6b4dafb4bb937c4481ee36080942d492dddee83e2f324b34dcb098d03b3ea9" exitCode=0 Mar 19 20:16:49 crc kubenswrapper[4826]: I0319 20:16:49.083862 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"38814433-1737-49df-966a-ac3511ed48dd","Type":"ContainerDied","Data":"3c6b4dafb4bb937c4481ee36080942d492dddee83e2f324b34dcb098d03b3ea9"} Mar 19 20:16:49 crc kubenswrapper[4826]: I0319 20:16:49.083895 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"38814433-1737-49df-966a-ac3511ed48dd","Type":"ContainerStarted","Data":"6d240da963366749939a3b65a05fed1bf024c1bde6b87e3ef82f6ea9885d27bb"} Mar 19 20:16:49 crc kubenswrapper[4826]: I0319 20:16:49.091479 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-pfrcn" event={"ID":"72f0a310-1676-49a4-826a-d83406d28e93","Type":"ContainerStarted","Data":"ee8d92136c42e417f1908fab4dd58a7b18dba5128d073afd529a8a665b7c3b5a"} Mar 19 20:16:49 crc kubenswrapper[4826]: I0319 20:16:49.092605 4826 patch_prober.go:28] interesting pod/controller-manager-567cb464d6-bm4t6 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.70:8443/healthz\": dial tcp 10.217.0.70:8443: connect: connection refused" start-of-body= Mar 19 20:16:49 crc kubenswrapper[4826]: I0319 20:16:49.092624 4826 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-66v8z container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.75:8080/healthz\": dial tcp 10.217.0.75:8080: connect: connection refused" start-of-body= Mar 19 20:16:49 crc kubenswrapper[4826]: I0319 20:16:49.092654 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-567cb464d6-bm4t6" podUID="e5996d80-d5eb-423c-8965-1f5704c3dd69" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.70:8443/healthz\": dial tcp 10.217.0.70:8443: connect: connection refused" Mar 19 20:16:49 crc kubenswrapper[4826]: I0319 20:16:49.092707 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-66v8z" podUID="f182fb72-66c7-4d5d-bccd-29a47b27f4c6" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.75:8080/healthz\": dial tcp 10.217.0.75:8080: connect: connection refused" Mar 19 20:16:49 crc kubenswrapper[4826]: I0319 20:16:49.092829 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-p9rmq" Mar 19 20:16:49 crc kubenswrapper[4826]: I0319 20:16:49.092900 4826 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-fcnzx container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.44:5443/healthz\": dial tcp 10.217.0.44:5443: connect: connection refused" start-of-body= Mar 19 20:16:49 crc kubenswrapper[4826]: I0319 20:16:49.092959 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fcnzx" podUID="781f0741-f222-4ccc-aa80-6dde59e9648d" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.44:5443/healthz\": dial tcp 10.217.0.44:5443: connect: connection refused" Mar 19 20:16:49 crc kubenswrapper[4826]: I0319 20:16:49.093057 4826 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-sbhr9 container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.82:8443/healthz\": dial tcp 10.217.0.82:8443: connect: connection refused" start-of-body= Mar 19 20:16:49 crc kubenswrapper[4826]: I0319 20:16:49.093086 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-sbhr9" podUID="67f96c65-0583-4f62-a063-98c7e6bbfb87" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.82:8443/healthz\": dial tcp 10.217.0.82:8443: connect: connection refused" Mar 19 20:16:49 crc kubenswrapper[4826]: I0319 20:16:49.093370 4826 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-dnc22 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Mar 19 20:16:49 crc kubenswrapper[4826]: I0319 20:16:49.093394 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dnc22" podUID="fdb49b25-5e81-4f9d-9a17-34bade2cec18" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" Mar 19 20:16:49 crc kubenswrapper[4826]: I0319 20:16:49.094101 4826 patch_prober.go:28] interesting pod/route-controller-manager-bb4bb89f7-bhb8x container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.73:8443/healthz\": dial tcp 10.217.0.73:8443: connect: connection refused" start-of-body= Mar 19 20:16:49 crc kubenswrapper[4826]: I0319 20:16:49.094140 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-bb4bb89f7-bhb8x" podUID="5f25fb62-ec83-409e-88fb-0073d07869b9" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.73:8443/healthz\": dial tcp 10.217.0.73:8443: connect: connection refused" Mar 19 20:16:49 crc kubenswrapper[4826]: I0319 20:16:49.095050 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-pfrcn" Mar 19 20:16:49 crc kubenswrapper[4826]: I0319 20:16:49.095078 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-p9rmq" Mar 19 20:16:49 crc kubenswrapper[4826]: I0319 20:16:49.239431 4826 patch_prober.go:28] interesting pod/router-default-5444994796-drbf6 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 19 20:16:49 crc kubenswrapper[4826]: I0319 20:16:49.239833 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-drbf6" podUID="ee11e1f6-25be-40f4-b19b-a2d8e439d8c6" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 19 20:16:49 crc kubenswrapper[4826]: I0319 20:16:49.258541 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 20:16:49 crc kubenswrapper[4826]: I0319 20:16:49.409710 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-scheduler-0" podUID="9c0ba365-345a-4a2b-b919-b5e9de88b680" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 20:16:49 crc kubenswrapper[4826]: I0319 20:16:49.507566 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" Mar 19 20:16:49 crc kubenswrapper[4826]: I0319 20:16:49.644724 4826 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-66v8z container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.75:8080/healthz\": dial tcp 10.217.0.75:8080: connect: connection refused" start-of-body= Mar 19 20:16:49 crc kubenswrapper[4826]: I0319 20:16:49.644776 4826 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-66v8z container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.217.0.75:8080/healthz\": dial tcp 10.217.0.75:8080: connect: connection refused" start-of-body= Mar 19 20:16:49 crc kubenswrapper[4826]: I0319 20:16:49.644789 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-66v8z" podUID="f182fb72-66c7-4d5d-bccd-29a47b27f4c6" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.75:8080/healthz\": dial tcp 10.217.0.75:8080: connect: connection refused" Mar 19 20:16:49 crc kubenswrapper[4826]: I0319 20:16:49.644823 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-79b997595-66v8z" podUID="f182fb72-66c7-4d5d-bccd-29a47b27f4c6" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.75:8080/healthz\": dial tcp 10.217.0.75:8080: connect: connection refused" Mar 19 20:16:49 crc kubenswrapper[4826]: I0319 20:16:49.645970 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-8645ff956b-rx86q" Mar 19 20:16:50 crc kubenswrapper[4826]: I0319 20:16:50.101096 4826 patch_prober.go:28] interesting pod/route-controller-manager-bb4bb89f7-bhb8x container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.73:8443/healthz\": dial tcp 10.217.0.73:8443: connect: connection refused" start-of-body= Mar 19 20:16:50 crc kubenswrapper[4826]: I0319 20:16:50.101131 4826 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-pfrcn container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Mar 19 20:16:50 crc kubenswrapper[4826]: I0319 20:16:50.103957 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-bb4bb89f7-bhb8x" podUID="5f25fb62-ec83-409e-88fb-0073d07869b9" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.73:8443/healthz\": dial tcp 10.217.0.73:8443: connect: connection refused" Mar 19 20:16:50 crc kubenswrapper[4826]: I0319 20:16:50.104012 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-pfrcn" podUID="72f0a310-1676-49a4-826a-d83406d28e93" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" Mar 19 20:16:50 crc kubenswrapper[4826]: I0319 20:16:50.101977 4826 patch_prober.go:28] interesting pod/observability-operator-6dd7dd855f-tcdmb container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.13:8081/healthz\": dial tcp 10.217.0.13:8081: connect: connection refused" start-of-body= Mar 19 20:16:50 crc kubenswrapper[4826]: I0319 20:16:50.104066 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-6dd7dd855f-tcdmb" podUID="217c809e-0af8-4b11-a5ce-932d698ed444" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.13:8081/healthz\": dial tcp 10.217.0.13:8081: connect: connection refused" Mar 19 20:16:50 crc kubenswrapper[4826]: I0319 20:16:50.102019 4826 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-66v8z container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.75:8080/healthz\": dial tcp 10.217.0.75:8080: connect: connection refused" start-of-body= Mar 19 20:16:50 crc kubenswrapper[4826]: I0319 20:16:50.104093 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-66v8z" podUID="f182fb72-66c7-4d5d-bccd-29a47b27f4c6" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.75:8080/healthz\": dial tcp 10.217.0.75:8080: connect: connection refused" Mar 19 20:16:50 crc kubenswrapper[4826]: I0319 20:16:50.156508 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-p9rmq" podUID="3a20f6a8-01f3-4492-856d-e5f494672fa3" containerName="registry-server" probeResult="failure" output=< Mar 19 20:16:50 crc kubenswrapper[4826]: timeout: failed to connect service ":50051" within 1s Mar 19 20:16:50 crc kubenswrapper[4826]: > Mar 19 20:16:50 crc kubenswrapper[4826]: I0319 20:16:50.239824 4826 patch_prober.go:28] interesting pod/router-default-5444994796-drbf6 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 19 20:16:50 crc kubenswrapper[4826]: I0319 20:16:50.239881 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-drbf6" podUID="ee11e1f6-25be-40f4-b19b-a2d8e439d8c6" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 19 20:16:50 crc kubenswrapper[4826]: I0319 20:16:50.375977 4826 patch_prober.go:28] interesting pod/observability-operator-6dd7dd855f-tcdmb container/operator namespace/openshift-operators: Liveness probe status=failure output="Get \"http://10.217.0.13:8081/healthz\": dial tcp 10.217.0.13:8081: connect: connection refused" start-of-body= Mar 19 20:16:50 crc kubenswrapper[4826]: I0319 20:16:50.376019 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators/observability-operator-6dd7dd855f-tcdmb" podUID="217c809e-0af8-4b11-a5ce-932d698ed444" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.13:8081/healthz\": dial tcp 10.217.0.13:8081: connect: connection refused" Mar 19 20:16:50 crc kubenswrapper[4826]: I0319 20:16:50.376466 4826 patch_prober.go:28] interesting pod/observability-operator-6dd7dd855f-tcdmb container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.13:8081/healthz\": dial tcp 10.217.0.13:8081: connect: connection refused" start-of-body= Mar 19 20:16:50 crc kubenswrapper[4826]: I0319 20:16:50.376487 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-6dd7dd855f-tcdmb" podUID="217c809e-0af8-4b11-a5ce-932d698ed444" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.13:8081/healthz\": dial tcp 10.217.0.13:8081: connect: connection refused" Mar 19 20:16:50 crc kubenswrapper[4826]: I0319 20:16:50.481124 4826 patch_prober.go:28] interesting pod/console-operator-58897d9998-zc8ht container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 19 20:16:50 crc kubenswrapper[4826]: I0319 20:16:50.481165 4826 patch_prober.go:28] interesting pod/console-operator-58897d9998-zc8ht container/console-operator namespace/openshift-console-operator: Liveness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 19 20:16:50 crc kubenswrapper[4826]: I0319 20:16:50.481185 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-zc8ht" podUID="f61cc107-39c3-4add-b9a1-45c5d744ea4b" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 19 20:16:50 crc kubenswrapper[4826]: I0319 20:16:50.481223 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console-operator/console-operator-58897d9998-zc8ht" podUID="f61cc107-39c3-4add-b9a1-45c5d744ea4b" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 19 20:16:50 crc kubenswrapper[4826]: I0319 20:16:50.511768 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-v6d7k" Mar 19 20:16:50 crc kubenswrapper[4826]: I0319 20:16:50.884360 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-6648f6899-wbmts" Mar 19 20:16:51 crc kubenswrapper[4826]: I0319 20:16:51.120142 4826 generic.go:334] "Generic (PLEG): container finished" podID="763c5ded-be94-49ad-9eea-447e444f24f3" containerID="48ef0c18927acad3cab5327c9df4d256a3f01325b10cf9e6772558514a35dec9" exitCode=0 Mar 19 20:16:51 crc kubenswrapper[4826]: I0319 20:16:51.121906 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"763c5ded-be94-49ad-9eea-447e444f24f3","Type":"ContainerDied","Data":"48ef0c18927acad3cab5327c9df4d256a3f01325b10cf9e6772558514a35dec9"} Mar 19 20:16:51 crc kubenswrapper[4826]: I0319 20:16:51.121939 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"763c5ded-be94-49ad-9eea-447e444f24f3","Type":"ContainerStarted","Data":"4200dbe778f060dca6da8aedf03d4065efa18daa227895933ce35bbd87e08ef3"} Mar 19 20:16:51 crc kubenswrapper[4826]: I0319 20:16:51.122336 4826 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-pfrcn container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Mar 19 20:16:51 crc kubenswrapper[4826]: I0319 20:16:51.122397 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-pfrcn" podUID="72f0a310-1676-49a4-826a-d83406d28e93" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" Mar 19 20:16:51 crc kubenswrapper[4826]: I0319 20:16:51.239385 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-drbf6" Mar 19 20:16:51 crc kubenswrapper[4826]: I0319 20:16:51.241328 4826 patch_prober.go:28] interesting pod/router-default-5444994796-drbf6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 20:16:51 crc kubenswrapper[4826]: [-]has-synced failed: reason withheld Mar 19 20:16:51 crc kubenswrapper[4826]: [+]process-running ok Mar 19 20:16:51 crc kubenswrapper[4826]: healthz check failed Mar 19 20:16:51 crc kubenswrapper[4826]: I0319 20:16:51.241367 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-drbf6" podUID="ee11e1f6-25be-40f4-b19b-a2d8e439d8c6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 20:16:51 crc kubenswrapper[4826]: I0319 20:16:51.353181 4826 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-dnc22 container/olm-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Mar 19 20:16:51 crc kubenswrapper[4826]: I0319 20:16:51.353233 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dnc22" podUID="fdb49b25-5e81-4f9d-9a17-34bade2cec18" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" Mar 19 20:16:51 crc kubenswrapper[4826]: I0319 20:16:51.353237 4826 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-dnc22 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Mar 19 20:16:51 crc kubenswrapper[4826]: I0319 20:16:51.353293 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dnc22" podUID="fdb49b25-5e81-4f9d-9a17-34bade2cec18" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" Mar 19 20:16:51 crc kubenswrapper[4826]: I0319 20:16:51.601811 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 19 20:16:51 crc kubenswrapper[4826]: I0319 20:16:51.601856 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 19 20:16:51 crc kubenswrapper[4826]: I0319 20:16:51.712995 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-operators/openstack-operator-index-wrwzn" podUID="f9f3d33c-0421-473c-94e6-a7860932d772" containerName="registry-server" probeResult="failure" output=< Mar 19 20:16:51 crc kubenswrapper[4826]: timeout: failed to connect service ":50051" within 1s Mar 19 20:16:51 crc kubenswrapper[4826]: > Mar 19 20:16:51 crc kubenswrapper[4826]: I0319 20:16:51.959464 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-zjkbj" Mar 19 20:16:52 crc kubenswrapper[4826]: I0319 20:16:52.011363 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fcnzx" Mar 19 20:16:52 crc kubenswrapper[4826]: I0319 20:16:52.088680 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 19 20:16:52 crc kubenswrapper[4826]: I0319 20:16:52.102991 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-747c5d4c44-ltxl4" Mar 19 20:16:52 crc kubenswrapper[4826]: I0319 20:16:52.199211 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-w2f68" Mar 19 20:16:52 crc kubenswrapper[4826]: I0319 20:16:52.298471 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-drbf6" Mar 19 20:16:52 crc kubenswrapper[4826]: I0319 20:16:52.302779 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-b76w9" Mar 19 20:16:52 crc kubenswrapper[4826]: I0319 20:16:52.315101 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-drbf6" Mar 19 20:16:52 crc kubenswrapper[4826]: I0319 20:16:52.579548 4826 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-pfrcn container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Mar 19 20:16:52 crc kubenswrapper[4826]: I0319 20:16:52.580146 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-pfrcn" podUID="72f0a310-1676-49a4-826a-d83406d28e93" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" Mar 19 20:16:52 crc kubenswrapper[4826]: I0319 20:16:52.579561 4826 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-pfrcn container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Mar 19 20:16:52 crc kubenswrapper[4826]: I0319 20:16:52.580333 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-pfrcn" podUID="72f0a310-1676-49a4-826a-d83406d28e93" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" Mar 19 20:16:52 crc kubenswrapper[4826]: I0319 20:16:52.588667 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Mar 19 20:16:52 crc kubenswrapper[4826]: I0319 20:16:52.626638 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 19 20:16:52 crc kubenswrapper[4826]: I0319 20:16:52.628821 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 19 20:16:52 crc kubenswrapper[4826]: I0319 20:16:52.681720 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-scheduler-0" podUID="9c0ba365-345a-4a2b-b919-b5e9de88b680" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 20:16:52 crc kubenswrapper[4826]: I0319 20:16:52.681992 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 19 20:16:52 crc kubenswrapper[4826]: I0319 20:16:52.683769 4826 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cinder-scheduler" containerStatusID={"Type":"cri-o","ID":"f7c7d7accd377b922c768cd2c63561dd625ead7eed451e5666328299f2ed6f49"} pod="openstack/cinder-scheduler-0" containerMessage="Container cinder-scheduler failed liveness probe, will be restarted" Mar 19 20:16:52 crc kubenswrapper[4826]: I0319 20:16:52.683920 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="9c0ba365-345a-4a2b-b919-b5e9de88b680" containerName="cinder-scheduler" containerID="cri-o://f7c7d7accd377b922c768cd2c63561dd625ead7eed451e5666328299f2ed6f49" gracePeriod=30 Mar 19 20:16:54 crc kubenswrapper[4826]: I0319 20:16:54.023394 4826 patch_prober.go:28] interesting pod/loki-operator-controller-manager-d88f59dd5-fqs6s container/manager namespace/openshift-operators-redhat: Readiness probe status=failure output="Get \"http://10.217.0.49:8081/readyz\": dial tcp 10.217.0.49:8081: connect: connection refused" start-of-body= Mar 19 20:16:54 crc kubenswrapper[4826]: I0319 20:16:54.023871 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators-redhat/loki-operator-controller-manager-d88f59dd5-fqs6s" podUID="84bba80c-841e-4df3-87e0-901afbc23bf3" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.49:8081/readyz\": dial tcp 10.217.0.49:8081: connect: connection refused" Mar 19 20:16:55 crc kubenswrapper[4826]: I0319 20:16:55.155085 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-6c6f68556d-k5tlt" Mar 19 20:16:55 crc kubenswrapper[4826]: I0319 20:16:55.492112 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-567cb464d6-bm4t6" Mar 19 20:16:55 crc kubenswrapper[4826]: I0319 20:16:55.494592 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-bb4bb89f7-bhb8x" Mar 19 20:16:55 crc kubenswrapper[4826]: I0319 20:16:55.542707 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 20:16:55 crc kubenswrapper[4826]: I0319 20:16:55.557177 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 20:16:55 crc kubenswrapper[4826]: I0319 20:16:55.588130 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-pfrcn" Mar 19 20:16:55 crc kubenswrapper[4826]: I0319 20:16:55.996987 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-zm4ps" Mar 19 20:16:56 crc kubenswrapper[4826]: I0319 20:16:56.003177 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-rsrjx" Mar 19 20:16:56 crc kubenswrapper[4826]: I0319 20:16:56.009830 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-hf8n5" Mar 19 20:16:56 crc kubenswrapper[4826]: I0319 20:16:56.025516 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-8265b" Mar 19 20:16:56 crc kubenswrapper[4826]: I0319 20:16:56.042896 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-ngb9j" Mar 19 20:16:56 crc kubenswrapper[4826]: I0319 20:16:56.162155 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-725xd" Mar 19 20:16:56 crc kubenswrapper[4826]: I0319 20:16:56.275819 4826 generic.go:334] "Generic (PLEG): container finished" podID="773a141c-3fd8-4f47-b8e1-ad98015ea7d4" containerID="a27243d89376028b26d66dfc28c4f4d0217cd4075ac5c42e4aa3f89b931bc73f" exitCode=0 Mar 19 20:16:56 crc kubenswrapper[4826]: I0319 20:16:56.276102 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565856-nwv2c" event={"ID":"773a141c-3fd8-4f47-b8e1-ad98015ea7d4","Type":"ContainerDied","Data":"a27243d89376028b26d66dfc28c4f4d0217cd4075ac5c42e4aa3f89b931bc73f"} Mar 19 20:16:56 crc kubenswrapper[4826]: I0319 20:16:56.308039 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-4bkbn" Mar 19 20:16:56 crc kubenswrapper[4826]: I0319 20:16:56.449311 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-884679f54-zs74n" Mar 19 20:16:56 crc kubenswrapper[4826]: I0319 20:16:56.505299 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-tjcmb" Mar 19 20:16:56 crc kubenswrapper[4826]: I0319 20:16:56.599124 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5784578c99-kkmzl" Mar 19 20:16:56 crc kubenswrapper[4826]: I0319 20:16:56.680321 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-767865f676-sfs65" Mar 19 20:16:56 crc kubenswrapper[4826]: I0319 20:16:56.694167 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-sbhr9" Mar 19 20:16:56 crc kubenswrapper[4826]: I0319 20:16:56.791037 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-j4p25" Mar 19 20:16:56 crc kubenswrapper[4826]: I0319 20:16:56.995973 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-6vmk6" Mar 19 20:16:57 crc kubenswrapper[4826]: I0319 20:16:57.319112 4826 generic.go:334] "Generic (PLEG): container finished" podID="4c98a23b-914e-4545-92fd-056eeb4af1ee" containerID="3afda7e361b537c4d54b166e46699b3580326e89b48ef9f26da22c6b1123e603" exitCode=0 Mar 19 20:16:57 crc kubenswrapper[4826]: I0319 20:16:57.319197 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vt4z5" event={"ID":"4c98a23b-914e-4545-92fd-056eeb4af1ee","Type":"ContainerDied","Data":"3afda7e361b537c4d54b166e46699b3580326e89b48ef9f26da22c6b1123e603"} Mar 19 20:16:58 crc kubenswrapper[4826]: I0319 20:16:58.391053 4826 generic.go:334] "Generic (PLEG): container finished" podID="36ba563e-3246-4dbb-ba34-e15fd6646fad" containerID="194aba72010cf27497ca6cea02ad74054529a252896975a16f9a305f2c121f86" exitCode=1 Mar 19 20:16:58 crc kubenswrapper[4826]: I0319 20:16:58.391495 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"36ba563e-3246-4dbb-ba34-e15fd6646fad","Type":"ContainerDied","Data":"194aba72010cf27497ca6cea02ad74054529a252896975a16f9a305f2c121f86"} Mar 19 20:16:58 crc kubenswrapper[4826]: I0319 20:16:58.394369 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vt4z5" event={"ID":"4c98a23b-914e-4545-92fd-056eeb4af1ee","Type":"ContainerStarted","Data":"1153e34a8181347a9516274e4a72505ada421a6d22f001b104c4379151d58f35"} Mar 19 20:16:58 crc kubenswrapper[4826]: I0319 20:16:58.465678 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vt4z5" podStartSLOduration=4.78059834 podStartE2EDuration="1m2.461426895s" podCreationTimestamp="2026-03-19 20:15:56 +0000 UTC" firstStartedPulling="2026-03-19 20:16:00.151544413 +0000 UTC m=+4784.905612726" lastFinishedPulling="2026-03-19 20:16:57.832372968 +0000 UTC m=+4842.586441281" observedRunningTime="2026-03-19 20:16:58.441945343 +0000 UTC m=+4843.196013656" watchObservedRunningTime="2026-03-19 20:16:58.461426895 +0000 UTC m=+4843.215495208" Mar 19 20:16:58 crc kubenswrapper[4826]: I0319 20:16:58.730032 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" podUID="46e578cd-3724-4abe-805c-554b384ed050" containerName="oauth-openshift" containerID="cri-o://9bc32b51b5566b427ffa287240d9eb0613e8145bd9253dd2736092863a4a7221" gracePeriod=12 Mar 19 20:16:58 crc kubenswrapper[4826]: I0319 20:16:58.749653 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-646cd56bc9-8t2bm" Mar 19 20:16:59 crc kubenswrapper[4826]: I0319 20:16:59.108474 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-wrwzn" Mar 19 20:16:59 crc kubenswrapper[4826]: I0319 20:16:59.153385 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-wrwzn" Mar 19 20:16:59 crc kubenswrapper[4826]: I0319 20:16:59.265464 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 20:16:59 crc kubenswrapper[4826]: I0319 20:16:59.410570 4826 generic.go:334] "Generic (PLEG): container finished" podID="46e578cd-3724-4abe-805c-554b384ed050" containerID="9bc32b51b5566b427ffa287240d9eb0613e8145bd9253dd2736092863a4a7221" exitCode=0 Mar 19 20:16:59 crc kubenswrapper[4826]: I0319 20:16:59.410839 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" event={"ID":"46e578cd-3724-4abe-805c-554b384ed050","Type":"ContainerDied","Data":"9bc32b51b5566b427ffa287240d9eb0613e8145bd9253dd2736092863a4a7221"} Mar 19 20:16:59 crc kubenswrapper[4826]: I0319 20:16:59.483421 4826 patch_prober.go:28] interesting pod/oauth-openshift-55bb4f975f-zpl6z container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.74:6443/healthz\": dial tcp 10.217.0.74:6443: connect: connection refused" start-of-body= Mar 19 20:16:59 crc kubenswrapper[4826]: I0319 20:16:59.483475 4826 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" podUID="46e578cd-3724-4abe-805c-554b384ed050" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.74:6443/healthz\": dial tcp 10.217.0.74:6443: connect: connection refused" Mar 19 20:16:59 crc kubenswrapper[4826]: I0319 20:16:59.651390 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-66v8z" Mar 19 20:16:59 crc kubenswrapper[4826]: I0319 20:16:59.883478 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565856-nwv2c" Mar 19 20:17:00 crc kubenswrapper[4826]: I0319 20:17:00.033151 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwnj7\" (UniqueName: \"kubernetes.io/projected/773a141c-3fd8-4f47-b8e1-ad98015ea7d4-kube-api-access-wwnj7\") pod \"773a141c-3fd8-4f47-b8e1-ad98015ea7d4\" (UID: \"773a141c-3fd8-4f47-b8e1-ad98015ea7d4\") " Mar 19 20:17:00 crc kubenswrapper[4826]: I0319 20:17:00.048049 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/773a141c-3fd8-4f47-b8e1-ad98015ea7d4-kube-api-access-wwnj7" (OuterVolumeSpecName: "kube-api-access-wwnj7") pod "773a141c-3fd8-4f47-b8e1-ad98015ea7d4" (UID: "773a141c-3fd8-4f47-b8e1-ad98015ea7d4"). InnerVolumeSpecName "kube-api-access-wwnj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:17:00 crc kubenswrapper[4826]: I0319 20:17:00.135895 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwnj7\" (UniqueName: \"kubernetes.io/projected/773a141c-3fd8-4f47-b8e1-ad98015ea7d4-kube-api-access-wwnj7\") on node \"crc\" DevicePath \"\"" Mar 19 20:17:00 crc kubenswrapper[4826]: I0319 20:17:00.168938 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-p9rmq" podUID="3a20f6a8-01f3-4492-856d-e5f494672fa3" containerName="registry-server" probeResult="failure" output=< Mar 19 20:17:00 crc kubenswrapper[4826]: timeout: failed to connect service ":50051" within 1s Mar 19 20:17:00 crc kubenswrapper[4826]: > Mar 19 20:17:00 crc kubenswrapper[4826]: I0319 20:17:00.213156 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 19 20:17:00 crc kubenswrapper[4826]: I0319 20:17:00.339071 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/36ba563e-3246-4dbb-ba34-e15fd6646fad-test-operator-ephemeral-temporary\") pod \"36ba563e-3246-4dbb-ba34-e15fd6646fad\" (UID: \"36ba563e-3246-4dbb-ba34-e15fd6646fad\") " Mar 19 20:17:00 crc kubenswrapper[4826]: I0319 20:17:00.339152 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"36ba563e-3246-4dbb-ba34-e15fd6646fad\" (UID: \"36ba563e-3246-4dbb-ba34-e15fd6646fad\") " Mar 19 20:17:00 crc kubenswrapper[4826]: I0319 20:17:00.339198 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/36ba563e-3246-4dbb-ba34-e15fd6646fad-openstack-config-secret\") pod \"36ba563e-3246-4dbb-ba34-e15fd6646fad\" (UID: \"36ba563e-3246-4dbb-ba34-e15fd6646fad\") " Mar 19 20:17:00 crc kubenswrapper[4826]: I0319 20:17:00.339221 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/36ba563e-3246-4dbb-ba34-e15fd6646fad-ca-certs\") pod \"36ba563e-3246-4dbb-ba34-e15fd6646fad\" (UID: \"36ba563e-3246-4dbb-ba34-e15fd6646fad\") " Mar 19 20:17:00 crc kubenswrapper[4826]: I0319 20:17:00.339241 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/36ba563e-3246-4dbb-ba34-e15fd6646fad-config-data\") pod \"36ba563e-3246-4dbb-ba34-e15fd6646fad\" (UID: \"36ba563e-3246-4dbb-ba34-e15fd6646fad\") " Mar 19 20:17:00 crc kubenswrapper[4826]: I0319 20:17:00.339272 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlt8n\" (UniqueName: \"kubernetes.io/projected/36ba563e-3246-4dbb-ba34-e15fd6646fad-kube-api-access-tlt8n\") pod \"36ba563e-3246-4dbb-ba34-e15fd6646fad\" (UID: \"36ba563e-3246-4dbb-ba34-e15fd6646fad\") " Mar 19 20:17:00 crc kubenswrapper[4826]: I0319 20:17:00.339419 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/36ba563e-3246-4dbb-ba34-e15fd6646fad-openstack-config\") pod \"36ba563e-3246-4dbb-ba34-e15fd6646fad\" (UID: \"36ba563e-3246-4dbb-ba34-e15fd6646fad\") " Mar 19 20:17:00 crc kubenswrapper[4826]: I0319 20:17:00.339954 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36ba563e-3246-4dbb-ba34-e15fd6646fad-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "36ba563e-3246-4dbb-ba34-e15fd6646fad" (UID: "36ba563e-3246-4dbb-ba34-e15fd6646fad"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:17:00 crc kubenswrapper[4826]: I0319 20:17:00.340259 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36ba563e-3246-4dbb-ba34-e15fd6646fad-config-data" (OuterVolumeSpecName: "config-data") pod "36ba563e-3246-4dbb-ba34-e15fd6646fad" (UID: "36ba563e-3246-4dbb-ba34-e15fd6646fad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:17:00 crc kubenswrapper[4826]: I0319 20:17:00.340317 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/36ba563e-3246-4dbb-ba34-e15fd6646fad-test-operator-ephemeral-workdir\") pod \"36ba563e-3246-4dbb-ba34-e15fd6646fad\" (UID: \"36ba563e-3246-4dbb-ba34-e15fd6646fad\") " Mar 19 20:17:00 crc kubenswrapper[4826]: I0319 20:17:00.340345 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/36ba563e-3246-4dbb-ba34-e15fd6646fad-ssh-key\") pod \"36ba563e-3246-4dbb-ba34-e15fd6646fad\" (UID: \"36ba563e-3246-4dbb-ba34-e15fd6646fad\") " Mar 19 20:17:00 crc kubenswrapper[4826]: I0319 20:17:00.340908 4826 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/36ba563e-3246-4dbb-ba34-e15fd6646fad-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 19 20:17:00 crc kubenswrapper[4826]: I0319 20:17:00.340925 4826 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/36ba563e-3246-4dbb-ba34-e15fd6646fad-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 20:17:00 crc kubenswrapper[4826]: I0319 20:17:00.345291 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36ba563e-3246-4dbb-ba34-e15fd6646fad-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "36ba563e-3246-4dbb-ba34-e15fd6646fad" (UID: "36ba563e-3246-4dbb-ba34-e15fd6646fad"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:17:00 crc kubenswrapper[4826]: I0319 20:17:00.356231 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36ba563e-3246-4dbb-ba34-e15fd6646fad-kube-api-access-tlt8n" (OuterVolumeSpecName: "kube-api-access-tlt8n") pod "36ba563e-3246-4dbb-ba34-e15fd6646fad" (UID: "36ba563e-3246-4dbb-ba34-e15fd6646fad"). InnerVolumeSpecName "kube-api-access-tlt8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:17:00 crc kubenswrapper[4826]: I0319 20:17:00.360786 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "test-operator-logs") pod "36ba563e-3246-4dbb-ba34-e15fd6646fad" (UID: "36ba563e-3246-4dbb-ba34-e15fd6646fad"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 19 20:17:00 crc kubenswrapper[4826]: I0319 20:17:00.371882 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36ba563e-3246-4dbb-ba34-e15fd6646fad-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "36ba563e-3246-4dbb-ba34-e15fd6646fad" (UID: "36ba563e-3246-4dbb-ba34-e15fd6646fad"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:17:00 crc kubenswrapper[4826]: I0319 20:17:00.378565 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-6dd7dd855f-tcdmb" Mar 19 20:17:00 crc kubenswrapper[4826]: I0319 20:17:00.392897 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36ba563e-3246-4dbb-ba34-e15fd6646fad-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "36ba563e-3246-4dbb-ba34-e15fd6646fad" (UID: "36ba563e-3246-4dbb-ba34-e15fd6646fad"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:17:00 crc kubenswrapper[4826]: I0319 20:17:00.410236 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36ba563e-3246-4dbb-ba34-e15fd6646fad-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "36ba563e-3246-4dbb-ba34-e15fd6646fad" (UID: "36ba563e-3246-4dbb-ba34-e15fd6646fad"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:17:00 crc kubenswrapper[4826]: I0319 20:17:00.427062 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36ba563e-3246-4dbb-ba34-e15fd6646fad-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "36ba563e-3246-4dbb-ba34-e15fd6646fad" (UID: "36ba563e-3246-4dbb-ba34-e15fd6646fad"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:17:00 crc kubenswrapper[4826]: I0319 20:17:00.427627 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565856-nwv2c" event={"ID":"773a141c-3fd8-4f47-b8e1-ad98015ea7d4","Type":"ContainerDied","Data":"75deefc8d688b23aecc15d14ccb268112b9361a9766883b60fe876272b9131cc"} Mar 19 20:17:00 crc kubenswrapper[4826]: I0319 20:17:00.428586 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75deefc8d688b23aecc15d14ccb268112b9361a9766883b60fe876272b9131cc" Mar 19 20:17:00 crc kubenswrapper[4826]: I0319 20:17:00.427711 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565856-nwv2c" Mar 19 20:17:00 crc kubenswrapper[4826]: I0319 20:17:00.429577 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" event={"ID":"46e578cd-3724-4abe-805c-554b384ed050","Type":"ContainerStarted","Data":"5776d426c731be8d50da58dda51d74d20024503eccf3fcb021930d1016d9caaa"} Mar 19 20:17:00 crc kubenswrapper[4826]: I0319 20:17:00.430544 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" Mar 19 20:17:00 crc kubenswrapper[4826]: I0319 20:17:00.432510 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"36ba563e-3246-4dbb-ba34-e15fd6646fad","Type":"ContainerDied","Data":"4db1569639cd08c33f37c6f66fd4ae88f333f99fed18a87cd7c4b02404abb45c"} Mar 19 20:17:00 crc kubenswrapper[4826]: I0319 20:17:00.432541 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4db1569639cd08c33f37c6f66fd4ae88f333f99fed18a87cd7c4b02404abb45c" Mar 19 20:17:00 crc kubenswrapper[4826]: I0319 20:17:00.432581 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 19 20:17:00 crc kubenswrapper[4826]: I0319 20:17:00.443559 4826 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/36ba563e-3246-4dbb-ba34-e15fd6646fad-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 19 20:17:00 crc kubenswrapper[4826]: I0319 20:17:00.443599 4826 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/36ba563e-3246-4dbb-ba34-e15fd6646fad-ssh-key\") on node \"crc\" DevicePath \"\"" Mar 19 20:17:00 crc kubenswrapper[4826]: I0319 20:17:00.447424 4826 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 19 20:17:00 crc kubenswrapper[4826]: I0319 20:17:00.447452 4826 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/36ba563e-3246-4dbb-ba34-e15fd6646fad-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 19 20:17:00 crc kubenswrapper[4826]: I0319 20:17:00.447468 4826 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/36ba563e-3246-4dbb-ba34-e15fd6646fad-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 19 20:17:00 crc kubenswrapper[4826]: I0319 20:17:00.447480 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlt8n\" (UniqueName: \"kubernetes.io/projected/36ba563e-3246-4dbb-ba34-e15fd6646fad-kube-api-access-tlt8n\") on node \"crc\" DevicePath \"\"" Mar 19 20:17:00 crc kubenswrapper[4826]: I0319 20:17:00.447492 4826 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/36ba563e-3246-4dbb-ba34-e15fd6646fad-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 19 20:17:00 crc kubenswrapper[4826]: I0319 20:17:00.488784 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-zc8ht" Mar 19 20:17:00 crc kubenswrapper[4826]: I0319 20:17:00.503963 4826 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 19 20:17:00 crc kubenswrapper[4826]: I0319 20:17:00.542288 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-prxxj" Mar 19 20:17:00 crc kubenswrapper[4826]: I0319 20:17:00.551855 4826 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 19 20:17:00 crc kubenswrapper[4826]: I0319 20:17:00.672588 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-55bb4f975f-zpl6z" Mar 19 20:17:01 crc kubenswrapper[4826]: I0319 20:17:01.177418 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-6btqx" Mar 19 20:17:01 crc kubenswrapper[4826]: I0319 20:17:01.358444 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dnc22" Mar 19 20:17:01 crc kubenswrapper[4826]: I0319 20:17:01.445683 4826 generic.go:334] "Generic (PLEG): container finished" podID="9c0ba365-345a-4a2b-b919-b5e9de88b680" containerID="f7c7d7accd377b922c768cd2c63561dd625ead7eed451e5666328299f2ed6f49" exitCode=0 Mar 19 20:17:01 crc kubenswrapper[4826]: I0319 20:17:01.445779 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9c0ba365-345a-4a2b-b919-b5e9de88b680","Type":"ContainerDied","Data":"f7c7d7accd377b922c768cd2c63561dd625ead7eed451e5666328299f2ed6f49"} Mar 19 20:17:02 crc kubenswrapper[4826]: I0319 20:17:02.088525 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Mar 19 20:17:02 crc kubenswrapper[4826]: I0319 20:17:02.095162 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Mar 19 20:17:02 crc kubenswrapper[4826]: I0319 20:17:02.465110 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 19 20:17:04 crc kubenswrapper[4826]: I0319 20:17:04.025824 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators-redhat/loki-operator-controller-manager-d88f59dd5-fqs6s" Mar 19 20:17:04 crc kubenswrapper[4826]: I0319 20:17:04.500761 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9c0ba365-345a-4a2b-b919-b5e9de88b680","Type":"ContainerStarted","Data":"2f7b2a6f9baba8a2f1aea980204f4dc7f00ca5d9e5d2fa8f4a88128e16725d0f"} Mar 19 20:17:07 crc kubenswrapper[4826]: I0319 20:17:07.648460 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 19 20:17:07 crc kubenswrapper[4826]: E0319 20:17:07.650498 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36ba563e-3246-4dbb-ba34-e15fd6646fad" containerName="tempest-tests-tempest-tests-runner" Mar 19 20:17:07 crc kubenswrapper[4826]: I0319 20:17:07.652342 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="36ba563e-3246-4dbb-ba34-e15fd6646fad" containerName="tempest-tests-tempest-tests-runner" Mar 19 20:17:07 crc kubenswrapper[4826]: E0319 20:17:07.652379 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="773a141c-3fd8-4f47-b8e1-ad98015ea7d4" containerName="oc" Mar 19 20:17:07 crc kubenswrapper[4826]: I0319 20:17:07.652386 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="773a141c-3fd8-4f47-b8e1-ad98015ea7d4" containerName="oc" Mar 19 20:17:07 crc kubenswrapper[4826]: I0319 20:17:07.652684 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="773a141c-3fd8-4f47-b8e1-ad98015ea7d4" containerName="oc" Mar 19 20:17:07 crc kubenswrapper[4826]: I0319 20:17:07.652710 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="36ba563e-3246-4dbb-ba34-e15fd6646fad" containerName="tempest-tests-tempest-tests-runner" Mar 19 20:17:07 crc kubenswrapper[4826]: I0319 20:17:07.654874 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 19 20:17:07 crc kubenswrapper[4826]: I0319 20:17:07.658675 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-vxjvc" Mar 19 20:17:07 crc kubenswrapper[4826]: I0319 20:17:07.678667 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 19 20:17:07 crc kubenswrapper[4826]: I0319 20:17:07.754910 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"f524c648-b4b0-4f10-abf3-5a3ccc1d96a0\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 19 20:17:07 crc kubenswrapper[4826]: I0319 20:17:07.754970 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hm4bs\" (UniqueName: \"kubernetes.io/projected/f524c648-b4b0-4f10-abf3-5a3ccc1d96a0-kube-api-access-hm4bs\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"f524c648-b4b0-4f10-abf3-5a3ccc1d96a0\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 19 20:17:07 crc kubenswrapper[4826]: I0319 20:17:07.769493 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vt4z5" Mar 19 20:17:07 crc kubenswrapper[4826]: I0319 20:17:07.769686 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vt4z5" Mar 19 20:17:07 crc kubenswrapper[4826]: I0319 20:17:07.856677 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hm4bs\" (UniqueName: \"kubernetes.io/projected/f524c648-b4b0-4f10-abf3-5a3ccc1d96a0-kube-api-access-hm4bs\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"f524c648-b4b0-4f10-abf3-5a3ccc1d96a0\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 19 20:17:07 crc kubenswrapper[4826]: I0319 20:17:07.857295 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"f524c648-b4b0-4f10-abf3-5a3ccc1d96a0\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 19 20:17:07 crc kubenswrapper[4826]: I0319 20:17:07.857835 4826 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"f524c648-b4b0-4f10-abf3-5a3ccc1d96a0\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 19 20:17:07 crc kubenswrapper[4826]: I0319 20:17:07.894342 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hm4bs\" (UniqueName: \"kubernetes.io/projected/f524c648-b4b0-4f10-abf3-5a3ccc1d96a0-kube-api-access-hm4bs\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"f524c648-b4b0-4f10-abf3-5a3ccc1d96a0\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 19 20:17:07 crc kubenswrapper[4826]: I0319 20:17:07.930691 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"f524c648-b4b0-4f10-abf3-5a3ccc1d96a0\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 19 20:17:07 crc kubenswrapper[4826]: I0319 20:17:07.977019 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 19 20:17:08 crc kubenswrapper[4826]: I0319 20:17:08.390444 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 19 20:17:08 crc kubenswrapper[4826]: I0319 20:17:08.421366 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-scheduler-0" podUID="9c0ba365-345a-4a2b-b919-b5e9de88b680" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 20:17:08 crc kubenswrapper[4826]: I0319 20:17:08.686931 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 19 20:17:08 crc kubenswrapper[4826]: I0319 20:17:08.845750 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vt4z5" podUID="4c98a23b-914e-4545-92fd-056eeb4af1ee" containerName="registry-server" probeResult="failure" output=< Mar 19 20:17:08 crc kubenswrapper[4826]: timeout: failed to connect service ":50051" within 1s Mar 19 20:17:08 crc kubenswrapper[4826]: > Mar 19 20:17:09 crc kubenswrapper[4826]: I0319 20:17:09.581068 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"f524c648-b4b0-4f10-abf3-5a3ccc1d96a0","Type":"ContainerStarted","Data":"a5260aad0323d57ae0da15ce647f4c873e329df773caab15843b5ca84ca9372b"} Mar 19 20:17:10 crc kubenswrapper[4826]: I0319 20:17:10.171780 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-p9rmq" podUID="3a20f6a8-01f3-4492-856d-e5f494672fa3" containerName="registry-server" probeResult="failure" output=< Mar 19 20:17:10 crc kubenswrapper[4826]: timeout: failed to connect service ":50051" within 1s Mar 19 20:17:10 crc kubenswrapper[4826]: > Mar 19 20:17:10 crc kubenswrapper[4826]: I0319 20:17:10.591615 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"f524c648-b4b0-4f10-abf3-5a3ccc1d96a0","Type":"ContainerStarted","Data":"ea159c564007e16d9a9e31fb7630a72820c903168c11d14a0c4b7c59e8644ec8"} Mar 19 20:17:10 crc kubenswrapper[4826]: I0319 20:17:10.625750 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.985633711 podStartE2EDuration="3.625643988s" podCreationTimestamp="2026-03-19 20:17:07 +0000 UTC" firstStartedPulling="2026-03-19 20:17:08.694841796 +0000 UTC m=+4853.448910109" lastFinishedPulling="2026-03-19 20:17:10.334852073 +0000 UTC m=+4855.088920386" observedRunningTime="2026-03-19 20:17:10.613000791 +0000 UTC m=+4855.367069104" watchObservedRunningTime="2026-03-19 20:17:10.625643988 +0000 UTC m=+4855.379712301" Mar 19 20:17:12 crc kubenswrapper[4826]: I0319 20:17:12.277624 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 19 20:17:12 crc kubenswrapper[4826]: I0319 20:17:12.420288 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 19 20:17:12 crc kubenswrapper[4826]: I0319 20:17:12.425603 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 19 20:17:12 crc kubenswrapper[4826]: I0319 20:17:12.588183 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Mar 19 20:17:12 crc kubenswrapper[4826]: I0319 20:17:12.597815 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 19 20:17:12 crc kubenswrapper[4826]: I0319 20:17:12.692185 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Mar 19 20:17:12 crc kubenswrapper[4826]: I0319 20:17:12.743211 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Mar 19 20:17:13 crc kubenswrapper[4826]: I0319 20:17:13.519497 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 19 20:17:15 crc kubenswrapper[4826]: I0319 20:17:15.087562 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565850-df5nq"] Mar 19 20:17:15 crc kubenswrapper[4826]: I0319 20:17:15.097799 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565850-df5nq"] Mar 19 20:17:15 crc kubenswrapper[4826]: I0319 20:17:15.994436 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee52295e-4450-4ddd-8da0-07a48a53694f" path="/var/lib/kubelet/pods/ee52295e-4450-4ddd-8da0-07a48a53694f/volumes" Mar 19 20:17:18 crc kubenswrapper[4826]: I0319 20:17:18.834306 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vt4z5" podUID="4c98a23b-914e-4545-92fd-056eeb4af1ee" containerName="registry-server" probeResult="failure" output=< Mar 19 20:17:18 crc kubenswrapper[4826]: timeout: failed to connect service ":50051" within 1s Mar 19 20:17:18 crc kubenswrapper[4826]: > Mar 19 20:17:19 crc kubenswrapper[4826]: I0319 20:17:19.158190 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-p9rmq" Mar 19 20:17:19 crc kubenswrapper[4826]: I0319 20:17:19.218256 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-p9rmq" Mar 19 20:17:19 crc kubenswrapper[4826]: I0319 20:17:19.383074 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-84d47777df-4x998" Mar 19 20:17:28 crc kubenswrapper[4826]: I0319 20:17:28.833257 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vt4z5" podUID="4c98a23b-914e-4545-92fd-056eeb4af1ee" containerName="registry-server" probeResult="failure" output=< Mar 19 20:17:28 crc kubenswrapper[4826]: timeout: failed to connect service ":50051" within 1s Mar 19 20:17:28 crc kubenswrapper[4826]: > Mar 19 20:17:32 crc kubenswrapper[4826]: I0319 20:17:32.446852 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 20:17:38 crc kubenswrapper[4826]: I0319 20:17:38.819176 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vt4z5" podUID="4c98a23b-914e-4545-92fd-056eeb4af1ee" containerName="registry-server" probeResult="failure" output=< Mar 19 20:17:38 crc kubenswrapper[4826]: timeout: failed to connect service ":50051" within 1s Mar 19 20:17:38 crc kubenswrapper[4826]: > Mar 19 20:17:40 crc kubenswrapper[4826]: I0319 20:17:40.594293 4826 scope.go:117] "RemoveContainer" containerID="17e4391027936fd369c038fedd40d620a05c047ebf7d0009fcb95507e3cc054a" Mar 19 20:17:48 crc kubenswrapper[4826]: I0319 20:17:48.843215 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vt4z5" podUID="4c98a23b-914e-4545-92fd-056eeb4af1ee" containerName="registry-server" probeResult="failure" output=< Mar 19 20:17:48 crc kubenswrapper[4826]: timeout: failed to connect service ":50051" within 1s Mar 19 20:17:48 crc kubenswrapper[4826]: > Mar 19 20:17:58 crc kubenswrapper[4826]: I0319 20:17:58.834059 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vt4z5" podUID="4c98a23b-914e-4545-92fd-056eeb4af1ee" containerName="registry-server" probeResult="failure" output=< Mar 19 20:17:58 crc kubenswrapper[4826]: timeout: failed to connect service ":50051" within 1s Mar 19 20:17:58 crc kubenswrapper[4826]: > Mar 19 20:18:00 crc kubenswrapper[4826]: I0319 20:18:00.225825 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565858-pj7w6"] Mar 19 20:18:00 crc kubenswrapper[4826]: I0319 20:18:00.241040 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565858-pj7w6" Mar 19 20:18:00 crc kubenswrapper[4826]: I0319 20:18:00.244440 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565858-pj7w6"] Mar 19 20:18:00 crc kubenswrapper[4826]: I0319 20:18:00.273922 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 20:18:00 crc kubenswrapper[4826]: I0319 20:18:00.273949 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 20:18:00 crc kubenswrapper[4826]: I0319 20:18:00.273909 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b27wl" Mar 19 20:18:00 crc kubenswrapper[4826]: I0319 20:18:00.315833 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8pw9\" (UniqueName: \"kubernetes.io/projected/ec82184c-99e1-400c-b68e-a504cda95f2e-kube-api-access-l8pw9\") pod \"auto-csr-approver-29565858-pj7w6\" (UID: \"ec82184c-99e1-400c-b68e-a504cda95f2e\") " pod="openshift-infra/auto-csr-approver-29565858-pj7w6" Mar 19 20:18:00 crc kubenswrapper[4826]: I0319 20:18:00.418965 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8pw9\" (UniqueName: \"kubernetes.io/projected/ec82184c-99e1-400c-b68e-a504cda95f2e-kube-api-access-l8pw9\") pod \"auto-csr-approver-29565858-pj7w6\" (UID: \"ec82184c-99e1-400c-b68e-a504cda95f2e\") " pod="openshift-infra/auto-csr-approver-29565858-pj7w6" Mar 19 20:18:00 crc kubenswrapper[4826]: I0319 20:18:00.468542 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8pw9\" (UniqueName: \"kubernetes.io/projected/ec82184c-99e1-400c-b68e-a504cda95f2e-kube-api-access-l8pw9\") pod \"auto-csr-approver-29565858-pj7w6\" (UID: \"ec82184c-99e1-400c-b68e-a504cda95f2e\") " pod="openshift-infra/auto-csr-approver-29565858-pj7w6" Mar 19 20:18:00 crc kubenswrapper[4826]: I0319 20:18:00.617275 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565858-pj7w6" Mar 19 20:18:02 crc kubenswrapper[4826]: I0319 20:18:02.093256 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565858-pj7w6"] Mar 19 20:18:02 crc kubenswrapper[4826]: W0319 20:18:02.106719 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec82184c_99e1_400c_b68e_a504cda95f2e.slice/crio-b210a5cf97adcd132e19245bb536f62e4d1005c520152663c1cdecac7e4f59e2 WatchSource:0}: Error finding container b210a5cf97adcd132e19245bb536f62e4d1005c520152663c1cdecac7e4f59e2: Status 404 returned error can't find the container with id b210a5cf97adcd132e19245bb536f62e4d1005c520152663c1cdecac7e4f59e2 Mar 19 20:18:02 crc kubenswrapper[4826]: I0319 20:18:02.711153 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565858-pj7w6" event={"ID":"ec82184c-99e1-400c-b68e-a504cda95f2e","Type":"ContainerStarted","Data":"b210a5cf97adcd132e19245bb536f62e4d1005c520152663c1cdecac7e4f59e2"} Mar 19 20:18:04 crc kubenswrapper[4826]: I0319 20:18:04.039912 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-splz6/must-gather-c99v7"] Mar 19 20:18:04 crc kubenswrapper[4826]: I0319 20:18:04.042236 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-splz6/must-gather-c99v7" Mar 19 20:18:04 crc kubenswrapper[4826]: I0319 20:18:04.045429 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-splz6"/"kube-root-ca.crt" Mar 19 20:18:04 crc kubenswrapper[4826]: I0319 20:18:04.045642 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-splz6"/"openshift-service-ca.crt" Mar 19 20:18:04 crc kubenswrapper[4826]: I0319 20:18:04.060193 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-splz6"/"default-dockercfg-c4dw7" Mar 19 20:18:04 crc kubenswrapper[4826]: I0319 20:18:04.071934 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9e171a7e-b733-47f9-807a-51f2828b6927-must-gather-output\") pod \"must-gather-c99v7\" (UID: \"9e171a7e-b733-47f9-807a-51f2828b6927\") " pod="openshift-must-gather-splz6/must-gather-c99v7" Mar 19 20:18:04 crc kubenswrapper[4826]: I0319 20:18:04.072065 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jw6wt\" (UniqueName: \"kubernetes.io/projected/9e171a7e-b733-47f9-807a-51f2828b6927-kube-api-access-jw6wt\") pod \"must-gather-c99v7\" (UID: \"9e171a7e-b733-47f9-807a-51f2828b6927\") " pod="openshift-must-gather-splz6/must-gather-c99v7" Mar 19 20:18:04 crc kubenswrapper[4826]: I0319 20:18:04.080064 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-splz6/must-gather-c99v7"] Mar 19 20:18:04 crc kubenswrapper[4826]: I0319 20:18:04.174532 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jw6wt\" (UniqueName: \"kubernetes.io/projected/9e171a7e-b733-47f9-807a-51f2828b6927-kube-api-access-jw6wt\") pod \"must-gather-c99v7\" (UID: \"9e171a7e-b733-47f9-807a-51f2828b6927\") " pod="openshift-must-gather-splz6/must-gather-c99v7" Mar 19 20:18:04 crc kubenswrapper[4826]: I0319 20:18:04.174789 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9e171a7e-b733-47f9-807a-51f2828b6927-must-gather-output\") pod \"must-gather-c99v7\" (UID: \"9e171a7e-b733-47f9-807a-51f2828b6927\") " pod="openshift-must-gather-splz6/must-gather-c99v7" Mar 19 20:18:04 crc kubenswrapper[4826]: I0319 20:18:04.176048 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9e171a7e-b733-47f9-807a-51f2828b6927-must-gather-output\") pod \"must-gather-c99v7\" (UID: \"9e171a7e-b733-47f9-807a-51f2828b6927\") " pod="openshift-must-gather-splz6/must-gather-c99v7" Mar 19 20:18:04 crc kubenswrapper[4826]: I0319 20:18:04.242118 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jw6wt\" (UniqueName: \"kubernetes.io/projected/9e171a7e-b733-47f9-807a-51f2828b6927-kube-api-access-jw6wt\") pod \"must-gather-c99v7\" (UID: \"9e171a7e-b733-47f9-807a-51f2828b6927\") " pod="openshift-must-gather-splz6/must-gather-c99v7" Mar 19 20:18:04 crc kubenswrapper[4826]: I0319 20:18:04.376872 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-splz6/must-gather-c99v7" Mar 19 20:18:05 crc kubenswrapper[4826]: I0319 20:18:05.077186 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-splz6/must-gather-c99v7"] Mar 19 20:18:05 crc kubenswrapper[4826]: I0319 20:18:05.781453 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565858-pj7w6" event={"ID":"ec82184c-99e1-400c-b68e-a504cda95f2e","Type":"ContainerStarted","Data":"b244a43eade554532371b14bdc314cbae1451b53ab4c37fa35d68a7bc2fa3960"} Mar 19 20:18:05 crc kubenswrapper[4826]: I0319 20:18:05.788609 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-splz6/must-gather-c99v7" event={"ID":"9e171a7e-b733-47f9-807a-51f2828b6927","Type":"ContainerStarted","Data":"dac6aa323a188440f99b7a91037aebd6fa8e8b1481b9143a914e899191fb3906"} Mar 19 20:18:05 crc kubenswrapper[4826]: I0319 20:18:05.817374 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565858-pj7w6" podStartSLOduration=4.795908982 podStartE2EDuration="5.815749456s" podCreationTimestamp="2026-03-19 20:18:00 +0000 UTC" firstStartedPulling="2026-03-19 20:18:02.123892357 +0000 UTC m=+4906.877960670" lastFinishedPulling="2026-03-19 20:18:03.143732831 +0000 UTC m=+4907.897801144" observedRunningTime="2026-03-19 20:18:05.803413917 +0000 UTC m=+4910.557482260" watchObservedRunningTime="2026-03-19 20:18:05.815749456 +0000 UTC m=+4910.569817789" Mar 19 20:18:06 crc kubenswrapper[4826]: I0319 20:18:06.813271 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565858-pj7w6" event={"ID":"ec82184c-99e1-400c-b68e-a504cda95f2e","Type":"ContainerDied","Data":"b244a43eade554532371b14bdc314cbae1451b53ab4c37fa35d68a7bc2fa3960"} Mar 19 20:18:06 crc kubenswrapper[4826]: I0319 20:18:06.814769 4826 generic.go:334] "Generic (PLEG): container finished" podID="ec82184c-99e1-400c-b68e-a504cda95f2e" containerID="b244a43eade554532371b14bdc314cbae1451b53ab4c37fa35d68a7bc2fa3960" exitCode=0 Mar 19 20:18:07 crc kubenswrapper[4826]: I0319 20:18:07.842056 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vt4z5" Mar 19 20:18:07 crc kubenswrapper[4826]: I0319 20:18:07.933743 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vt4z5" Mar 19 20:18:08 crc kubenswrapper[4826]: I0319 20:18:08.088397 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vt4z5"] Mar 19 20:18:08 crc kubenswrapper[4826]: I0319 20:18:08.366851 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565858-pj7w6" Mar 19 20:18:08 crc kubenswrapper[4826]: I0319 20:18:08.506173 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8pw9\" (UniqueName: \"kubernetes.io/projected/ec82184c-99e1-400c-b68e-a504cda95f2e-kube-api-access-l8pw9\") pod \"ec82184c-99e1-400c-b68e-a504cda95f2e\" (UID: \"ec82184c-99e1-400c-b68e-a504cda95f2e\") " Mar 19 20:18:08 crc kubenswrapper[4826]: I0319 20:18:08.524505 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec82184c-99e1-400c-b68e-a504cda95f2e-kube-api-access-l8pw9" (OuterVolumeSpecName: "kube-api-access-l8pw9") pod "ec82184c-99e1-400c-b68e-a504cda95f2e" (UID: "ec82184c-99e1-400c-b68e-a504cda95f2e"). InnerVolumeSpecName "kube-api-access-l8pw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:18:08 crc kubenswrapper[4826]: I0319 20:18:08.610093 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8pw9\" (UniqueName: \"kubernetes.io/projected/ec82184c-99e1-400c-b68e-a504cda95f2e-kube-api-access-l8pw9\") on node \"crc\" DevicePath \"\"" Mar 19 20:18:08 crc kubenswrapper[4826]: I0319 20:18:08.848820 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565858-pj7w6" event={"ID":"ec82184c-99e1-400c-b68e-a504cda95f2e","Type":"ContainerDied","Data":"b210a5cf97adcd132e19245bb536f62e4d1005c520152663c1cdecac7e4f59e2"} Mar 19 20:18:08 crc kubenswrapper[4826]: I0319 20:18:08.848952 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565858-pj7w6" Mar 19 20:18:08 crc kubenswrapper[4826]: I0319 20:18:08.849696 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b210a5cf97adcd132e19245bb536f62e4d1005c520152663c1cdecac7e4f59e2" Mar 19 20:18:08 crc kubenswrapper[4826]: I0319 20:18:08.880163 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565852-knh8r"] Mar 19 20:18:08 crc kubenswrapper[4826]: I0319 20:18:08.893671 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565852-knh8r"] Mar 19 20:18:09 crc kubenswrapper[4826]: I0319 20:18:09.859476 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vt4z5" podUID="4c98a23b-914e-4545-92fd-056eeb4af1ee" containerName="registry-server" containerID="cri-o://1153e34a8181347a9516274e4a72505ada421a6d22f001b104c4379151d58f35" gracePeriod=2 Mar 19 20:18:10 crc kubenswrapper[4826]: I0319 20:18:10.004292 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6bc3f75-71a9-48cf-bc96-8ab9b191a3f7" path="/var/lib/kubelet/pods/d6bc3f75-71a9-48cf-bc96-8ab9b191a3f7/volumes" Mar 19 20:18:10 crc kubenswrapper[4826]: I0319 20:18:10.884548 4826 generic.go:334] "Generic (PLEG): container finished" podID="4c98a23b-914e-4545-92fd-056eeb4af1ee" containerID="1153e34a8181347a9516274e4a72505ada421a6d22f001b104c4379151d58f35" exitCode=0 Mar 19 20:18:10 crc kubenswrapper[4826]: I0319 20:18:10.884695 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vt4z5" event={"ID":"4c98a23b-914e-4545-92fd-056eeb4af1ee","Type":"ContainerDied","Data":"1153e34a8181347a9516274e4a72505ada421a6d22f001b104c4379151d58f35"} Mar 19 20:18:15 crc kubenswrapper[4826]: I0319 20:18:15.927561 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vt4z5" Mar 19 20:18:16 crc kubenswrapper[4826]: I0319 20:18:16.057516 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c98a23b-914e-4545-92fd-056eeb4af1ee-catalog-content\") pod \"4c98a23b-914e-4545-92fd-056eeb4af1ee\" (UID: \"4c98a23b-914e-4545-92fd-056eeb4af1ee\") " Mar 19 20:18:16 crc kubenswrapper[4826]: I0319 20:18:16.058108 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddzmc\" (UniqueName: \"kubernetes.io/projected/4c98a23b-914e-4545-92fd-056eeb4af1ee-kube-api-access-ddzmc\") pod \"4c98a23b-914e-4545-92fd-056eeb4af1ee\" (UID: \"4c98a23b-914e-4545-92fd-056eeb4af1ee\") " Mar 19 20:18:16 crc kubenswrapper[4826]: I0319 20:18:16.058256 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c98a23b-914e-4545-92fd-056eeb4af1ee-utilities\") pod \"4c98a23b-914e-4545-92fd-056eeb4af1ee\" (UID: \"4c98a23b-914e-4545-92fd-056eeb4af1ee\") " Mar 19 20:18:16 crc kubenswrapper[4826]: I0319 20:18:16.059928 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c98a23b-914e-4545-92fd-056eeb4af1ee-utilities" (OuterVolumeSpecName: "utilities") pod "4c98a23b-914e-4545-92fd-056eeb4af1ee" (UID: "4c98a23b-914e-4545-92fd-056eeb4af1ee"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:18:16 crc kubenswrapper[4826]: I0319 20:18:16.086057 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c98a23b-914e-4545-92fd-056eeb4af1ee-kube-api-access-ddzmc" (OuterVolumeSpecName: "kube-api-access-ddzmc") pod "4c98a23b-914e-4545-92fd-056eeb4af1ee" (UID: "4c98a23b-914e-4545-92fd-056eeb4af1ee"). InnerVolumeSpecName "kube-api-access-ddzmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:18:16 crc kubenswrapper[4826]: I0319 20:18:16.167688 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vt4z5" Mar 19 20:18:16 crc kubenswrapper[4826]: I0319 20:18:16.171524 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c98a23b-914e-4545-92fd-056eeb4af1ee-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4c98a23b-914e-4545-92fd-056eeb4af1ee" (UID: "4c98a23b-914e-4545-92fd-056eeb4af1ee"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:18:16 crc kubenswrapper[4826]: I0319 20:18:16.175094 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vt4z5" event={"ID":"4c98a23b-914e-4545-92fd-056eeb4af1ee","Type":"ContainerDied","Data":"a47ea601a88e3f920e02b6f94fc8d996719e98bd0696cedcf56e122064374f3e"} Mar 19 20:18:16 crc kubenswrapper[4826]: I0319 20:18:16.175149 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-splz6/must-gather-c99v7" event={"ID":"9e171a7e-b733-47f9-807a-51f2828b6927","Type":"ContainerStarted","Data":"16b5f0e5dbd407162ea88a68f1a3d5ed1e6de1d5b88efe25200151de34f1c118"} Mar 19 20:18:16 crc kubenswrapper[4826]: I0319 20:18:16.175172 4826 scope.go:117] "RemoveContainer" containerID="1153e34a8181347a9516274e4a72505ada421a6d22f001b104c4379151d58f35" Mar 19 20:18:16 crc kubenswrapper[4826]: I0319 20:18:16.180415 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddzmc\" (UniqueName: \"kubernetes.io/projected/4c98a23b-914e-4545-92fd-056eeb4af1ee-kube-api-access-ddzmc\") on node \"crc\" DevicePath \"\"" Mar 19 20:18:16 crc kubenswrapper[4826]: I0319 20:18:16.181440 4826 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c98a23b-914e-4545-92fd-056eeb4af1ee-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 20:18:16 crc kubenswrapper[4826]: I0319 20:18:16.183758 4826 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c98a23b-914e-4545-92fd-056eeb4af1ee-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 20:18:16 crc kubenswrapper[4826]: I0319 20:18:16.230296 4826 scope.go:117] "RemoveContainer" containerID="3afda7e361b537c4d54b166e46699b3580326e89b48ef9f26da22c6b1123e603" Mar 19 20:18:16 crc kubenswrapper[4826]: I0319 20:18:16.235439 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vt4z5"] Mar 19 20:18:16 crc kubenswrapper[4826]: I0319 20:18:16.245861 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vt4z5"] Mar 19 20:18:16 crc kubenswrapper[4826]: I0319 20:18:16.262620 4826 scope.go:117] "RemoveContainer" containerID="da45a447d2129cee85036f1261d0bf2cdf4e5d868b2a5d5583cb4492b3ea8744" Mar 19 20:18:17 crc kubenswrapper[4826]: I0319 20:18:17.050477 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-splz6/must-gather-c99v7" event={"ID":"9e171a7e-b733-47f9-807a-51f2828b6927","Type":"ContainerStarted","Data":"4c9de78b9869f86ede10c625673cbbd878058d25777cf854a4dd1e6cd005833a"} Mar 19 20:18:17 crc kubenswrapper[4826]: I0319 20:18:17.080551 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-splz6/must-gather-c99v7" podStartSLOduration=4.330174332 podStartE2EDuration="14.080520822s" podCreationTimestamp="2026-03-19 20:18:03 +0000 UTC" firstStartedPulling="2026-03-19 20:18:05.082110835 +0000 UTC m=+4909.836179148" lastFinishedPulling="2026-03-19 20:18:14.832457325 +0000 UTC m=+4919.586525638" observedRunningTime="2026-03-19 20:18:17.070480019 +0000 UTC m=+4921.824548372" watchObservedRunningTime="2026-03-19 20:18:17.080520822 +0000 UTC m=+4921.834589175" Mar 19 20:18:17 crc kubenswrapper[4826]: I0319 20:18:17.994258 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c98a23b-914e-4545-92fd-056eeb4af1ee" path="/var/lib/kubelet/pods/4c98a23b-914e-4545-92fd-056eeb4af1ee/volumes" Mar 19 20:18:22 crc kubenswrapper[4826]: I0319 20:18:22.007390 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-splz6/crc-debug-glbf7"] Mar 19 20:18:22 crc kubenswrapper[4826]: E0319 20:18:22.008670 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c98a23b-914e-4545-92fd-056eeb4af1ee" containerName="registry-server" Mar 19 20:18:22 crc kubenswrapper[4826]: I0319 20:18:22.009110 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c98a23b-914e-4545-92fd-056eeb4af1ee" containerName="registry-server" Mar 19 20:18:22 crc kubenswrapper[4826]: E0319 20:18:22.009170 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c98a23b-914e-4545-92fd-056eeb4af1ee" containerName="extract-utilities" Mar 19 20:18:22 crc kubenswrapper[4826]: I0319 20:18:22.009180 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c98a23b-914e-4545-92fd-056eeb4af1ee" containerName="extract-utilities" Mar 19 20:18:22 crc kubenswrapper[4826]: E0319 20:18:22.009197 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c98a23b-914e-4545-92fd-056eeb4af1ee" containerName="extract-content" Mar 19 20:18:22 crc kubenswrapper[4826]: I0319 20:18:22.009204 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c98a23b-914e-4545-92fd-056eeb4af1ee" containerName="extract-content" Mar 19 20:18:22 crc kubenswrapper[4826]: E0319 20:18:22.009216 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec82184c-99e1-400c-b68e-a504cda95f2e" containerName="oc" Mar 19 20:18:22 crc kubenswrapper[4826]: I0319 20:18:22.009223 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec82184c-99e1-400c-b68e-a504cda95f2e" containerName="oc" Mar 19 20:18:22 crc kubenswrapper[4826]: I0319 20:18:22.009485 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec82184c-99e1-400c-b68e-a504cda95f2e" containerName="oc" Mar 19 20:18:22 crc kubenswrapper[4826]: I0319 20:18:22.009518 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c98a23b-914e-4545-92fd-056eeb4af1ee" containerName="registry-server" Mar 19 20:18:22 crc kubenswrapper[4826]: I0319 20:18:22.011090 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-splz6/crc-debug-glbf7" Mar 19 20:18:22 crc kubenswrapper[4826]: I0319 20:18:22.049244 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjcb8\" (UniqueName: \"kubernetes.io/projected/64f36ec1-6965-47bc-9dc7-1bf25fbd19c2-kube-api-access-bjcb8\") pod \"crc-debug-glbf7\" (UID: \"64f36ec1-6965-47bc-9dc7-1bf25fbd19c2\") " pod="openshift-must-gather-splz6/crc-debug-glbf7" Mar 19 20:18:22 crc kubenswrapper[4826]: I0319 20:18:22.049759 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/64f36ec1-6965-47bc-9dc7-1bf25fbd19c2-host\") pod \"crc-debug-glbf7\" (UID: \"64f36ec1-6965-47bc-9dc7-1bf25fbd19c2\") " pod="openshift-must-gather-splz6/crc-debug-glbf7" Mar 19 20:18:22 crc kubenswrapper[4826]: I0319 20:18:22.151796 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjcb8\" (UniqueName: \"kubernetes.io/projected/64f36ec1-6965-47bc-9dc7-1bf25fbd19c2-kube-api-access-bjcb8\") pod \"crc-debug-glbf7\" (UID: \"64f36ec1-6965-47bc-9dc7-1bf25fbd19c2\") " pod="openshift-must-gather-splz6/crc-debug-glbf7" Mar 19 20:18:22 crc kubenswrapper[4826]: I0319 20:18:22.152080 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/64f36ec1-6965-47bc-9dc7-1bf25fbd19c2-host\") pod \"crc-debug-glbf7\" (UID: \"64f36ec1-6965-47bc-9dc7-1bf25fbd19c2\") " pod="openshift-must-gather-splz6/crc-debug-glbf7" Mar 19 20:18:22 crc kubenswrapper[4826]: I0319 20:18:22.160738 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/64f36ec1-6965-47bc-9dc7-1bf25fbd19c2-host\") pod \"crc-debug-glbf7\" (UID: \"64f36ec1-6965-47bc-9dc7-1bf25fbd19c2\") " pod="openshift-must-gather-splz6/crc-debug-glbf7" Mar 19 20:18:22 crc kubenswrapper[4826]: I0319 20:18:22.174310 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjcb8\" (UniqueName: \"kubernetes.io/projected/64f36ec1-6965-47bc-9dc7-1bf25fbd19c2-kube-api-access-bjcb8\") pod \"crc-debug-glbf7\" (UID: \"64f36ec1-6965-47bc-9dc7-1bf25fbd19c2\") " pod="openshift-must-gather-splz6/crc-debug-glbf7" Mar 19 20:18:22 crc kubenswrapper[4826]: I0319 20:18:22.334986 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-splz6/crc-debug-glbf7" Mar 19 20:18:23 crc kubenswrapper[4826]: I0319 20:18:23.138275 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-splz6/crc-debug-glbf7" event={"ID":"64f36ec1-6965-47bc-9dc7-1bf25fbd19c2","Type":"ContainerStarted","Data":"c74c1919f0cc26af1ee4a50b0401f94a91c37d29e575927e2bd611aecc0048f7"} Mar 19 20:18:35 crc kubenswrapper[4826]: I0319 20:18:35.305202 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-splz6/crc-debug-glbf7" event={"ID":"64f36ec1-6965-47bc-9dc7-1bf25fbd19c2","Type":"ContainerStarted","Data":"ac3f90f5690c61ca239256ac11f844b35a0251e1f2d64107902be402c4c3a9f9"} Mar 19 20:18:35 crc kubenswrapper[4826]: I0319 20:18:35.328491 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-splz6/crc-debug-glbf7" podStartSLOduration=2.436769524 podStartE2EDuration="14.328470594s" podCreationTimestamp="2026-03-19 20:18:21 +0000 UTC" firstStartedPulling="2026-03-19 20:18:22.426280536 +0000 UTC m=+4927.180348859" lastFinishedPulling="2026-03-19 20:18:34.317981616 +0000 UTC m=+4939.072049929" observedRunningTime="2026-03-19 20:18:35.32338599 +0000 UTC m=+4940.077454313" watchObservedRunningTime="2026-03-19 20:18:35.328470594 +0000 UTC m=+4940.082538907" Mar 19 20:18:40 crc kubenswrapper[4826]: I0319 20:18:40.879862 4826 scope.go:117] "RemoveContainer" containerID="d0993fecaa7e2ef7f6f11e5a016a4032e34b14c64a539a8ca2f4d6a9e3ff9adf" Mar 19 20:18:46 crc kubenswrapper[4826]: I0319 20:18:46.206416 4826 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","burstable","pod4c98a23b-914e-4545-92fd-056eeb4af1ee"] err="unable to destroy cgroup paths for cgroup [kubepods burstable pod4c98a23b-914e-4545-92fd-056eeb4af1ee] : Timed out while waiting for systemd to remove kubepods-burstable-pod4c98a23b_914e_4545_92fd_056eeb4af1ee.slice" Mar 19 20:18:55 crc kubenswrapper[4826]: I0319 20:18:55.400604 4826 patch_prober.go:28] interesting pod/machine-config-daemon-zz87p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:18:55 crc kubenswrapper[4826]: I0319 20:18:55.402337 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:19:03 crc kubenswrapper[4826]: I0319 20:19:03.652889 4826 generic.go:334] "Generic (PLEG): container finished" podID="ab7c3046-ac34-417e-a7c6-63e500286063" containerID="1d31d0d8db0c2bb8340ffd0cc50bd990421104e190bb05e42ac92b09f1760326" exitCode=0 Mar 19 20:19:03 crc kubenswrapper[4826]: I0319 20:19:03.653369 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-657c5b447-gjh5h" event={"ID":"ab7c3046-ac34-417e-a7c6-63e500286063","Type":"ContainerDied","Data":"1d31d0d8db0c2bb8340ffd0cc50bd990421104e190bb05e42ac92b09f1760326"} Mar 19 20:19:03 crc kubenswrapper[4826]: I0319 20:19:03.653490 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-657c5b447-gjh5h" event={"ID":"ab7c3046-ac34-417e-a7c6-63e500286063","Type":"ContainerStarted","Data":"8d6d1d2bfdf20dc170c4472b04dffc6e2c44332a84c5354f14aa5ccfdd7e332f"} Mar 19 20:19:11 crc kubenswrapper[4826]: I0319 20:19:11.660640 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-657c5b447-gjh5h" Mar 19 20:19:11 crc kubenswrapper[4826]: I0319 20:19:11.662471 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-657c5b447-gjh5h" Mar 19 20:19:18 crc kubenswrapper[4826]: I0319 20:19:18.856095 4826 generic.go:334] "Generic (PLEG): container finished" podID="64f36ec1-6965-47bc-9dc7-1bf25fbd19c2" containerID="ac3f90f5690c61ca239256ac11f844b35a0251e1f2d64107902be402c4c3a9f9" exitCode=0 Mar 19 20:19:18 crc kubenswrapper[4826]: I0319 20:19:18.856233 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-splz6/crc-debug-glbf7" event={"ID":"64f36ec1-6965-47bc-9dc7-1bf25fbd19c2","Type":"ContainerDied","Data":"ac3f90f5690c61ca239256ac11f844b35a0251e1f2d64107902be402c4c3a9f9"} Mar 19 20:19:20 crc kubenswrapper[4826]: I0319 20:19:20.014903 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-splz6/crc-debug-glbf7" Mar 19 20:19:20 crc kubenswrapper[4826]: I0319 20:19:20.053192 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjcb8\" (UniqueName: \"kubernetes.io/projected/64f36ec1-6965-47bc-9dc7-1bf25fbd19c2-kube-api-access-bjcb8\") pod \"64f36ec1-6965-47bc-9dc7-1bf25fbd19c2\" (UID: \"64f36ec1-6965-47bc-9dc7-1bf25fbd19c2\") " Mar 19 20:19:20 crc kubenswrapper[4826]: I0319 20:19:20.053259 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/64f36ec1-6965-47bc-9dc7-1bf25fbd19c2-host\") pod \"64f36ec1-6965-47bc-9dc7-1bf25fbd19c2\" (UID: \"64f36ec1-6965-47bc-9dc7-1bf25fbd19c2\") " Mar 19 20:19:20 crc kubenswrapper[4826]: I0319 20:19:20.053668 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/64f36ec1-6965-47bc-9dc7-1bf25fbd19c2-host" (OuterVolumeSpecName: "host") pod "64f36ec1-6965-47bc-9dc7-1bf25fbd19c2" (UID: "64f36ec1-6965-47bc-9dc7-1bf25fbd19c2"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 20:19:20 crc kubenswrapper[4826]: I0319 20:19:20.054681 4826 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/64f36ec1-6965-47bc-9dc7-1bf25fbd19c2-host\") on node \"crc\" DevicePath \"\"" Mar 19 20:19:20 crc kubenswrapper[4826]: I0319 20:19:20.061439 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64f36ec1-6965-47bc-9dc7-1bf25fbd19c2-kube-api-access-bjcb8" (OuterVolumeSpecName: "kube-api-access-bjcb8") pod "64f36ec1-6965-47bc-9dc7-1bf25fbd19c2" (UID: "64f36ec1-6965-47bc-9dc7-1bf25fbd19c2"). InnerVolumeSpecName "kube-api-access-bjcb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:19:20 crc kubenswrapper[4826]: I0319 20:19:20.063207 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-splz6/crc-debug-glbf7"] Mar 19 20:19:20 crc kubenswrapper[4826]: I0319 20:19:20.075920 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-splz6/crc-debug-glbf7"] Mar 19 20:19:20 crc kubenswrapper[4826]: I0319 20:19:20.156721 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjcb8\" (UniqueName: \"kubernetes.io/projected/64f36ec1-6965-47bc-9dc7-1bf25fbd19c2-kube-api-access-bjcb8\") on node \"crc\" DevicePath \"\"" Mar 19 20:19:20 crc kubenswrapper[4826]: I0319 20:19:20.882880 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c74c1919f0cc26af1ee4a50b0401f94a91c37d29e575927e2bd611aecc0048f7" Mar 19 20:19:20 crc kubenswrapper[4826]: I0319 20:19:20.882948 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-splz6/crc-debug-glbf7" Mar 19 20:19:21 crc kubenswrapper[4826]: I0319 20:19:21.302103 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-splz6/crc-debug-v7x2g"] Mar 19 20:19:21 crc kubenswrapper[4826]: E0319 20:19:21.302850 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64f36ec1-6965-47bc-9dc7-1bf25fbd19c2" containerName="container-00" Mar 19 20:19:21 crc kubenswrapper[4826]: I0319 20:19:21.302863 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="64f36ec1-6965-47bc-9dc7-1bf25fbd19c2" containerName="container-00" Mar 19 20:19:21 crc kubenswrapper[4826]: I0319 20:19:21.303130 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="64f36ec1-6965-47bc-9dc7-1bf25fbd19c2" containerName="container-00" Mar 19 20:19:21 crc kubenswrapper[4826]: I0319 20:19:21.303962 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-splz6/crc-debug-v7x2g" Mar 19 20:19:21 crc kubenswrapper[4826]: I0319 20:19:21.382516 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a65419b5-28d4-4188-81bf-bebf75d3c525-host\") pod \"crc-debug-v7x2g\" (UID: \"a65419b5-28d4-4188-81bf-bebf75d3c525\") " pod="openshift-must-gather-splz6/crc-debug-v7x2g" Mar 19 20:19:21 crc kubenswrapper[4826]: I0319 20:19:21.382681 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psnx8\" (UniqueName: \"kubernetes.io/projected/a65419b5-28d4-4188-81bf-bebf75d3c525-kube-api-access-psnx8\") pod \"crc-debug-v7x2g\" (UID: \"a65419b5-28d4-4188-81bf-bebf75d3c525\") " pod="openshift-must-gather-splz6/crc-debug-v7x2g" Mar 19 20:19:21 crc kubenswrapper[4826]: I0319 20:19:21.485220 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psnx8\" (UniqueName: \"kubernetes.io/projected/a65419b5-28d4-4188-81bf-bebf75d3c525-kube-api-access-psnx8\") pod \"crc-debug-v7x2g\" (UID: \"a65419b5-28d4-4188-81bf-bebf75d3c525\") " pod="openshift-must-gather-splz6/crc-debug-v7x2g" Mar 19 20:19:21 crc kubenswrapper[4826]: I0319 20:19:21.485426 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a65419b5-28d4-4188-81bf-bebf75d3c525-host\") pod \"crc-debug-v7x2g\" (UID: \"a65419b5-28d4-4188-81bf-bebf75d3c525\") " pod="openshift-must-gather-splz6/crc-debug-v7x2g" Mar 19 20:19:21 crc kubenswrapper[4826]: I0319 20:19:21.485529 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a65419b5-28d4-4188-81bf-bebf75d3c525-host\") pod \"crc-debug-v7x2g\" (UID: \"a65419b5-28d4-4188-81bf-bebf75d3c525\") " pod="openshift-must-gather-splz6/crc-debug-v7x2g" Mar 19 20:19:21 crc kubenswrapper[4826]: I0319 20:19:21.504508 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psnx8\" (UniqueName: \"kubernetes.io/projected/a65419b5-28d4-4188-81bf-bebf75d3c525-kube-api-access-psnx8\") pod \"crc-debug-v7x2g\" (UID: \"a65419b5-28d4-4188-81bf-bebf75d3c525\") " pod="openshift-must-gather-splz6/crc-debug-v7x2g" Mar 19 20:19:21 crc kubenswrapper[4826]: I0319 20:19:21.625621 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-splz6/crc-debug-v7x2g" Mar 19 20:19:21 crc kubenswrapper[4826]: I0319 20:19:21.895171 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-splz6/crc-debug-v7x2g" event={"ID":"a65419b5-28d4-4188-81bf-bebf75d3c525","Type":"ContainerStarted","Data":"9d8d75c20db057276955e3becd94bac37cfeba1c68b3c523280b3e633fb53b6b"} Mar 19 20:19:21 crc kubenswrapper[4826]: I0319 20:19:21.988912 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64f36ec1-6965-47bc-9dc7-1bf25fbd19c2" path="/var/lib/kubelet/pods/64f36ec1-6965-47bc-9dc7-1bf25fbd19c2/volumes" Mar 19 20:19:22 crc kubenswrapper[4826]: I0319 20:19:22.908569 4826 generic.go:334] "Generic (PLEG): container finished" podID="a65419b5-28d4-4188-81bf-bebf75d3c525" containerID="149aa9af905edd8da0d991f72d546c6e3e17572ae31db109c0dc924ea6ad308e" exitCode=0 Mar 19 20:19:22 crc kubenswrapper[4826]: I0319 20:19:22.908642 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-splz6/crc-debug-v7x2g" event={"ID":"a65419b5-28d4-4188-81bf-bebf75d3c525","Type":"ContainerDied","Data":"149aa9af905edd8da0d991f72d546c6e3e17572ae31db109c0dc924ea6ad308e"} Mar 19 20:19:23 crc kubenswrapper[4826]: I0319 20:19:23.977707 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-splz6/crc-debug-v7x2g"] Mar 19 20:19:23 crc kubenswrapper[4826]: I0319 20:19:23.997002 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-splz6/crc-debug-v7x2g"] Mar 19 20:19:24 crc kubenswrapper[4826]: I0319 20:19:24.055914 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-splz6/crc-debug-v7x2g" Mar 19 20:19:24 crc kubenswrapper[4826]: I0319 20:19:24.249153 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psnx8\" (UniqueName: \"kubernetes.io/projected/a65419b5-28d4-4188-81bf-bebf75d3c525-kube-api-access-psnx8\") pod \"a65419b5-28d4-4188-81bf-bebf75d3c525\" (UID: \"a65419b5-28d4-4188-81bf-bebf75d3c525\") " Mar 19 20:19:24 crc kubenswrapper[4826]: I0319 20:19:24.249214 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a65419b5-28d4-4188-81bf-bebf75d3c525-host\") pod \"a65419b5-28d4-4188-81bf-bebf75d3c525\" (UID: \"a65419b5-28d4-4188-81bf-bebf75d3c525\") " Mar 19 20:19:24 crc kubenswrapper[4826]: I0319 20:19:24.249320 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a65419b5-28d4-4188-81bf-bebf75d3c525-host" (OuterVolumeSpecName: "host") pod "a65419b5-28d4-4188-81bf-bebf75d3c525" (UID: "a65419b5-28d4-4188-81bf-bebf75d3c525"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 20:19:24 crc kubenswrapper[4826]: I0319 20:19:24.249883 4826 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a65419b5-28d4-4188-81bf-bebf75d3c525-host\") on node \"crc\" DevicePath \"\"" Mar 19 20:19:24 crc kubenswrapper[4826]: I0319 20:19:24.254846 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a65419b5-28d4-4188-81bf-bebf75d3c525-kube-api-access-psnx8" (OuterVolumeSpecName: "kube-api-access-psnx8") pod "a65419b5-28d4-4188-81bf-bebf75d3c525" (UID: "a65419b5-28d4-4188-81bf-bebf75d3c525"). InnerVolumeSpecName "kube-api-access-psnx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:19:24 crc kubenswrapper[4826]: I0319 20:19:24.352157 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psnx8\" (UniqueName: \"kubernetes.io/projected/a65419b5-28d4-4188-81bf-bebf75d3c525-kube-api-access-psnx8\") on node \"crc\" DevicePath \"\"" Mar 19 20:19:24 crc kubenswrapper[4826]: I0319 20:19:24.932234 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d8d75c20db057276955e3becd94bac37cfeba1c68b3c523280b3e633fb53b6b" Mar 19 20:19:24 crc kubenswrapper[4826]: I0319 20:19:24.932537 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-splz6/crc-debug-v7x2g" Mar 19 20:19:25 crc kubenswrapper[4826]: I0319 20:19:25.241098 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-splz6/crc-debug-p7h48"] Mar 19 20:19:25 crc kubenswrapper[4826]: E0319 20:19:25.241758 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a65419b5-28d4-4188-81bf-bebf75d3c525" containerName="container-00" Mar 19 20:19:25 crc kubenswrapper[4826]: I0319 20:19:25.241781 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="a65419b5-28d4-4188-81bf-bebf75d3c525" containerName="container-00" Mar 19 20:19:25 crc kubenswrapper[4826]: I0319 20:19:25.242087 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="a65419b5-28d4-4188-81bf-bebf75d3c525" containerName="container-00" Mar 19 20:19:25 crc kubenswrapper[4826]: I0319 20:19:25.243087 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-splz6/crc-debug-p7h48" Mar 19 20:19:25 crc kubenswrapper[4826]: I0319 20:19:25.276488 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d018daf0-50fc-4707-a6fa-620ddae101cf-host\") pod \"crc-debug-p7h48\" (UID: \"d018daf0-50fc-4707-a6fa-620ddae101cf\") " pod="openshift-must-gather-splz6/crc-debug-p7h48" Mar 19 20:19:25 crc kubenswrapper[4826]: I0319 20:19:25.276646 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6p5hq\" (UniqueName: \"kubernetes.io/projected/d018daf0-50fc-4707-a6fa-620ddae101cf-kube-api-access-6p5hq\") pod \"crc-debug-p7h48\" (UID: \"d018daf0-50fc-4707-a6fa-620ddae101cf\") " pod="openshift-must-gather-splz6/crc-debug-p7h48" Mar 19 20:19:25 crc kubenswrapper[4826]: I0319 20:19:25.379637 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d018daf0-50fc-4707-a6fa-620ddae101cf-host\") pod \"crc-debug-p7h48\" (UID: \"d018daf0-50fc-4707-a6fa-620ddae101cf\") " pod="openshift-must-gather-splz6/crc-debug-p7h48" Mar 19 20:19:25 crc kubenswrapper[4826]: I0319 20:19:25.379746 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d018daf0-50fc-4707-a6fa-620ddae101cf-host\") pod \"crc-debug-p7h48\" (UID: \"d018daf0-50fc-4707-a6fa-620ddae101cf\") " pod="openshift-must-gather-splz6/crc-debug-p7h48" Mar 19 20:19:25 crc kubenswrapper[4826]: I0319 20:19:25.379895 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6p5hq\" (UniqueName: \"kubernetes.io/projected/d018daf0-50fc-4707-a6fa-620ddae101cf-kube-api-access-6p5hq\") pod \"crc-debug-p7h48\" (UID: \"d018daf0-50fc-4707-a6fa-620ddae101cf\") " pod="openshift-must-gather-splz6/crc-debug-p7h48" Mar 19 20:19:25 crc kubenswrapper[4826]: I0319 20:19:25.396761 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6p5hq\" (UniqueName: \"kubernetes.io/projected/d018daf0-50fc-4707-a6fa-620ddae101cf-kube-api-access-6p5hq\") pod \"crc-debug-p7h48\" (UID: \"d018daf0-50fc-4707-a6fa-620ddae101cf\") " pod="openshift-must-gather-splz6/crc-debug-p7h48" Mar 19 20:19:25 crc kubenswrapper[4826]: I0319 20:19:25.401011 4826 patch_prober.go:28] interesting pod/machine-config-daemon-zz87p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:19:25 crc kubenswrapper[4826]: I0319 20:19:25.401180 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:19:25 crc kubenswrapper[4826]: I0319 20:19:25.562240 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-splz6/crc-debug-p7h48" Mar 19 20:19:25 crc kubenswrapper[4826]: I0319 20:19:25.945478 4826 generic.go:334] "Generic (PLEG): container finished" podID="d018daf0-50fc-4707-a6fa-620ddae101cf" containerID="d9504428b78d71865795f5ad9d00ea0aacddb666be3561f3ca4c4973148fc659" exitCode=0 Mar 19 20:19:25 crc kubenswrapper[4826]: I0319 20:19:25.945509 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-splz6/crc-debug-p7h48" event={"ID":"d018daf0-50fc-4707-a6fa-620ddae101cf","Type":"ContainerDied","Data":"d9504428b78d71865795f5ad9d00ea0aacddb666be3561f3ca4c4973148fc659"} Mar 19 20:19:25 crc kubenswrapper[4826]: I0319 20:19:25.945813 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-splz6/crc-debug-p7h48" event={"ID":"d018daf0-50fc-4707-a6fa-620ddae101cf","Type":"ContainerStarted","Data":"8846b67c72bcdc66e529a14c743b43eb043381362f469369db2f6e48e047ab4e"} Mar 19 20:19:25 crc kubenswrapper[4826]: I0319 20:19:25.996468 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a65419b5-28d4-4188-81bf-bebf75d3c525" path="/var/lib/kubelet/pods/a65419b5-28d4-4188-81bf-bebf75d3c525/volumes" Mar 19 20:19:25 crc kubenswrapper[4826]: I0319 20:19:25.997099 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-splz6/crc-debug-p7h48"] Mar 19 20:19:26 crc kubenswrapper[4826]: I0319 20:19:26.005428 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-splz6/crc-debug-p7h48"] Mar 19 20:19:27 crc kubenswrapper[4826]: I0319 20:19:27.137731 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-splz6/crc-debug-p7h48" Mar 19 20:19:27 crc kubenswrapper[4826]: I0319 20:19:27.329469 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d018daf0-50fc-4707-a6fa-620ddae101cf-host\") pod \"d018daf0-50fc-4707-a6fa-620ddae101cf\" (UID: \"d018daf0-50fc-4707-a6fa-620ddae101cf\") " Mar 19 20:19:27 crc kubenswrapper[4826]: I0319 20:19:27.330126 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6p5hq\" (UniqueName: \"kubernetes.io/projected/d018daf0-50fc-4707-a6fa-620ddae101cf-kube-api-access-6p5hq\") pod \"d018daf0-50fc-4707-a6fa-620ddae101cf\" (UID: \"d018daf0-50fc-4707-a6fa-620ddae101cf\") " Mar 19 20:19:27 crc kubenswrapper[4826]: I0319 20:19:27.332640 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d018daf0-50fc-4707-a6fa-620ddae101cf-host" (OuterVolumeSpecName: "host") pod "d018daf0-50fc-4707-a6fa-620ddae101cf" (UID: "d018daf0-50fc-4707-a6fa-620ddae101cf"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 20:19:27 crc kubenswrapper[4826]: I0319 20:19:27.340277 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d018daf0-50fc-4707-a6fa-620ddae101cf-kube-api-access-6p5hq" (OuterVolumeSpecName: "kube-api-access-6p5hq") pod "d018daf0-50fc-4707-a6fa-620ddae101cf" (UID: "d018daf0-50fc-4707-a6fa-620ddae101cf"). InnerVolumeSpecName "kube-api-access-6p5hq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:19:27 crc kubenswrapper[4826]: I0319 20:19:27.433502 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6p5hq\" (UniqueName: \"kubernetes.io/projected/d018daf0-50fc-4707-a6fa-620ddae101cf-kube-api-access-6p5hq\") on node \"crc\" DevicePath \"\"" Mar 19 20:19:27 crc kubenswrapper[4826]: I0319 20:19:27.433872 4826 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d018daf0-50fc-4707-a6fa-620ddae101cf-host\") on node \"crc\" DevicePath \"\"" Mar 19 20:19:27 crc kubenswrapper[4826]: I0319 20:19:27.976544 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-splz6/crc-debug-p7h48" Mar 19 20:19:28 crc kubenswrapper[4826]: I0319 20:19:28.000972 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d018daf0-50fc-4707-a6fa-620ddae101cf" path="/var/lib/kubelet/pods/d018daf0-50fc-4707-a6fa-620ddae101cf/volumes" Mar 19 20:19:28 crc kubenswrapper[4826]: I0319 20:19:28.002356 4826 scope.go:117] "RemoveContainer" containerID="d9504428b78d71865795f5ad9d00ea0aacddb666be3561f3ca4c4973148fc659" Mar 19 20:19:31 crc kubenswrapper[4826]: I0319 20:19:31.671814 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-657c5b447-gjh5h" Mar 19 20:19:31 crc kubenswrapper[4826]: I0319 20:19:31.677414 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-657c5b447-gjh5h" Mar 19 20:19:55 crc kubenswrapper[4826]: I0319 20:19:55.400250 4826 patch_prober.go:28] interesting pod/machine-config-daemon-zz87p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:19:55 crc kubenswrapper[4826]: I0319 20:19:55.400834 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:19:55 crc kubenswrapper[4826]: I0319 20:19:55.400882 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" Mar 19 20:19:55 crc kubenswrapper[4826]: I0319 20:19:55.401824 4826 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"63ecc124824e01c5ccfd2a32cf4bb3e2efc0746dfbe17c96f0b271731ffe1823"} pod="openshift-machine-config-operator/machine-config-daemon-zz87p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 20:19:55 crc kubenswrapper[4826]: I0319 20:19:55.401888 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" containerID="cri-o://63ecc124824e01c5ccfd2a32cf4bb3e2efc0746dfbe17c96f0b271731ffe1823" gracePeriod=600 Mar 19 20:19:55 crc kubenswrapper[4826]: E0319 20:19:55.527854 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 20:19:56 crc kubenswrapper[4826]: I0319 20:19:56.372461 4826 generic.go:334] "Generic (PLEG): container finished" podID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerID="63ecc124824e01c5ccfd2a32cf4bb3e2efc0746dfbe17c96f0b271731ffe1823" exitCode=0 Mar 19 20:19:56 crc kubenswrapper[4826]: I0319 20:19:56.372523 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" event={"ID":"b456fa3f-c7a7-45ca-b560-e7a9b21be05a","Type":"ContainerDied","Data":"63ecc124824e01c5ccfd2a32cf4bb3e2efc0746dfbe17c96f0b271731ffe1823"} Mar 19 20:19:56 crc kubenswrapper[4826]: I0319 20:19:56.372640 4826 scope.go:117] "RemoveContainer" containerID="b791712166d20c7da73a215deab7b11094ad95f1cb3bea1ca9cc7f96f4e37482" Mar 19 20:19:56 crc kubenswrapper[4826]: I0319 20:19:56.373775 4826 scope.go:117] "RemoveContainer" containerID="63ecc124824e01c5ccfd2a32cf4bb3e2efc0746dfbe17c96f0b271731ffe1823" Mar 19 20:19:56 crc kubenswrapper[4826]: E0319 20:19:56.374372 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 20:20:00 crc kubenswrapper[4826]: I0319 20:20:00.154463 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565860-7lvs2"] Mar 19 20:20:00 crc kubenswrapper[4826]: E0319 20:20:00.155979 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d018daf0-50fc-4707-a6fa-620ddae101cf" containerName="container-00" Mar 19 20:20:00 crc kubenswrapper[4826]: I0319 20:20:00.156000 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="d018daf0-50fc-4707-a6fa-620ddae101cf" containerName="container-00" Mar 19 20:20:00 crc kubenswrapper[4826]: I0319 20:20:00.156313 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="d018daf0-50fc-4707-a6fa-620ddae101cf" containerName="container-00" Mar 19 20:20:00 crc kubenswrapper[4826]: I0319 20:20:00.157445 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565860-7lvs2" Mar 19 20:20:00 crc kubenswrapper[4826]: I0319 20:20:00.160454 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 20:20:00 crc kubenswrapper[4826]: I0319 20:20:00.160472 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b27wl" Mar 19 20:20:00 crc kubenswrapper[4826]: I0319 20:20:00.160456 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 20:20:00 crc kubenswrapper[4826]: I0319 20:20:00.171058 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565860-7lvs2"] Mar 19 20:20:00 crc kubenswrapper[4826]: I0319 20:20:00.283851 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2896\" (UniqueName: \"kubernetes.io/projected/0a654ca5-dc45-4a08-9c35-e25fb06760c6-kube-api-access-s2896\") pod \"auto-csr-approver-29565860-7lvs2\" (UID: \"0a654ca5-dc45-4a08-9c35-e25fb06760c6\") " pod="openshift-infra/auto-csr-approver-29565860-7lvs2" Mar 19 20:20:00 crc kubenswrapper[4826]: I0319 20:20:00.387475 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2896\" (UniqueName: \"kubernetes.io/projected/0a654ca5-dc45-4a08-9c35-e25fb06760c6-kube-api-access-s2896\") pod \"auto-csr-approver-29565860-7lvs2\" (UID: \"0a654ca5-dc45-4a08-9c35-e25fb06760c6\") " pod="openshift-infra/auto-csr-approver-29565860-7lvs2" Mar 19 20:20:00 crc kubenswrapper[4826]: I0319 20:20:00.412641 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2896\" (UniqueName: \"kubernetes.io/projected/0a654ca5-dc45-4a08-9c35-e25fb06760c6-kube-api-access-s2896\") pod \"auto-csr-approver-29565860-7lvs2\" (UID: \"0a654ca5-dc45-4a08-9c35-e25fb06760c6\") " pod="openshift-infra/auto-csr-approver-29565860-7lvs2" Mar 19 20:20:00 crc kubenswrapper[4826]: I0319 20:20:00.482855 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565860-7lvs2" Mar 19 20:20:01 crc kubenswrapper[4826]: I0319 20:20:01.265756 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565860-7lvs2"] Mar 19 20:20:01 crc kubenswrapper[4826]: W0319 20:20:01.282075 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a654ca5_dc45_4a08_9c35_e25fb06760c6.slice/crio-6df32da016c1366b6fe0a87a24628c567c7a7497b792c525d2d6f515c61303ec WatchSource:0}: Error finding container 6df32da016c1366b6fe0a87a24628c567c7a7497b792c525d2d6f515c61303ec: Status 404 returned error can't find the container with id 6df32da016c1366b6fe0a87a24628c567c7a7497b792c525d2d6f515c61303ec Mar 19 20:20:01 crc kubenswrapper[4826]: I0319 20:20:01.467065 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565860-7lvs2" event={"ID":"0a654ca5-dc45-4a08-9c35-e25fb06760c6","Type":"ContainerStarted","Data":"6df32da016c1366b6fe0a87a24628c567c7a7497b792c525d2d6f515c61303ec"} Mar 19 20:20:03 crc kubenswrapper[4826]: I0319 20:20:03.497378 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565860-7lvs2" event={"ID":"0a654ca5-dc45-4a08-9c35-e25fb06760c6","Type":"ContainerStarted","Data":"4b8f142f166faec74c40a5246a9ba266558ced09680d95a3f74613d8760a6fb6"} Mar 19 20:20:03 crc kubenswrapper[4826]: I0319 20:20:03.513251 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565860-7lvs2" podStartSLOduration=2.276342529 podStartE2EDuration="3.513225175s" podCreationTimestamp="2026-03-19 20:20:00 +0000 UTC" firstStartedPulling="2026-03-19 20:20:01.286465526 +0000 UTC m=+5026.040533829" lastFinishedPulling="2026-03-19 20:20:02.523348132 +0000 UTC m=+5027.277416475" observedRunningTime="2026-03-19 20:20:03.511246508 +0000 UTC m=+5028.265314821" watchObservedRunningTime="2026-03-19 20:20:03.513225175 +0000 UTC m=+5028.267293518" Mar 19 20:20:04 crc kubenswrapper[4826]: I0319 20:20:04.510442 4826 generic.go:334] "Generic (PLEG): container finished" podID="0a654ca5-dc45-4a08-9c35-e25fb06760c6" containerID="4b8f142f166faec74c40a5246a9ba266558ced09680d95a3f74613d8760a6fb6" exitCode=0 Mar 19 20:20:04 crc kubenswrapper[4826]: I0319 20:20:04.510809 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565860-7lvs2" event={"ID":"0a654ca5-dc45-4a08-9c35-e25fb06760c6","Type":"ContainerDied","Data":"4b8f142f166faec74c40a5246a9ba266558ced09680d95a3f74613d8760a6fb6"} Mar 19 20:20:05 crc kubenswrapper[4826]: I0319 20:20:05.970092 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565860-7lvs2" Mar 19 20:20:06 crc kubenswrapper[4826]: I0319 20:20:06.140631 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2896\" (UniqueName: \"kubernetes.io/projected/0a654ca5-dc45-4a08-9c35-e25fb06760c6-kube-api-access-s2896\") pod \"0a654ca5-dc45-4a08-9c35-e25fb06760c6\" (UID: \"0a654ca5-dc45-4a08-9c35-e25fb06760c6\") " Mar 19 20:20:06 crc kubenswrapper[4826]: I0319 20:20:06.146772 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a654ca5-dc45-4a08-9c35-e25fb06760c6-kube-api-access-s2896" (OuterVolumeSpecName: "kube-api-access-s2896") pod "0a654ca5-dc45-4a08-9c35-e25fb06760c6" (UID: "0a654ca5-dc45-4a08-9c35-e25fb06760c6"). InnerVolumeSpecName "kube-api-access-s2896". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:20:06 crc kubenswrapper[4826]: I0319 20:20:06.244172 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2896\" (UniqueName: \"kubernetes.io/projected/0a654ca5-dc45-4a08-9c35-e25fb06760c6-kube-api-access-s2896\") on node \"crc\" DevicePath \"\"" Mar 19 20:20:06 crc kubenswrapper[4826]: I0319 20:20:06.534906 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565860-7lvs2" event={"ID":"0a654ca5-dc45-4a08-9c35-e25fb06760c6","Type":"ContainerDied","Data":"6df32da016c1366b6fe0a87a24628c567c7a7497b792c525d2d6f515c61303ec"} Mar 19 20:20:06 crc kubenswrapper[4826]: I0319 20:20:06.534952 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6df32da016c1366b6fe0a87a24628c567c7a7497b792c525d2d6f515c61303ec" Mar 19 20:20:06 crc kubenswrapper[4826]: I0319 20:20:06.535008 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565860-7lvs2" Mar 19 20:20:06 crc kubenswrapper[4826]: I0319 20:20:06.658791 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565854-m7sgq"] Mar 19 20:20:06 crc kubenswrapper[4826]: I0319 20:20:06.671445 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565854-m7sgq"] Mar 19 20:20:07 crc kubenswrapper[4826]: I0319 20:20:07.524428 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_e9249025-1986-4855-8225-8ca0601709ce/aodh-api/0.log" Mar 19 20:20:07 crc kubenswrapper[4826]: I0319 20:20:07.747248 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_e9249025-1986-4855-8225-8ca0601709ce/aodh-evaluator/0.log" Mar 19 20:20:07 crc kubenswrapper[4826]: I0319 20:20:07.763872 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_e9249025-1986-4855-8225-8ca0601709ce/aodh-notifier/0.log" Mar 19 20:20:07 crc kubenswrapper[4826]: I0319 20:20:07.767018 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_e9249025-1986-4855-8225-8ca0601709ce/aodh-listener/0.log" Mar 19 20:20:07 crc kubenswrapper[4826]: I0319 20:20:07.937241 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-554d47978d-xzcgd_a4ae16c3-6030-48fc-b4d2-057a45770fe1/barbican-api/0.log" Mar 19 20:20:07 crc kubenswrapper[4826]: I0319 20:20:07.977223 4826 scope.go:117] "RemoveContainer" containerID="63ecc124824e01c5ccfd2a32cf4bb3e2efc0746dfbe17c96f0b271731ffe1823" Mar 19 20:20:07 crc kubenswrapper[4826]: E0319 20:20:07.977577 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 20:20:07 crc kubenswrapper[4826]: I0319 20:20:07.990533 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89d4e4ee-f487-404d-9bdd-a945c36d7cbc" path="/var/lib/kubelet/pods/89d4e4ee-f487-404d-9bdd-a945c36d7cbc/volumes" Mar 19 20:20:08 crc kubenswrapper[4826]: I0319 20:20:08.023722 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-554d47978d-xzcgd_a4ae16c3-6030-48fc-b4d2-057a45770fe1/barbican-api-log/0.log" Mar 19 20:20:08 crc kubenswrapper[4826]: I0319 20:20:08.130406 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7f89998948-7dsjr_f794027a-e079-4fdc-94b5-aa448e034df7/barbican-keystone-listener/0.log" Mar 19 20:20:08 crc kubenswrapper[4826]: I0319 20:20:08.182955 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7f89998948-7dsjr_f794027a-e079-4fdc-94b5-aa448e034df7/barbican-keystone-listener-log/0.log" Mar 19 20:20:08 crc kubenswrapper[4826]: I0319 20:20:08.271213 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-67df6b8d6c-9kl5x_6cf6154b-c9df-415a-82dd-907847aaf7d4/barbican-worker/0.log" Mar 19 20:20:08 crc kubenswrapper[4826]: I0319 20:20:08.337167 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-67df6b8d6c-9kl5x_6cf6154b-c9df-415a-82dd-907847aaf7d4/barbican-worker-log/0.log" Mar 19 20:20:08 crc kubenswrapper[4826]: I0319 20:20:08.562811 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ab298593-ac97-4031-8bfc-b0e5be9b341a/ceilometer-central-agent/1.log" Mar 19 20:20:08 crc kubenswrapper[4826]: I0319 20:20:08.609921 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-flqqq_a5c0489c-9ec7-4851-b96a-d2cebe602bf2/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 20:20:08 crc kubenswrapper[4826]: I0319 20:20:08.799608 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ab298593-ac97-4031-8bfc-b0e5be9b341a/ceilometer-central-agent/0.log" Mar 19 20:20:08 crc kubenswrapper[4826]: I0319 20:20:08.815621 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ab298593-ac97-4031-8bfc-b0e5be9b341a/proxy-httpd/0.log" Mar 19 20:20:08 crc kubenswrapper[4826]: I0319 20:20:08.821368 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ab298593-ac97-4031-8bfc-b0e5be9b341a/ceilometer-notification-agent/0.log" Mar 19 20:20:08 crc kubenswrapper[4826]: I0319 20:20:08.831358 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ab298593-ac97-4031-8bfc-b0e5be9b341a/sg-core/0.log" Mar 19 20:20:09 crc kubenswrapper[4826]: I0319 20:20:09.035373 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_e18e7f7e-f1f1-4349-a076-79e1f781315d/cinder-api/0.log" Mar 19 20:20:09 crc kubenswrapper[4826]: I0319 20:20:09.048514 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_e18e7f7e-f1f1-4349-a076-79e1f781315d/cinder-api-log/0.log" Mar 19 20:20:09 crc kubenswrapper[4826]: I0319 20:20:09.229352 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_9c0ba365-345a-4a2b-b919-b5e9de88b680/cinder-scheduler/0.log" Mar 19 20:20:09 crc kubenswrapper[4826]: I0319 20:20:09.280620 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_9c0ba365-345a-4a2b-b919-b5e9de88b680/probe/0.log" Mar 19 20:20:09 crc kubenswrapper[4826]: I0319 20:20:09.281931 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_9c0ba365-345a-4a2b-b919-b5e9de88b680/cinder-scheduler/1.log" Mar 19 20:20:09 crc kubenswrapper[4826]: I0319 20:20:09.483106 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-2bzk2_ce8656f9-5811-44d9-bed1-39fb364ddc4f/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 20:20:09 crc kubenswrapper[4826]: I0319 20:20:09.604614 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-d7rnr_a450a030-0a6c-4b58-8d12-ac4b7fb8c20d/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 20:20:09 crc kubenswrapper[4826]: I0319 20:20:09.701287 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5d75f767dc-qk9tk_ce7b509e-beae-4a19-803c-339c76a4d51e/init/0.log" Mar 19 20:20:09 crc kubenswrapper[4826]: I0319 20:20:09.887690 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5d75f767dc-qk9tk_ce7b509e-beae-4a19-803c-339c76a4d51e/init/0.log" Mar 19 20:20:09 crc kubenswrapper[4826]: I0319 20:20:09.950729 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5d75f767dc-qk9tk_ce7b509e-beae-4a19-803c-339c76a4d51e/dnsmasq-dns/0.log" Mar 19 20:20:10 crc kubenswrapper[4826]: I0319 20:20:10.000138 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-5zwm5_e15b97af-3c47-4d33-82d0-20e627959de3/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 20:20:10 crc kubenswrapper[4826]: I0319 20:20:10.394885 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_2bdeb398-28cb-4d25-89fd-81af1e9ad81e/glance-httpd/0.log" Mar 19 20:20:10 crc kubenswrapper[4826]: I0319 20:20:10.536954 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_2bdeb398-28cb-4d25-89fd-81af1e9ad81e/glance-log/0.log" Mar 19 20:20:10 crc kubenswrapper[4826]: I0319 20:20:10.602549 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_3ada0e1e-2ace-4520-9fce-e0d9771afc01/glance-httpd/0.log" Mar 19 20:20:10 crc kubenswrapper[4826]: I0319 20:20:10.663628 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_3ada0e1e-2ace-4520-9fce-e0d9771afc01/glance-log/0.log" Mar 19 20:20:11 crc kubenswrapper[4826]: I0319 20:20:11.431569 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-fb594b459-7sf97_77ff195c-0819-4764-b09f-fd10a1aea177/heat-api/0.log" Mar 19 20:20:11 crc kubenswrapper[4826]: I0319 20:20:11.443569 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-84bf69fdcb-b6hq4_a8d87bc1-29fa-4219-8c55-968d58f697e8/heat-engine/0.log" Mar 19 20:20:11 crc kubenswrapper[4826]: I0319 20:20:11.478233 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-5b5447b648-5hq9h_45e3dc79-4f5e-4bec-a579-41b93f1d6150/heat-cfnapi/0.log" Mar 19 20:20:11 crc kubenswrapper[4826]: I0319 20:20:11.516939 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-glfmw_e534979a-19b2-434a-9b5e-d7d7f4d9125b/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 20:20:11 crc kubenswrapper[4826]: I0319 20:20:11.841327 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-7frjg_58cc0c09-7531-463e-ac30-4cb993eca5fc/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 20:20:12 crc kubenswrapper[4826]: I0319 20:20:12.034030 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29565841-mg6mz_29dfabdd-c376-4871-9fa7-68f2feef0e4a/keystone-cron/0.log" Mar 19 20:20:12 crc kubenswrapper[4826]: I0319 20:20:12.088876 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_06606200-618f-46b2-afb9-e5e2738fe2dd/kube-state-metrics/0.log" Mar 19 20:20:12 crc kubenswrapper[4826]: I0319 20:20:12.362481 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_logging-edpm-deployment-openstack-edpm-ipam-5j76b_71a93dcb-a020-4cb2-b7bc-66042b67be66/logging-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 20:20:12 crc kubenswrapper[4826]: I0319 20:20:12.552441 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-578cc9f57d-6xb4n_9470cb99-e153-461b-a17e-83897c3de6f1/keystone-api/0.log" Mar 19 20:20:12 crc kubenswrapper[4826]: I0319 20:20:12.696765 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mysqld-exporter-0_a472fa43-9652-40de-9038-082a4314b962/mysqld-exporter/0.log" Mar 19 20:20:12 crc kubenswrapper[4826]: I0319 20:20:12.788430 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-r4tbj_0e998b4b-67f9-4f5c-bcb7-df21ad523a61/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 20:20:12 crc kubenswrapper[4826]: I0319 20:20:12.998367 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6674b95fd5-8s42n_79f755d3-73d9-4207-b77b-af2ac5f99404/neutron-httpd/0.log" Mar 19 20:20:13 crc kubenswrapper[4826]: I0319 20:20:13.030230 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6674b95fd5-8s42n_79f755d3-73d9-4207-b77b-af2ac5f99404/neutron-api/0.log" Mar 19 20:20:13 crc kubenswrapper[4826]: I0319 20:20:13.160961 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-r5gzg_29402a90-3f51-4212-bc44-5c382894d0e6/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 20:20:13 crc kubenswrapper[4826]: I0319 20:20:13.727813 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_2fbc03ca-3891-4c9b-8ac9-88efbe950d4d/nova-api-log/0.log" Mar 19 20:20:13 crc kubenswrapper[4826]: I0319 20:20:13.752066 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_d98a927a-17ed-4120-b2e9-0480ce473022/nova-cell0-conductor-conductor/0.log" Mar 19 20:20:14 crc kubenswrapper[4826]: I0319 20:20:14.059554 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_fb070564-2600-49cb-92d3-f8097addd815/nova-cell1-conductor-conductor/0.log" Mar 19 20:20:14 crc kubenswrapper[4826]: I0319 20:20:14.069440 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_dc847c8f-db3b-41e1-b0bd-43faeeb17707/nova-cell1-novncproxy-novncproxy/0.log" Mar 19 20:20:14 crc kubenswrapper[4826]: I0319 20:20:14.325746 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_2fbc03ca-3891-4c9b-8ac9-88efbe950d4d/nova-api-api/0.log" Mar 19 20:20:14 crc kubenswrapper[4826]: I0319 20:20:14.801131 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_5660601a-af93-417b-b7c8-b68893743919/nova-metadata-log/0.log" Mar 19 20:20:15 crc kubenswrapper[4826]: I0319 20:20:15.330647 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_c55cf1ac-b2f2-4801-bb17-8e9b703d33eb/nova-scheduler-scheduler/0.log" Mar 19 20:20:15 crc kubenswrapper[4826]: I0319 20:20:15.358796 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-ppww7_d55a2c66-72a6-4026-90be-b4a15bf7914a/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 20:20:15 crc kubenswrapper[4826]: I0319 20:20:15.454410 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_38814433-1737-49df-966a-ac3511ed48dd/mysql-bootstrap/0.log" Mar 19 20:20:15 crc kubenswrapper[4826]: I0319 20:20:15.603080 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_5660601a-af93-417b-b7c8-b68893743919/nova-metadata-metadata/0.log" Mar 19 20:20:15 crc kubenswrapper[4826]: I0319 20:20:15.617251 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_38814433-1737-49df-966a-ac3511ed48dd/mysql-bootstrap/0.log" Mar 19 20:20:15 crc kubenswrapper[4826]: I0319 20:20:15.659970 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_38814433-1737-49df-966a-ac3511ed48dd/galera/1.log" Mar 19 20:20:15 crc kubenswrapper[4826]: I0319 20:20:15.703852 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_38814433-1737-49df-966a-ac3511ed48dd/galera/0.log" Mar 19 20:20:15 crc kubenswrapper[4826]: I0319 20:20:15.821929 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_763c5ded-be94-49ad-9eea-447e444f24f3/mysql-bootstrap/0.log" Mar 19 20:20:16 crc kubenswrapper[4826]: I0319 20:20:16.068473 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_763c5ded-be94-49ad-9eea-447e444f24f3/mysql-bootstrap/0.log" Mar 19 20:20:16 crc kubenswrapper[4826]: I0319 20:20:16.154387 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_763c5ded-be94-49ad-9eea-447e444f24f3/galera/0.log" Mar 19 20:20:16 crc kubenswrapper[4826]: I0319 20:20:16.179496 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_763c5ded-be94-49ad-9eea-447e444f24f3/galera/1.log" Mar 19 20:20:16 crc kubenswrapper[4826]: I0319 20:20:16.942403 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-7srxv_0c42846f-f2bf-4928-bc9b-d200b10cbf9b/openstack-network-exporter/0.log" Mar 19 20:20:16 crc kubenswrapper[4826]: I0319 20:20:16.963478 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_a1f4e03c-3679-4122-a562-0de802c7f1c8/openstackclient/0.log" Mar 19 20:20:17 crc kubenswrapper[4826]: I0319 20:20:17.106462 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-2vpmv_91261e82-2106-4923-9f54-8a7e04b033b8/ovsdb-server-init/0.log" Mar 19 20:20:17 crc kubenswrapper[4826]: I0319 20:20:17.325520 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-2vpmv_91261e82-2106-4923-9f54-8a7e04b033b8/ovsdb-server-init/0.log" Mar 19 20:20:17 crc kubenswrapper[4826]: I0319 20:20:17.341999 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-2vpmv_91261e82-2106-4923-9f54-8a7e04b033b8/ovs-vswitchd/0.log" Mar 19 20:20:17 crc kubenswrapper[4826]: I0319 20:20:17.357718 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-2vpmv_91261e82-2106-4923-9f54-8a7e04b033b8/ovsdb-server/0.log" Mar 19 20:20:17 crc kubenswrapper[4826]: I0319 20:20:17.630493 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-wdll6_2ed5ed9d-f761-4b5d-8cc8-07693c1d1289/ovn-controller/0.log" Mar 19 20:20:17 crc kubenswrapper[4826]: I0319 20:20:17.724080 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-ftdhw_56108211-8bba-4740-8883-b40c8a139f8e/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 20:20:17 crc kubenswrapper[4826]: I0319 20:20:17.832241 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_d6c4b87f-ecc2-49f3-98db-f9e81cacdb7e/openstack-network-exporter/0.log" Mar 19 20:20:17 crc kubenswrapper[4826]: I0319 20:20:17.881778 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_d6c4b87f-ecc2-49f3-98db-f9e81cacdb7e/ovn-northd/0.log" Mar 19 20:20:17 crc kubenswrapper[4826]: I0319 20:20:17.900981 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_7411e60b-ca3b-4409-a289-0513649c49b3/openstack-network-exporter/0.log" Mar 19 20:20:18 crc kubenswrapper[4826]: I0319 20:20:18.100633 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_7411e60b-ca3b-4409-a289-0513649c49b3/ovsdbserver-nb/0.log" Mar 19 20:20:18 crc kubenswrapper[4826]: I0319 20:20:18.182856 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_a1bd0e4d-d264-47a7-a3d0-71d4824ca253/openstack-network-exporter/0.log" Mar 19 20:20:18 crc kubenswrapper[4826]: I0319 20:20:18.238322 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_a1bd0e4d-d264-47a7-a3d0-71d4824ca253/ovsdbserver-sb/0.log" Mar 19 20:20:18 crc kubenswrapper[4826]: I0319 20:20:18.446479 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-58fdd88484-zg97f_aa0311ae-1bf2-42fe-a9f8-557f722b8065/placement-api/0.log" Mar 19 20:20:18 crc kubenswrapper[4826]: I0319 20:20:18.525312 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-58fdd88484-zg97f_aa0311ae-1bf2-42fe-a9f8-557f722b8065/placement-log/0.log" Mar 19 20:20:18 crc kubenswrapper[4826]: I0319 20:20:18.586700 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_bf194957-ec68-4ea7-b094-3e0912bc3bc5/init-config-reloader/0.log" Mar 19 20:20:18 crc kubenswrapper[4826]: I0319 20:20:18.814562 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_bf194957-ec68-4ea7-b094-3e0912bc3bc5/init-config-reloader/0.log" Mar 19 20:20:18 crc kubenswrapper[4826]: I0319 20:20:18.845779 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_bf194957-ec68-4ea7-b094-3e0912bc3bc5/prometheus/0.log" Mar 19 20:20:18 crc kubenswrapper[4826]: I0319 20:20:18.846494 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_bf194957-ec68-4ea7-b094-3e0912bc3bc5/prometheus/1.log" Mar 19 20:20:18 crc kubenswrapper[4826]: I0319 20:20:18.868211 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_bf194957-ec68-4ea7-b094-3e0912bc3bc5/config-reloader/0.log" Mar 19 20:20:19 crc kubenswrapper[4826]: I0319 20:20:19.050206 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_bf194957-ec68-4ea7-b094-3e0912bc3bc5/thanos-sidecar/0.log" Mar 19 20:20:19 crc kubenswrapper[4826]: I0319 20:20:19.104877 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_208228fc-8848-4817-96ea-48e37f6386ce/setup-container/0.log" Mar 19 20:20:19 crc kubenswrapper[4826]: I0319 20:20:19.432576 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_208228fc-8848-4817-96ea-48e37f6386ce/rabbitmq/0.log" Mar 19 20:20:19 crc kubenswrapper[4826]: I0319 20:20:19.434418 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c7e2447b-d047-4e45-a992-ffa82c1c5215/setup-container/0.log" Mar 19 20:20:19 crc kubenswrapper[4826]: I0319 20:20:19.453320 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_208228fc-8848-4817-96ea-48e37f6386ce/setup-container/0.log" Mar 19 20:20:19 crc kubenswrapper[4826]: I0319 20:20:19.772735 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c7e2447b-d047-4e45-a992-ffa82c1c5215/setup-container/0.log" Mar 19 20:20:19 crc kubenswrapper[4826]: I0319 20:20:19.828840 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-1_97a64d0f-56cc-4ec0-9e02-49fbbe998f43/setup-container/0.log" Mar 19 20:20:19 crc kubenswrapper[4826]: I0319 20:20:19.838517 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c7e2447b-d047-4e45-a992-ffa82c1c5215/rabbitmq/0.log" Mar 19 20:20:19 crc kubenswrapper[4826]: I0319 20:20:19.978188 4826 scope.go:117] "RemoveContainer" containerID="63ecc124824e01c5ccfd2a32cf4bb3e2efc0746dfbe17c96f0b271731ffe1823" Mar 19 20:20:19 crc kubenswrapper[4826]: E0319 20:20:19.978626 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 20:20:20 crc kubenswrapper[4826]: I0319 20:20:20.059733 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-2_de5e1809-5292-4c32-a83e-9dbb01f1db4b/setup-container/0.log" Mar 19 20:20:20 crc kubenswrapper[4826]: I0319 20:20:20.081773 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-1_97a64d0f-56cc-4ec0-9e02-49fbbe998f43/rabbitmq/0.log" Mar 19 20:20:20 crc kubenswrapper[4826]: I0319 20:20:20.112296 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-1_97a64d0f-56cc-4ec0-9e02-49fbbe998f43/setup-container/0.log" Mar 19 20:20:20 crc kubenswrapper[4826]: I0319 20:20:20.365923 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-tfjnp_23035475-3ff6-49e0-9810-c27013a74f8c/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 20:20:20 crc kubenswrapper[4826]: I0319 20:20:20.373162 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-2_de5e1809-5292-4c32-a83e-9dbb01f1db4b/setup-container/0.log" Mar 19 20:20:20 crc kubenswrapper[4826]: I0319 20:20:20.435921 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-2_de5e1809-5292-4c32-a83e-9dbb01f1db4b/rabbitmq/0.log" Mar 19 20:20:20 crc kubenswrapper[4826]: I0319 20:20:20.645591 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-qb95v_d00cc58b-deec-42ba-aab9-2f6cfd6ff5a4/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 20:20:20 crc kubenswrapper[4826]: I0319 20:20:20.707000 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-k6wzl_d499bbf3-6fa0-4467-92f6-7ccaa0f71b06/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 20:20:20 crc kubenswrapper[4826]: I0319 20:20:20.912588 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-jb5ln_0778f133-8f61-4ccc-bde7-a664c3ff638b/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 20:20:20 crc kubenswrapper[4826]: I0319 20:20:20.971956 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-njzdg_a28972ae-f55e-4db8-9c20-28befd60e934/ssh-known-hosts-edpm-deployment/0.log" Mar 19 20:20:21 crc kubenswrapper[4826]: I0319 20:20:21.229881 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5448ff86dc-ccghd_2b2152b5-e974-469b-84e7-c0ca4d4a9826/proxy-server/0.log" Mar 19 20:20:21 crc kubenswrapper[4826]: I0319 20:20:21.338097 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5448ff86dc-ccghd_2b2152b5-e974-469b-84e7-c0ca4d4a9826/proxy-httpd/0.log" Mar 19 20:20:21 crc kubenswrapper[4826]: I0319 20:20:21.444999 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-s8pzl_9327e4ee-66b1-4f08-9cb8-9facc43491b4/swift-ring-rebalance/0.log" Mar 19 20:20:21 crc kubenswrapper[4826]: I0319 20:20:21.527846 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_775f9d8a-377a-4913-b2d2-3bb1b7aec077/account-auditor/0.log" Mar 19 20:20:21 crc kubenswrapper[4826]: I0319 20:20:21.566873 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_775f9d8a-377a-4913-b2d2-3bb1b7aec077/account-reaper/0.log" Mar 19 20:20:21 crc kubenswrapper[4826]: I0319 20:20:21.818722 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_775f9d8a-377a-4913-b2d2-3bb1b7aec077/account-replicator/0.log" Mar 19 20:20:21 crc kubenswrapper[4826]: I0319 20:20:21.837621 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_775f9d8a-377a-4913-b2d2-3bb1b7aec077/account-server/0.log" Mar 19 20:20:21 crc kubenswrapper[4826]: I0319 20:20:21.837960 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_775f9d8a-377a-4913-b2d2-3bb1b7aec077/container-auditor/0.log" Mar 19 20:20:21 crc kubenswrapper[4826]: I0319 20:20:21.848325 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_775f9d8a-377a-4913-b2d2-3bb1b7aec077/container-replicator/0.log" Mar 19 20:20:22 crc kubenswrapper[4826]: I0319 20:20:22.004926 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_775f9d8a-377a-4913-b2d2-3bb1b7aec077/container-updater/0.log" Mar 19 20:20:22 crc kubenswrapper[4826]: I0319 20:20:22.064621 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_775f9d8a-377a-4913-b2d2-3bb1b7aec077/container-server/0.log" Mar 19 20:20:22 crc kubenswrapper[4826]: I0319 20:20:22.066190 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_775f9d8a-377a-4913-b2d2-3bb1b7aec077/object-expirer/0.log" Mar 19 20:20:22 crc kubenswrapper[4826]: I0319 20:20:22.083361 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_775f9d8a-377a-4913-b2d2-3bb1b7aec077/object-auditor/0.log" Mar 19 20:20:22 crc kubenswrapper[4826]: I0319 20:20:22.265449 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_775f9d8a-377a-4913-b2d2-3bb1b7aec077/object-replicator/0.log" Mar 19 20:20:22 crc kubenswrapper[4826]: I0319 20:20:22.295213 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_775f9d8a-377a-4913-b2d2-3bb1b7aec077/object-server/0.log" Mar 19 20:20:22 crc kubenswrapper[4826]: I0319 20:20:22.297093 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_775f9d8a-377a-4913-b2d2-3bb1b7aec077/rsync/0.log" Mar 19 20:20:22 crc kubenswrapper[4826]: I0319 20:20:22.342665 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_775f9d8a-377a-4913-b2d2-3bb1b7aec077/object-updater/0.log" Mar 19 20:20:22 crc kubenswrapper[4826]: I0319 20:20:22.566902 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_775f9d8a-377a-4913-b2d2-3bb1b7aec077/swift-recon-cron/0.log" Mar 19 20:20:23 crc kubenswrapper[4826]: I0319 20:20:23.039848 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-78qhl_b0112fd9-267c-4357-8120-f42c43662900/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 20:20:23 crc kubenswrapper[4826]: I0319 20:20:23.242485 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_f524c648-b4b0-4f10-abf3-5a3ccc1d96a0/test-operator-logs-container/0.log" Mar 19 20:20:23 crc kubenswrapper[4826]: I0319 20:20:23.357524 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-power-monitoring-edpm-deployment-openstack-edpm-hzf4h_22837e37-d9e9-4044-b5bf-34a4c866ebf0/telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 20:20:23 crc kubenswrapper[4826]: I0319 20:20:23.441919 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-kjfmq_d357bfcc-5ce5-456d-b8b7-142ef30a57b0/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 20:20:23 crc kubenswrapper[4826]: I0319 20:20:23.501200 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_36ba563e-3246-4dbb-ba34-e15fd6646fad/tempest-tests-tempest-tests-runner/0.log" Mar 19 20:20:31 crc kubenswrapper[4826]: I0319 20:20:31.143006 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_128ebe90-32c9-409b-b145-5f7f95c7dbbf/memcached/0.log" Mar 19 20:20:34 crc kubenswrapper[4826]: I0319 20:20:34.976100 4826 scope.go:117] "RemoveContainer" containerID="63ecc124824e01c5ccfd2a32cf4bb3e2efc0746dfbe17c96f0b271731ffe1823" Mar 19 20:20:34 crc kubenswrapper[4826]: E0319 20:20:34.977921 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 20:20:43 crc kubenswrapper[4826]: I0319 20:20:43.490478 4826 scope.go:117] "RemoveContainer" containerID="945b8e02372f2bc787044d5007674cf89e3f5889d0014bbaf6ca51c6a1bcb018" Mar 19 20:20:48 crc kubenswrapper[4826]: I0319 20:20:48.977101 4826 scope.go:117] "RemoveContainer" containerID="63ecc124824e01c5ccfd2a32cf4bb3e2efc0746dfbe17c96f0b271731ffe1823" Mar 19 20:20:48 crc kubenswrapper[4826]: E0319 20:20:48.977844 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 20:20:54 crc kubenswrapper[4826]: I0319 20:20:54.164247 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908b64vts_6898d200-6ab5-4f88-8390-712724dbeb63/util/0.log" Mar 19 20:20:54 crc kubenswrapper[4826]: I0319 20:20:54.439624 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908b64vts_6898d200-6ab5-4f88-8390-712724dbeb63/pull/0.log" Mar 19 20:20:54 crc kubenswrapper[4826]: I0319 20:20:54.486339 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908b64vts_6898d200-6ab5-4f88-8390-712724dbeb63/pull/0.log" Mar 19 20:20:54 crc kubenswrapper[4826]: I0319 20:20:54.501308 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908b64vts_6898d200-6ab5-4f88-8390-712724dbeb63/util/0.log" Mar 19 20:20:54 crc kubenswrapper[4826]: I0319 20:20:54.795458 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908b64vts_6898d200-6ab5-4f88-8390-712724dbeb63/extract/0.log" Mar 19 20:20:54 crc kubenswrapper[4826]: I0319 20:20:54.805812 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908b64vts_6898d200-6ab5-4f88-8390-712724dbeb63/pull/0.log" Mar 19 20:20:54 crc kubenswrapper[4826]: I0319 20:20:54.889883 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_05cb5f7ae00b7b66d548dae7cc683a8a3d9e5bec9566df415063c2908b64vts_6898d200-6ab5-4f88-8390-712724dbeb63/util/0.log" Mar 19 20:20:55 crc kubenswrapper[4826]: I0319 20:20:55.016488 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59bc569d95-jjqrs_5f60643c-c919-436b-bd23-9e39698d9c9b/manager/0.log" Mar 19 20:20:55 crc kubenswrapper[4826]: I0319 20:20:55.375544 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-588d4d986b-hf8n5_080fa697-4720-424e-b75e-6564061cd68f/manager/1.log" Mar 19 20:20:55 crc kubenswrapper[4826]: I0319 20:20:55.607741 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-588d4d986b-hf8n5_080fa697-4720-424e-b75e-6564061cd68f/manager/0.log" Mar 19 20:20:55 crc kubenswrapper[4826]: I0319 20:20:55.858666 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-79df6bcc97-8265b_0f77f094-1b90-43a6-85be-27e8b1fda71f/manager/1.log" Mar 19 20:20:56 crc kubenswrapper[4826]: I0319 20:20:56.093495 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-79df6bcc97-8265b_0f77f094-1b90-43a6-85be-27e8b1fda71f/manager/0.log" Mar 19 20:20:56 crc kubenswrapper[4826]: I0319 20:20:56.255855 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d58dc466-zm4ps_38267b94-39ea-4067-9b6e-3d863ff60494/manager/1.log" Mar 19 20:20:56 crc kubenswrapper[4826]: I0319 20:20:56.306839 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-67dd5f86f5-rsrjx_d2375678-e630-4376-9dfd-28efbc77aed4/manager/1.log" Mar 19 20:20:56 crc kubenswrapper[4826]: I0319 20:20:56.403748 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-67dd5f86f5-rsrjx_d2375678-e630-4376-9dfd-28efbc77aed4/manager/0.log" Mar 19 20:20:56 crc kubenswrapper[4826]: I0319 20:20:56.520833 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-8464cc45fb-ngb9j_ee5c97c9-5dc0-4292-9a34-08ca45f5387a/manager/1.log" Mar 19 20:20:56 crc kubenswrapper[4826]: I0319 20:20:56.658549 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-8464cc45fb-ngb9j_ee5c97c9-5dc0-4292-9a34-08ca45f5387a/manager/0.log" Mar 19 20:20:56 crc kubenswrapper[4826]: I0319 20:20:56.874490 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7b9c774f96-zjkbj_a960df53-d712-424a-85a7-64b0e50c911f/manager/1.log" Mar 19 20:20:57 crc kubenswrapper[4826]: I0319 20:20:57.182395 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f787dddc9-725xd_f073a654-efe9-4fd0-9c08-23d9fdb0d492/manager/1.log" Mar 19 20:20:57 crc kubenswrapper[4826]: I0319 20:20:57.420584 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f787dddc9-725xd_f073a654-efe9-4fd0-9c08-23d9fdb0d492/manager/0.log" Mar 19 20:20:57 crc kubenswrapper[4826]: I0319 20:20:57.447753 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7b9c774f96-zjkbj_a960df53-d712-424a-85a7-64b0e50c911f/manager/0.log" Mar 19 20:20:57 crc kubenswrapper[4826]: I0319 20:20:57.499694 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d58dc466-zm4ps_38267b94-39ea-4067-9b6e-3d863ff60494/manager/0.log" Mar 19 20:20:57 crc kubenswrapper[4826]: I0319 20:20:57.773672 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-768b96df4c-4bkbn_49f5fbe6-ba93-4ff2-b575-aa08dceb2622/manager/1.log" Mar 19 20:20:57 crc kubenswrapper[4826]: I0319 20:20:57.779548 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-768b96df4c-4bkbn_49f5fbe6-ba93-4ff2-b575-aa08dceb2622/manager/0.log" Mar 19 20:20:57 crc kubenswrapper[4826]: I0319 20:20:57.827627 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-55f864c847-zrczt_6a5ffd48-ea97-46a0-b9ed-f7c38d5d8a90/manager/0.log" Mar 19 20:20:58 crc kubenswrapper[4826]: I0319 20:20:58.009704 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67ccfc9778-xpq6x_271f8c86-929d-46a4-8852-f5ec8e701bcb/manager/0.log" Mar 19 20:20:58 crc kubenswrapper[4826]: I0319 20:20:58.085534 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-sfs65_918ac815-fe60-44b9-b6c0-c99ee8dc80b8/manager/1.log" Mar 19 20:20:58 crc kubenswrapper[4826]: I0319 20:20:58.271804 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-sfs65_918ac815-fe60-44b9-b6c0-c99ee8dc80b8/manager/0.log" Mar 19 20:20:58 crc kubenswrapper[4826]: I0319 20:20:58.396141 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d488d59fb-tjcmb_44055ef9-1bc5-4b25-a40d-553a1546fc15/manager/1.log" Mar 19 20:20:58 crc kubenswrapper[4826]: I0319 20:20:58.533505 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d488d59fb-tjcmb_44055ef9-1bc5-4b25-a40d-553a1546fc15/manager/0.log" Mar 19 20:20:58 crc kubenswrapper[4826]: I0319 20:20:58.577607 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5b9f45d989-j4p25_7137162e-cccf-4ce6-9dc4-7380db33a85a/manager/1.log" Mar 19 20:20:58 crc kubenswrapper[4826]: I0319 20:20:58.751102 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5b9f45d989-j4p25_7137162e-cccf-4ce6-9dc4-7380db33a85a/manager/0.log" Mar 19 20:20:58 crc kubenswrapper[4826]: I0319 20:20:58.919095 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-89d64c458-b76w9_4f382869-5ee2-4a46-8188-d4ddd0bee2fa/manager/0.log" Mar 19 20:20:59 crc kubenswrapper[4826]: I0319 20:20:59.086277 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-89d64c458-b76w9_4f382869-5ee2-4a46-8188-d4ddd0bee2fa/manager/1.log" Mar 19 20:20:59 crc kubenswrapper[4826]: I0319 20:20:59.203441 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-6c6f68556d-k5tlt_c045bb2f-b87b-4a14-92b5-0b98cdc7a0d1/operator/1.log" Mar 19 20:20:59 crc kubenswrapper[4826]: I0319 20:20:59.342741 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-646cd56bc9-8t2bm_50980b03-91b0-4e4d-9923-e2a531458fd4/manager/1.log" Mar 19 20:20:59 crc kubenswrapper[4826]: I0319 20:20:59.459110 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-6c6f68556d-k5tlt_c045bb2f-b87b-4a14-92b5-0b98cdc7a0d1/operator/0.log" Mar 19 20:20:59 crc kubenswrapper[4826]: I0319 20:20:59.963549 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-wrwzn_f9f3d33c-0421-473c-94e6-a7860932d772/registry-server/1.log" Mar 19 20:21:00 crc kubenswrapper[4826]: I0319 20:21:00.084518 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-wrwzn_f9f3d33c-0421-473c-94e6-a7860932d772/registry-server/0.log" Mar 19 20:21:00 crc kubenswrapper[4826]: I0319 20:21:00.251946 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-884679f54-zs74n_6243b523-966a-4f1d-b663-2f1ed4614fdb/manager/1.log" Mar 19 20:21:00 crc kubenswrapper[4826]: I0319 20:21:00.380635 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-884679f54-zs74n_6243b523-966a-4f1d-b663-2f1ed4614fdb/manager/0.log" Mar 19 20:21:00 crc kubenswrapper[4826]: I0319 20:21:00.414621 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5784578c99-kkmzl_79a89fcd-3226-4314-951d-d94af2ac242c/manager/1.log" Mar 19 20:21:00 crc kubenswrapper[4826]: I0319 20:21:00.655478 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5784578c99-kkmzl_79a89fcd-3226-4314-951d-d94af2ac242c/manager/0.log" Mar 19 20:21:00 crc kubenswrapper[4826]: I0319 20:21:00.662218 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-5jvh4_b00ec043-3d8c-41dd-bbef-fc99f7ad0bb6/operator/1.log" Mar 19 20:21:00 crc kubenswrapper[4826]: I0319 20:21:00.722140 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-5jvh4_b00ec043-3d8c-41dd-bbef-fc99f7ad0bb6/operator/0.log" Mar 19 20:21:00 crc kubenswrapper[4826]: I0319 20:21:00.985580 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-c674c5965-skdcp_aff2d31f-3465-4c0c-8bbf-b04dfdb92db0/manager/0.log" Mar 19 20:21:01 crc kubenswrapper[4826]: I0319 20:21:01.203255 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-6vmk6_e36e6f7a-53ec-4262-b9e5-798353e5bf15/manager/0.log" Mar 19 20:21:01 crc kubenswrapper[4826]: I0319 20:21:01.213363 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-6vmk6_e36e6f7a-53ec-4262-b9e5-798353e5bf15/manager/1.log" Mar 19 20:21:01 crc kubenswrapper[4826]: I0319 20:21:01.281798 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-646cd56bc9-8t2bm_50980b03-91b0-4e4d-9923-e2a531458fd4/manager/0.log" Mar 19 20:21:01 crc kubenswrapper[4826]: I0319 20:21:01.364227 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6c5c766d94-258q2_5d8869b3-7d43-4db2-b79d-f05c13d0d6f2/manager/0.log" Mar 19 20:21:01 crc kubenswrapper[4826]: I0319 20:21:01.462621 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-7l4t6_dc64459f-49c1-41f5-b946-88ab7bc8e1d8/manager/0.log" Mar 19 20:21:01 crc kubenswrapper[4826]: I0319 20:21:01.981351 4826 scope.go:117] "RemoveContainer" containerID="63ecc124824e01c5ccfd2a32cf4bb3e2efc0746dfbe17c96f0b271731ffe1823" Mar 19 20:21:01 crc kubenswrapper[4826]: E0319 20:21:01.981567 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 20:21:15 crc kubenswrapper[4826]: I0319 20:21:15.987958 4826 scope.go:117] "RemoveContainer" containerID="63ecc124824e01c5ccfd2a32cf4bb3e2efc0746dfbe17c96f0b271731ffe1823" Mar 19 20:21:15 crc kubenswrapper[4826]: E0319 20:21:15.992594 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 20:21:25 crc kubenswrapper[4826]: I0319 20:21:25.403947 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-nw52z_d3d2b5c3-e37e-4a58-af35-8980d9c8d43a/control-plane-machine-set-operator/0.log" Mar 19 20:21:25 crc kubenswrapper[4826]: I0319 20:21:25.518959 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-6tq97_84d8b63e-fcf9-45c2-98be-d2aa00660cee/kube-rbac-proxy/0.log" Mar 19 20:21:25 crc kubenswrapper[4826]: I0319 20:21:25.543110 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-6tq97_84d8b63e-fcf9-45c2-98be-d2aa00660cee/machine-api-operator/0.log" Mar 19 20:21:27 crc kubenswrapper[4826]: I0319 20:21:27.977273 4826 scope.go:117] "RemoveContainer" containerID="63ecc124824e01c5ccfd2a32cf4bb3e2efc0746dfbe17c96f0b271731ffe1823" Mar 19 20:21:27 crc kubenswrapper[4826]: E0319 20:21:27.977956 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 20:21:38 crc kubenswrapper[4826]: I0319 20:21:38.979806 4826 scope.go:117] "RemoveContainer" containerID="63ecc124824e01c5ccfd2a32cf4bb3e2efc0746dfbe17c96f0b271731ffe1823" Mar 19 20:21:38 crc kubenswrapper[4826]: E0319 20:21:38.980624 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 20:21:41 crc kubenswrapper[4826]: I0319 20:21:41.552618 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-btnzb_5161bc15-b664-4436-ae44-dc85f97b5dd3/cert-manager-controller/0.log" Mar 19 20:21:41 crc kubenswrapper[4826]: I0319 20:21:41.731082 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-mdv5n_ed4460c1-0ccc-4545-9477-7a5e8380c5e6/cert-manager-cainjector/0.log" Mar 19 20:21:41 crc kubenswrapper[4826]: I0319 20:21:41.823127 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-fhw8v_fe2ad622-0df2-4cb2-8c00-45f4d9a8a1c3/cert-manager-webhook/0.log" Mar 19 20:21:50 crc kubenswrapper[4826]: I0319 20:21:50.976049 4826 scope.go:117] "RemoveContainer" containerID="63ecc124824e01c5ccfd2a32cf4bb3e2efc0746dfbe17c96f0b271731ffe1823" Mar 19 20:21:50 crc kubenswrapper[4826]: E0319 20:21:50.976909 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 20:21:57 crc kubenswrapper[4826]: I0319 20:21:57.261650 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-8f82b_d34bdc1e-adc4-4363-b623-6644010a58dc/nmstate-console-plugin/0.log" Mar 19 20:21:57 crc kubenswrapper[4826]: I0319 20:21:57.542613 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-zqv4g_9f450458-8845-4ec5-9971-6df9dd448312/nmstate-handler/0.log" Mar 19 20:21:57 crc kubenswrapper[4826]: I0319 20:21:57.589509 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-ws7kv_6365b3ab-4090-4880-b583-0ad075dd3c4d/kube-rbac-proxy/0.log" Mar 19 20:21:57 crc kubenswrapper[4826]: I0319 20:21:57.691189 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-ws7kv_6365b3ab-4090-4880-b583-0ad075dd3c4d/nmstate-metrics/0.log" Mar 19 20:21:57 crc kubenswrapper[4826]: I0319 20:21:57.806121 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-jtglg_b08225e9-d4ce-4b9f-b87e-c586e3d8c47f/nmstate-operator/0.log" Mar 19 20:21:57 crc kubenswrapper[4826]: I0319 20:21:57.942712 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-d4pjw_afb786fa-7916-4f36-9978-5bd829c9dbf8/nmstate-webhook/0.log" Mar 19 20:22:00 crc kubenswrapper[4826]: I0319 20:22:00.155922 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565862-8g8qh"] Mar 19 20:22:00 crc kubenswrapper[4826]: E0319 20:22:00.158320 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a654ca5-dc45-4a08-9c35-e25fb06760c6" containerName="oc" Mar 19 20:22:00 crc kubenswrapper[4826]: I0319 20:22:00.158340 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a654ca5-dc45-4a08-9c35-e25fb06760c6" containerName="oc" Mar 19 20:22:00 crc kubenswrapper[4826]: I0319 20:22:00.158721 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a654ca5-dc45-4a08-9c35-e25fb06760c6" containerName="oc" Mar 19 20:22:00 crc kubenswrapper[4826]: I0319 20:22:00.159714 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565862-8g8qh" Mar 19 20:22:00 crc kubenswrapper[4826]: I0319 20:22:00.163506 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 20:22:00 crc kubenswrapper[4826]: I0319 20:22:00.163900 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 20:22:00 crc kubenswrapper[4826]: I0319 20:22:00.164191 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b27wl" Mar 19 20:22:00 crc kubenswrapper[4826]: I0319 20:22:00.177570 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565862-8g8qh"] Mar 19 20:22:00 crc kubenswrapper[4826]: I0319 20:22:00.220425 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22tf2\" (UniqueName: \"kubernetes.io/projected/266a96af-b06d-410f-a96e-567f232185be-kube-api-access-22tf2\") pod \"auto-csr-approver-29565862-8g8qh\" (UID: \"266a96af-b06d-410f-a96e-567f232185be\") " pod="openshift-infra/auto-csr-approver-29565862-8g8qh" Mar 19 20:22:00 crc kubenswrapper[4826]: I0319 20:22:00.322114 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22tf2\" (UniqueName: \"kubernetes.io/projected/266a96af-b06d-410f-a96e-567f232185be-kube-api-access-22tf2\") pod \"auto-csr-approver-29565862-8g8qh\" (UID: \"266a96af-b06d-410f-a96e-567f232185be\") " pod="openshift-infra/auto-csr-approver-29565862-8g8qh" Mar 19 20:22:00 crc kubenswrapper[4826]: I0319 20:22:00.366682 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22tf2\" (UniqueName: \"kubernetes.io/projected/266a96af-b06d-410f-a96e-567f232185be-kube-api-access-22tf2\") pod \"auto-csr-approver-29565862-8g8qh\" (UID: \"266a96af-b06d-410f-a96e-567f232185be\") " pod="openshift-infra/auto-csr-approver-29565862-8g8qh" Mar 19 20:22:00 crc kubenswrapper[4826]: I0319 20:22:00.481369 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565862-8g8qh" Mar 19 20:22:01 crc kubenswrapper[4826]: I0319 20:22:01.533091 4826 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 20:22:01 crc kubenswrapper[4826]: I0319 20:22:01.540474 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565862-8g8qh"] Mar 19 20:22:01 crc kubenswrapper[4826]: I0319 20:22:01.910251 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565862-8g8qh" event={"ID":"266a96af-b06d-410f-a96e-567f232185be","Type":"ContainerStarted","Data":"4d21c9fa1fbd01ba802580f9d2e83cf25ccfa2334735a1f989cd4021cb03f9fd"} Mar 19 20:22:03 crc kubenswrapper[4826]: I0319 20:22:03.943278 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565862-8g8qh" event={"ID":"266a96af-b06d-410f-a96e-567f232185be","Type":"ContainerStarted","Data":"26a03e5c5c482527d2b2a1d3880cb796f7ac8ca88ede2ad192109ea489987c0b"} Mar 19 20:22:03 crc kubenswrapper[4826]: I0319 20:22:03.966814 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565862-8g8qh" podStartSLOduration=3.076477993 podStartE2EDuration="3.966794348s" podCreationTimestamp="2026-03-19 20:22:00 +0000 UTC" firstStartedPulling="2026-03-19 20:22:01.532239674 +0000 UTC m=+5146.286307987" lastFinishedPulling="2026-03-19 20:22:02.422556019 +0000 UTC m=+5147.176624342" observedRunningTime="2026-03-19 20:22:03.96059393 +0000 UTC m=+5148.714662243" watchObservedRunningTime="2026-03-19 20:22:03.966794348 +0000 UTC m=+5148.720862661" Mar 19 20:22:04 crc kubenswrapper[4826]: I0319 20:22:04.955665 4826 generic.go:334] "Generic (PLEG): container finished" podID="266a96af-b06d-410f-a96e-567f232185be" containerID="26a03e5c5c482527d2b2a1d3880cb796f7ac8ca88ede2ad192109ea489987c0b" exitCode=0 Mar 19 20:22:04 crc kubenswrapper[4826]: I0319 20:22:04.955706 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565862-8g8qh" event={"ID":"266a96af-b06d-410f-a96e-567f232185be","Type":"ContainerDied","Data":"26a03e5c5c482527d2b2a1d3880cb796f7ac8ca88ede2ad192109ea489987c0b"} Mar 19 20:22:05 crc kubenswrapper[4826]: I0319 20:22:05.986393 4826 scope.go:117] "RemoveContainer" containerID="63ecc124824e01c5ccfd2a32cf4bb3e2efc0746dfbe17c96f0b271731ffe1823" Mar 19 20:22:05 crc kubenswrapper[4826]: E0319 20:22:05.987217 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 20:22:06 crc kubenswrapper[4826]: I0319 20:22:06.418856 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565862-8g8qh" Mar 19 20:22:06 crc kubenswrapper[4826]: I0319 20:22:06.529086 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22tf2\" (UniqueName: \"kubernetes.io/projected/266a96af-b06d-410f-a96e-567f232185be-kube-api-access-22tf2\") pod \"266a96af-b06d-410f-a96e-567f232185be\" (UID: \"266a96af-b06d-410f-a96e-567f232185be\") " Mar 19 20:22:06 crc kubenswrapper[4826]: I0319 20:22:06.698598 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/266a96af-b06d-410f-a96e-567f232185be-kube-api-access-22tf2" (OuterVolumeSpecName: "kube-api-access-22tf2") pod "266a96af-b06d-410f-a96e-567f232185be" (UID: "266a96af-b06d-410f-a96e-567f232185be"). InnerVolumeSpecName "kube-api-access-22tf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:22:06 crc kubenswrapper[4826]: I0319 20:22:06.792075 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22tf2\" (UniqueName: \"kubernetes.io/projected/266a96af-b06d-410f-a96e-567f232185be-kube-api-access-22tf2\") on node \"crc\" DevicePath \"\"" Mar 19 20:22:07 crc kubenswrapper[4826]: I0319 20:22:07.031285 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565862-8g8qh" event={"ID":"266a96af-b06d-410f-a96e-567f232185be","Type":"ContainerDied","Data":"4d21c9fa1fbd01ba802580f9d2e83cf25ccfa2334735a1f989cd4021cb03f9fd"} Mar 19 20:22:07 crc kubenswrapper[4826]: I0319 20:22:07.031344 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d21c9fa1fbd01ba802580f9d2e83cf25ccfa2334735a1f989cd4021cb03f9fd" Mar 19 20:22:07 crc kubenswrapper[4826]: I0319 20:22:07.031356 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565862-8g8qh" Mar 19 20:22:07 crc kubenswrapper[4826]: I0319 20:22:07.049510 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565856-nwv2c"] Mar 19 20:22:07 crc kubenswrapper[4826]: I0319 20:22:07.058542 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565856-nwv2c"] Mar 19 20:22:07 crc kubenswrapper[4826]: I0319 20:22:07.992988 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="773a141c-3fd8-4f47-b8e1-ad98015ea7d4" path="/var/lib/kubelet/pods/773a141c-3fd8-4f47-b8e1-ad98015ea7d4/volumes" Mar 19 20:22:14 crc kubenswrapper[4826]: I0319 20:22:14.021981 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-d88f59dd5-fqs6s_84bba80c-841e-4df3-87e0-901afbc23bf3/manager/1.log" Mar 19 20:22:14 crc kubenswrapper[4826]: I0319 20:22:14.087303 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-d88f59dd5-fqs6s_84bba80c-841e-4df3-87e0-901afbc23bf3/kube-rbac-proxy/0.log" Mar 19 20:22:14 crc kubenswrapper[4826]: I0319 20:22:14.313590 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-d88f59dd5-fqs6s_84bba80c-841e-4df3-87e0-901afbc23bf3/manager/0.log" Mar 19 20:22:16 crc kubenswrapper[4826]: I0319 20:22:16.976059 4826 scope.go:117] "RemoveContainer" containerID="63ecc124824e01c5ccfd2a32cf4bb3e2efc0746dfbe17c96f0b271731ffe1823" Mar 19 20:22:16 crc kubenswrapper[4826]: E0319 20:22:16.976669 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 20:22:27 crc kubenswrapper[4826]: I0319 20:22:27.976723 4826 scope.go:117] "RemoveContainer" containerID="63ecc124824e01c5ccfd2a32cf4bb3e2efc0746dfbe17c96f0b271731ffe1823" Mar 19 20:22:27 crc kubenswrapper[4826]: E0319 20:22:27.978067 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 20:22:31 crc kubenswrapper[4826]: I0319 20:22:31.323626 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-8ff7d675-q6rkt_ac9e546d-1803-4ef0-a76c-6d9e0823c010/prometheus-operator/0.log" Mar 19 20:22:31 crc kubenswrapper[4826]: I0319 20:22:31.604871 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5b8dd7d9db-hhlpv_2e6734a7-50cb-4366-baab-fa0feba677f0/prometheus-operator-admission-webhook/0.log" Mar 19 20:22:32 crc kubenswrapper[4826]: I0319 20:22:32.337555 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-6dd7dd855f-tcdmb_217c809e-0af8-4b11-a5ce-932d698ed444/operator/1.log" Mar 19 20:22:32 crc kubenswrapper[4826]: I0319 20:22:32.363177 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5b8dd7d9db-jbnft_f1278453-65e7-42aa-8d60-b42e0c10f232/prometheus-operator-admission-webhook/0.log" Mar 19 20:22:32 crc kubenswrapper[4826]: I0319 20:22:32.716265 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-6dd7dd855f-tcdmb_217c809e-0af8-4b11-a5ce-932d698ed444/operator/0.log" Mar 19 20:22:32 crc kubenswrapper[4826]: I0319 20:22:32.899911 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-ui-dashboards-7f87b9b85b-9r5qq_e3abbb77-c3e9-4c0f-8038-2cdc6ddd10a5/observability-ui-dashboards/0.log" Mar 19 20:22:32 crc kubenswrapper[4826]: I0319 20:22:32.985964 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-6648f6899-wbmts_8eb71543-680b-4018-94e4-572cfcc12660/perses-operator/0.log" Mar 19 20:22:39 crc kubenswrapper[4826]: I0319 20:22:39.977100 4826 scope.go:117] "RemoveContainer" containerID="63ecc124824e01c5ccfd2a32cf4bb3e2efc0746dfbe17c96f0b271731ffe1823" Mar 19 20:22:39 crc kubenswrapper[4826]: E0319 20:22:39.978368 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 20:22:43 crc kubenswrapper[4826]: I0319 20:22:43.623155 4826 scope.go:117] "RemoveContainer" containerID="a27243d89376028b26d66dfc28c4f4d0217cd4075ac5c42e4aa3f89b931bc73f" Mar 19 20:22:50 crc kubenswrapper[4826]: I0319 20:22:50.899360 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_cluster-logging-operator-66689c4bbf-gpn69_c2f22ac6-fb47-448e-8570-b95a2688d081/cluster-logging-operator/0.log" Mar 19 20:22:51 crc kubenswrapper[4826]: I0319 20:22:51.066521 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_collector-76xlf_ec6a09bc-174e-4e15-a61c-74bac9b3baa3/collector/0.log" Mar 19 20:22:51 crc kubenswrapper[4826]: I0319 20:22:51.151861 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-compactor-0_377fff75-1f59-4c28-a3ed-2bd89e803b73/loki-compactor/0.log" Mar 19 20:22:51 crc kubenswrapper[4826]: I0319 20:22:51.291031 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-distributor-9c6b6d984-qrlfg_e1f51b15-5d82-43d5-b391-5f4b10434957/loki-distributor/0.log" Mar 19 20:22:51 crc kubenswrapper[4826]: I0319 20:22:51.347301 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-68b4bcd8f5-mhqzk_1e1484c9-801f-4999-9754-456df604d7ca/gateway/0.log" Mar 19 20:22:51 crc kubenswrapper[4826]: I0319 20:22:51.447492 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-68b4bcd8f5-mhqzk_1e1484c9-801f-4999-9754-456df604d7ca/opa/0.log" Mar 19 20:22:51 crc kubenswrapper[4826]: I0319 20:22:51.515555 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-68b4bcd8f5-zvtrc_64ca34d8-5f9f-448d-9ab2-414c5b4757e9/gateway/0.log" Mar 19 20:22:51 crc kubenswrapper[4826]: I0319 20:22:51.525400 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-68b4bcd8f5-zvtrc_64ca34d8-5f9f-448d-9ab2-414c5b4757e9/opa/0.log" Mar 19 20:22:51 crc kubenswrapper[4826]: I0319 20:22:51.720554 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-index-gateway-0_05505420-3d58-4de7-9da6-2f27e54c32f5/loki-index-gateway/0.log" Mar 19 20:22:51 crc kubenswrapper[4826]: I0319 20:22:51.824802 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-ingester-0_93990ea7-96ba-4c12-b92c-17a7c38aece4/loki-ingester/0.log" Mar 19 20:22:51 crc kubenswrapper[4826]: I0319 20:22:51.928461 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-querier-6dcbdf8bb8-qltmk_081e84d7-1c7e-4c6f-935e-ee01eaf393e2/loki-querier/0.log" Mar 19 20:22:52 crc kubenswrapper[4826]: I0319 20:22:52.057021 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-query-frontend-ff66c4dc9-l2p46_0fc08676-ae6f-4018-8f85-259585de45fe/loki-query-frontend/0.log" Mar 19 20:22:54 crc kubenswrapper[4826]: I0319 20:22:54.976031 4826 scope.go:117] "RemoveContainer" containerID="63ecc124824e01c5ccfd2a32cf4bb3e2efc0746dfbe17c96f0b271731ffe1823" Mar 19 20:22:54 crc kubenswrapper[4826]: E0319 20:22:54.976765 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 20:23:07 crc kubenswrapper[4826]: I0319 20:23:07.976628 4826 scope.go:117] "RemoveContainer" containerID="63ecc124824e01c5ccfd2a32cf4bb3e2efc0746dfbe17c96f0b271731ffe1823" Mar 19 20:23:07 crc kubenswrapper[4826]: E0319 20:23:07.977615 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 20:23:09 crc kubenswrapper[4826]: I0319 20:23:09.625802 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-cnfr9_13651756-55fe-46f1-b849-fbdc5dc20887/kube-rbac-proxy/0.log" Mar 19 20:23:09 crc kubenswrapper[4826]: I0319 20:23:09.748056 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-cnfr9_13651756-55fe-46f1-b849-fbdc5dc20887/controller/0.log" Mar 19 20:23:09 crc kubenswrapper[4826]: I0319 20:23:09.834940 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-prxxj_b724e39c-45b5-4701-b4f0-a19969224d90/cp-frr-files/0.log" Mar 19 20:23:10 crc kubenswrapper[4826]: I0319 20:23:10.052599 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-prxxj_b724e39c-45b5-4701-b4f0-a19969224d90/cp-reloader/0.log" Mar 19 20:23:10 crc kubenswrapper[4826]: I0319 20:23:10.055329 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-prxxj_b724e39c-45b5-4701-b4f0-a19969224d90/cp-frr-files/0.log" Mar 19 20:23:10 crc kubenswrapper[4826]: I0319 20:23:10.090499 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-prxxj_b724e39c-45b5-4701-b4f0-a19969224d90/cp-metrics/0.log" Mar 19 20:23:10 crc kubenswrapper[4826]: I0319 20:23:10.094026 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-prxxj_b724e39c-45b5-4701-b4f0-a19969224d90/cp-reloader/0.log" Mar 19 20:23:10 crc kubenswrapper[4826]: I0319 20:23:10.294114 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-prxxj_b724e39c-45b5-4701-b4f0-a19969224d90/cp-metrics/0.log" Mar 19 20:23:10 crc kubenswrapper[4826]: I0319 20:23:10.329197 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-prxxj_b724e39c-45b5-4701-b4f0-a19969224d90/cp-frr-files/0.log" Mar 19 20:23:10 crc kubenswrapper[4826]: I0319 20:23:10.336596 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-prxxj_b724e39c-45b5-4701-b4f0-a19969224d90/cp-metrics/0.log" Mar 19 20:23:10 crc kubenswrapper[4826]: I0319 20:23:10.385569 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-prxxj_b724e39c-45b5-4701-b4f0-a19969224d90/cp-reloader/0.log" Mar 19 20:23:10 crc kubenswrapper[4826]: I0319 20:23:10.593821 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-prxxj_b724e39c-45b5-4701-b4f0-a19969224d90/cp-metrics/0.log" Mar 19 20:23:10 crc kubenswrapper[4826]: I0319 20:23:10.606193 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-prxxj_b724e39c-45b5-4701-b4f0-a19969224d90/cp-reloader/0.log" Mar 19 20:23:10 crc kubenswrapper[4826]: I0319 20:23:10.614998 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-prxxj_b724e39c-45b5-4701-b4f0-a19969224d90/cp-frr-files/0.log" Mar 19 20:23:10 crc kubenswrapper[4826]: I0319 20:23:10.668575 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-prxxj_b724e39c-45b5-4701-b4f0-a19969224d90/controller/1.log" Mar 19 20:23:10 crc kubenswrapper[4826]: I0319 20:23:10.820066 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-prxxj_b724e39c-45b5-4701-b4f0-a19969224d90/controller/0.log" Mar 19 20:23:10 crc kubenswrapper[4826]: I0319 20:23:10.862495 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-prxxj_b724e39c-45b5-4701-b4f0-a19969224d90/frr/1.log" Mar 19 20:23:10 crc kubenswrapper[4826]: I0319 20:23:10.949945 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-prxxj_b724e39c-45b5-4701-b4f0-a19969224d90/frr-metrics/0.log" Mar 19 20:23:11 crc kubenswrapper[4826]: I0319 20:23:11.106616 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-prxxj_b724e39c-45b5-4701-b4f0-a19969224d90/kube-rbac-proxy/0.log" Mar 19 20:23:11 crc kubenswrapper[4826]: I0319 20:23:11.131072 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-prxxj_b724e39c-45b5-4701-b4f0-a19969224d90/kube-rbac-proxy-frr/0.log" Mar 19 20:23:11 crc kubenswrapper[4826]: I0319 20:23:11.212111 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-prxxj_b724e39c-45b5-4701-b4f0-a19969224d90/reloader/0.log" Mar 19 20:23:11 crc kubenswrapper[4826]: I0319 20:23:11.406127 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-6btqx_81cad5dc-6bd8-4081-adc1-28f65b056636/frr-k8s-webhook-server/1.log" Mar 19 20:23:11 crc kubenswrapper[4826]: I0319 20:23:11.475790 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-6btqx_81cad5dc-6bd8-4081-adc1-28f65b056636/frr-k8s-webhook-server/0.log" Mar 19 20:23:11 crc kubenswrapper[4826]: I0319 20:23:11.658217 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-84d47777df-4x998_010ce31f-d333-43a9-b1e0-cd85cc0f6fd6/manager/1.log" Mar 19 20:23:11 crc kubenswrapper[4826]: I0319 20:23:11.715806 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-84d47777df-4x998_010ce31f-d333-43a9-b1e0-cd85cc0f6fd6/manager/0.log" Mar 19 20:23:11 crc kubenswrapper[4826]: I0319 20:23:11.934871 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-8645ff956b-rx86q_b57da585-9fca-48a5-a872-e5019db1e36e/webhook-server/0.log" Mar 19 20:23:11 crc kubenswrapper[4826]: I0319 20:23:11.981491 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-8645ff956b-rx86q_b57da585-9fca-48a5-a872-e5019db1e36e/webhook-server/1.log" Mar 19 20:23:12 crc kubenswrapper[4826]: I0319 20:23:12.190312 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-w2f68_b812f1db-b2c8-467c-977a-a8661540546e/kube-rbac-proxy/0.log" Mar 19 20:23:12 crc kubenswrapper[4826]: I0319 20:23:12.443588 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-w2f68_b812f1db-b2c8-467c-977a-a8661540546e/speaker/1.log" Mar 19 20:23:12 crc kubenswrapper[4826]: I0319 20:23:12.892644 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-prxxj_b724e39c-45b5-4701-b4f0-a19969224d90/frr/0.log" Mar 19 20:23:13 crc kubenswrapper[4826]: I0319 20:23:13.120632 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-w2f68_b812f1db-b2c8-467c-977a-a8661540546e/speaker/0.log" Mar 19 20:23:21 crc kubenswrapper[4826]: I0319 20:23:21.976405 4826 scope.go:117] "RemoveContainer" containerID="63ecc124824e01c5ccfd2a32cf4bb3e2efc0746dfbe17c96f0b271731ffe1823" Mar 19 20:23:21 crc kubenswrapper[4826]: E0319 20:23:21.977518 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 20:23:28 crc kubenswrapper[4826]: I0319 20:23:28.642262 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n5kn7_d850e5c4-d43d-452c-9d27-69cb4cda0dd5/util/0.log" Mar 19 20:23:28 crc kubenswrapper[4826]: I0319 20:23:28.906273 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n5kn7_d850e5c4-d43d-452c-9d27-69cb4cda0dd5/pull/0.log" Mar 19 20:23:28 crc kubenswrapper[4826]: I0319 20:23:28.937960 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n5kn7_d850e5c4-d43d-452c-9d27-69cb4cda0dd5/util/0.log" Mar 19 20:23:28 crc kubenswrapper[4826]: I0319 20:23:28.958026 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n5kn7_d850e5c4-d43d-452c-9d27-69cb4cda0dd5/pull/0.log" Mar 19 20:23:29 crc kubenswrapper[4826]: I0319 20:23:29.128433 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n5kn7_d850e5c4-d43d-452c-9d27-69cb4cda0dd5/util/0.log" Mar 19 20:23:29 crc kubenswrapper[4826]: I0319 20:23:29.129817 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n5kn7_d850e5c4-d43d-452c-9d27-69cb4cda0dd5/extract/0.log" Mar 19 20:23:29 crc kubenswrapper[4826]: I0319 20:23:29.137057 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874n5kn7_d850e5c4-d43d-452c-9d27-69cb4cda0dd5/pull/0.log" Mar 19 20:23:29 crc kubenswrapper[4826]: I0319 20:23:29.332914 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c19jdwp_9ab1a846-5046-4bc2-a83a-a7a1ee360c2e/util/0.log" Mar 19 20:23:29 crc kubenswrapper[4826]: I0319 20:23:29.474887 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c19jdwp_9ab1a846-5046-4bc2-a83a-a7a1ee360c2e/pull/0.log" Mar 19 20:23:29 crc kubenswrapper[4826]: I0319 20:23:29.493224 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c19jdwp_9ab1a846-5046-4bc2-a83a-a7a1ee360c2e/pull/0.log" Mar 19 20:23:29 crc kubenswrapper[4826]: I0319 20:23:29.506067 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c19jdwp_9ab1a846-5046-4bc2-a83a-a7a1ee360c2e/util/0.log" Mar 19 20:23:29 crc kubenswrapper[4826]: I0319 20:23:29.687986 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c19jdwp_9ab1a846-5046-4bc2-a83a-a7a1ee360c2e/extract/0.log" Mar 19 20:23:29 crc kubenswrapper[4826]: I0319 20:23:29.691449 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c19jdwp_9ab1a846-5046-4bc2-a83a-a7a1ee360c2e/util/0.log" Mar 19 20:23:29 crc kubenswrapper[4826]: I0319 20:23:29.693243 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c19jdwp_9ab1a846-5046-4bc2-a83a-a7a1ee360c2e/pull/0.log" Mar 19 20:23:29 crc kubenswrapper[4826]: I0319 20:23:29.892712 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5t4jtf_1e8dab2c-5a4e-4383-85a4-3422eac078ee/util/0.log" Mar 19 20:23:30 crc kubenswrapper[4826]: I0319 20:23:30.056021 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5t4jtf_1e8dab2c-5a4e-4383-85a4-3422eac078ee/pull/0.log" Mar 19 20:23:30 crc kubenswrapper[4826]: I0319 20:23:30.076889 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5t4jtf_1e8dab2c-5a4e-4383-85a4-3422eac078ee/util/0.log" Mar 19 20:23:30 crc kubenswrapper[4826]: I0319 20:23:30.097268 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5t4jtf_1e8dab2c-5a4e-4383-85a4-3422eac078ee/pull/0.log" Mar 19 20:23:30 crc kubenswrapper[4826]: I0319 20:23:30.749023 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5t4jtf_1e8dab2c-5a4e-4383-85a4-3422eac078ee/util/0.log" Mar 19 20:23:30 crc kubenswrapper[4826]: I0319 20:23:30.755847 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5t4jtf_1e8dab2c-5a4e-4383-85a4-3422eac078ee/pull/0.log" Mar 19 20:23:30 crc kubenswrapper[4826]: I0319 20:23:30.813766 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5t4jtf_1e8dab2c-5a4e-4383-85a4-3422eac078ee/extract/0.log" Mar 19 20:23:30 crc kubenswrapper[4826]: I0319 20:23:30.979698 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cc68dz_40389493-fee7-4684-8af2-6b5845158143/util/0.log" Mar 19 20:23:31 crc kubenswrapper[4826]: I0319 20:23:31.126550 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cc68dz_40389493-fee7-4684-8af2-6b5845158143/pull/0.log" Mar 19 20:23:31 crc kubenswrapper[4826]: I0319 20:23:31.143622 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cc68dz_40389493-fee7-4684-8af2-6b5845158143/util/0.log" Mar 19 20:23:31 crc kubenswrapper[4826]: I0319 20:23:31.217062 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cc68dz_40389493-fee7-4684-8af2-6b5845158143/pull/0.log" Mar 19 20:23:31 crc kubenswrapper[4826]: I0319 20:23:31.396796 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cc68dz_40389493-fee7-4684-8af2-6b5845158143/pull/0.log" Mar 19 20:23:31 crc kubenswrapper[4826]: I0319 20:23:31.416204 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cc68dz_40389493-fee7-4684-8af2-6b5845158143/extract/0.log" Mar 19 20:23:31 crc kubenswrapper[4826]: I0319 20:23:31.418253 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cc68dz_40389493-fee7-4684-8af2-6b5845158143/util/0.log" Mar 19 20:23:31 crc kubenswrapper[4826]: I0319 20:23:31.583950 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwmbv_e8bfbc17-505e-4154-98c2-6e9c25345308/util/0.log" Mar 19 20:23:31 crc kubenswrapper[4826]: I0319 20:23:31.809290 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwmbv_e8bfbc17-505e-4154-98c2-6e9c25345308/util/0.log" Mar 19 20:23:31 crc kubenswrapper[4826]: I0319 20:23:31.822634 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwmbv_e8bfbc17-505e-4154-98c2-6e9c25345308/pull/0.log" Mar 19 20:23:31 crc kubenswrapper[4826]: I0319 20:23:31.845905 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwmbv_e8bfbc17-505e-4154-98c2-6e9c25345308/pull/0.log" Mar 19 20:23:32 crc kubenswrapper[4826]: I0319 20:23:32.038896 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwmbv_e8bfbc17-505e-4154-98c2-6e9c25345308/pull/0.log" Mar 19 20:23:32 crc kubenswrapper[4826]: I0319 20:23:32.048073 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwmbv_e8bfbc17-505e-4154-98c2-6e9c25345308/util/0.log" Mar 19 20:23:32 crc kubenswrapper[4826]: I0319 20:23:32.078954 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwmbv_e8bfbc17-505e-4154-98c2-6e9c25345308/extract/0.log" Mar 19 20:23:32 crc kubenswrapper[4826]: I0319 20:23:32.976812 4826 scope.go:117] "RemoveContainer" containerID="63ecc124824e01c5ccfd2a32cf4bb3e2efc0746dfbe17c96f0b271731ffe1823" Mar 19 20:23:32 crc kubenswrapper[4826]: E0319 20:23:32.977786 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 20:23:33 crc kubenswrapper[4826]: I0319 20:23:33.378903 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mw64r_a3821b4d-7122-428f-be08-2c5f72a29b1d/extract-utilities/0.log" Mar 19 20:23:33 crc kubenswrapper[4826]: I0319 20:23:33.535563 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mw64r_a3821b4d-7122-428f-be08-2c5f72a29b1d/extract-content/0.log" Mar 19 20:23:33 crc kubenswrapper[4826]: I0319 20:23:33.546194 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mw64r_a3821b4d-7122-428f-be08-2c5f72a29b1d/extract-utilities/0.log" Mar 19 20:23:33 crc kubenswrapper[4826]: I0319 20:23:33.547317 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mw64r_a3821b4d-7122-428f-be08-2c5f72a29b1d/extract-content/0.log" Mar 19 20:23:33 crc kubenswrapper[4826]: I0319 20:23:33.696527 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mw64r_a3821b4d-7122-428f-be08-2c5f72a29b1d/extract-content/0.log" Mar 19 20:23:33 crc kubenswrapper[4826]: I0319 20:23:33.763508 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p9rmq_3a20f6a8-01f3-4492-856d-e5f494672fa3/extract-utilities/0.log" Mar 19 20:23:33 crc kubenswrapper[4826]: I0319 20:23:33.821584 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mw64r_a3821b4d-7122-428f-be08-2c5f72a29b1d/extract-utilities/0.log" Mar 19 20:23:34 crc kubenswrapper[4826]: I0319 20:23:34.031705 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p9rmq_3a20f6a8-01f3-4492-856d-e5f494672fa3/extract-content/0.log" Mar 19 20:23:34 crc kubenswrapper[4826]: I0319 20:23:34.093222 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mw64r_a3821b4d-7122-428f-be08-2c5f72a29b1d/registry-server/0.log" Mar 19 20:23:34 crc kubenswrapper[4826]: I0319 20:23:34.132199 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p9rmq_3a20f6a8-01f3-4492-856d-e5f494672fa3/extract-utilities/0.log" Mar 19 20:23:34 crc kubenswrapper[4826]: I0319 20:23:34.142516 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p9rmq_3a20f6a8-01f3-4492-856d-e5f494672fa3/extract-content/0.log" Mar 19 20:23:34 crc kubenswrapper[4826]: I0319 20:23:34.262619 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p9rmq_3a20f6a8-01f3-4492-856d-e5f494672fa3/extract-utilities/0.log" Mar 19 20:23:34 crc kubenswrapper[4826]: I0319 20:23:34.266609 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p9rmq_3a20f6a8-01f3-4492-856d-e5f494672fa3/extract-content/0.log" Mar 19 20:23:34 crc kubenswrapper[4826]: I0319 20:23:34.334525 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p9rmq_3a20f6a8-01f3-4492-856d-e5f494672fa3/registry-server/1.log" Mar 19 20:23:34 crc kubenswrapper[4826]: I0319 20:23:34.507175 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-66v8z_f182fb72-66c7-4d5d-bccd-29a47b27f4c6/marketplace-operator/0.log" Mar 19 20:23:34 crc kubenswrapper[4826]: I0319 20:23:34.530327 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p9rmq_3a20f6a8-01f3-4492-856d-e5f494672fa3/registry-server/0.log" Mar 19 20:23:34 crc kubenswrapper[4826]: I0319 20:23:34.541552 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-66v8z_f182fb72-66c7-4d5d-bccd-29a47b27f4c6/marketplace-operator/1.log" Mar 19 20:23:34 crc kubenswrapper[4826]: I0319 20:23:34.599438 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-plvms_623d1506-574a-4eee-9f1c-cc0ee85e9083/extract-utilities/0.log" Mar 19 20:23:34 crc kubenswrapper[4826]: I0319 20:23:34.809511 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-plvms_623d1506-574a-4eee-9f1c-cc0ee85e9083/extract-content/0.log" Mar 19 20:23:34 crc kubenswrapper[4826]: I0319 20:23:34.841598 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-plvms_623d1506-574a-4eee-9f1c-cc0ee85e9083/extract-utilities/0.log" Mar 19 20:23:34 crc kubenswrapper[4826]: I0319 20:23:34.843142 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-plvms_623d1506-574a-4eee-9f1c-cc0ee85e9083/extract-content/0.log" Mar 19 20:23:35 crc kubenswrapper[4826]: I0319 20:23:35.029872 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-plvms_623d1506-574a-4eee-9f1c-cc0ee85e9083/extract-content/0.log" Mar 19 20:23:35 crc kubenswrapper[4826]: I0319 20:23:35.029877 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-plvms_623d1506-574a-4eee-9f1c-cc0ee85e9083/extract-utilities/0.log" Mar 19 20:23:35 crc kubenswrapper[4826]: I0319 20:23:35.073834 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xnzmj_326a5687-dfe7-4a01-b8b9-c6bedd76684a/extract-utilities/0.log" Mar 19 20:23:35 crc kubenswrapper[4826]: I0319 20:23:35.237069 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-plvms_623d1506-574a-4eee-9f1c-cc0ee85e9083/registry-server/0.log" Mar 19 20:23:35 crc kubenswrapper[4826]: I0319 20:23:35.306676 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xnzmj_326a5687-dfe7-4a01-b8b9-c6bedd76684a/extract-content/0.log" Mar 19 20:23:35 crc kubenswrapper[4826]: I0319 20:23:35.319055 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xnzmj_326a5687-dfe7-4a01-b8b9-c6bedd76684a/extract-utilities/0.log" Mar 19 20:23:35 crc kubenswrapper[4826]: I0319 20:23:35.354553 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xnzmj_326a5687-dfe7-4a01-b8b9-c6bedd76684a/extract-content/0.log" Mar 19 20:23:35 crc kubenswrapper[4826]: I0319 20:23:35.530639 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xnzmj_326a5687-dfe7-4a01-b8b9-c6bedd76684a/extract-content/0.log" Mar 19 20:23:35 crc kubenswrapper[4826]: I0319 20:23:35.534686 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xnzmj_326a5687-dfe7-4a01-b8b9-c6bedd76684a/extract-utilities/0.log" Mar 19 20:23:36 crc kubenswrapper[4826]: I0319 20:23:36.393604 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xnzmj_326a5687-dfe7-4a01-b8b9-c6bedd76684a/registry-server/0.log" Mar 19 20:23:45 crc kubenswrapper[4826]: I0319 20:23:45.991251 4826 scope.go:117] "RemoveContainer" containerID="63ecc124824e01c5ccfd2a32cf4bb3e2efc0746dfbe17c96f0b271731ffe1823" Mar 19 20:23:45 crc kubenswrapper[4826]: E0319 20:23:45.992910 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 20:23:52 crc kubenswrapper[4826]: I0319 20:23:52.231859 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-8ff7d675-q6rkt_ac9e546d-1803-4ef0-a76c-6d9e0823c010/prometheus-operator/0.log" Mar 19 20:23:52 crc kubenswrapper[4826]: I0319 20:23:52.256009 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5b8dd7d9db-hhlpv_2e6734a7-50cb-4366-baab-fa0feba677f0/prometheus-operator-admission-webhook/0.log" Mar 19 20:23:52 crc kubenswrapper[4826]: I0319 20:23:52.406817 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5b8dd7d9db-jbnft_f1278453-65e7-42aa-8d60-b42e0c10f232/prometheus-operator-admission-webhook/0.log" Mar 19 20:23:52 crc kubenswrapper[4826]: I0319 20:23:52.435324 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-6dd7dd855f-tcdmb_217c809e-0af8-4b11-a5ce-932d698ed444/operator/1.log" Mar 19 20:23:52 crc kubenswrapper[4826]: I0319 20:23:52.493744 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-6dd7dd855f-tcdmb_217c809e-0af8-4b11-a5ce-932d698ed444/operator/0.log" Mar 19 20:23:52 crc kubenswrapper[4826]: I0319 20:23:52.575470 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-ui-dashboards-7f87b9b85b-9r5qq_e3abbb77-c3e9-4c0f-8038-2cdc6ddd10a5/observability-ui-dashboards/0.log" Mar 19 20:23:52 crc kubenswrapper[4826]: I0319 20:23:52.614689 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-6648f6899-wbmts_8eb71543-680b-4018-94e4-572cfcc12660/perses-operator/0.log" Mar 19 20:24:00 crc kubenswrapper[4826]: I0319 20:24:00.146392 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565864-xgs52"] Mar 19 20:24:00 crc kubenswrapper[4826]: E0319 20:24:00.147331 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="266a96af-b06d-410f-a96e-567f232185be" containerName="oc" Mar 19 20:24:00 crc kubenswrapper[4826]: I0319 20:24:00.147344 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="266a96af-b06d-410f-a96e-567f232185be" containerName="oc" Mar 19 20:24:00 crc kubenswrapper[4826]: I0319 20:24:00.147580 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="266a96af-b06d-410f-a96e-567f232185be" containerName="oc" Mar 19 20:24:00 crc kubenswrapper[4826]: I0319 20:24:00.148339 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565864-xgs52" Mar 19 20:24:00 crc kubenswrapper[4826]: I0319 20:24:00.150554 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 20:24:00 crc kubenswrapper[4826]: I0319 20:24:00.150826 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 20:24:00 crc kubenswrapper[4826]: I0319 20:24:00.150838 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b27wl" Mar 19 20:24:00 crc kubenswrapper[4826]: I0319 20:24:00.165896 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565864-xgs52"] Mar 19 20:24:00 crc kubenswrapper[4826]: I0319 20:24:00.330716 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zxd7\" (UniqueName: \"kubernetes.io/projected/30c9f9f1-b6bd-49e5-bc84-e101f2d58499-kube-api-access-9zxd7\") pod \"auto-csr-approver-29565864-xgs52\" (UID: \"30c9f9f1-b6bd-49e5-bc84-e101f2d58499\") " pod="openshift-infra/auto-csr-approver-29565864-xgs52" Mar 19 20:24:00 crc kubenswrapper[4826]: I0319 20:24:00.433735 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zxd7\" (UniqueName: \"kubernetes.io/projected/30c9f9f1-b6bd-49e5-bc84-e101f2d58499-kube-api-access-9zxd7\") pod \"auto-csr-approver-29565864-xgs52\" (UID: \"30c9f9f1-b6bd-49e5-bc84-e101f2d58499\") " pod="openshift-infra/auto-csr-approver-29565864-xgs52" Mar 19 20:24:00 crc kubenswrapper[4826]: I0319 20:24:00.461826 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zxd7\" (UniqueName: \"kubernetes.io/projected/30c9f9f1-b6bd-49e5-bc84-e101f2d58499-kube-api-access-9zxd7\") pod \"auto-csr-approver-29565864-xgs52\" (UID: \"30c9f9f1-b6bd-49e5-bc84-e101f2d58499\") " pod="openshift-infra/auto-csr-approver-29565864-xgs52" Mar 19 20:24:00 crc kubenswrapper[4826]: I0319 20:24:00.467107 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565864-xgs52" Mar 19 20:24:00 crc kubenswrapper[4826]: I0319 20:24:00.974977 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565864-xgs52"] Mar 19 20:24:00 crc kubenswrapper[4826]: I0319 20:24:00.978438 4826 scope.go:117] "RemoveContainer" containerID="63ecc124824e01c5ccfd2a32cf4bb3e2efc0746dfbe17c96f0b271731ffe1823" Mar 19 20:24:00 crc kubenswrapper[4826]: E0319 20:24:00.978698 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 20:24:01 crc kubenswrapper[4826]: I0319 20:24:01.483845 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565864-xgs52" event={"ID":"30c9f9f1-b6bd-49e5-bc84-e101f2d58499","Type":"ContainerStarted","Data":"66d15db6fe75388b6fa180afbfd7b80c58480dca83a4752a4c37e3db90fab940"} Mar 19 20:24:02 crc kubenswrapper[4826]: I0319 20:24:02.497162 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565864-xgs52" event={"ID":"30c9f9f1-b6bd-49e5-bc84-e101f2d58499","Type":"ContainerStarted","Data":"2059b0764d92c96ab5112fc6f92288a7188bb448b20e2ba665cae80068218aae"} Mar 19 20:24:02 crc kubenswrapper[4826]: I0319 20:24:02.520902 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565864-xgs52" podStartSLOduration=1.6408221809999999 podStartE2EDuration="2.520881269s" podCreationTimestamp="2026-03-19 20:24:00 +0000 UTC" firstStartedPulling="2026-03-19 20:24:00.9804151 +0000 UTC m=+5265.734483413" lastFinishedPulling="2026-03-19 20:24:01.860474188 +0000 UTC m=+5266.614542501" observedRunningTime="2026-03-19 20:24:02.513173224 +0000 UTC m=+5267.267241537" watchObservedRunningTime="2026-03-19 20:24:02.520881269 +0000 UTC m=+5267.274949572" Mar 19 20:24:03 crc kubenswrapper[4826]: I0319 20:24:03.513318 4826 generic.go:334] "Generic (PLEG): container finished" podID="30c9f9f1-b6bd-49e5-bc84-e101f2d58499" containerID="2059b0764d92c96ab5112fc6f92288a7188bb448b20e2ba665cae80068218aae" exitCode=0 Mar 19 20:24:03 crc kubenswrapper[4826]: I0319 20:24:03.513410 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565864-xgs52" event={"ID":"30c9f9f1-b6bd-49e5-bc84-e101f2d58499","Type":"ContainerDied","Data":"2059b0764d92c96ab5112fc6f92288a7188bb448b20e2ba665cae80068218aae"} Mar 19 20:24:04 crc kubenswrapper[4826]: I0319 20:24:04.987611 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565864-xgs52" Mar 19 20:24:05 crc kubenswrapper[4826]: I0319 20:24:05.085313 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zxd7\" (UniqueName: \"kubernetes.io/projected/30c9f9f1-b6bd-49e5-bc84-e101f2d58499-kube-api-access-9zxd7\") pod \"30c9f9f1-b6bd-49e5-bc84-e101f2d58499\" (UID: \"30c9f9f1-b6bd-49e5-bc84-e101f2d58499\") " Mar 19 20:24:05 crc kubenswrapper[4826]: I0319 20:24:05.092318 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30c9f9f1-b6bd-49e5-bc84-e101f2d58499-kube-api-access-9zxd7" (OuterVolumeSpecName: "kube-api-access-9zxd7") pod "30c9f9f1-b6bd-49e5-bc84-e101f2d58499" (UID: "30c9f9f1-b6bd-49e5-bc84-e101f2d58499"). InnerVolumeSpecName "kube-api-access-9zxd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:24:05 crc kubenswrapper[4826]: I0319 20:24:05.188508 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zxd7\" (UniqueName: \"kubernetes.io/projected/30c9f9f1-b6bd-49e5-bc84-e101f2d58499-kube-api-access-9zxd7\") on node \"crc\" DevicePath \"\"" Mar 19 20:24:05 crc kubenswrapper[4826]: I0319 20:24:05.545813 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565864-xgs52" event={"ID":"30c9f9f1-b6bd-49e5-bc84-e101f2d58499","Type":"ContainerDied","Data":"66d15db6fe75388b6fa180afbfd7b80c58480dca83a4752a4c37e3db90fab940"} Mar 19 20:24:05 crc kubenswrapper[4826]: I0319 20:24:05.545866 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66d15db6fe75388b6fa180afbfd7b80c58480dca83a4752a4c37e3db90fab940" Mar 19 20:24:05 crc kubenswrapper[4826]: I0319 20:24:05.545914 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565864-xgs52" Mar 19 20:24:05 crc kubenswrapper[4826]: I0319 20:24:05.595107 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565858-pj7w6"] Mar 19 20:24:05 crc kubenswrapper[4826]: I0319 20:24:05.607558 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565858-pj7w6"] Mar 19 20:24:05 crc kubenswrapper[4826]: I0319 20:24:05.992512 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec82184c-99e1-400c-b68e-a504cda95f2e" path="/var/lib/kubelet/pods/ec82184c-99e1-400c-b68e-a504cda95f2e/volumes" Mar 19 20:24:08 crc kubenswrapper[4826]: I0319 20:24:08.694876 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-d88f59dd5-fqs6s_84bba80c-841e-4df3-87e0-901afbc23bf3/manager/1.log" Mar 19 20:24:08 crc kubenswrapper[4826]: I0319 20:24:08.721488 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-d88f59dd5-fqs6s_84bba80c-841e-4df3-87e0-901afbc23bf3/manager/0.log" Mar 19 20:24:08 crc kubenswrapper[4826]: I0319 20:24:08.761099 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-d88f59dd5-fqs6s_84bba80c-841e-4df3-87e0-901afbc23bf3/kube-rbac-proxy/0.log" Mar 19 20:24:16 crc kubenswrapper[4826]: I0319 20:24:16.009221 4826 scope.go:117] "RemoveContainer" containerID="63ecc124824e01c5ccfd2a32cf4bb3e2efc0746dfbe17c96f0b271731ffe1823" Mar 19 20:24:16 crc kubenswrapper[4826]: E0319 20:24:16.010107 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 20:24:27 crc kubenswrapper[4826]: I0319 20:24:27.980382 4826 scope.go:117] "RemoveContainer" containerID="63ecc124824e01c5ccfd2a32cf4bb3e2efc0746dfbe17c96f0b271731ffe1823" Mar 19 20:24:27 crc kubenswrapper[4826]: E0319 20:24:27.981046 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 20:24:40 crc kubenswrapper[4826]: I0319 20:24:40.976475 4826 scope.go:117] "RemoveContainer" containerID="63ecc124824e01c5ccfd2a32cf4bb3e2efc0746dfbe17c96f0b271731ffe1823" Mar 19 20:24:40 crc kubenswrapper[4826]: E0319 20:24:40.977403 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 20:24:44 crc kubenswrapper[4826]: I0319 20:24:44.592219 4826 scope.go:117] "RemoveContainer" containerID="b244a43eade554532371b14bdc314cbae1451b53ab4c37fa35d68a7bc2fa3960" Mar 19 20:24:44 crc kubenswrapper[4826]: I0319 20:24:44.762412 4826 scope.go:117] "RemoveContainer" containerID="ac3f90f5690c61ca239256ac11f844b35a0251e1f2d64107902be402c4c3a9f9" Mar 19 20:24:52 crc kubenswrapper[4826]: I0319 20:24:52.976221 4826 scope.go:117] "RemoveContainer" containerID="63ecc124824e01c5ccfd2a32cf4bb3e2efc0746dfbe17c96f0b271731ffe1823" Mar 19 20:24:52 crc kubenswrapper[4826]: E0319 20:24:52.977055 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 20:25:06 crc kubenswrapper[4826]: I0319 20:25:06.011230 4826 scope.go:117] "RemoveContainer" containerID="63ecc124824e01c5ccfd2a32cf4bb3e2efc0746dfbe17c96f0b271731ffe1823" Mar 19 20:25:06 crc kubenswrapper[4826]: I0319 20:25:06.405224 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" event={"ID":"b456fa3f-c7a7-45ca-b560-e7a9b21be05a","Type":"ContainerStarted","Data":"d046213c26048ba48fa301b15ba9433e2be6da4043ebedf6f1a1793170691e95"} Mar 19 20:25:44 crc kubenswrapper[4826]: I0319 20:25:44.874312 4826 scope.go:117] "RemoveContainer" containerID="149aa9af905edd8da0d991f72d546c6e3e17572ae31db109c0dc924ea6ad308e" Mar 19 20:26:00 crc kubenswrapper[4826]: I0319 20:26:00.200009 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565866-l8968"] Mar 19 20:26:00 crc kubenswrapper[4826]: E0319 20:26:00.201080 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30c9f9f1-b6bd-49e5-bc84-e101f2d58499" containerName="oc" Mar 19 20:26:00 crc kubenswrapper[4826]: I0319 20:26:00.201094 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="30c9f9f1-b6bd-49e5-bc84-e101f2d58499" containerName="oc" Mar 19 20:26:00 crc kubenswrapper[4826]: I0319 20:26:00.201336 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="30c9f9f1-b6bd-49e5-bc84-e101f2d58499" containerName="oc" Mar 19 20:26:00 crc kubenswrapper[4826]: I0319 20:26:00.202334 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565866-l8968" Mar 19 20:26:00 crc kubenswrapper[4826]: I0319 20:26:00.204712 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 20:26:00 crc kubenswrapper[4826]: I0319 20:26:00.204870 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b27wl" Mar 19 20:26:00 crc kubenswrapper[4826]: I0319 20:26:00.204986 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 20:26:00 crc kubenswrapper[4826]: I0319 20:26:00.220879 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565866-l8968"] Mar 19 20:26:00 crc kubenswrapper[4826]: I0319 20:26:00.298338 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lf78\" (UniqueName: \"kubernetes.io/projected/f797fa48-2452-4895-b010-66602f49638e-kube-api-access-8lf78\") pod \"auto-csr-approver-29565866-l8968\" (UID: \"f797fa48-2452-4895-b010-66602f49638e\") " pod="openshift-infra/auto-csr-approver-29565866-l8968" Mar 19 20:26:00 crc kubenswrapper[4826]: I0319 20:26:00.401022 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lf78\" (UniqueName: \"kubernetes.io/projected/f797fa48-2452-4895-b010-66602f49638e-kube-api-access-8lf78\") pod \"auto-csr-approver-29565866-l8968\" (UID: \"f797fa48-2452-4895-b010-66602f49638e\") " pod="openshift-infra/auto-csr-approver-29565866-l8968" Mar 19 20:26:00 crc kubenswrapper[4826]: I0319 20:26:00.440614 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lf78\" (UniqueName: \"kubernetes.io/projected/f797fa48-2452-4895-b010-66602f49638e-kube-api-access-8lf78\") pod \"auto-csr-approver-29565866-l8968\" (UID: \"f797fa48-2452-4895-b010-66602f49638e\") " pod="openshift-infra/auto-csr-approver-29565866-l8968" Mar 19 20:26:00 crc kubenswrapper[4826]: I0319 20:26:00.522591 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565866-l8968" Mar 19 20:26:01 crc kubenswrapper[4826]: I0319 20:26:01.041285 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565866-l8968"] Mar 19 20:26:01 crc kubenswrapper[4826]: I0319 20:26:01.195721 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565866-l8968" event={"ID":"f797fa48-2452-4895-b010-66602f49638e","Type":"ContainerStarted","Data":"a1ee89304ed661b3a0788b42795428a98496c0cd63e0b22681678255b6242d71"} Mar 19 20:26:03 crc kubenswrapper[4826]: I0319 20:26:03.218689 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565866-l8968" event={"ID":"f797fa48-2452-4895-b010-66602f49638e","Type":"ContainerStarted","Data":"5b4daf45cf384b8c09f0c3f2ffc6cbfb3fddc2bd70f5971b59d7db41dad50917"} Mar 19 20:26:03 crc kubenswrapper[4826]: I0319 20:26:03.241943 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565866-l8968" podStartSLOduration=2.083834935 podStartE2EDuration="3.241923424s" podCreationTimestamp="2026-03-19 20:26:00 +0000 UTC" firstStartedPulling="2026-03-19 20:26:01.05301287 +0000 UTC m=+5385.807081183" lastFinishedPulling="2026-03-19 20:26:02.211101349 +0000 UTC m=+5386.965169672" observedRunningTime="2026-03-19 20:26:03.236116165 +0000 UTC m=+5387.990184468" watchObservedRunningTime="2026-03-19 20:26:03.241923424 +0000 UTC m=+5387.995991737" Mar 19 20:26:04 crc kubenswrapper[4826]: I0319 20:26:04.234109 4826 generic.go:334] "Generic (PLEG): container finished" podID="f797fa48-2452-4895-b010-66602f49638e" containerID="5b4daf45cf384b8c09f0c3f2ffc6cbfb3fddc2bd70f5971b59d7db41dad50917" exitCode=0 Mar 19 20:26:04 crc kubenswrapper[4826]: I0319 20:26:04.234569 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565866-l8968" event={"ID":"f797fa48-2452-4895-b010-66602f49638e","Type":"ContainerDied","Data":"5b4daf45cf384b8c09f0c3f2ffc6cbfb3fddc2bd70f5971b59d7db41dad50917"} Mar 19 20:26:04 crc kubenswrapper[4826]: I0319 20:26:04.603588 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bxqzl"] Mar 19 20:26:04 crc kubenswrapper[4826]: I0319 20:26:04.625819 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bxqzl"] Mar 19 20:26:04 crc kubenswrapper[4826]: I0319 20:26:04.625918 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bxqzl" Mar 19 20:26:04 crc kubenswrapper[4826]: I0319 20:26:04.713542 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9227da5a-9c34-4a13-ae29-a7442de81053-utilities\") pod \"community-operators-bxqzl\" (UID: \"9227da5a-9c34-4a13-ae29-a7442de81053\") " pod="openshift-marketplace/community-operators-bxqzl" Mar 19 20:26:04 crc kubenswrapper[4826]: I0319 20:26:04.713670 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9227da5a-9c34-4a13-ae29-a7442de81053-catalog-content\") pod \"community-operators-bxqzl\" (UID: \"9227da5a-9c34-4a13-ae29-a7442de81053\") " pod="openshift-marketplace/community-operators-bxqzl" Mar 19 20:26:04 crc kubenswrapper[4826]: I0319 20:26:04.713746 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5wgx\" (UniqueName: \"kubernetes.io/projected/9227da5a-9c34-4a13-ae29-a7442de81053-kube-api-access-l5wgx\") pod \"community-operators-bxqzl\" (UID: \"9227da5a-9c34-4a13-ae29-a7442de81053\") " pod="openshift-marketplace/community-operators-bxqzl" Mar 19 20:26:04 crc kubenswrapper[4826]: I0319 20:26:04.817363 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9227da5a-9c34-4a13-ae29-a7442de81053-utilities\") pod \"community-operators-bxqzl\" (UID: \"9227da5a-9c34-4a13-ae29-a7442de81053\") " pod="openshift-marketplace/community-operators-bxqzl" Mar 19 20:26:04 crc kubenswrapper[4826]: I0319 20:26:04.817871 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9227da5a-9c34-4a13-ae29-a7442de81053-utilities\") pod \"community-operators-bxqzl\" (UID: \"9227da5a-9c34-4a13-ae29-a7442de81053\") " pod="openshift-marketplace/community-operators-bxqzl" Mar 19 20:26:04 crc kubenswrapper[4826]: I0319 20:26:04.817977 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9227da5a-9c34-4a13-ae29-a7442de81053-catalog-content\") pod \"community-operators-bxqzl\" (UID: \"9227da5a-9c34-4a13-ae29-a7442de81053\") " pod="openshift-marketplace/community-operators-bxqzl" Mar 19 20:26:04 crc kubenswrapper[4826]: I0319 20:26:04.818178 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5wgx\" (UniqueName: \"kubernetes.io/projected/9227da5a-9c34-4a13-ae29-a7442de81053-kube-api-access-l5wgx\") pod \"community-operators-bxqzl\" (UID: \"9227da5a-9c34-4a13-ae29-a7442de81053\") " pod="openshift-marketplace/community-operators-bxqzl" Mar 19 20:26:04 crc kubenswrapper[4826]: I0319 20:26:04.818234 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9227da5a-9c34-4a13-ae29-a7442de81053-catalog-content\") pod \"community-operators-bxqzl\" (UID: \"9227da5a-9c34-4a13-ae29-a7442de81053\") " pod="openshift-marketplace/community-operators-bxqzl" Mar 19 20:26:04 crc kubenswrapper[4826]: I0319 20:26:04.853329 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5wgx\" (UniqueName: \"kubernetes.io/projected/9227da5a-9c34-4a13-ae29-a7442de81053-kube-api-access-l5wgx\") pod \"community-operators-bxqzl\" (UID: \"9227da5a-9c34-4a13-ae29-a7442de81053\") " pod="openshift-marketplace/community-operators-bxqzl" Mar 19 20:26:04 crc kubenswrapper[4826]: I0319 20:26:04.948183 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bxqzl" Mar 19 20:26:05 crc kubenswrapper[4826]: I0319 20:26:05.525451 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bxqzl"] Mar 19 20:26:05 crc kubenswrapper[4826]: I0319 20:26:05.763407 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565866-l8968" Mar 19 20:26:05 crc kubenswrapper[4826]: I0319 20:26:05.946809 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lf78\" (UniqueName: \"kubernetes.io/projected/f797fa48-2452-4895-b010-66602f49638e-kube-api-access-8lf78\") pod \"f797fa48-2452-4895-b010-66602f49638e\" (UID: \"f797fa48-2452-4895-b010-66602f49638e\") " Mar 19 20:26:05 crc kubenswrapper[4826]: I0319 20:26:05.955613 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f797fa48-2452-4895-b010-66602f49638e-kube-api-access-8lf78" (OuterVolumeSpecName: "kube-api-access-8lf78") pod "f797fa48-2452-4895-b010-66602f49638e" (UID: "f797fa48-2452-4895-b010-66602f49638e"). InnerVolumeSpecName "kube-api-access-8lf78". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:26:06 crc kubenswrapper[4826]: I0319 20:26:06.054387 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lf78\" (UniqueName: \"kubernetes.io/projected/f797fa48-2452-4895-b010-66602f49638e-kube-api-access-8lf78\") on node \"crc\" DevicePath \"\"" Mar 19 20:26:06 crc kubenswrapper[4826]: I0319 20:26:06.272485 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565866-l8968" event={"ID":"f797fa48-2452-4895-b010-66602f49638e","Type":"ContainerDied","Data":"a1ee89304ed661b3a0788b42795428a98496c0cd63e0b22681678255b6242d71"} Mar 19 20:26:06 crc kubenswrapper[4826]: I0319 20:26:06.272848 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1ee89304ed661b3a0788b42795428a98496c0cd63e0b22681678255b6242d71" Mar 19 20:26:06 crc kubenswrapper[4826]: I0319 20:26:06.272926 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565866-l8968" Mar 19 20:26:06 crc kubenswrapper[4826]: I0319 20:26:06.286338 4826 generic.go:334] "Generic (PLEG): container finished" podID="9227da5a-9c34-4a13-ae29-a7442de81053" containerID="d22e6c56db184af10dc0b02067df6f83c7fbe6f44f074fc16369c6a6b5a0e14b" exitCode=0 Mar 19 20:26:06 crc kubenswrapper[4826]: I0319 20:26:06.286388 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bxqzl" event={"ID":"9227da5a-9c34-4a13-ae29-a7442de81053","Type":"ContainerDied","Data":"d22e6c56db184af10dc0b02067df6f83c7fbe6f44f074fc16369c6a6b5a0e14b"} Mar 19 20:26:06 crc kubenswrapper[4826]: I0319 20:26:06.286417 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bxqzl" event={"ID":"9227da5a-9c34-4a13-ae29-a7442de81053","Type":"ContainerStarted","Data":"40b6c07d05f027c865d14ce7d7921533b2c9519bdb940350b34792c21a80c6d7"} Mar 19 20:26:06 crc kubenswrapper[4826]: I0319 20:26:06.336642 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565860-7lvs2"] Mar 19 20:26:06 crc kubenswrapper[4826]: I0319 20:26:06.357048 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565860-7lvs2"] Mar 19 20:26:08 crc kubenswrapper[4826]: I0319 20:26:08.009967 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a654ca5-dc45-4a08-9c35-e25fb06760c6" path="/var/lib/kubelet/pods/0a654ca5-dc45-4a08-9c35-e25fb06760c6/volumes" Mar 19 20:26:08 crc kubenswrapper[4826]: I0319 20:26:08.324880 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bxqzl" event={"ID":"9227da5a-9c34-4a13-ae29-a7442de81053","Type":"ContainerStarted","Data":"b998f50104743dcb1b1b66e8bb60a70f66a49e6f1be2cdf11915fe694a69bae4"} Mar 19 20:26:11 crc kubenswrapper[4826]: I0319 20:26:11.062680 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-v59cc"] Mar 19 20:26:11 crc kubenswrapper[4826]: E0319 20:26:11.063546 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f797fa48-2452-4895-b010-66602f49638e" containerName="oc" Mar 19 20:26:11 crc kubenswrapper[4826]: I0319 20:26:11.063562 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="f797fa48-2452-4895-b010-66602f49638e" containerName="oc" Mar 19 20:26:11 crc kubenswrapper[4826]: I0319 20:26:11.063859 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="f797fa48-2452-4895-b010-66602f49638e" containerName="oc" Mar 19 20:26:11 crc kubenswrapper[4826]: I0319 20:26:11.065563 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v59cc" Mar 19 20:26:11 crc kubenswrapper[4826]: I0319 20:26:11.078346 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v59cc"] Mar 19 20:26:11 crc kubenswrapper[4826]: E0319 20:26:11.154449 4826 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9227da5a_9c34_4a13_ae29_a7442de81053.slice/crio-conmon-b998f50104743dcb1b1b66e8bb60a70f66a49e6f1be2cdf11915fe694a69bae4.scope\": RecentStats: unable to find data in memory cache]" Mar 19 20:26:11 crc kubenswrapper[4826]: I0319 20:26:11.226363 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvg9n\" (UniqueName: \"kubernetes.io/projected/9639e00b-90c7-457d-af6f-7923fc9aeca4-kube-api-access-dvg9n\") pod \"redhat-marketplace-v59cc\" (UID: \"9639e00b-90c7-457d-af6f-7923fc9aeca4\") " pod="openshift-marketplace/redhat-marketplace-v59cc" Mar 19 20:26:11 crc kubenswrapper[4826]: I0319 20:26:11.226702 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9639e00b-90c7-457d-af6f-7923fc9aeca4-utilities\") pod \"redhat-marketplace-v59cc\" (UID: \"9639e00b-90c7-457d-af6f-7923fc9aeca4\") " pod="openshift-marketplace/redhat-marketplace-v59cc" Mar 19 20:26:11 crc kubenswrapper[4826]: I0319 20:26:11.227183 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9639e00b-90c7-457d-af6f-7923fc9aeca4-catalog-content\") pod \"redhat-marketplace-v59cc\" (UID: \"9639e00b-90c7-457d-af6f-7923fc9aeca4\") " pod="openshift-marketplace/redhat-marketplace-v59cc" Mar 19 20:26:11 crc kubenswrapper[4826]: I0319 20:26:11.329210 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9639e00b-90c7-457d-af6f-7923fc9aeca4-catalog-content\") pod \"redhat-marketplace-v59cc\" (UID: \"9639e00b-90c7-457d-af6f-7923fc9aeca4\") " pod="openshift-marketplace/redhat-marketplace-v59cc" Mar 19 20:26:11 crc kubenswrapper[4826]: I0319 20:26:11.329392 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvg9n\" (UniqueName: \"kubernetes.io/projected/9639e00b-90c7-457d-af6f-7923fc9aeca4-kube-api-access-dvg9n\") pod \"redhat-marketplace-v59cc\" (UID: \"9639e00b-90c7-457d-af6f-7923fc9aeca4\") " pod="openshift-marketplace/redhat-marketplace-v59cc" Mar 19 20:26:11 crc kubenswrapper[4826]: I0319 20:26:11.329422 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9639e00b-90c7-457d-af6f-7923fc9aeca4-utilities\") pod \"redhat-marketplace-v59cc\" (UID: \"9639e00b-90c7-457d-af6f-7923fc9aeca4\") " pod="openshift-marketplace/redhat-marketplace-v59cc" Mar 19 20:26:11 crc kubenswrapper[4826]: I0319 20:26:11.329867 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9639e00b-90c7-457d-af6f-7923fc9aeca4-catalog-content\") pod \"redhat-marketplace-v59cc\" (UID: \"9639e00b-90c7-457d-af6f-7923fc9aeca4\") " pod="openshift-marketplace/redhat-marketplace-v59cc" Mar 19 20:26:11 crc kubenswrapper[4826]: I0319 20:26:11.329975 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9639e00b-90c7-457d-af6f-7923fc9aeca4-utilities\") pod \"redhat-marketplace-v59cc\" (UID: \"9639e00b-90c7-457d-af6f-7923fc9aeca4\") " pod="openshift-marketplace/redhat-marketplace-v59cc" Mar 19 20:26:11 crc kubenswrapper[4826]: I0319 20:26:11.350688 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvg9n\" (UniqueName: \"kubernetes.io/projected/9639e00b-90c7-457d-af6f-7923fc9aeca4-kube-api-access-dvg9n\") pod \"redhat-marketplace-v59cc\" (UID: \"9639e00b-90c7-457d-af6f-7923fc9aeca4\") " pod="openshift-marketplace/redhat-marketplace-v59cc" Mar 19 20:26:11 crc kubenswrapper[4826]: I0319 20:26:11.363579 4826 generic.go:334] "Generic (PLEG): container finished" podID="9227da5a-9c34-4a13-ae29-a7442de81053" containerID="b998f50104743dcb1b1b66e8bb60a70f66a49e6f1be2cdf11915fe694a69bae4" exitCode=0 Mar 19 20:26:11 crc kubenswrapper[4826]: I0319 20:26:11.363682 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bxqzl" event={"ID":"9227da5a-9c34-4a13-ae29-a7442de81053","Type":"ContainerDied","Data":"b998f50104743dcb1b1b66e8bb60a70f66a49e6f1be2cdf11915fe694a69bae4"} Mar 19 20:26:11 crc kubenswrapper[4826]: I0319 20:26:11.368436 4826 generic.go:334] "Generic (PLEG): container finished" podID="9e171a7e-b733-47f9-807a-51f2828b6927" containerID="16b5f0e5dbd407162ea88a68f1a3d5ed1e6de1d5b88efe25200151de34f1c118" exitCode=0 Mar 19 20:26:11 crc kubenswrapper[4826]: I0319 20:26:11.368489 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-splz6/must-gather-c99v7" event={"ID":"9e171a7e-b733-47f9-807a-51f2828b6927","Type":"ContainerDied","Data":"16b5f0e5dbd407162ea88a68f1a3d5ed1e6de1d5b88efe25200151de34f1c118"} Mar 19 20:26:11 crc kubenswrapper[4826]: I0319 20:26:11.369695 4826 scope.go:117] "RemoveContainer" containerID="16b5f0e5dbd407162ea88a68f1a3d5ed1e6de1d5b88efe25200151de34f1c118" Mar 19 20:26:11 crc kubenswrapper[4826]: I0319 20:26:11.388870 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v59cc" Mar 19 20:26:11 crc kubenswrapper[4826]: I0319 20:26:11.647921 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-splz6_must-gather-c99v7_9e171a7e-b733-47f9-807a-51f2828b6927/gather/0.log" Mar 19 20:26:11 crc kubenswrapper[4826]: I0319 20:26:11.949950 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v59cc"] Mar 19 20:26:11 crc kubenswrapper[4826]: W0319 20:26:11.950355 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9639e00b_90c7_457d_af6f_7923fc9aeca4.slice/crio-9f21b5e708539126799c9f1d65190794ee7cdfabeb700024d4109d805f463e7b WatchSource:0}: Error finding container 9f21b5e708539126799c9f1d65190794ee7cdfabeb700024d4109d805f463e7b: Status 404 returned error can't find the container with id 9f21b5e708539126799c9f1d65190794ee7cdfabeb700024d4109d805f463e7b Mar 19 20:26:12 crc kubenswrapper[4826]: I0319 20:26:12.387043 4826 generic.go:334] "Generic (PLEG): container finished" podID="9639e00b-90c7-457d-af6f-7923fc9aeca4" containerID="2fec3135b860194f05d1448c9454e57fd705b20388155d4f0d1e8128945f0185" exitCode=0 Mar 19 20:26:12 crc kubenswrapper[4826]: I0319 20:26:12.387167 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v59cc" event={"ID":"9639e00b-90c7-457d-af6f-7923fc9aeca4","Type":"ContainerDied","Data":"2fec3135b860194f05d1448c9454e57fd705b20388155d4f0d1e8128945f0185"} Mar 19 20:26:12 crc kubenswrapper[4826]: I0319 20:26:12.387681 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v59cc" event={"ID":"9639e00b-90c7-457d-af6f-7923fc9aeca4","Type":"ContainerStarted","Data":"9f21b5e708539126799c9f1d65190794ee7cdfabeb700024d4109d805f463e7b"} Mar 19 20:26:13 crc kubenswrapper[4826]: I0319 20:26:13.412938 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bxqzl" event={"ID":"9227da5a-9c34-4a13-ae29-a7442de81053","Type":"ContainerStarted","Data":"3af444f6a36dbd5a6b06fec86794f4da824fe8883575235fc003bec1b3c697cf"} Mar 19 20:26:13 crc kubenswrapper[4826]: I0319 20:26:13.443279 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bxqzl" podStartSLOduration=3.322761164 podStartE2EDuration="9.443259409s" podCreationTimestamp="2026-03-19 20:26:04 +0000 UTC" firstStartedPulling="2026-03-19 20:26:06.289602273 +0000 UTC m=+5391.043670586" lastFinishedPulling="2026-03-19 20:26:12.410100518 +0000 UTC m=+5397.164168831" observedRunningTime="2026-03-19 20:26:13.439451947 +0000 UTC m=+5398.193520260" watchObservedRunningTime="2026-03-19 20:26:13.443259409 +0000 UTC m=+5398.197327722" Mar 19 20:26:14 crc kubenswrapper[4826]: I0319 20:26:14.951314 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bxqzl" Mar 19 20:26:14 crc kubenswrapper[4826]: I0319 20:26:14.951840 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bxqzl" Mar 19 20:26:15 crc kubenswrapper[4826]: I0319 20:26:15.443071 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v59cc" event={"ID":"9639e00b-90c7-457d-af6f-7923fc9aeca4","Type":"ContainerStarted","Data":"f9e0851c9306a20c3c80e251984b56b0c40da1057e543b745c8b491a3d02cd28"} Mar 19 20:26:16 crc kubenswrapper[4826]: I0319 20:26:16.012274 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-bxqzl" podUID="9227da5a-9c34-4a13-ae29-a7442de81053" containerName="registry-server" probeResult="failure" output=< Mar 19 20:26:16 crc kubenswrapper[4826]: timeout: failed to connect service ":50051" within 1s Mar 19 20:26:16 crc kubenswrapper[4826]: > Mar 19 20:26:16 crc kubenswrapper[4826]: I0319 20:26:16.457451 4826 generic.go:334] "Generic (PLEG): container finished" podID="9639e00b-90c7-457d-af6f-7923fc9aeca4" containerID="f9e0851c9306a20c3c80e251984b56b0c40da1057e543b745c8b491a3d02cd28" exitCode=0 Mar 19 20:26:16 crc kubenswrapper[4826]: I0319 20:26:16.457515 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v59cc" event={"ID":"9639e00b-90c7-457d-af6f-7923fc9aeca4","Type":"ContainerDied","Data":"f9e0851c9306a20c3c80e251984b56b0c40da1057e543b745c8b491a3d02cd28"} Mar 19 20:26:17 crc kubenswrapper[4826]: I0319 20:26:17.471631 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v59cc" event={"ID":"9639e00b-90c7-457d-af6f-7923fc9aeca4","Type":"ContainerStarted","Data":"eaff2a0a2e1bd1d2794725d7f7a12adcd85c5a7be2aaff4f239421872c415cfb"} Mar 19 20:26:17 crc kubenswrapper[4826]: I0319 20:26:17.503000 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-v59cc" podStartSLOduration=1.916196446 podStartE2EDuration="6.502980443s" podCreationTimestamp="2026-03-19 20:26:11 +0000 UTC" firstStartedPulling="2026-03-19 20:26:12.409133194 +0000 UTC m=+5397.163201507" lastFinishedPulling="2026-03-19 20:26:16.995917191 +0000 UTC m=+5401.749985504" observedRunningTime="2026-03-19 20:26:17.491393285 +0000 UTC m=+5402.245461598" watchObservedRunningTime="2026-03-19 20:26:17.502980443 +0000 UTC m=+5402.257048756" Mar 19 20:26:19 crc kubenswrapper[4826]: I0319 20:26:19.749897 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-b2hdk"] Mar 19 20:26:19 crc kubenswrapper[4826]: I0319 20:26:19.752960 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b2hdk" Mar 19 20:26:19 crc kubenswrapper[4826]: I0319 20:26:19.772028 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b2hdk"] Mar 19 20:26:19 crc kubenswrapper[4826]: I0319 20:26:19.851973 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/943caaa5-d3b3-47a3-ac4c-7aab7200e6c6-utilities\") pod \"certified-operators-b2hdk\" (UID: \"943caaa5-d3b3-47a3-ac4c-7aab7200e6c6\") " pod="openshift-marketplace/certified-operators-b2hdk" Mar 19 20:26:19 crc kubenswrapper[4826]: I0319 20:26:19.852139 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/943caaa5-d3b3-47a3-ac4c-7aab7200e6c6-catalog-content\") pod \"certified-operators-b2hdk\" (UID: \"943caaa5-d3b3-47a3-ac4c-7aab7200e6c6\") " pod="openshift-marketplace/certified-operators-b2hdk" Mar 19 20:26:19 crc kubenswrapper[4826]: I0319 20:26:19.852215 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wg2p\" (UniqueName: \"kubernetes.io/projected/943caaa5-d3b3-47a3-ac4c-7aab7200e6c6-kube-api-access-9wg2p\") pod \"certified-operators-b2hdk\" (UID: \"943caaa5-d3b3-47a3-ac4c-7aab7200e6c6\") " pod="openshift-marketplace/certified-operators-b2hdk" Mar 19 20:26:19 crc kubenswrapper[4826]: I0319 20:26:19.955002 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/943caaa5-d3b3-47a3-ac4c-7aab7200e6c6-utilities\") pod \"certified-operators-b2hdk\" (UID: \"943caaa5-d3b3-47a3-ac4c-7aab7200e6c6\") " pod="openshift-marketplace/certified-operators-b2hdk" Mar 19 20:26:19 crc kubenswrapper[4826]: I0319 20:26:19.955448 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/943caaa5-d3b3-47a3-ac4c-7aab7200e6c6-utilities\") pod \"certified-operators-b2hdk\" (UID: \"943caaa5-d3b3-47a3-ac4c-7aab7200e6c6\") " pod="openshift-marketplace/certified-operators-b2hdk" Mar 19 20:26:19 crc kubenswrapper[4826]: I0319 20:26:19.955644 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/943caaa5-d3b3-47a3-ac4c-7aab7200e6c6-catalog-content\") pod \"certified-operators-b2hdk\" (UID: \"943caaa5-d3b3-47a3-ac4c-7aab7200e6c6\") " pod="openshift-marketplace/certified-operators-b2hdk" Mar 19 20:26:19 crc kubenswrapper[4826]: I0319 20:26:19.955917 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wg2p\" (UniqueName: \"kubernetes.io/projected/943caaa5-d3b3-47a3-ac4c-7aab7200e6c6-kube-api-access-9wg2p\") pod \"certified-operators-b2hdk\" (UID: \"943caaa5-d3b3-47a3-ac4c-7aab7200e6c6\") " pod="openshift-marketplace/certified-operators-b2hdk" Mar 19 20:26:19 crc kubenswrapper[4826]: I0319 20:26:19.955860 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/943caaa5-d3b3-47a3-ac4c-7aab7200e6c6-catalog-content\") pod \"certified-operators-b2hdk\" (UID: \"943caaa5-d3b3-47a3-ac4c-7aab7200e6c6\") " pod="openshift-marketplace/certified-operators-b2hdk" Mar 19 20:26:19 crc kubenswrapper[4826]: I0319 20:26:19.976597 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wg2p\" (UniqueName: \"kubernetes.io/projected/943caaa5-d3b3-47a3-ac4c-7aab7200e6c6-kube-api-access-9wg2p\") pod \"certified-operators-b2hdk\" (UID: \"943caaa5-d3b3-47a3-ac4c-7aab7200e6c6\") " pod="openshift-marketplace/certified-operators-b2hdk" Mar 19 20:26:20 crc kubenswrapper[4826]: I0319 20:26:20.086214 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b2hdk" Mar 19 20:26:20 crc kubenswrapper[4826]: I0319 20:26:20.668427 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b2hdk"] Mar 19 20:26:21 crc kubenswrapper[4826]: I0319 20:26:21.390258 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-v59cc" Mar 19 20:26:21 crc kubenswrapper[4826]: I0319 20:26:21.390475 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-v59cc" Mar 19 20:26:21 crc kubenswrapper[4826]: I0319 20:26:21.533939 4826 generic.go:334] "Generic (PLEG): container finished" podID="943caaa5-d3b3-47a3-ac4c-7aab7200e6c6" containerID="203f96f808e85bdf71542805d0f15da52f9e647b9f8edd5adacbf8a794da8fd0" exitCode=0 Mar 19 20:26:21 crc kubenswrapper[4826]: I0319 20:26:21.533989 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b2hdk" event={"ID":"943caaa5-d3b3-47a3-ac4c-7aab7200e6c6","Type":"ContainerDied","Data":"203f96f808e85bdf71542805d0f15da52f9e647b9f8edd5adacbf8a794da8fd0"} Mar 19 20:26:21 crc kubenswrapper[4826]: I0319 20:26:21.534200 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b2hdk" event={"ID":"943caaa5-d3b3-47a3-ac4c-7aab7200e6c6","Type":"ContainerStarted","Data":"0eb7db66baff99ca54c271a8038bf0eaf59b762e80b581da95cfc87544443c55"} Mar 19 20:26:22 crc kubenswrapper[4826]: I0319 20:26:22.444064 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-v59cc" podUID="9639e00b-90c7-457d-af6f-7923fc9aeca4" containerName="registry-server" probeResult="failure" output=< Mar 19 20:26:22 crc kubenswrapper[4826]: timeout: failed to connect service ":50051" within 1s Mar 19 20:26:22 crc kubenswrapper[4826]: > Mar 19 20:26:23 crc kubenswrapper[4826]: I0319 20:26:23.557941 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b2hdk" event={"ID":"943caaa5-d3b3-47a3-ac4c-7aab7200e6c6","Type":"ContainerStarted","Data":"e77ec8293bfff5fd91ca41ceda9deac3e7207be673ee0dfca946eca91bb7ed82"} Mar 19 20:26:23 crc kubenswrapper[4826]: I0319 20:26:23.720933 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-splz6/must-gather-c99v7"] Mar 19 20:26:23 crc kubenswrapper[4826]: I0319 20:26:23.721202 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-splz6/must-gather-c99v7" podUID="9e171a7e-b733-47f9-807a-51f2828b6927" containerName="copy" containerID="cri-o://4c9de78b9869f86ede10c625673cbbd878058d25777cf854a4dd1e6cd005833a" gracePeriod=2 Mar 19 20:26:23 crc kubenswrapper[4826]: I0319 20:26:23.733979 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-splz6/must-gather-c99v7"] Mar 19 20:26:24 crc kubenswrapper[4826]: I0319 20:26:24.383814 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-splz6_must-gather-c99v7_9e171a7e-b733-47f9-807a-51f2828b6927/copy/0.log" Mar 19 20:26:24 crc kubenswrapper[4826]: I0319 20:26:24.384786 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-splz6/must-gather-c99v7" Mar 19 20:26:24 crc kubenswrapper[4826]: I0319 20:26:24.477867 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jw6wt\" (UniqueName: \"kubernetes.io/projected/9e171a7e-b733-47f9-807a-51f2828b6927-kube-api-access-jw6wt\") pod \"9e171a7e-b733-47f9-807a-51f2828b6927\" (UID: \"9e171a7e-b733-47f9-807a-51f2828b6927\") " Mar 19 20:26:24 crc kubenswrapper[4826]: I0319 20:26:24.478095 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9e171a7e-b733-47f9-807a-51f2828b6927-must-gather-output\") pod \"9e171a7e-b733-47f9-807a-51f2828b6927\" (UID: \"9e171a7e-b733-47f9-807a-51f2828b6927\") " Mar 19 20:26:24 crc kubenswrapper[4826]: I0319 20:26:24.488601 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e171a7e-b733-47f9-807a-51f2828b6927-kube-api-access-jw6wt" (OuterVolumeSpecName: "kube-api-access-jw6wt") pod "9e171a7e-b733-47f9-807a-51f2828b6927" (UID: "9e171a7e-b733-47f9-807a-51f2828b6927"). InnerVolumeSpecName "kube-api-access-jw6wt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:26:24 crc kubenswrapper[4826]: I0319 20:26:24.570617 4826 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-splz6_must-gather-c99v7_9e171a7e-b733-47f9-807a-51f2828b6927/copy/0.log" Mar 19 20:26:24 crc kubenswrapper[4826]: I0319 20:26:24.574809 4826 generic.go:334] "Generic (PLEG): container finished" podID="9e171a7e-b733-47f9-807a-51f2828b6927" containerID="4c9de78b9869f86ede10c625673cbbd878058d25777cf854a4dd1e6cd005833a" exitCode=143 Mar 19 20:26:24 crc kubenswrapper[4826]: I0319 20:26:24.574928 4826 scope.go:117] "RemoveContainer" containerID="4c9de78b9869f86ede10c625673cbbd878058d25777cf854a4dd1e6cd005833a" Mar 19 20:26:24 crc kubenswrapper[4826]: I0319 20:26:24.575382 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-splz6/must-gather-c99v7" Mar 19 20:26:24 crc kubenswrapper[4826]: I0319 20:26:24.583154 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jw6wt\" (UniqueName: \"kubernetes.io/projected/9e171a7e-b733-47f9-807a-51f2828b6927-kube-api-access-jw6wt\") on node \"crc\" DevicePath \"\"" Mar 19 20:26:24 crc kubenswrapper[4826]: I0319 20:26:24.619993 4826 scope.go:117] "RemoveContainer" containerID="16b5f0e5dbd407162ea88a68f1a3d5ed1e6de1d5b88efe25200151de34f1c118" Mar 19 20:26:24 crc kubenswrapper[4826]: I0319 20:26:24.624836 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e171a7e-b733-47f9-807a-51f2828b6927-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "9e171a7e-b733-47f9-807a-51f2828b6927" (UID: "9e171a7e-b733-47f9-807a-51f2828b6927"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:26:24 crc kubenswrapper[4826]: I0319 20:26:24.663974 4826 scope.go:117] "RemoveContainer" containerID="4c9de78b9869f86ede10c625673cbbd878058d25777cf854a4dd1e6cd005833a" Mar 19 20:26:24 crc kubenswrapper[4826]: E0319 20:26:24.666633 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c9de78b9869f86ede10c625673cbbd878058d25777cf854a4dd1e6cd005833a\": container with ID starting with 4c9de78b9869f86ede10c625673cbbd878058d25777cf854a4dd1e6cd005833a not found: ID does not exist" containerID="4c9de78b9869f86ede10c625673cbbd878058d25777cf854a4dd1e6cd005833a" Mar 19 20:26:24 crc kubenswrapper[4826]: I0319 20:26:24.666858 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c9de78b9869f86ede10c625673cbbd878058d25777cf854a4dd1e6cd005833a"} err="failed to get container status \"4c9de78b9869f86ede10c625673cbbd878058d25777cf854a4dd1e6cd005833a\": rpc error: code = NotFound desc = could not find container \"4c9de78b9869f86ede10c625673cbbd878058d25777cf854a4dd1e6cd005833a\": container with ID starting with 4c9de78b9869f86ede10c625673cbbd878058d25777cf854a4dd1e6cd005833a not found: ID does not exist" Mar 19 20:26:24 crc kubenswrapper[4826]: I0319 20:26:24.667016 4826 scope.go:117] "RemoveContainer" containerID="16b5f0e5dbd407162ea88a68f1a3d5ed1e6de1d5b88efe25200151de34f1c118" Mar 19 20:26:24 crc kubenswrapper[4826]: E0319 20:26:24.667550 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16b5f0e5dbd407162ea88a68f1a3d5ed1e6de1d5b88efe25200151de34f1c118\": container with ID starting with 16b5f0e5dbd407162ea88a68f1a3d5ed1e6de1d5b88efe25200151de34f1c118 not found: ID does not exist" containerID="16b5f0e5dbd407162ea88a68f1a3d5ed1e6de1d5b88efe25200151de34f1c118" Mar 19 20:26:24 crc kubenswrapper[4826]: I0319 20:26:24.667704 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16b5f0e5dbd407162ea88a68f1a3d5ed1e6de1d5b88efe25200151de34f1c118"} err="failed to get container status \"16b5f0e5dbd407162ea88a68f1a3d5ed1e6de1d5b88efe25200151de34f1c118\": rpc error: code = NotFound desc = could not find container \"16b5f0e5dbd407162ea88a68f1a3d5ed1e6de1d5b88efe25200151de34f1c118\": container with ID starting with 16b5f0e5dbd407162ea88a68f1a3d5ed1e6de1d5b88efe25200151de34f1c118 not found: ID does not exist" Mar 19 20:26:24 crc kubenswrapper[4826]: I0319 20:26:24.685759 4826 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9e171a7e-b733-47f9-807a-51f2828b6927-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 19 20:26:25 crc kubenswrapper[4826]: I0319 20:26:25.591813 4826 generic.go:334] "Generic (PLEG): container finished" podID="943caaa5-d3b3-47a3-ac4c-7aab7200e6c6" containerID="e77ec8293bfff5fd91ca41ceda9deac3e7207be673ee0dfca946eca91bb7ed82" exitCode=0 Mar 19 20:26:25 crc kubenswrapper[4826]: I0319 20:26:25.591856 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b2hdk" event={"ID":"943caaa5-d3b3-47a3-ac4c-7aab7200e6c6","Type":"ContainerDied","Data":"e77ec8293bfff5fd91ca41ceda9deac3e7207be673ee0dfca946eca91bb7ed82"} Mar 19 20:26:25 crc kubenswrapper[4826]: I0319 20:26:25.990683 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e171a7e-b733-47f9-807a-51f2828b6927" path="/var/lib/kubelet/pods/9e171a7e-b733-47f9-807a-51f2828b6927/volumes" Mar 19 20:26:26 crc kubenswrapper[4826]: I0319 20:26:26.016924 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-bxqzl" podUID="9227da5a-9c34-4a13-ae29-a7442de81053" containerName="registry-server" probeResult="failure" output=< Mar 19 20:26:26 crc kubenswrapper[4826]: timeout: failed to connect service ":50051" within 1s Mar 19 20:26:26 crc kubenswrapper[4826]: > Mar 19 20:26:26 crc kubenswrapper[4826]: I0319 20:26:26.631509 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b2hdk" event={"ID":"943caaa5-d3b3-47a3-ac4c-7aab7200e6c6","Type":"ContainerStarted","Data":"b2345506fcbe6b5dd1d3302dc4d0c7616d33333e46cc96900298df2315c95b61"} Mar 19 20:26:26 crc kubenswrapper[4826]: I0319 20:26:26.653781 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-b2hdk" podStartSLOduration=3.144550565 podStartE2EDuration="7.65376094s" podCreationTimestamp="2026-03-19 20:26:19 +0000 UTC" firstStartedPulling="2026-03-19 20:26:21.537727019 +0000 UTC m=+5406.291795332" lastFinishedPulling="2026-03-19 20:26:26.046937394 +0000 UTC m=+5410.801005707" observedRunningTime="2026-03-19 20:26:26.648518844 +0000 UTC m=+5411.402587157" watchObservedRunningTime="2026-03-19 20:26:26.65376094 +0000 UTC m=+5411.407829253" Mar 19 20:26:30 crc kubenswrapper[4826]: I0319 20:26:30.087286 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-b2hdk" Mar 19 20:26:30 crc kubenswrapper[4826]: I0319 20:26:30.088101 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-b2hdk" Mar 19 20:26:31 crc kubenswrapper[4826]: I0319 20:26:31.165775 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-b2hdk" podUID="943caaa5-d3b3-47a3-ac4c-7aab7200e6c6" containerName="registry-server" probeResult="failure" output=< Mar 19 20:26:31 crc kubenswrapper[4826]: timeout: failed to connect service ":50051" within 1s Mar 19 20:26:31 crc kubenswrapper[4826]: > Mar 19 20:26:32 crc kubenswrapper[4826]: I0319 20:26:32.445997 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-v59cc" podUID="9639e00b-90c7-457d-af6f-7923fc9aeca4" containerName="registry-server" probeResult="failure" output=< Mar 19 20:26:32 crc kubenswrapper[4826]: timeout: failed to connect service ":50051" within 1s Mar 19 20:26:32 crc kubenswrapper[4826]: > Mar 19 20:26:35 crc kubenswrapper[4826]: I0319 20:26:35.005389 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bxqzl" Mar 19 20:26:35 crc kubenswrapper[4826]: I0319 20:26:35.090579 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bxqzl" Mar 19 20:26:35 crc kubenswrapper[4826]: I0319 20:26:35.813484 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bxqzl"] Mar 19 20:26:36 crc kubenswrapper[4826]: I0319 20:26:36.755955 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bxqzl" podUID="9227da5a-9c34-4a13-ae29-a7442de81053" containerName="registry-server" containerID="cri-o://3af444f6a36dbd5a6b06fec86794f4da824fe8883575235fc003bec1b3c697cf" gracePeriod=2 Mar 19 20:26:37 crc kubenswrapper[4826]: I0319 20:26:37.307490 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bxqzl" Mar 19 20:26:37 crc kubenswrapper[4826]: I0319 20:26:37.441006 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5wgx\" (UniqueName: \"kubernetes.io/projected/9227da5a-9c34-4a13-ae29-a7442de81053-kube-api-access-l5wgx\") pod \"9227da5a-9c34-4a13-ae29-a7442de81053\" (UID: \"9227da5a-9c34-4a13-ae29-a7442de81053\") " Mar 19 20:26:37 crc kubenswrapper[4826]: I0319 20:26:37.441149 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9227da5a-9c34-4a13-ae29-a7442de81053-catalog-content\") pod \"9227da5a-9c34-4a13-ae29-a7442de81053\" (UID: \"9227da5a-9c34-4a13-ae29-a7442de81053\") " Mar 19 20:26:37 crc kubenswrapper[4826]: I0319 20:26:37.441417 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9227da5a-9c34-4a13-ae29-a7442de81053-utilities\") pod \"9227da5a-9c34-4a13-ae29-a7442de81053\" (UID: \"9227da5a-9c34-4a13-ae29-a7442de81053\") " Mar 19 20:26:37 crc kubenswrapper[4826]: I0319 20:26:37.442137 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9227da5a-9c34-4a13-ae29-a7442de81053-utilities" (OuterVolumeSpecName: "utilities") pod "9227da5a-9c34-4a13-ae29-a7442de81053" (UID: "9227da5a-9c34-4a13-ae29-a7442de81053"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:26:37 crc kubenswrapper[4826]: I0319 20:26:37.442465 4826 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9227da5a-9c34-4a13-ae29-a7442de81053-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 20:26:37 crc kubenswrapper[4826]: I0319 20:26:37.451221 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9227da5a-9c34-4a13-ae29-a7442de81053-kube-api-access-l5wgx" (OuterVolumeSpecName: "kube-api-access-l5wgx") pod "9227da5a-9c34-4a13-ae29-a7442de81053" (UID: "9227da5a-9c34-4a13-ae29-a7442de81053"). InnerVolumeSpecName "kube-api-access-l5wgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:26:37 crc kubenswrapper[4826]: I0319 20:26:37.503523 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9227da5a-9c34-4a13-ae29-a7442de81053-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9227da5a-9c34-4a13-ae29-a7442de81053" (UID: "9227da5a-9c34-4a13-ae29-a7442de81053"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:26:37 crc kubenswrapper[4826]: I0319 20:26:37.544203 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5wgx\" (UniqueName: \"kubernetes.io/projected/9227da5a-9c34-4a13-ae29-a7442de81053-kube-api-access-l5wgx\") on node \"crc\" DevicePath \"\"" Mar 19 20:26:37 crc kubenswrapper[4826]: I0319 20:26:37.544237 4826 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9227da5a-9c34-4a13-ae29-a7442de81053-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 20:26:37 crc kubenswrapper[4826]: I0319 20:26:37.769155 4826 generic.go:334] "Generic (PLEG): container finished" podID="9227da5a-9c34-4a13-ae29-a7442de81053" containerID="3af444f6a36dbd5a6b06fec86794f4da824fe8883575235fc003bec1b3c697cf" exitCode=0 Mar 19 20:26:37 crc kubenswrapper[4826]: I0319 20:26:37.769222 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bxqzl" event={"ID":"9227da5a-9c34-4a13-ae29-a7442de81053","Type":"ContainerDied","Data":"3af444f6a36dbd5a6b06fec86794f4da824fe8883575235fc003bec1b3c697cf"} Mar 19 20:26:37 crc kubenswrapper[4826]: I0319 20:26:37.769278 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bxqzl" event={"ID":"9227da5a-9c34-4a13-ae29-a7442de81053","Type":"ContainerDied","Data":"40b6c07d05f027c865d14ce7d7921533b2c9519bdb940350b34792c21a80c6d7"} Mar 19 20:26:37 crc kubenswrapper[4826]: I0319 20:26:37.769308 4826 scope.go:117] "RemoveContainer" containerID="3af444f6a36dbd5a6b06fec86794f4da824fe8883575235fc003bec1b3c697cf" Mar 19 20:26:37 crc kubenswrapper[4826]: I0319 20:26:37.769333 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bxqzl" Mar 19 20:26:37 crc kubenswrapper[4826]: I0319 20:26:37.795948 4826 scope.go:117] "RemoveContainer" containerID="b998f50104743dcb1b1b66e8bb60a70f66a49e6f1be2cdf11915fe694a69bae4" Mar 19 20:26:37 crc kubenswrapper[4826]: I0319 20:26:37.845232 4826 scope.go:117] "RemoveContainer" containerID="d22e6c56db184af10dc0b02067df6f83c7fbe6f44f074fc16369c6a6b5a0e14b" Mar 19 20:26:37 crc kubenswrapper[4826]: I0319 20:26:37.856322 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bxqzl"] Mar 19 20:26:37 crc kubenswrapper[4826]: I0319 20:26:37.876475 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bxqzl"] Mar 19 20:26:37 crc kubenswrapper[4826]: I0319 20:26:37.889864 4826 scope.go:117] "RemoveContainer" containerID="3af444f6a36dbd5a6b06fec86794f4da824fe8883575235fc003bec1b3c697cf" Mar 19 20:26:37 crc kubenswrapper[4826]: E0319 20:26:37.890285 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3af444f6a36dbd5a6b06fec86794f4da824fe8883575235fc003bec1b3c697cf\": container with ID starting with 3af444f6a36dbd5a6b06fec86794f4da824fe8883575235fc003bec1b3c697cf not found: ID does not exist" containerID="3af444f6a36dbd5a6b06fec86794f4da824fe8883575235fc003bec1b3c697cf" Mar 19 20:26:37 crc kubenswrapper[4826]: I0319 20:26:37.890314 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3af444f6a36dbd5a6b06fec86794f4da824fe8883575235fc003bec1b3c697cf"} err="failed to get container status \"3af444f6a36dbd5a6b06fec86794f4da824fe8883575235fc003bec1b3c697cf\": rpc error: code = NotFound desc = could not find container \"3af444f6a36dbd5a6b06fec86794f4da824fe8883575235fc003bec1b3c697cf\": container with ID starting with 3af444f6a36dbd5a6b06fec86794f4da824fe8883575235fc003bec1b3c697cf not found: ID does not exist" Mar 19 20:26:37 crc kubenswrapper[4826]: I0319 20:26:37.890333 4826 scope.go:117] "RemoveContainer" containerID="b998f50104743dcb1b1b66e8bb60a70f66a49e6f1be2cdf11915fe694a69bae4" Mar 19 20:26:37 crc kubenswrapper[4826]: E0319 20:26:37.890622 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b998f50104743dcb1b1b66e8bb60a70f66a49e6f1be2cdf11915fe694a69bae4\": container with ID starting with b998f50104743dcb1b1b66e8bb60a70f66a49e6f1be2cdf11915fe694a69bae4 not found: ID does not exist" containerID="b998f50104743dcb1b1b66e8bb60a70f66a49e6f1be2cdf11915fe694a69bae4" Mar 19 20:26:37 crc kubenswrapper[4826]: I0319 20:26:37.890648 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b998f50104743dcb1b1b66e8bb60a70f66a49e6f1be2cdf11915fe694a69bae4"} err="failed to get container status \"b998f50104743dcb1b1b66e8bb60a70f66a49e6f1be2cdf11915fe694a69bae4\": rpc error: code = NotFound desc = could not find container \"b998f50104743dcb1b1b66e8bb60a70f66a49e6f1be2cdf11915fe694a69bae4\": container with ID starting with b998f50104743dcb1b1b66e8bb60a70f66a49e6f1be2cdf11915fe694a69bae4 not found: ID does not exist" Mar 19 20:26:37 crc kubenswrapper[4826]: I0319 20:26:37.890681 4826 scope.go:117] "RemoveContainer" containerID="d22e6c56db184af10dc0b02067df6f83c7fbe6f44f074fc16369c6a6b5a0e14b" Mar 19 20:26:37 crc kubenswrapper[4826]: E0319 20:26:37.890995 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d22e6c56db184af10dc0b02067df6f83c7fbe6f44f074fc16369c6a6b5a0e14b\": container with ID starting with d22e6c56db184af10dc0b02067df6f83c7fbe6f44f074fc16369c6a6b5a0e14b not found: ID does not exist" containerID="d22e6c56db184af10dc0b02067df6f83c7fbe6f44f074fc16369c6a6b5a0e14b" Mar 19 20:26:37 crc kubenswrapper[4826]: I0319 20:26:37.891045 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d22e6c56db184af10dc0b02067df6f83c7fbe6f44f074fc16369c6a6b5a0e14b"} err="failed to get container status \"d22e6c56db184af10dc0b02067df6f83c7fbe6f44f074fc16369c6a6b5a0e14b\": rpc error: code = NotFound desc = could not find container \"d22e6c56db184af10dc0b02067df6f83c7fbe6f44f074fc16369c6a6b5a0e14b\": container with ID starting with d22e6c56db184af10dc0b02067df6f83c7fbe6f44f074fc16369c6a6b5a0e14b not found: ID does not exist" Mar 19 20:26:37 crc kubenswrapper[4826]: I0319 20:26:37.992807 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9227da5a-9c34-4a13-ae29-a7442de81053" path="/var/lib/kubelet/pods/9227da5a-9c34-4a13-ae29-a7442de81053/volumes" Mar 19 20:26:40 crc kubenswrapper[4826]: I0319 20:26:40.142650 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-b2hdk" Mar 19 20:26:40 crc kubenswrapper[4826]: I0319 20:26:40.229754 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-b2hdk" Mar 19 20:26:41 crc kubenswrapper[4826]: I0319 20:26:41.212369 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b2hdk"] Mar 19 20:26:41 crc kubenswrapper[4826]: I0319 20:26:41.475303 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-v59cc" Mar 19 20:26:41 crc kubenswrapper[4826]: I0319 20:26:41.557358 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-v59cc" Mar 19 20:26:41 crc kubenswrapper[4826]: I0319 20:26:41.834822 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-b2hdk" podUID="943caaa5-d3b3-47a3-ac4c-7aab7200e6c6" containerName="registry-server" containerID="cri-o://b2345506fcbe6b5dd1d3302dc4d0c7616d33333e46cc96900298df2315c95b61" gracePeriod=2 Mar 19 20:26:42 crc kubenswrapper[4826]: I0319 20:26:42.386450 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b2hdk" Mar 19 20:26:42 crc kubenswrapper[4826]: I0319 20:26:42.494107 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/943caaa5-d3b3-47a3-ac4c-7aab7200e6c6-catalog-content\") pod \"943caaa5-d3b3-47a3-ac4c-7aab7200e6c6\" (UID: \"943caaa5-d3b3-47a3-ac4c-7aab7200e6c6\") " Mar 19 20:26:42 crc kubenswrapper[4826]: I0319 20:26:42.494247 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wg2p\" (UniqueName: \"kubernetes.io/projected/943caaa5-d3b3-47a3-ac4c-7aab7200e6c6-kube-api-access-9wg2p\") pod \"943caaa5-d3b3-47a3-ac4c-7aab7200e6c6\" (UID: \"943caaa5-d3b3-47a3-ac4c-7aab7200e6c6\") " Mar 19 20:26:42 crc kubenswrapper[4826]: I0319 20:26:42.494290 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/943caaa5-d3b3-47a3-ac4c-7aab7200e6c6-utilities\") pod \"943caaa5-d3b3-47a3-ac4c-7aab7200e6c6\" (UID: \"943caaa5-d3b3-47a3-ac4c-7aab7200e6c6\") " Mar 19 20:26:42 crc kubenswrapper[4826]: I0319 20:26:42.495717 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/943caaa5-d3b3-47a3-ac4c-7aab7200e6c6-utilities" (OuterVolumeSpecName: "utilities") pod "943caaa5-d3b3-47a3-ac4c-7aab7200e6c6" (UID: "943caaa5-d3b3-47a3-ac4c-7aab7200e6c6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:26:42 crc kubenswrapper[4826]: I0319 20:26:42.501015 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/943caaa5-d3b3-47a3-ac4c-7aab7200e6c6-kube-api-access-9wg2p" (OuterVolumeSpecName: "kube-api-access-9wg2p") pod "943caaa5-d3b3-47a3-ac4c-7aab7200e6c6" (UID: "943caaa5-d3b3-47a3-ac4c-7aab7200e6c6"). InnerVolumeSpecName "kube-api-access-9wg2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:26:42 crc kubenswrapper[4826]: I0319 20:26:42.557502 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/943caaa5-d3b3-47a3-ac4c-7aab7200e6c6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "943caaa5-d3b3-47a3-ac4c-7aab7200e6c6" (UID: "943caaa5-d3b3-47a3-ac4c-7aab7200e6c6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:26:42 crc kubenswrapper[4826]: I0319 20:26:42.596482 4826 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/943caaa5-d3b3-47a3-ac4c-7aab7200e6c6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 20:26:42 crc kubenswrapper[4826]: I0319 20:26:42.596526 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wg2p\" (UniqueName: \"kubernetes.io/projected/943caaa5-d3b3-47a3-ac4c-7aab7200e6c6-kube-api-access-9wg2p\") on node \"crc\" DevicePath \"\"" Mar 19 20:26:42 crc kubenswrapper[4826]: I0319 20:26:42.596540 4826 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/943caaa5-d3b3-47a3-ac4c-7aab7200e6c6-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 20:26:42 crc kubenswrapper[4826]: I0319 20:26:42.848945 4826 generic.go:334] "Generic (PLEG): container finished" podID="943caaa5-d3b3-47a3-ac4c-7aab7200e6c6" containerID="b2345506fcbe6b5dd1d3302dc4d0c7616d33333e46cc96900298df2315c95b61" exitCode=0 Mar 19 20:26:42 crc kubenswrapper[4826]: I0319 20:26:42.848992 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b2hdk" event={"ID":"943caaa5-d3b3-47a3-ac4c-7aab7200e6c6","Type":"ContainerDied","Data":"b2345506fcbe6b5dd1d3302dc4d0c7616d33333e46cc96900298df2315c95b61"} Mar 19 20:26:42 crc kubenswrapper[4826]: I0319 20:26:42.849022 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b2hdk" event={"ID":"943caaa5-d3b3-47a3-ac4c-7aab7200e6c6","Type":"ContainerDied","Data":"0eb7db66baff99ca54c271a8038bf0eaf59b762e80b581da95cfc87544443c55"} Mar 19 20:26:42 crc kubenswrapper[4826]: I0319 20:26:42.849045 4826 scope.go:117] "RemoveContainer" containerID="b2345506fcbe6b5dd1d3302dc4d0c7616d33333e46cc96900298df2315c95b61" Mar 19 20:26:42 crc kubenswrapper[4826]: I0319 20:26:42.849064 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b2hdk" Mar 19 20:26:42 crc kubenswrapper[4826]: I0319 20:26:42.880313 4826 scope.go:117] "RemoveContainer" containerID="e77ec8293bfff5fd91ca41ceda9deac3e7207be673ee0dfca946eca91bb7ed82" Mar 19 20:26:42 crc kubenswrapper[4826]: I0319 20:26:42.926725 4826 scope.go:117] "RemoveContainer" containerID="203f96f808e85bdf71542805d0f15da52f9e647b9f8edd5adacbf8a794da8fd0" Mar 19 20:26:42 crc kubenswrapper[4826]: I0319 20:26:42.926720 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b2hdk"] Mar 19 20:26:42 crc kubenswrapper[4826]: I0319 20:26:42.949616 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-b2hdk"] Mar 19 20:26:43 crc kubenswrapper[4826]: I0319 20:26:43.100290 4826 scope.go:117] "RemoveContainer" containerID="b2345506fcbe6b5dd1d3302dc4d0c7616d33333e46cc96900298df2315c95b61" Mar 19 20:26:43 crc kubenswrapper[4826]: E0319 20:26:43.115823 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2345506fcbe6b5dd1d3302dc4d0c7616d33333e46cc96900298df2315c95b61\": container with ID starting with b2345506fcbe6b5dd1d3302dc4d0c7616d33333e46cc96900298df2315c95b61 not found: ID does not exist" containerID="b2345506fcbe6b5dd1d3302dc4d0c7616d33333e46cc96900298df2315c95b61" Mar 19 20:26:43 crc kubenswrapper[4826]: I0319 20:26:43.115872 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2345506fcbe6b5dd1d3302dc4d0c7616d33333e46cc96900298df2315c95b61"} err="failed to get container status \"b2345506fcbe6b5dd1d3302dc4d0c7616d33333e46cc96900298df2315c95b61\": rpc error: code = NotFound desc = could not find container \"b2345506fcbe6b5dd1d3302dc4d0c7616d33333e46cc96900298df2315c95b61\": container with ID starting with b2345506fcbe6b5dd1d3302dc4d0c7616d33333e46cc96900298df2315c95b61 not found: ID does not exist" Mar 19 20:26:43 crc kubenswrapper[4826]: I0319 20:26:43.115900 4826 scope.go:117] "RemoveContainer" containerID="e77ec8293bfff5fd91ca41ceda9deac3e7207be673ee0dfca946eca91bb7ed82" Mar 19 20:26:43 crc kubenswrapper[4826]: E0319 20:26:43.125813 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e77ec8293bfff5fd91ca41ceda9deac3e7207be673ee0dfca946eca91bb7ed82\": container with ID starting with e77ec8293bfff5fd91ca41ceda9deac3e7207be673ee0dfca946eca91bb7ed82 not found: ID does not exist" containerID="e77ec8293bfff5fd91ca41ceda9deac3e7207be673ee0dfca946eca91bb7ed82" Mar 19 20:26:43 crc kubenswrapper[4826]: I0319 20:26:43.125858 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e77ec8293bfff5fd91ca41ceda9deac3e7207be673ee0dfca946eca91bb7ed82"} err="failed to get container status \"e77ec8293bfff5fd91ca41ceda9deac3e7207be673ee0dfca946eca91bb7ed82\": rpc error: code = NotFound desc = could not find container \"e77ec8293bfff5fd91ca41ceda9deac3e7207be673ee0dfca946eca91bb7ed82\": container with ID starting with e77ec8293bfff5fd91ca41ceda9deac3e7207be673ee0dfca946eca91bb7ed82 not found: ID does not exist" Mar 19 20:26:43 crc kubenswrapper[4826]: I0319 20:26:43.125884 4826 scope.go:117] "RemoveContainer" containerID="203f96f808e85bdf71542805d0f15da52f9e647b9f8edd5adacbf8a794da8fd0" Mar 19 20:26:43 crc kubenswrapper[4826]: E0319 20:26:43.138803 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"203f96f808e85bdf71542805d0f15da52f9e647b9f8edd5adacbf8a794da8fd0\": container with ID starting with 203f96f808e85bdf71542805d0f15da52f9e647b9f8edd5adacbf8a794da8fd0 not found: ID does not exist" containerID="203f96f808e85bdf71542805d0f15da52f9e647b9f8edd5adacbf8a794da8fd0" Mar 19 20:26:43 crc kubenswrapper[4826]: I0319 20:26:43.138861 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"203f96f808e85bdf71542805d0f15da52f9e647b9f8edd5adacbf8a794da8fd0"} err="failed to get container status \"203f96f808e85bdf71542805d0f15da52f9e647b9f8edd5adacbf8a794da8fd0\": rpc error: code = NotFound desc = could not find container \"203f96f808e85bdf71542805d0f15da52f9e647b9f8edd5adacbf8a794da8fd0\": container with ID starting with 203f96f808e85bdf71542805d0f15da52f9e647b9f8edd5adacbf8a794da8fd0 not found: ID does not exist" Mar 19 20:26:43 crc kubenswrapper[4826]: I0319 20:26:43.831453 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v59cc"] Mar 19 20:26:43 crc kubenswrapper[4826]: I0319 20:26:43.832229 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-v59cc" podUID="9639e00b-90c7-457d-af6f-7923fc9aeca4" containerName="registry-server" containerID="cri-o://eaff2a0a2e1bd1d2794725d7f7a12adcd85c5a7be2aaff4f239421872c415cfb" gracePeriod=2 Mar 19 20:26:44 crc kubenswrapper[4826]: I0319 20:26:44.006159 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="943caaa5-d3b3-47a3-ac4c-7aab7200e6c6" path="/var/lib/kubelet/pods/943caaa5-d3b3-47a3-ac4c-7aab7200e6c6/volumes" Mar 19 20:26:44 crc kubenswrapper[4826]: I0319 20:26:44.389906 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v59cc" Mar 19 20:26:44 crc kubenswrapper[4826]: I0319 20:26:44.476828 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9639e00b-90c7-457d-af6f-7923fc9aeca4-catalog-content\") pod \"9639e00b-90c7-457d-af6f-7923fc9aeca4\" (UID: \"9639e00b-90c7-457d-af6f-7923fc9aeca4\") " Mar 19 20:26:44 crc kubenswrapper[4826]: I0319 20:26:44.476921 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvg9n\" (UniqueName: \"kubernetes.io/projected/9639e00b-90c7-457d-af6f-7923fc9aeca4-kube-api-access-dvg9n\") pod \"9639e00b-90c7-457d-af6f-7923fc9aeca4\" (UID: \"9639e00b-90c7-457d-af6f-7923fc9aeca4\") " Mar 19 20:26:44 crc kubenswrapper[4826]: I0319 20:26:44.476993 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9639e00b-90c7-457d-af6f-7923fc9aeca4-utilities\") pod \"9639e00b-90c7-457d-af6f-7923fc9aeca4\" (UID: \"9639e00b-90c7-457d-af6f-7923fc9aeca4\") " Mar 19 20:26:44 crc kubenswrapper[4826]: I0319 20:26:44.477827 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9639e00b-90c7-457d-af6f-7923fc9aeca4-utilities" (OuterVolumeSpecName: "utilities") pod "9639e00b-90c7-457d-af6f-7923fc9aeca4" (UID: "9639e00b-90c7-457d-af6f-7923fc9aeca4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:26:44 crc kubenswrapper[4826]: I0319 20:26:44.482897 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9639e00b-90c7-457d-af6f-7923fc9aeca4-kube-api-access-dvg9n" (OuterVolumeSpecName: "kube-api-access-dvg9n") pod "9639e00b-90c7-457d-af6f-7923fc9aeca4" (UID: "9639e00b-90c7-457d-af6f-7923fc9aeca4"). InnerVolumeSpecName "kube-api-access-dvg9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:26:44 crc kubenswrapper[4826]: I0319 20:26:44.512205 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9639e00b-90c7-457d-af6f-7923fc9aeca4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9639e00b-90c7-457d-af6f-7923fc9aeca4" (UID: "9639e00b-90c7-457d-af6f-7923fc9aeca4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:26:44 crc kubenswrapper[4826]: I0319 20:26:44.580514 4826 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9639e00b-90c7-457d-af6f-7923fc9aeca4-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 20:26:44 crc kubenswrapper[4826]: I0319 20:26:44.580555 4826 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9639e00b-90c7-457d-af6f-7923fc9aeca4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 20:26:44 crc kubenswrapper[4826]: I0319 20:26:44.580568 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvg9n\" (UniqueName: \"kubernetes.io/projected/9639e00b-90c7-457d-af6f-7923fc9aeca4-kube-api-access-dvg9n\") on node \"crc\" DevicePath \"\"" Mar 19 20:26:44 crc kubenswrapper[4826]: I0319 20:26:44.880647 4826 generic.go:334] "Generic (PLEG): container finished" podID="9639e00b-90c7-457d-af6f-7923fc9aeca4" containerID="eaff2a0a2e1bd1d2794725d7f7a12adcd85c5a7be2aaff4f239421872c415cfb" exitCode=0 Mar 19 20:26:44 crc kubenswrapper[4826]: I0319 20:26:44.880746 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v59cc" Mar 19 20:26:44 crc kubenswrapper[4826]: I0319 20:26:44.881270 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v59cc" event={"ID":"9639e00b-90c7-457d-af6f-7923fc9aeca4","Type":"ContainerDied","Data":"eaff2a0a2e1bd1d2794725d7f7a12adcd85c5a7be2aaff4f239421872c415cfb"} Mar 19 20:26:44 crc kubenswrapper[4826]: I0319 20:26:44.881392 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v59cc" event={"ID":"9639e00b-90c7-457d-af6f-7923fc9aeca4","Type":"ContainerDied","Data":"9f21b5e708539126799c9f1d65190794ee7cdfabeb700024d4109d805f463e7b"} Mar 19 20:26:44 crc kubenswrapper[4826]: I0319 20:26:44.881457 4826 scope.go:117] "RemoveContainer" containerID="eaff2a0a2e1bd1d2794725d7f7a12adcd85c5a7be2aaff4f239421872c415cfb" Mar 19 20:26:44 crc kubenswrapper[4826]: I0319 20:26:44.915561 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v59cc"] Mar 19 20:26:44 crc kubenswrapper[4826]: I0319 20:26:44.923718 4826 scope.go:117] "RemoveContainer" containerID="f9e0851c9306a20c3c80e251984b56b0c40da1057e543b745c8b491a3d02cd28" Mar 19 20:26:44 crc kubenswrapper[4826]: I0319 20:26:44.942515 4826 scope.go:117] "RemoveContainer" containerID="2fec3135b860194f05d1448c9454e57fd705b20388155d4f0d1e8128945f0185" Mar 19 20:26:44 crc kubenswrapper[4826]: I0319 20:26:44.959108 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-v59cc"] Mar 19 20:26:45 crc kubenswrapper[4826]: I0319 20:26:45.010169 4826 scope.go:117] "RemoveContainer" containerID="4b8f142f166faec74c40a5246a9ba266558ced09680d95a3f74613d8760a6fb6" Mar 19 20:26:45 crc kubenswrapper[4826]: I0319 20:26:45.022441 4826 scope.go:117] "RemoveContainer" containerID="eaff2a0a2e1bd1d2794725d7f7a12adcd85c5a7be2aaff4f239421872c415cfb" Mar 19 20:26:45 crc kubenswrapper[4826]: E0319 20:26:45.025091 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eaff2a0a2e1bd1d2794725d7f7a12adcd85c5a7be2aaff4f239421872c415cfb\": container with ID starting with eaff2a0a2e1bd1d2794725d7f7a12adcd85c5a7be2aaff4f239421872c415cfb not found: ID does not exist" containerID="eaff2a0a2e1bd1d2794725d7f7a12adcd85c5a7be2aaff4f239421872c415cfb" Mar 19 20:26:45 crc kubenswrapper[4826]: I0319 20:26:45.025126 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eaff2a0a2e1bd1d2794725d7f7a12adcd85c5a7be2aaff4f239421872c415cfb"} err="failed to get container status \"eaff2a0a2e1bd1d2794725d7f7a12adcd85c5a7be2aaff4f239421872c415cfb\": rpc error: code = NotFound desc = could not find container \"eaff2a0a2e1bd1d2794725d7f7a12adcd85c5a7be2aaff4f239421872c415cfb\": container with ID starting with eaff2a0a2e1bd1d2794725d7f7a12adcd85c5a7be2aaff4f239421872c415cfb not found: ID does not exist" Mar 19 20:26:45 crc kubenswrapper[4826]: I0319 20:26:45.025147 4826 scope.go:117] "RemoveContainer" containerID="f9e0851c9306a20c3c80e251984b56b0c40da1057e543b745c8b491a3d02cd28" Mar 19 20:26:45 crc kubenswrapper[4826]: E0319 20:26:45.025493 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9e0851c9306a20c3c80e251984b56b0c40da1057e543b745c8b491a3d02cd28\": container with ID starting with f9e0851c9306a20c3c80e251984b56b0c40da1057e543b745c8b491a3d02cd28 not found: ID does not exist" containerID="f9e0851c9306a20c3c80e251984b56b0c40da1057e543b745c8b491a3d02cd28" Mar 19 20:26:45 crc kubenswrapper[4826]: I0319 20:26:45.025539 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9e0851c9306a20c3c80e251984b56b0c40da1057e543b745c8b491a3d02cd28"} err="failed to get container status \"f9e0851c9306a20c3c80e251984b56b0c40da1057e543b745c8b491a3d02cd28\": rpc error: code = NotFound desc = could not find container \"f9e0851c9306a20c3c80e251984b56b0c40da1057e543b745c8b491a3d02cd28\": container with ID starting with f9e0851c9306a20c3c80e251984b56b0c40da1057e543b745c8b491a3d02cd28 not found: ID does not exist" Mar 19 20:26:45 crc kubenswrapper[4826]: I0319 20:26:45.025572 4826 scope.go:117] "RemoveContainer" containerID="2fec3135b860194f05d1448c9454e57fd705b20388155d4f0d1e8128945f0185" Mar 19 20:26:45 crc kubenswrapper[4826]: E0319 20:26:45.025998 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fec3135b860194f05d1448c9454e57fd705b20388155d4f0d1e8128945f0185\": container with ID starting with 2fec3135b860194f05d1448c9454e57fd705b20388155d4f0d1e8128945f0185 not found: ID does not exist" containerID="2fec3135b860194f05d1448c9454e57fd705b20388155d4f0d1e8128945f0185" Mar 19 20:26:45 crc kubenswrapper[4826]: I0319 20:26:45.026026 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fec3135b860194f05d1448c9454e57fd705b20388155d4f0d1e8128945f0185"} err="failed to get container status \"2fec3135b860194f05d1448c9454e57fd705b20388155d4f0d1e8128945f0185\": rpc error: code = NotFound desc = could not find container \"2fec3135b860194f05d1448c9454e57fd705b20388155d4f0d1e8128945f0185\": container with ID starting with 2fec3135b860194f05d1448c9454e57fd705b20388155d4f0d1e8128945f0185 not found: ID does not exist" Mar 19 20:26:45 crc kubenswrapper[4826]: I0319 20:26:45.992593 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9639e00b-90c7-457d-af6f-7923fc9aeca4" path="/var/lib/kubelet/pods/9639e00b-90c7-457d-af6f-7923fc9aeca4/volumes" Mar 19 20:27:25 crc kubenswrapper[4826]: I0319 20:27:25.400372 4826 patch_prober.go:28] interesting pod/machine-config-daemon-zz87p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:27:25 crc kubenswrapper[4826]: I0319 20:27:25.402420 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:27:55 crc kubenswrapper[4826]: I0319 20:27:55.400958 4826 patch_prober.go:28] interesting pod/machine-config-daemon-zz87p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:27:55 crc kubenswrapper[4826]: I0319 20:27:55.401943 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:28:00 crc kubenswrapper[4826]: I0319 20:28:00.177027 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565868-jj5ps"] Mar 19 20:28:00 crc kubenswrapper[4826]: E0319 20:28:00.178366 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e171a7e-b733-47f9-807a-51f2828b6927" containerName="gather" Mar 19 20:28:00 crc kubenswrapper[4826]: I0319 20:28:00.178388 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e171a7e-b733-47f9-807a-51f2828b6927" containerName="gather" Mar 19 20:28:00 crc kubenswrapper[4826]: E0319 20:28:00.178426 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e171a7e-b733-47f9-807a-51f2828b6927" containerName="copy" Mar 19 20:28:00 crc kubenswrapper[4826]: I0319 20:28:00.178440 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e171a7e-b733-47f9-807a-51f2828b6927" containerName="copy" Mar 19 20:28:00 crc kubenswrapper[4826]: E0319 20:28:00.178469 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9639e00b-90c7-457d-af6f-7923fc9aeca4" containerName="extract-content" Mar 19 20:28:00 crc kubenswrapper[4826]: I0319 20:28:00.178483 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="9639e00b-90c7-457d-af6f-7923fc9aeca4" containerName="extract-content" Mar 19 20:28:00 crc kubenswrapper[4826]: E0319 20:28:00.178518 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="943caaa5-d3b3-47a3-ac4c-7aab7200e6c6" containerName="extract-content" Mar 19 20:28:00 crc kubenswrapper[4826]: I0319 20:28:00.178530 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="943caaa5-d3b3-47a3-ac4c-7aab7200e6c6" containerName="extract-content" Mar 19 20:28:00 crc kubenswrapper[4826]: E0319 20:28:00.178572 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9227da5a-9c34-4a13-ae29-a7442de81053" containerName="registry-server" Mar 19 20:28:00 crc kubenswrapper[4826]: I0319 20:28:00.178584 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="9227da5a-9c34-4a13-ae29-a7442de81053" containerName="registry-server" Mar 19 20:28:00 crc kubenswrapper[4826]: E0319 20:28:00.178616 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="943caaa5-d3b3-47a3-ac4c-7aab7200e6c6" containerName="extract-utilities" Mar 19 20:28:00 crc kubenswrapper[4826]: I0319 20:28:00.178628 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="943caaa5-d3b3-47a3-ac4c-7aab7200e6c6" containerName="extract-utilities" Mar 19 20:28:00 crc kubenswrapper[4826]: E0319 20:28:00.178649 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9639e00b-90c7-457d-af6f-7923fc9aeca4" containerName="registry-server" Mar 19 20:28:00 crc kubenswrapper[4826]: I0319 20:28:00.178686 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="9639e00b-90c7-457d-af6f-7923fc9aeca4" containerName="registry-server" Mar 19 20:28:00 crc kubenswrapper[4826]: E0319 20:28:00.178718 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9227da5a-9c34-4a13-ae29-a7442de81053" containerName="extract-utilities" Mar 19 20:28:00 crc kubenswrapper[4826]: I0319 20:28:00.178730 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="9227da5a-9c34-4a13-ae29-a7442de81053" containerName="extract-utilities" Mar 19 20:28:00 crc kubenswrapper[4826]: E0319 20:28:00.178755 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="943caaa5-d3b3-47a3-ac4c-7aab7200e6c6" containerName="registry-server" Mar 19 20:28:00 crc kubenswrapper[4826]: I0319 20:28:00.178766 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="943caaa5-d3b3-47a3-ac4c-7aab7200e6c6" containerName="registry-server" Mar 19 20:28:00 crc kubenswrapper[4826]: E0319 20:28:00.178800 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9227da5a-9c34-4a13-ae29-a7442de81053" containerName="extract-content" Mar 19 20:28:00 crc kubenswrapper[4826]: I0319 20:28:00.178812 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="9227da5a-9c34-4a13-ae29-a7442de81053" containerName="extract-content" Mar 19 20:28:00 crc kubenswrapper[4826]: E0319 20:28:00.178839 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9639e00b-90c7-457d-af6f-7923fc9aeca4" containerName="extract-utilities" Mar 19 20:28:00 crc kubenswrapper[4826]: I0319 20:28:00.178852 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="9639e00b-90c7-457d-af6f-7923fc9aeca4" containerName="extract-utilities" Mar 19 20:28:00 crc kubenswrapper[4826]: I0319 20:28:00.179241 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="943caaa5-d3b3-47a3-ac4c-7aab7200e6c6" containerName="registry-server" Mar 19 20:28:00 crc kubenswrapper[4826]: I0319 20:28:00.179286 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="9227da5a-9c34-4a13-ae29-a7442de81053" containerName="registry-server" Mar 19 20:28:00 crc kubenswrapper[4826]: I0319 20:28:00.179311 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e171a7e-b733-47f9-807a-51f2828b6927" containerName="gather" Mar 19 20:28:00 crc kubenswrapper[4826]: I0319 20:28:00.179348 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e171a7e-b733-47f9-807a-51f2828b6927" containerName="copy" Mar 19 20:28:00 crc kubenswrapper[4826]: I0319 20:28:00.179385 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="9639e00b-90c7-457d-af6f-7923fc9aeca4" containerName="registry-server" Mar 19 20:28:00 crc kubenswrapper[4826]: I0319 20:28:00.180814 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565868-jj5ps" Mar 19 20:28:00 crc kubenswrapper[4826]: I0319 20:28:00.183629 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b27wl" Mar 19 20:28:00 crc kubenswrapper[4826]: I0319 20:28:00.184201 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 20:28:00 crc kubenswrapper[4826]: I0319 20:28:00.184260 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 20:28:00 crc kubenswrapper[4826]: I0319 20:28:00.192195 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565868-jj5ps"] Mar 19 20:28:00 crc kubenswrapper[4826]: I0319 20:28:00.362723 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znp7f\" (UniqueName: \"kubernetes.io/projected/48c4e7fb-d893-4bee-a9fb-ac57f28bc8f9-kube-api-access-znp7f\") pod \"auto-csr-approver-29565868-jj5ps\" (UID: \"48c4e7fb-d893-4bee-a9fb-ac57f28bc8f9\") " pod="openshift-infra/auto-csr-approver-29565868-jj5ps" Mar 19 20:28:00 crc kubenswrapper[4826]: I0319 20:28:00.466321 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znp7f\" (UniqueName: \"kubernetes.io/projected/48c4e7fb-d893-4bee-a9fb-ac57f28bc8f9-kube-api-access-znp7f\") pod \"auto-csr-approver-29565868-jj5ps\" (UID: \"48c4e7fb-d893-4bee-a9fb-ac57f28bc8f9\") " pod="openshift-infra/auto-csr-approver-29565868-jj5ps" Mar 19 20:28:00 crc kubenswrapper[4826]: I0319 20:28:00.492012 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znp7f\" (UniqueName: \"kubernetes.io/projected/48c4e7fb-d893-4bee-a9fb-ac57f28bc8f9-kube-api-access-znp7f\") pod \"auto-csr-approver-29565868-jj5ps\" (UID: \"48c4e7fb-d893-4bee-a9fb-ac57f28bc8f9\") " pod="openshift-infra/auto-csr-approver-29565868-jj5ps" Mar 19 20:28:00 crc kubenswrapper[4826]: I0319 20:28:00.515307 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565868-jj5ps" Mar 19 20:28:01 crc kubenswrapper[4826]: I0319 20:28:01.119518 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565868-jj5ps"] Mar 19 20:28:01 crc kubenswrapper[4826]: I0319 20:28:01.684967 4826 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 20:28:02 crc kubenswrapper[4826]: I0319 20:28:02.000483 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565868-jj5ps" event={"ID":"48c4e7fb-d893-4bee-a9fb-ac57f28bc8f9","Type":"ContainerStarted","Data":"e7d02409d34beae4aaee3a803bac81911aea6c3949909b5a4f6dd5074cdb955d"} Mar 19 20:28:04 crc kubenswrapper[4826]: I0319 20:28:04.025617 4826 generic.go:334] "Generic (PLEG): container finished" podID="48c4e7fb-d893-4bee-a9fb-ac57f28bc8f9" containerID="cfc47d2f3ac5a040a0caeb144a0f1f328bf1c2f39fa544d8b1becb679f8b6051" exitCode=0 Mar 19 20:28:04 crc kubenswrapper[4826]: I0319 20:28:04.025948 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565868-jj5ps" event={"ID":"48c4e7fb-d893-4bee-a9fb-ac57f28bc8f9","Type":"ContainerDied","Data":"cfc47d2f3ac5a040a0caeb144a0f1f328bf1c2f39fa544d8b1becb679f8b6051"} Mar 19 20:28:05 crc kubenswrapper[4826]: I0319 20:28:05.542970 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565868-jj5ps" Mar 19 20:28:05 crc kubenswrapper[4826]: I0319 20:28:05.640041 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znp7f\" (UniqueName: \"kubernetes.io/projected/48c4e7fb-d893-4bee-a9fb-ac57f28bc8f9-kube-api-access-znp7f\") pod \"48c4e7fb-d893-4bee-a9fb-ac57f28bc8f9\" (UID: \"48c4e7fb-d893-4bee-a9fb-ac57f28bc8f9\") " Mar 19 20:28:05 crc kubenswrapper[4826]: I0319 20:28:05.647239 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48c4e7fb-d893-4bee-a9fb-ac57f28bc8f9-kube-api-access-znp7f" (OuterVolumeSpecName: "kube-api-access-znp7f") pod "48c4e7fb-d893-4bee-a9fb-ac57f28bc8f9" (UID: "48c4e7fb-d893-4bee-a9fb-ac57f28bc8f9"). InnerVolumeSpecName "kube-api-access-znp7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:28:05 crc kubenswrapper[4826]: I0319 20:28:05.743418 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znp7f\" (UniqueName: \"kubernetes.io/projected/48c4e7fb-d893-4bee-a9fb-ac57f28bc8f9-kube-api-access-znp7f\") on node \"crc\" DevicePath \"\"" Mar 19 20:28:06 crc kubenswrapper[4826]: I0319 20:28:06.057903 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565868-jj5ps" event={"ID":"48c4e7fb-d893-4bee-a9fb-ac57f28bc8f9","Type":"ContainerDied","Data":"e7d02409d34beae4aaee3a803bac81911aea6c3949909b5a4f6dd5074cdb955d"} Mar 19 20:28:06 crc kubenswrapper[4826]: I0319 20:28:06.058396 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7d02409d34beae4aaee3a803bac81911aea6c3949909b5a4f6dd5074cdb955d" Mar 19 20:28:06 crc kubenswrapper[4826]: I0319 20:28:06.057970 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565868-jj5ps" Mar 19 20:28:06 crc kubenswrapper[4826]: I0319 20:28:06.640583 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565862-8g8qh"] Mar 19 20:28:06 crc kubenswrapper[4826]: I0319 20:28:06.650170 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565862-8g8qh"] Mar 19 20:28:08 crc kubenswrapper[4826]: I0319 20:28:08.000139 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="266a96af-b06d-410f-a96e-567f232185be" path="/var/lib/kubelet/pods/266a96af-b06d-410f-a96e-567f232185be/volumes" Mar 19 20:28:25 crc kubenswrapper[4826]: I0319 20:28:25.401287 4826 patch_prober.go:28] interesting pod/machine-config-daemon-zz87p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:28:25 crc kubenswrapper[4826]: I0319 20:28:25.401990 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:28:25 crc kubenswrapper[4826]: I0319 20:28:25.402108 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" Mar 19 20:28:25 crc kubenswrapper[4826]: I0319 20:28:25.403819 4826 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d046213c26048ba48fa301b15ba9433e2be6da4043ebedf6f1a1793170691e95"} pod="openshift-machine-config-operator/machine-config-daemon-zz87p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 20:28:25 crc kubenswrapper[4826]: I0319 20:28:25.403927 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" containerID="cri-o://d046213c26048ba48fa301b15ba9433e2be6da4043ebedf6f1a1793170691e95" gracePeriod=600 Mar 19 20:28:26 crc kubenswrapper[4826]: I0319 20:28:26.364391 4826 generic.go:334] "Generic (PLEG): container finished" podID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerID="d046213c26048ba48fa301b15ba9433e2be6da4043ebedf6f1a1793170691e95" exitCode=0 Mar 19 20:28:26 crc kubenswrapper[4826]: I0319 20:28:26.364466 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" event={"ID":"b456fa3f-c7a7-45ca-b560-e7a9b21be05a","Type":"ContainerDied","Data":"d046213c26048ba48fa301b15ba9433e2be6da4043ebedf6f1a1793170691e95"} Mar 19 20:28:26 crc kubenswrapper[4826]: I0319 20:28:26.365101 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" event={"ID":"b456fa3f-c7a7-45ca-b560-e7a9b21be05a","Type":"ContainerStarted","Data":"95b0b900f4eb8dd11b67aa30345bb619fef585c718e3011755caf9f779d6bf1d"} Mar 19 20:28:26 crc kubenswrapper[4826]: I0319 20:28:26.365167 4826 scope.go:117] "RemoveContainer" containerID="63ecc124824e01c5ccfd2a32cf4bb3e2efc0746dfbe17c96f0b271731ffe1823" Mar 19 20:28:37 crc kubenswrapper[4826]: I0319 20:28:37.244774 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rrvvx"] Mar 19 20:28:37 crc kubenswrapper[4826]: E0319 20:28:37.246730 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48c4e7fb-d893-4bee-a9fb-ac57f28bc8f9" containerName="oc" Mar 19 20:28:37 crc kubenswrapper[4826]: I0319 20:28:37.246830 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="48c4e7fb-d893-4bee-a9fb-ac57f28bc8f9" containerName="oc" Mar 19 20:28:37 crc kubenswrapper[4826]: I0319 20:28:37.247159 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="48c4e7fb-d893-4bee-a9fb-ac57f28bc8f9" containerName="oc" Mar 19 20:28:37 crc kubenswrapper[4826]: I0319 20:28:37.250390 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rrvvx" Mar 19 20:28:37 crc kubenswrapper[4826]: I0319 20:28:37.266344 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rrvvx"] Mar 19 20:28:37 crc kubenswrapper[4826]: I0319 20:28:37.343470 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlpsd\" (UniqueName: \"kubernetes.io/projected/e0b1b58e-510a-4d71-b296-65226f028fbc-kube-api-access-zlpsd\") pod \"redhat-operators-rrvvx\" (UID: \"e0b1b58e-510a-4d71-b296-65226f028fbc\") " pod="openshift-marketplace/redhat-operators-rrvvx" Mar 19 20:28:37 crc kubenswrapper[4826]: I0319 20:28:37.344134 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0b1b58e-510a-4d71-b296-65226f028fbc-catalog-content\") pod \"redhat-operators-rrvvx\" (UID: \"e0b1b58e-510a-4d71-b296-65226f028fbc\") " pod="openshift-marketplace/redhat-operators-rrvvx" Mar 19 20:28:37 crc kubenswrapper[4826]: I0319 20:28:37.344431 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0b1b58e-510a-4d71-b296-65226f028fbc-utilities\") pod \"redhat-operators-rrvvx\" (UID: \"e0b1b58e-510a-4d71-b296-65226f028fbc\") " pod="openshift-marketplace/redhat-operators-rrvvx" Mar 19 20:28:37 crc kubenswrapper[4826]: I0319 20:28:37.446838 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlpsd\" (UniqueName: \"kubernetes.io/projected/e0b1b58e-510a-4d71-b296-65226f028fbc-kube-api-access-zlpsd\") pod \"redhat-operators-rrvvx\" (UID: \"e0b1b58e-510a-4d71-b296-65226f028fbc\") " pod="openshift-marketplace/redhat-operators-rrvvx" Mar 19 20:28:37 crc kubenswrapper[4826]: I0319 20:28:37.446963 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0b1b58e-510a-4d71-b296-65226f028fbc-catalog-content\") pod \"redhat-operators-rrvvx\" (UID: \"e0b1b58e-510a-4d71-b296-65226f028fbc\") " pod="openshift-marketplace/redhat-operators-rrvvx" Mar 19 20:28:37 crc kubenswrapper[4826]: I0319 20:28:37.446998 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0b1b58e-510a-4d71-b296-65226f028fbc-utilities\") pod \"redhat-operators-rrvvx\" (UID: \"e0b1b58e-510a-4d71-b296-65226f028fbc\") " pod="openshift-marketplace/redhat-operators-rrvvx" Mar 19 20:28:37 crc kubenswrapper[4826]: I0319 20:28:37.447388 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0b1b58e-510a-4d71-b296-65226f028fbc-catalog-content\") pod \"redhat-operators-rrvvx\" (UID: \"e0b1b58e-510a-4d71-b296-65226f028fbc\") " pod="openshift-marketplace/redhat-operators-rrvvx" Mar 19 20:28:37 crc kubenswrapper[4826]: I0319 20:28:37.447554 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0b1b58e-510a-4d71-b296-65226f028fbc-utilities\") pod \"redhat-operators-rrvvx\" (UID: \"e0b1b58e-510a-4d71-b296-65226f028fbc\") " pod="openshift-marketplace/redhat-operators-rrvvx" Mar 19 20:28:37 crc kubenswrapper[4826]: I0319 20:28:37.474558 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlpsd\" (UniqueName: \"kubernetes.io/projected/e0b1b58e-510a-4d71-b296-65226f028fbc-kube-api-access-zlpsd\") pod \"redhat-operators-rrvvx\" (UID: \"e0b1b58e-510a-4d71-b296-65226f028fbc\") " pod="openshift-marketplace/redhat-operators-rrvvx" Mar 19 20:28:37 crc kubenswrapper[4826]: I0319 20:28:37.578454 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rrvvx" Mar 19 20:28:38 crc kubenswrapper[4826]: I0319 20:28:38.125779 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rrvvx"] Mar 19 20:28:38 crc kubenswrapper[4826]: W0319 20:28:38.137604 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0b1b58e_510a_4d71_b296_65226f028fbc.slice/crio-d8b6e3642d85139a5404e945d2514e3dfaadf1f0a49ec64288fc4ce9155a7fe2 WatchSource:0}: Error finding container d8b6e3642d85139a5404e945d2514e3dfaadf1f0a49ec64288fc4ce9155a7fe2: Status 404 returned error can't find the container with id d8b6e3642d85139a5404e945d2514e3dfaadf1f0a49ec64288fc4ce9155a7fe2 Mar 19 20:28:38 crc kubenswrapper[4826]: I0319 20:28:38.531270 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rrvvx" event={"ID":"e0b1b58e-510a-4d71-b296-65226f028fbc","Type":"ContainerStarted","Data":"54c52d29e65294f9e00042f42600aff7bee4bfa6cc65901072dd42a32212c7c1"} Mar 19 20:28:38 crc kubenswrapper[4826]: I0319 20:28:38.531552 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rrvvx" event={"ID":"e0b1b58e-510a-4d71-b296-65226f028fbc","Type":"ContainerStarted","Data":"d8b6e3642d85139a5404e945d2514e3dfaadf1f0a49ec64288fc4ce9155a7fe2"} Mar 19 20:28:39 crc kubenswrapper[4826]: I0319 20:28:39.549940 4826 generic.go:334] "Generic (PLEG): container finished" podID="e0b1b58e-510a-4d71-b296-65226f028fbc" containerID="54c52d29e65294f9e00042f42600aff7bee4bfa6cc65901072dd42a32212c7c1" exitCode=0 Mar 19 20:28:39 crc kubenswrapper[4826]: I0319 20:28:39.550025 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rrvvx" event={"ID":"e0b1b58e-510a-4d71-b296-65226f028fbc","Type":"ContainerDied","Data":"54c52d29e65294f9e00042f42600aff7bee4bfa6cc65901072dd42a32212c7c1"} Mar 19 20:28:40 crc kubenswrapper[4826]: I0319 20:28:40.563669 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rrvvx" event={"ID":"e0b1b58e-510a-4d71-b296-65226f028fbc","Type":"ContainerStarted","Data":"913257bfb9ee853057a409fb7ec99192bc109e5e070bbbdddf3bf801b2480f65"} Mar 19 20:28:45 crc kubenswrapper[4826]: I0319 20:28:45.225847 4826 scope.go:117] "RemoveContainer" containerID="26a03e5c5c482527d2b2a1d3880cb796f7ac8ca88ede2ad192109ea489987c0b" Mar 19 20:28:48 crc kubenswrapper[4826]: I0319 20:28:48.695241 4826 generic.go:334] "Generic (PLEG): container finished" podID="e0b1b58e-510a-4d71-b296-65226f028fbc" containerID="913257bfb9ee853057a409fb7ec99192bc109e5e070bbbdddf3bf801b2480f65" exitCode=0 Mar 19 20:28:48 crc kubenswrapper[4826]: I0319 20:28:48.695326 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rrvvx" event={"ID":"e0b1b58e-510a-4d71-b296-65226f028fbc","Type":"ContainerDied","Data":"913257bfb9ee853057a409fb7ec99192bc109e5e070bbbdddf3bf801b2480f65"} Mar 19 20:28:50 crc kubenswrapper[4826]: I0319 20:28:50.720471 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rrvvx" event={"ID":"e0b1b58e-510a-4d71-b296-65226f028fbc","Type":"ContainerStarted","Data":"971d3fd4453b0242790efb027feeec0ebb9e5fbb9f1fc08be3b00c280524650d"} Mar 19 20:28:50 crc kubenswrapper[4826]: I0319 20:28:50.755466 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rrvvx" podStartSLOduration=2.821223883 podStartE2EDuration="13.755445579s" podCreationTimestamp="2026-03-19 20:28:37 +0000 UTC" firstStartedPulling="2026-03-19 20:28:38.534479808 +0000 UTC m=+5543.288548121" lastFinishedPulling="2026-03-19 20:28:49.468701494 +0000 UTC m=+5554.222769817" observedRunningTime="2026-03-19 20:28:50.746268745 +0000 UTC m=+5555.500337088" watchObservedRunningTime="2026-03-19 20:28:50.755445579 +0000 UTC m=+5555.509513882" Mar 19 20:28:57 crc kubenswrapper[4826]: I0319 20:28:57.578946 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rrvvx" Mar 19 20:28:57 crc kubenswrapper[4826]: I0319 20:28:57.579619 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rrvvx" Mar 19 20:28:58 crc kubenswrapper[4826]: I0319 20:28:58.661901 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rrvvx" podUID="e0b1b58e-510a-4d71-b296-65226f028fbc" containerName="registry-server" probeResult="failure" output=< Mar 19 20:28:58 crc kubenswrapper[4826]: timeout: failed to connect service ":50051" within 1s Mar 19 20:28:58 crc kubenswrapper[4826]: > Mar 19 20:29:08 crc kubenswrapper[4826]: I0319 20:29:08.644217 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rrvvx" podUID="e0b1b58e-510a-4d71-b296-65226f028fbc" containerName="registry-server" probeResult="failure" output=< Mar 19 20:29:08 crc kubenswrapper[4826]: timeout: failed to connect service ":50051" within 1s Mar 19 20:29:08 crc kubenswrapper[4826]: > Mar 19 20:29:18 crc kubenswrapper[4826]: I0319 20:29:18.628417 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rrvvx" podUID="e0b1b58e-510a-4d71-b296-65226f028fbc" containerName="registry-server" probeResult="failure" output=< Mar 19 20:29:18 crc kubenswrapper[4826]: timeout: failed to connect service ":50051" within 1s Mar 19 20:29:18 crc kubenswrapper[4826]: > Mar 19 20:29:28 crc kubenswrapper[4826]: I0319 20:29:28.633531 4826 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rrvvx" podUID="e0b1b58e-510a-4d71-b296-65226f028fbc" containerName="registry-server" probeResult="failure" output=< Mar 19 20:29:28 crc kubenswrapper[4826]: timeout: failed to connect service ":50051" within 1s Mar 19 20:29:28 crc kubenswrapper[4826]: > Mar 19 20:29:37 crc kubenswrapper[4826]: I0319 20:29:37.679361 4826 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rrvvx" Mar 19 20:29:37 crc kubenswrapper[4826]: I0319 20:29:37.751276 4826 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rrvvx" Mar 19 20:29:38 crc kubenswrapper[4826]: I0319 20:29:38.469069 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rrvvx"] Mar 19 20:29:39 crc kubenswrapper[4826]: I0319 20:29:39.374579 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rrvvx" podUID="e0b1b58e-510a-4d71-b296-65226f028fbc" containerName="registry-server" containerID="cri-o://971d3fd4453b0242790efb027feeec0ebb9e5fbb9f1fc08be3b00c280524650d" gracePeriod=2 Mar 19 20:29:40 crc kubenswrapper[4826]: I0319 20:29:40.036200 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rrvvx" Mar 19 20:29:40 crc kubenswrapper[4826]: I0319 20:29:40.183852 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlpsd\" (UniqueName: \"kubernetes.io/projected/e0b1b58e-510a-4d71-b296-65226f028fbc-kube-api-access-zlpsd\") pod \"e0b1b58e-510a-4d71-b296-65226f028fbc\" (UID: \"e0b1b58e-510a-4d71-b296-65226f028fbc\") " Mar 19 20:29:40 crc kubenswrapper[4826]: I0319 20:29:40.183937 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0b1b58e-510a-4d71-b296-65226f028fbc-utilities\") pod \"e0b1b58e-510a-4d71-b296-65226f028fbc\" (UID: \"e0b1b58e-510a-4d71-b296-65226f028fbc\") " Mar 19 20:29:40 crc kubenswrapper[4826]: I0319 20:29:40.183998 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0b1b58e-510a-4d71-b296-65226f028fbc-catalog-content\") pod \"e0b1b58e-510a-4d71-b296-65226f028fbc\" (UID: \"e0b1b58e-510a-4d71-b296-65226f028fbc\") " Mar 19 20:29:40 crc kubenswrapper[4826]: I0319 20:29:40.184633 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0b1b58e-510a-4d71-b296-65226f028fbc-utilities" (OuterVolumeSpecName: "utilities") pod "e0b1b58e-510a-4d71-b296-65226f028fbc" (UID: "e0b1b58e-510a-4d71-b296-65226f028fbc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:29:40 crc kubenswrapper[4826]: I0319 20:29:40.184939 4826 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0b1b58e-510a-4d71-b296-65226f028fbc-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 20:29:40 crc kubenswrapper[4826]: I0319 20:29:40.191712 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0b1b58e-510a-4d71-b296-65226f028fbc-kube-api-access-zlpsd" (OuterVolumeSpecName: "kube-api-access-zlpsd") pod "e0b1b58e-510a-4d71-b296-65226f028fbc" (UID: "e0b1b58e-510a-4d71-b296-65226f028fbc"). InnerVolumeSpecName "kube-api-access-zlpsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:29:40 crc kubenswrapper[4826]: I0319 20:29:40.287296 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zlpsd\" (UniqueName: \"kubernetes.io/projected/e0b1b58e-510a-4d71-b296-65226f028fbc-kube-api-access-zlpsd\") on node \"crc\" DevicePath \"\"" Mar 19 20:29:40 crc kubenswrapper[4826]: I0319 20:29:40.326776 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0b1b58e-510a-4d71-b296-65226f028fbc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e0b1b58e-510a-4d71-b296-65226f028fbc" (UID: "e0b1b58e-510a-4d71-b296-65226f028fbc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 20:29:40 crc kubenswrapper[4826]: I0319 20:29:40.389541 4826 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0b1b58e-510a-4d71-b296-65226f028fbc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 20:29:40 crc kubenswrapper[4826]: I0319 20:29:40.391784 4826 generic.go:334] "Generic (PLEG): container finished" podID="e0b1b58e-510a-4d71-b296-65226f028fbc" containerID="971d3fd4453b0242790efb027feeec0ebb9e5fbb9f1fc08be3b00c280524650d" exitCode=0 Mar 19 20:29:40 crc kubenswrapper[4826]: I0319 20:29:40.391826 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rrvvx" event={"ID":"e0b1b58e-510a-4d71-b296-65226f028fbc","Type":"ContainerDied","Data":"971d3fd4453b0242790efb027feeec0ebb9e5fbb9f1fc08be3b00c280524650d"} Mar 19 20:29:40 crc kubenswrapper[4826]: I0319 20:29:40.391853 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rrvvx" event={"ID":"e0b1b58e-510a-4d71-b296-65226f028fbc","Type":"ContainerDied","Data":"d8b6e3642d85139a5404e945d2514e3dfaadf1f0a49ec64288fc4ce9155a7fe2"} Mar 19 20:29:40 crc kubenswrapper[4826]: I0319 20:29:40.391868 4826 scope.go:117] "RemoveContainer" containerID="971d3fd4453b0242790efb027feeec0ebb9e5fbb9f1fc08be3b00c280524650d" Mar 19 20:29:40 crc kubenswrapper[4826]: I0319 20:29:40.392027 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rrvvx" Mar 19 20:29:40 crc kubenswrapper[4826]: I0319 20:29:40.430305 4826 scope.go:117] "RemoveContainer" containerID="913257bfb9ee853057a409fb7ec99192bc109e5e070bbbdddf3bf801b2480f65" Mar 19 20:29:40 crc kubenswrapper[4826]: I0319 20:29:40.437400 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rrvvx"] Mar 19 20:29:40 crc kubenswrapper[4826]: I0319 20:29:40.454562 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rrvvx"] Mar 19 20:29:40 crc kubenswrapper[4826]: I0319 20:29:40.490446 4826 scope.go:117] "RemoveContainer" containerID="54c52d29e65294f9e00042f42600aff7bee4bfa6cc65901072dd42a32212c7c1" Mar 19 20:29:40 crc kubenswrapper[4826]: I0319 20:29:40.530114 4826 scope.go:117] "RemoveContainer" containerID="971d3fd4453b0242790efb027feeec0ebb9e5fbb9f1fc08be3b00c280524650d" Mar 19 20:29:40 crc kubenswrapper[4826]: E0319 20:29:40.530766 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"971d3fd4453b0242790efb027feeec0ebb9e5fbb9f1fc08be3b00c280524650d\": container with ID starting with 971d3fd4453b0242790efb027feeec0ebb9e5fbb9f1fc08be3b00c280524650d not found: ID does not exist" containerID="971d3fd4453b0242790efb027feeec0ebb9e5fbb9f1fc08be3b00c280524650d" Mar 19 20:29:40 crc kubenswrapper[4826]: I0319 20:29:40.530828 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"971d3fd4453b0242790efb027feeec0ebb9e5fbb9f1fc08be3b00c280524650d"} err="failed to get container status \"971d3fd4453b0242790efb027feeec0ebb9e5fbb9f1fc08be3b00c280524650d\": rpc error: code = NotFound desc = could not find container \"971d3fd4453b0242790efb027feeec0ebb9e5fbb9f1fc08be3b00c280524650d\": container with ID starting with 971d3fd4453b0242790efb027feeec0ebb9e5fbb9f1fc08be3b00c280524650d not found: ID does not exist" Mar 19 20:29:40 crc kubenswrapper[4826]: I0319 20:29:40.530866 4826 scope.go:117] "RemoveContainer" containerID="913257bfb9ee853057a409fb7ec99192bc109e5e070bbbdddf3bf801b2480f65" Mar 19 20:29:40 crc kubenswrapper[4826]: E0319 20:29:40.531442 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"913257bfb9ee853057a409fb7ec99192bc109e5e070bbbdddf3bf801b2480f65\": container with ID starting with 913257bfb9ee853057a409fb7ec99192bc109e5e070bbbdddf3bf801b2480f65 not found: ID does not exist" containerID="913257bfb9ee853057a409fb7ec99192bc109e5e070bbbdddf3bf801b2480f65" Mar 19 20:29:40 crc kubenswrapper[4826]: I0319 20:29:40.531479 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"913257bfb9ee853057a409fb7ec99192bc109e5e070bbbdddf3bf801b2480f65"} err="failed to get container status \"913257bfb9ee853057a409fb7ec99192bc109e5e070bbbdddf3bf801b2480f65\": rpc error: code = NotFound desc = could not find container \"913257bfb9ee853057a409fb7ec99192bc109e5e070bbbdddf3bf801b2480f65\": container with ID starting with 913257bfb9ee853057a409fb7ec99192bc109e5e070bbbdddf3bf801b2480f65 not found: ID does not exist" Mar 19 20:29:40 crc kubenswrapper[4826]: I0319 20:29:40.531508 4826 scope.go:117] "RemoveContainer" containerID="54c52d29e65294f9e00042f42600aff7bee4bfa6cc65901072dd42a32212c7c1" Mar 19 20:29:40 crc kubenswrapper[4826]: E0319 20:29:40.532411 4826 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54c52d29e65294f9e00042f42600aff7bee4bfa6cc65901072dd42a32212c7c1\": container with ID starting with 54c52d29e65294f9e00042f42600aff7bee4bfa6cc65901072dd42a32212c7c1 not found: ID does not exist" containerID="54c52d29e65294f9e00042f42600aff7bee4bfa6cc65901072dd42a32212c7c1" Mar 19 20:29:40 crc kubenswrapper[4826]: I0319 20:29:40.532443 4826 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54c52d29e65294f9e00042f42600aff7bee4bfa6cc65901072dd42a32212c7c1"} err="failed to get container status \"54c52d29e65294f9e00042f42600aff7bee4bfa6cc65901072dd42a32212c7c1\": rpc error: code = NotFound desc = could not find container \"54c52d29e65294f9e00042f42600aff7bee4bfa6cc65901072dd42a32212c7c1\": container with ID starting with 54c52d29e65294f9e00042f42600aff7bee4bfa6cc65901072dd42a32212c7c1 not found: ID does not exist" Mar 19 20:29:42 crc kubenswrapper[4826]: I0319 20:29:42.006379 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0b1b58e-510a-4d71-b296-65226f028fbc" path="/var/lib/kubelet/pods/e0b1b58e-510a-4d71-b296-65226f028fbc/volumes" Mar 19 20:30:00 crc kubenswrapper[4826]: I0319 20:30:00.196002 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565870-wzl24"] Mar 19 20:30:00 crc kubenswrapper[4826]: E0319 20:30:00.197244 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0b1b58e-510a-4d71-b296-65226f028fbc" containerName="extract-utilities" Mar 19 20:30:00 crc kubenswrapper[4826]: I0319 20:30:00.197266 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0b1b58e-510a-4d71-b296-65226f028fbc" containerName="extract-utilities" Mar 19 20:30:00 crc kubenswrapper[4826]: E0319 20:30:00.197291 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0b1b58e-510a-4d71-b296-65226f028fbc" containerName="registry-server" Mar 19 20:30:00 crc kubenswrapper[4826]: I0319 20:30:00.197302 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0b1b58e-510a-4d71-b296-65226f028fbc" containerName="registry-server" Mar 19 20:30:00 crc kubenswrapper[4826]: E0319 20:30:00.197337 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0b1b58e-510a-4d71-b296-65226f028fbc" containerName="extract-content" Mar 19 20:30:00 crc kubenswrapper[4826]: I0319 20:30:00.197344 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0b1b58e-510a-4d71-b296-65226f028fbc" containerName="extract-content" Mar 19 20:30:00 crc kubenswrapper[4826]: I0319 20:30:00.197719 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0b1b58e-510a-4d71-b296-65226f028fbc" containerName="registry-server" Mar 19 20:30:00 crc kubenswrapper[4826]: I0319 20:30:00.199012 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565870-wzl24" Mar 19 20:30:00 crc kubenswrapper[4826]: I0319 20:30:00.202230 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b27wl" Mar 19 20:30:00 crc kubenswrapper[4826]: I0319 20:30:00.203129 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 20:30:00 crc kubenswrapper[4826]: I0319 20:30:00.203123 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 20:30:00 crc kubenswrapper[4826]: I0319 20:30:00.220082 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565870-tbwgq"] Mar 19 20:30:00 crc kubenswrapper[4826]: I0319 20:30:00.223758 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565870-tbwgq" Mar 19 20:30:00 crc kubenswrapper[4826]: I0319 20:30:00.230977 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 19 20:30:00 crc kubenswrapper[4826]: I0319 20:30:00.235870 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 19 20:30:00 crc kubenswrapper[4826]: I0319 20:30:00.253882 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565870-wzl24"] Mar 19 20:30:00 crc kubenswrapper[4826]: I0319 20:30:00.264079 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565870-tbwgq"] Mar 19 20:30:00 crc kubenswrapper[4826]: I0319 20:30:00.290409 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a255471-5ac1-40ec-9ba2-fa5a0cac0d8c-config-volume\") pod \"collect-profiles-29565870-tbwgq\" (UID: \"2a255471-5ac1-40ec-9ba2-fa5a0cac0d8c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565870-tbwgq" Mar 19 20:30:00 crc kubenswrapper[4826]: I0319 20:30:00.290486 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2a255471-5ac1-40ec-9ba2-fa5a0cac0d8c-secret-volume\") pod \"collect-profiles-29565870-tbwgq\" (UID: \"2a255471-5ac1-40ec-9ba2-fa5a0cac0d8c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565870-tbwgq" Mar 19 20:30:00 crc kubenswrapper[4826]: I0319 20:30:00.290645 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nsml\" (UniqueName: \"kubernetes.io/projected/2a255471-5ac1-40ec-9ba2-fa5a0cac0d8c-kube-api-access-2nsml\") pod \"collect-profiles-29565870-tbwgq\" (UID: \"2a255471-5ac1-40ec-9ba2-fa5a0cac0d8c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565870-tbwgq" Mar 19 20:30:00 crc kubenswrapper[4826]: I0319 20:30:00.290688 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4z78z\" (UniqueName: \"kubernetes.io/projected/1f214485-8639-43a0-a7d2-c8ecb99b9b6a-kube-api-access-4z78z\") pod \"auto-csr-approver-29565870-wzl24\" (UID: \"1f214485-8639-43a0-a7d2-c8ecb99b9b6a\") " pod="openshift-infra/auto-csr-approver-29565870-wzl24" Mar 19 20:30:00 crc kubenswrapper[4826]: I0319 20:30:00.392866 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2a255471-5ac1-40ec-9ba2-fa5a0cac0d8c-secret-volume\") pod \"collect-profiles-29565870-tbwgq\" (UID: \"2a255471-5ac1-40ec-9ba2-fa5a0cac0d8c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565870-tbwgq" Mar 19 20:30:00 crc kubenswrapper[4826]: I0319 20:30:00.393297 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nsml\" (UniqueName: \"kubernetes.io/projected/2a255471-5ac1-40ec-9ba2-fa5a0cac0d8c-kube-api-access-2nsml\") pod \"collect-profiles-29565870-tbwgq\" (UID: \"2a255471-5ac1-40ec-9ba2-fa5a0cac0d8c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565870-tbwgq" Mar 19 20:30:00 crc kubenswrapper[4826]: I0319 20:30:00.393447 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4z78z\" (UniqueName: \"kubernetes.io/projected/1f214485-8639-43a0-a7d2-c8ecb99b9b6a-kube-api-access-4z78z\") pod \"auto-csr-approver-29565870-wzl24\" (UID: \"1f214485-8639-43a0-a7d2-c8ecb99b9b6a\") " pod="openshift-infra/auto-csr-approver-29565870-wzl24" Mar 19 20:30:00 crc kubenswrapper[4826]: I0319 20:30:00.393584 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a255471-5ac1-40ec-9ba2-fa5a0cac0d8c-config-volume\") pod \"collect-profiles-29565870-tbwgq\" (UID: \"2a255471-5ac1-40ec-9ba2-fa5a0cac0d8c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565870-tbwgq" Mar 19 20:30:00 crc kubenswrapper[4826]: I0319 20:30:00.394745 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a255471-5ac1-40ec-9ba2-fa5a0cac0d8c-config-volume\") pod \"collect-profiles-29565870-tbwgq\" (UID: \"2a255471-5ac1-40ec-9ba2-fa5a0cac0d8c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565870-tbwgq" Mar 19 20:30:00 crc kubenswrapper[4826]: I0319 20:30:00.400228 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2a255471-5ac1-40ec-9ba2-fa5a0cac0d8c-secret-volume\") pod \"collect-profiles-29565870-tbwgq\" (UID: \"2a255471-5ac1-40ec-9ba2-fa5a0cac0d8c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565870-tbwgq" Mar 19 20:30:00 crc kubenswrapper[4826]: I0319 20:30:00.412805 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4z78z\" (UniqueName: \"kubernetes.io/projected/1f214485-8639-43a0-a7d2-c8ecb99b9b6a-kube-api-access-4z78z\") pod \"auto-csr-approver-29565870-wzl24\" (UID: \"1f214485-8639-43a0-a7d2-c8ecb99b9b6a\") " pod="openshift-infra/auto-csr-approver-29565870-wzl24" Mar 19 20:30:00 crc kubenswrapper[4826]: I0319 20:30:00.416709 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nsml\" (UniqueName: \"kubernetes.io/projected/2a255471-5ac1-40ec-9ba2-fa5a0cac0d8c-kube-api-access-2nsml\") pod \"collect-profiles-29565870-tbwgq\" (UID: \"2a255471-5ac1-40ec-9ba2-fa5a0cac0d8c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565870-tbwgq" Mar 19 20:30:00 crc kubenswrapper[4826]: I0319 20:30:00.533892 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565870-wzl24" Mar 19 20:30:00 crc kubenswrapper[4826]: I0319 20:30:00.551167 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565870-tbwgq" Mar 19 20:30:01 crc kubenswrapper[4826]: I0319 20:30:01.092667 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565870-wzl24"] Mar 19 20:30:01 crc kubenswrapper[4826]: I0319 20:30:01.104068 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565870-tbwgq"] Mar 19 20:30:02 crc kubenswrapper[4826]: W0319 20:30:02.195958 4826 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a255471_5ac1_40ec_9ba2_fa5a0cac0d8c.slice/crio-bb54af26b7a79dfe15e3f8ef9f36a6dafceabfb1768d6625007913e428aef5f5 WatchSource:0}: Error finding container bb54af26b7a79dfe15e3f8ef9f36a6dafceabfb1768d6625007913e428aef5f5: Status 404 returned error can't find the container with id bb54af26b7a79dfe15e3f8ef9f36a6dafceabfb1768d6625007913e428aef5f5 Mar 19 20:30:02 crc kubenswrapper[4826]: I0319 20:30:02.722148 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565870-wzl24" event={"ID":"1f214485-8639-43a0-a7d2-c8ecb99b9b6a","Type":"ContainerStarted","Data":"f6ffb04576c0def5832bce9cef5c649998f5aa7e697c6caafb8ec48cf0e4bcd0"} Mar 19 20:30:02 crc kubenswrapper[4826]: I0319 20:30:02.724326 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565870-tbwgq" event={"ID":"2a255471-5ac1-40ec-9ba2-fa5a0cac0d8c","Type":"ContainerStarted","Data":"f9a88b2f8d75db0d5f337cae20cdf422d2bee568ffb88edfafa2c3d19dea0f9e"} Mar 19 20:30:02 crc kubenswrapper[4826]: I0319 20:30:02.724435 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565870-tbwgq" event={"ID":"2a255471-5ac1-40ec-9ba2-fa5a0cac0d8c","Type":"ContainerStarted","Data":"bb54af26b7a79dfe15e3f8ef9f36a6dafceabfb1768d6625007913e428aef5f5"} Mar 19 20:30:02 crc kubenswrapper[4826]: I0319 20:30:02.756215 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29565870-tbwgq" podStartSLOduration=2.756197286 podStartE2EDuration="2.756197286s" podCreationTimestamp="2026-03-19 20:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 20:30:02.744044639 +0000 UTC m=+5627.498112982" watchObservedRunningTime="2026-03-19 20:30:02.756197286 +0000 UTC m=+5627.510265599" Mar 19 20:30:03 crc kubenswrapper[4826]: I0319 20:30:03.744041 4826 generic.go:334] "Generic (PLEG): container finished" podID="2a255471-5ac1-40ec-9ba2-fa5a0cac0d8c" containerID="f9a88b2f8d75db0d5f337cae20cdf422d2bee568ffb88edfafa2c3d19dea0f9e" exitCode=0 Mar 19 20:30:03 crc kubenswrapper[4826]: I0319 20:30:03.744153 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565870-tbwgq" event={"ID":"2a255471-5ac1-40ec-9ba2-fa5a0cac0d8c","Type":"ContainerDied","Data":"f9a88b2f8d75db0d5f337cae20cdf422d2bee568ffb88edfafa2c3d19dea0f9e"} Mar 19 20:30:05 crc kubenswrapper[4826]: I0319 20:30:05.229746 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565870-tbwgq" Mar 19 20:30:05 crc kubenswrapper[4826]: I0319 20:30:05.321886 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a255471-5ac1-40ec-9ba2-fa5a0cac0d8c-config-volume\") pod \"2a255471-5ac1-40ec-9ba2-fa5a0cac0d8c\" (UID: \"2a255471-5ac1-40ec-9ba2-fa5a0cac0d8c\") " Mar 19 20:30:05 crc kubenswrapper[4826]: I0319 20:30:05.321995 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2nsml\" (UniqueName: \"kubernetes.io/projected/2a255471-5ac1-40ec-9ba2-fa5a0cac0d8c-kube-api-access-2nsml\") pod \"2a255471-5ac1-40ec-9ba2-fa5a0cac0d8c\" (UID: \"2a255471-5ac1-40ec-9ba2-fa5a0cac0d8c\") " Mar 19 20:30:05 crc kubenswrapper[4826]: I0319 20:30:05.322216 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2a255471-5ac1-40ec-9ba2-fa5a0cac0d8c-secret-volume\") pod \"2a255471-5ac1-40ec-9ba2-fa5a0cac0d8c\" (UID: \"2a255471-5ac1-40ec-9ba2-fa5a0cac0d8c\") " Mar 19 20:30:05 crc kubenswrapper[4826]: I0319 20:30:05.322990 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a255471-5ac1-40ec-9ba2-fa5a0cac0d8c-config-volume" (OuterVolumeSpecName: "config-volume") pod "2a255471-5ac1-40ec-9ba2-fa5a0cac0d8c" (UID: "2a255471-5ac1-40ec-9ba2-fa5a0cac0d8c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 20:30:05 crc kubenswrapper[4826]: I0319 20:30:05.329774 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a255471-5ac1-40ec-9ba2-fa5a0cac0d8c-kube-api-access-2nsml" (OuterVolumeSpecName: "kube-api-access-2nsml") pod "2a255471-5ac1-40ec-9ba2-fa5a0cac0d8c" (UID: "2a255471-5ac1-40ec-9ba2-fa5a0cac0d8c"). InnerVolumeSpecName "kube-api-access-2nsml". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:30:05 crc kubenswrapper[4826]: I0319 20:30:05.330072 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a255471-5ac1-40ec-9ba2-fa5a0cac0d8c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2a255471-5ac1-40ec-9ba2-fa5a0cac0d8c" (UID: "2a255471-5ac1-40ec-9ba2-fa5a0cac0d8c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 20:30:05 crc kubenswrapper[4826]: I0319 20:30:05.425613 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2nsml\" (UniqueName: \"kubernetes.io/projected/2a255471-5ac1-40ec-9ba2-fa5a0cac0d8c-kube-api-access-2nsml\") on node \"crc\" DevicePath \"\"" Mar 19 20:30:05 crc kubenswrapper[4826]: I0319 20:30:05.426019 4826 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2a255471-5ac1-40ec-9ba2-fa5a0cac0d8c-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 19 20:30:05 crc kubenswrapper[4826]: I0319 20:30:05.426035 4826 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a255471-5ac1-40ec-9ba2-fa5a0cac0d8c-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 20:30:05 crc kubenswrapper[4826]: I0319 20:30:05.778425 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565870-tbwgq" event={"ID":"2a255471-5ac1-40ec-9ba2-fa5a0cac0d8c","Type":"ContainerDied","Data":"bb54af26b7a79dfe15e3f8ef9f36a6dafceabfb1768d6625007913e428aef5f5"} Mar 19 20:30:05 crc kubenswrapper[4826]: I0319 20:30:05.778500 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb54af26b7a79dfe15e3f8ef9f36a6dafceabfb1768d6625007913e428aef5f5" Mar 19 20:30:05 crc kubenswrapper[4826]: I0319 20:30:05.778524 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565870-tbwgq" Mar 19 20:30:05 crc kubenswrapper[4826]: I0319 20:30:05.782344 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565870-wzl24" event={"ID":"1f214485-8639-43a0-a7d2-c8ecb99b9b6a","Type":"ContainerStarted","Data":"9928ead343fb1ddce451416ff3537d992e31393e0abd9c1e7a40704364b7d8df"} Mar 19 20:30:05 crc kubenswrapper[4826]: I0319 20:30:05.823959 4826 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565870-wzl24" podStartSLOduration=3.41373569 podStartE2EDuration="5.823937929s" podCreationTimestamp="2026-03-19 20:30:00 +0000 UTC" firstStartedPulling="2026-03-19 20:30:02.188225702 +0000 UTC m=+5626.942294055" lastFinishedPulling="2026-03-19 20:30:04.598427981 +0000 UTC m=+5629.352496294" observedRunningTime="2026-03-19 20:30:05.798106848 +0000 UTC m=+5630.552175181" watchObservedRunningTime="2026-03-19 20:30:05.823937929 +0000 UTC m=+5630.578006242" Mar 19 20:30:05 crc kubenswrapper[4826]: I0319 20:30:05.847272 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565825-229hw"] Mar 19 20:30:05 crc kubenswrapper[4826]: I0319 20:30:05.861236 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565825-229hw"] Mar 19 20:30:06 crc kubenswrapper[4826]: I0319 20:30:06.014643 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af735e0a-a52c-4254-843b-fd55bee90670" path="/var/lib/kubelet/pods/af735e0a-a52c-4254-843b-fd55bee90670/volumes" Mar 19 20:30:06 crc kubenswrapper[4826]: I0319 20:30:06.795113 4826 generic.go:334] "Generic (PLEG): container finished" podID="1f214485-8639-43a0-a7d2-c8ecb99b9b6a" containerID="9928ead343fb1ddce451416ff3537d992e31393e0abd9c1e7a40704364b7d8df" exitCode=0 Mar 19 20:30:06 crc kubenswrapper[4826]: I0319 20:30:06.795155 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565870-wzl24" event={"ID":"1f214485-8639-43a0-a7d2-c8ecb99b9b6a","Type":"ContainerDied","Data":"9928ead343fb1ddce451416ff3537d992e31393e0abd9c1e7a40704364b7d8df"} Mar 19 20:30:08 crc kubenswrapper[4826]: I0319 20:30:08.262487 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565870-wzl24" Mar 19 20:30:08 crc kubenswrapper[4826]: I0319 20:30:08.307243 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4z78z\" (UniqueName: \"kubernetes.io/projected/1f214485-8639-43a0-a7d2-c8ecb99b9b6a-kube-api-access-4z78z\") pod \"1f214485-8639-43a0-a7d2-c8ecb99b9b6a\" (UID: \"1f214485-8639-43a0-a7d2-c8ecb99b9b6a\") " Mar 19 20:30:08 crc kubenswrapper[4826]: I0319 20:30:08.315140 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f214485-8639-43a0-a7d2-c8ecb99b9b6a-kube-api-access-4z78z" (OuterVolumeSpecName: "kube-api-access-4z78z") pod "1f214485-8639-43a0-a7d2-c8ecb99b9b6a" (UID: "1f214485-8639-43a0-a7d2-c8ecb99b9b6a"). InnerVolumeSpecName "kube-api-access-4z78z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:30:08 crc kubenswrapper[4826]: I0319 20:30:08.411105 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4z78z\" (UniqueName: \"kubernetes.io/projected/1f214485-8639-43a0-a7d2-c8ecb99b9b6a-kube-api-access-4z78z\") on node \"crc\" DevicePath \"\"" Mar 19 20:30:08 crc kubenswrapper[4826]: I0319 20:30:08.822700 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565870-wzl24" event={"ID":"1f214485-8639-43a0-a7d2-c8ecb99b9b6a","Type":"ContainerDied","Data":"f6ffb04576c0def5832bce9cef5c649998f5aa7e697c6caafb8ec48cf0e4bcd0"} Mar 19 20:30:08 crc kubenswrapper[4826]: I0319 20:30:08.822742 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6ffb04576c0def5832bce9cef5c649998f5aa7e697c6caafb8ec48cf0e4bcd0" Mar 19 20:30:08 crc kubenswrapper[4826]: I0319 20:30:08.822808 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565870-wzl24" Mar 19 20:30:08 crc kubenswrapper[4826]: I0319 20:30:08.884435 4826 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565864-xgs52"] Mar 19 20:30:08 crc kubenswrapper[4826]: I0319 20:30:08.899425 4826 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565864-xgs52"] Mar 19 20:30:09 crc kubenswrapper[4826]: I0319 20:30:09.989371 4826 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30c9f9f1-b6bd-49e5-bc84-e101f2d58499" path="/var/lib/kubelet/pods/30c9f9f1-b6bd-49e5-bc84-e101f2d58499/volumes" Mar 19 20:30:25 crc kubenswrapper[4826]: I0319 20:30:25.400995 4826 patch_prober.go:28] interesting pod/machine-config-daemon-zz87p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:30:25 crc kubenswrapper[4826]: I0319 20:30:25.401901 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:30:45 crc kubenswrapper[4826]: I0319 20:30:45.572072 4826 scope.go:117] "RemoveContainer" containerID="2059b0764d92c96ab5112fc6f92288a7188bb448b20e2ba665cae80068218aae" Mar 19 20:30:45 crc kubenswrapper[4826]: I0319 20:30:45.644836 4826 scope.go:117] "RemoveContainer" containerID="bc6081120dcaba9b18b8df60a7de48e7bc07d3c360e614e940234e59143fd2c7" Mar 19 20:30:55 crc kubenswrapper[4826]: I0319 20:30:55.400303 4826 patch_prober.go:28] interesting pod/machine-config-daemon-zz87p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:30:55 crc kubenswrapper[4826]: I0319 20:30:55.400910 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:31:25 crc kubenswrapper[4826]: I0319 20:31:25.400223 4826 patch_prober.go:28] interesting pod/machine-config-daemon-zz87p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 20:31:25 crc kubenswrapper[4826]: I0319 20:31:25.400840 4826 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 20:31:25 crc kubenswrapper[4826]: I0319 20:31:25.400900 4826 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" Mar 19 20:31:25 crc kubenswrapper[4826]: I0319 20:31:25.402003 4826 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"95b0b900f4eb8dd11b67aa30345bb619fef585c718e3011755caf9f779d6bf1d"} pod="openshift-machine-config-operator/machine-config-daemon-zz87p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 20:31:25 crc kubenswrapper[4826]: I0319 20:31:25.402096 4826 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerName="machine-config-daemon" containerID="cri-o://95b0b900f4eb8dd11b67aa30345bb619fef585c718e3011755caf9f779d6bf1d" gracePeriod=600 Mar 19 20:31:25 crc kubenswrapper[4826]: E0319 20:31:25.532177 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 20:31:25 crc kubenswrapper[4826]: I0319 20:31:25.878624 4826 generic.go:334] "Generic (PLEG): container finished" podID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" containerID="95b0b900f4eb8dd11b67aa30345bb619fef585c718e3011755caf9f779d6bf1d" exitCode=0 Mar 19 20:31:25 crc kubenswrapper[4826]: I0319 20:31:25.878694 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" event={"ID":"b456fa3f-c7a7-45ca-b560-e7a9b21be05a","Type":"ContainerDied","Data":"95b0b900f4eb8dd11b67aa30345bb619fef585c718e3011755caf9f779d6bf1d"} Mar 19 20:31:25 crc kubenswrapper[4826]: I0319 20:31:25.879479 4826 scope.go:117] "RemoveContainer" containerID="d046213c26048ba48fa301b15ba9433e2be6da4043ebedf6f1a1793170691e95" Mar 19 20:31:25 crc kubenswrapper[4826]: I0319 20:31:25.880591 4826 scope.go:117] "RemoveContainer" containerID="95b0b900f4eb8dd11b67aa30345bb619fef585c718e3011755caf9f779d6bf1d" Mar 19 20:31:25 crc kubenswrapper[4826]: E0319 20:31:25.881173 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 20:31:38 crc kubenswrapper[4826]: I0319 20:31:38.975645 4826 scope.go:117] "RemoveContainer" containerID="95b0b900f4eb8dd11b67aa30345bb619fef585c718e3011755caf9f779d6bf1d" Mar 19 20:31:38 crc kubenswrapper[4826]: E0319 20:31:38.977624 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 20:31:52 crc kubenswrapper[4826]: I0319 20:31:52.977509 4826 scope.go:117] "RemoveContainer" containerID="95b0b900f4eb8dd11b67aa30345bb619fef585c718e3011755caf9f779d6bf1d" Mar 19 20:31:52 crc kubenswrapper[4826]: E0319 20:31:52.998227 4826 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zz87p_openshift-machine-config-operator(b456fa3f-c7a7-45ca-b560-e7a9b21be05a)\"" pod="openshift-machine-config-operator/machine-config-daemon-zz87p" podUID="b456fa3f-c7a7-45ca-b560-e7a9b21be05a" Mar 19 20:32:00 crc kubenswrapper[4826]: I0319 20:32:00.180249 4826 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565872-745wg"] Mar 19 20:32:00 crc kubenswrapper[4826]: E0319 20:32:00.181672 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a255471-5ac1-40ec-9ba2-fa5a0cac0d8c" containerName="collect-profiles" Mar 19 20:32:00 crc kubenswrapper[4826]: I0319 20:32:00.181689 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a255471-5ac1-40ec-9ba2-fa5a0cac0d8c" containerName="collect-profiles" Mar 19 20:32:00 crc kubenswrapper[4826]: E0319 20:32:00.181758 4826 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f214485-8639-43a0-a7d2-c8ecb99b9b6a" containerName="oc" Mar 19 20:32:00 crc kubenswrapper[4826]: I0319 20:32:00.181768 4826 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f214485-8639-43a0-a7d2-c8ecb99b9b6a" containerName="oc" Mar 19 20:32:00 crc kubenswrapper[4826]: I0319 20:32:00.182080 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a255471-5ac1-40ec-9ba2-fa5a0cac0d8c" containerName="collect-profiles" Mar 19 20:32:00 crc kubenswrapper[4826]: I0319 20:32:00.182101 4826 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f214485-8639-43a0-a7d2-c8ecb99b9b6a" containerName="oc" Mar 19 20:32:00 crc kubenswrapper[4826]: I0319 20:32:00.183174 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565872-745wg" Mar 19 20:32:00 crc kubenswrapper[4826]: I0319 20:32:00.186701 4826 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-b27wl" Mar 19 20:32:00 crc kubenswrapper[4826]: I0319 20:32:00.186750 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 20:32:00 crc kubenswrapper[4826]: I0319 20:32:00.187769 4826 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 20:32:00 crc kubenswrapper[4826]: I0319 20:32:00.196249 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565872-745wg"] Mar 19 20:32:00 crc kubenswrapper[4826]: I0319 20:32:00.270751 4826 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9d44\" (UniqueName: \"kubernetes.io/projected/2e3cbc4e-108e-406f-af10-37bb6ae704e1-kube-api-access-q9d44\") pod \"auto-csr-approver-29565872-745wg\" (UID: \"2e3cbc4e-108e-406f-af10-37bb6ae704e1\") " pod="openshift-infra/auto-csr-approver-29565872-745wg" Mar 19 20:32:00 crc kubenswrapper[4826]: I0319 20:32:00.373937 4826 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9d44\" (UniqueName: \"kubernetes.io/projected/2e3cbc4e-108e-406f-af10-37bb6ae704e1-kube-api-access-q9d44\") pod \"auto-csr-approver-29565872-745wg\" (UID: \"2e3cbc4e-108e-406f-af10-37bb6ae704e1\") " pod="openshift-infra/auto-csr-approver-29565872-745wg" Mar 19 20:32:00 crc kubenswrapper[4826]: I0319 20:32:00.471004 4826 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9d44\" (UniqueName: \"kubernetes.io/projected/2e3cbc4e-108e-406f-af10-37bb6ae704e1-kube-api-access-q9d44\") pod \"auto-csr-approver-29565872-745wg\" (UID: \"2e3cbc4e-108e-406f-af10-37bb6ae704e1\") " pod="openshift-infra/auto-csr-approver-29565872-745wg" Mar 19 20:32:00 crc kubenswrapper[4826]: I0319 20:32:00.541586 4826 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565872-745wg" Mar 19 20:32:01 crc kubenswrapper[4826]: I0319 20:32:01.095087 4826 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565872-745wg"] Mar 19 20:32:01 crc kubenswrapper[4826]: I0319 20:32:01.470323 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565872-745wg" event={"ID":"2e3cbc4e-108e-406f-af10-37bb6ae704e1","Type":"ContainerStarted","Data":"656e9207a4c5e33f9199d6728c31c1af0001953deea4637c13c67f13be34903b"} Mar 19 20:32:03 crc kubenswrapper[4826]: I0319 20:32:03.495795 4826 generic.go:334] "Generic (PLEG): container finished" podID="2e3cbc4e-108e-406f-af10-37bb6ae704e1" containerID="6d24398ec14a3a495d5e233bc035ac4bc6e7582124e2f9342462c4ab39ea1206" exitCode=0 Mar 19 20:32:03 crc kubenswrapper[4826]: I0319 20:32:03.495862 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565872-745wg" event={"ID":"2e3cbc4e-108e-406f-af10-37bb6ae704e1","Type":"ContainerDied","Data":"6d24398ec14a3a495d5e233bc035ac4bc6e7582124e2f9342462c4ab39ea1206"} Mar 19 20:32:04 crc kubenswrapper[4826]: I0319 20:32:04.989252 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565872-745wg" Mar 19 20:32:05 crc kubenswrapper[4826]: I0319 20:32:05.014403 4826 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9d44\" (UniqueName: \"kubernetes.io/projected/2e3cbc4e-108e-406f-af10-37bb6ae704e1-kube-api-access-q9d44\") pod \"2e3cbc4e-108e-406f-af10-37bb6ae704e1\" (UID: \"2e3cbc4e-108e-406f-af10-37bb6ae704e1\") " Mar 19 20:32:05 crc kubenswrapper[4826]: I0319 20:32:05.020999 4826 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e3cbc4e-108e-406f-af10-37bb6ae704e1-kube-api-access-q9d44" (OuterVolumeSpecName: "kube-api-access-q9d44") pod "2e3cbc4e-108e-406f-af10-37bb6ae704e1" (UID: "2e3cbc4e-108e-406f-af10-37bb6ae704e1"). InnerVolumeSpecName "kube-api-access-q9d44". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 20:32:05 crc kubenswrapper[4826]: I0319 20:32:05.118175 4826 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9d44\" (UniqueName: \"kubernetes.io/projected/2e3cbc4e-108e-406f-af10-37bb6ae704e1-kube-api-access-q9d44\") on node \"crc\" DevicePath \"\"" Mar 19 20:32:05 crc kubenswrapper[4826]: I0319 20:32:05.526827 4826 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565872-745wg" event={"ID":"2e3cbc4e-108e-406f-af10-37bb6ae704e1","Type":"ContainerDied","Data":"656e9207a4c5e33f9199d6728c31c1af0001953deea4637c13c67f13be34903b"} Mar 19 20:32:05 crc kubenswrapper[4826]: I0319 20:32:05.526870 4826 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="656e9207a4c5e33f9199d6728c31c1af0001953deea4637c13c67f13be34903b" Mar 19 20:32:05 crc kubenswrapper[4826]: I0319 20:32:05.526902 4826 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565872-745wg"